(m)
Typically, when accessing multimedia data across a network, a user had to wait for the entire file to be transferred before they could use the information. Streaming, however, allows a user to see or hear the information as it arrives without having to wait.
Streaming technology offers a significant improvement over the download-and-play approach to multimedia file distribution, because it allows the data to be delivered to the client as a continuous flow with minimal delay before playback can begin. The multimedia data arrives, is briefly buffered before being played, and is then discarded. It is never actually stored on the users' computer.
Users benefit by experiencing instant playback without the frustration of having to wait for the entire data to be downloaded before they can determine whether it meets their needs or interests. In most cases, this download process took a long time, and was impractical for widespread acceptance
Streaming is a server/client technology that allows live or pre-recorded data to be broadcast in real time, opening up the network for traditional multimedia applications such as news, education, training, entertainment, advertising, and a host of other uses. Thus, streaming enables the Internet or company Intranet as a new broadcast medium for audio and video.
The Video Source is typically one or more streams of analogue video. It can come from cameras, DVD players or VCRs. These video sources will have an analogue video connection to the Encoding Station. It is common for live broadcasts to connect the cameras to video production and editing equipment, before being passed on to the Encoding Station.
The Encoding Station is a computer that captures and typically, encodes both the audio and video live, directly into the required streaming format. The most common systems used for encoding are Windows® XP or Windows® 2000 workstations equipped with audio and video capture cards. These systems must have the computational power to encode one or more audio and video streams either in software or via a hardware codec. The use of a good capture card is critical in achieving these high rates with good picture quality. The card needs to be capable of capturing 640x480 @ 30 fps without dropping any pixels/frames, or having a high CPU consumption.
To meet this criteria, Winnov have developed their own ASIC's (Application Specific Integrated Circuit) that are optimised for managing data transfers to the PCI bus. In fact, the latest cards also have SDRAM on board to create an Elastic Frame Buffer that holds a digitised frame until the PCI bus is prepared to receive it. This is particularly important when there is high bus traffic and/or multiple cards in the same PC. Without it, you could experience pixels drops that would degrade the video quality. In contrast, most other vendors, including Osprey use the Conexant PCI I/F chip for managing data transfers to the PCI bus.
Image quality can be further improved by using the Osprey®-560, which has digital video inputs, hence there is no loss due to analogue to digital conversion of the video signal.
The Encoding Station, which needs to be near the Video Source, sends the compressed audio/video streams on to the Video Streaming Server (typically via a LAN using UDP/TCP protocol). Individual compressed streams can vary from 20 Kbps (Kilobits/second) to 500 Kbps or more. The connection between the Encoding Station and the Video Streaming Server must be able to accommodate the total of the bandwidths of the individual streams and must be a clear and reliable connection.
The Video Streaming Server is responsible for delivering compressed video to each individual request for a particular video stream. This is usually handled by one of the commercial streaming media software packages such as RealNetworks® RealSystem™ or Microsoft® Windows Media™ Technologies. The bandwidth connection to the Video Streaming Server must accommodate the total bandwidth of all the requests for a video stream, unlike the Encoding Station, which must only accommodate one copy of each. As a result, the Video Streaming Server usually has a direct connection to a very high bandwidth line. For example, if there were 100 requests for a video stream compressed at 28.8 Kbps, the server would require at least a 3 Mbps connection. The Encoding Station and the Video Streaming Server can be one single system. However, unless hardware encoding is used, this would typically be for a situations requiring limited performance (e.g. a single input stream and a small number of viewer requests). Even so, it would still require a fairly high-performance system. It is much more common to have two separate systems.
The WebServer for video streaming is in no way different from other Web Servers. The web site merely contains a URL link to the Video Streaming Server - one for every available video stream. Typically this is an icon on the web page to be selected.
A Video Player application is required to decode the specific video stream received by the system requesting the stream over the Internet (or corporate Intranet). The most popular current video streaming applications are RealNetworks® RealSystem™ and Microsoft® Windows Media™ Technologies. Both of these require downloading a corresponding Video Player application such as RealOne™ Player or Windows Media™ Player; but both of these are free. There are other video streaming applications that are implemented in such a way as to include the player in the stream and no download is required.
Unicast v IP Multicast.
There are two key streaming delivery techniques: unicast and multicast. Unicast refers to networking in which computers establish two-way, point-to-point connections. Most networks operate in this fashion....users request a file, and a server sends the file to those clients only. When streaming multimedia over a network, the advantage to unicast is that the client computer can communicate with the computer supplying the multimedia stream. The disadvantage of unicast is that each client that connects to the server receives a separate stream, which rapidly uses up network bandwidth.
IP Multicast refers to the networking technique in which one computer sends a single copy of the data over the network and many computers receive that data. Unlike a broadcast, routers can control where a multicast travels on the network. When streaming multimedia over the network, the advantage to multicasting is that only a single copy of the data is sent across the network, which preserves network bandwidth. The disadvantage to multicasting is that it is connectionless; clients have no control over the streams they receive. To use IP multicast on a network, the network routers must support the IP Multicast protocol. Most routers now handle multicast.
Internet Protocols.
There are several internet protocols available for streaming data, TCP, UDP, RTP, RTSP, MMS & HTTP. Generally, each configures the data into packets, with each packet having a 'header' that identifies its contents. The protocol used is usually determined by the need to have reliable or unreliable communications.
TCP is a reliable protocol designed for transmitting alphanumeric data; it can stop and correct itself when data is lost. This protocol is used to guarantee sequenced, error-free transmission, but its very nature can cause delays and reduced throughput. This can be especially annoying when streaming audio and video.
User Datagram Protocol (UDP) within the IP stack, is by contrast, an unreliable protocol in which data is lost in preference to maintaining the flow.
Real-Time Protocol (RTP) was developed by the Internet Engineering Task Force (IETF) to handle streaming audio and video and uses IP Multicast. RTP is a derivative of UDP in which a time-stamp and sequence number is added to the packet header. This extra information allows the receiving client to reorder out of sequence packets, discard duplicates and synchronise audio and video after an initial buffering period. Real-Time Control Protocol (RTCP) is used to control RTP.
With RealServer™, RealNetworks introduced as its primary server protocol the RealTime Streaming Protocol (RTSP); an open, standards-based protocol for multimedia streaming. To use this prot0col, URLs that point to media clips on a RealServer™ begin with rtsp://
With Windows Media™ Technologies, Microsoft introduced Microsoft Media™ Server (MMS) as its primary server protocol. MMS protocol has both a data delivery mechanism to ensure that packets reach the client and a control mechanism to handle client requests such as Stop/Play. MMS includes both Microsoft Media Server protocol/UDP (MMSU) and Microsoft Media Server protocol/TCP (MMST) as subsets to explicitly request the stream to use UDP or TCP respectively. Media Stream Broadcast Distribution (MSBD) protocol was used to transfer streams from the Windows Media™ Encoder to the Windows Media™ Server or between servers. However, Windows Media™ Encoder 7 and later versions no longer supports MSBD and uses HTTP instead. URLs that point to media clips on a Windows Media™ Server usually begin with mms://
Hyper Text Transport Protocol (HTTP) is the slowest of the protocols and is used by Internet Web Servers. HTTP is transparent to some older firewalls and can bypass security in such cases. Unlike RTSP and MMS that can serve the stream at a steady bitrate, HTTP would just serve the stream as fast as it could, hence it is better to have separate web and streaming servers.