My view is.....

Streaming media storage size (in the common file system measurements mebibytes, megabytes, gigabytes, terabytes, and so on) is calculated from the streaming bandwidth and length of the media using the following formula (for a single user and file):

storage size (in mebibytes) = length (in seconds) ? bit rate (in bit/s) / (8 ? 1024 ? 1024)

since 1 mebibyte = 8 ? 1024?1024 bits.

Real world example:

One hour of video encoded at 300 kbit/s (this is a typical broadband video in 2005 and it is usually encoded in a 320?240 pixels window size) will be:

(3,600 s ? 300,000 bit/s) / (8?1024?1024) give around 128 MiB of storage.

If the file is stored on a server for on-demand streaming and this stream is viewed by 1,000 people at the same time using a Unicast protocol, the requirement is:

300 kbit/s ? 1,000 = 300,000 kbit/s = 300 Mbit/s of bandwidth

This is equivalent to around 135 GB per hour. Of course, using a multicast protocol the server sends out only a single stream that is common to all users. Hence, such a stream would only use 300 kbit/s of serving bandwidth. See below for more information on these protocols.

Posted By: DrDublin on November 28th 2009 at 14:26:45


Message Thread


Reply to Message

In order to add a post to the WotB Message Board you must be a registered WotB user.

If you are not yet registered then please visit the registration page. You should ensure that their browser is setup to accept cookies.

Log in