

This is the format used as the index file for HTTP Live Streaming. The m3u file format is a de facto standard playlist format suitable for carrying lists of media file URLs. It is an m3u playlist containing UTF-8 encoded text.

M3U8 file is a extensible playlist file format.

MPEG-2 Transport Streams are containers, and should not be confused with MPEG-2 compression.Īn. (For a list of currently supported formats, see Media Encoder. Not all compression formats are currently supported in the Apple HTTP Live Streaming implementation, however. The file format supports a variety of compression formats, including MP3 audio, AAC audio, H.264 video, and so on.
#APPLE PRO VIDEO FORMATS AN ERROR HAS OCCURRED 10403 SERIES#
This is a file format that encapsulates a series of encoded media samples-typically audio and video. ts file contains an MPEG-2 Transport Stream. In the case of VOD, no segments are lost, but inadequate bandwidth does cause slower startup and periodic stalling while data buffers.Ī. If a broadcast uses an index file that provides a moving window into the content, the client will eventually fall behind in such cases, causing one or more segments to be dropped. Note: If the data rate exceeds the available bandwidth, there is more latency before startup and the client may have to pause to buffer more data periodically. Audio-only streams at 64 Kbps are recommended as alternates for delivery over slow cellular connections.įor recommended data rates, see Preparing Media for Delivery to iOS-Based Devices. The current implementation has been tested using audio-video streams with data rates as low as 64 Kbps and as high as 3 Mbps to iPhone. The streaming protocol itself places no limitations on the data rates that can be used. The data rate that a content provider chooses for a stream is most influenced by the target client platform and the expected network topology. The trade-off is that a longer index file adds to network overhead-during live broadcasts, the clients are all refreshing the index file regularly, so it does add up, even though the index file is typically small. The more files in the list, the longer the client can be paused without losing its place in the broadcast, the further back in the broadcast a new client begins when joining the stream, and the wider the time range within which the client can seek. The important point to consider when choosing the optimum number is that the number of files available during a live session constrains the client's behavior when doing play/pause and seeking operations. The normal recommendation is 3, but the optimum number may be larger. How many files should be listed in the index file during a continuous, ongoing session? A duration of 10 seconds of media per file seems to strike a reasonable balance for most broadcast content. Longer segments will extend the inherent latency of the broadcast and initial startup time. The main point to consider is that shorter segments result in more frequent refreshes of the index file, which might create unnecessary network overhead for the client.

If your content is intended solely for iPad, Apple TV, iPhone 4 and later, and Mac OS X computers, you should use Main Level 3.1. If your app runs on older versions of iPhone or iPod touch, however, you should use H.264 Baseline 3.0 for compatibility. Note: iPad, iPhone 3G, and iPod touch (2nd generation and later) support H.264 Baseline 3.1.
