-
-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DASH: Support convert RTMP/WebRTC to MPEG-DASH #299
Comments
DASH is becoming more and more popular in some countries. |
SRS will support MPEG-DASH in this Chinese New Year holiday. |
DASH in wikipedia. The segment of DASH can use MP4 or TS file. The DASH players include ExoPlayer for Android, MSE for H5(js), THEOPlayer, VIDEOJS, dash.js and dashif or dash industry forum. The Helix(RealNetwork), NGINX-RTMP and WOWZA servers already support transmux RTMP to DASH.
The dashif show us a example: http://dash.edgesuite.net/dash264/TestCases/1a/netflix/exMPD_BIP_TC1.mpd DASH HelloWorld, Use dash.js player to play:
Most of these demos use mp4 format, not ts. ISOM(ISO base media file):
|
https://github.com/Bilibili/ijkplayer also supports dash and runs on iOS and Android, but offers a host of other functionality like RTMP seek. |
We must fix #738 to support mp4 muxer first. |
Besides general mp4 format is fine(see #738), we can focus to fmp4 now. About the DASH MPD:
To setup the playhead when startup:
|
Use SRS3+, config is:
Use player: http://ossrs.net/dash.js/samples/dash-if-reference-player/ Or open the player: http://ossrs.net/dash.js/samples/dash-if-reference-player/?url=http://localhost:8080/live/livestream.mpd The player reports the error, but it's very hard to trace it:
|
I release this feature as |
Live example from FFMPEG#7382: https://livesim.dashif.org/livesim/periods_60/continuous_1/testpic_2s/Manifest.mpd
|
Live stream: https://livesim.dashif.org/livesim/testpic_2s/Manifest.mpd https://livesim.dashif.org/livesim/sts_1577417658/sid_6953a24f/testpic_2s/Manifest.mpd
|
Use online mp4box to parse fmp4/m4s: http://download.tsi.telecom-paristech.fr/gpac/mp4box.js/filereader.html |
https://livesim.dashif.org/livesim/mup_30/testpic_2s/Manifest.mpd https://livesim.dashif.org/livesim/sts_1577438849/sid_24086ba5/mup_30/testpic_2s/Manifest.mpd
|
The main reason for the DASH failure before discovery in SRS is the data offset of the turn box, which actually only includes the length of the moof and mdat headers. Thanks to Master Brother's FFMPEG, for example, the segments generated by ffmpeg:
Its data_offset is 3612 = moof (3604B) + mdat-header (8B). It can be played after modification.
Of course, several DASH bugs have also been fixed, including a crash. Currently, after these changes (sidx box synchronization still needs to be added), DASH has basically met the requirements for Experimental features. d11a7b2
|
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
Give a simple explanation of some key points about DASH, mainly the daunting mpd file with numerous fields and combinations. One major feature of the DASH protocol is mapping the time slices of audio and video to UTC time, allowing for real-time streaming and the selection of appropriate segments for playback. The commonly used descriptions for segmentation in DASH mpd are as follows (excluding irrelevant fields):1. Using SegmentTemplate and without SegmentTimeline<?xml version="1.0" encoding="utf-8"?>
<MPD ...>
<Period start="PT0S">
<AdaptationSet mimeType="audio/mp4" segmentAlignment="true" startWithSAP="1">
<Representation id="audio" bandwidth="48000" codecs="mp4a.40.2">
<SegmentTemplate initialization="$RepresentationID$-init.mp4" media="$RepresentationID$-$Number$.m4s" duration="5000" timescale="1000">
</SegmentTemplate>
</Representation>
</AdaptationSet>
<AdaptationSet mimeType="video/mp4" segmentAlignment="true" startWithSAP="1">
<Representation id="video" bandwidth="800000" codecs="avc1.64001e" width="1280" height="720">
<SegmentTemplate initialization="$RepresentationID$-init.mp4" media="$RepresentationID$-$Number$.m4s" duration="5000" timescale="1000">
</SegmentTemplate>
</Representation>
</AdaptationSet>
</Period>
</MPD> In this kind of MPD, since there is no explicit indication of segment information, the player needs to calculate the current segment number. 2. SegmentTemplate also has a SegmentTimeline.2.1 "$Number$"<?xml version="1.0" encoding="utf-8"?>
<MPD ...>
<Period start="PT0S">
<AdaptationSet mimeType="audio/mp4" segmentAlignment="true" startWithSAP="1">
<Representation id="audio" bandwidth="48000" codecs="mp4a.40.2">
<SegmentTemplate initialization="$RepresentationID$-init.mp4" media="$RepresentationID$-$Number$.m4s" startNumber="2898" timescale="1000">
<SegmentTimeline>
<S t="100" d="4907" />
<S t="5007" d="5013" />
<S t="10020" d="4992" />
</SegmentTimeline>
</SegmentTemplate>
</Representation>
</AdaptationSet>
<AdaptationSet mimeType="video/mp4" segmentAlignment="true" startWithSAP="1">
<Representation id="video" bandwidth="800000" codecs="avc1.64001e" width="1280" height="720">
<SegmentTemplate initialization="$RepresentationID$-init.mp4" media="$RepresentationID$-$Number$.m4s" startNumber="2898" timescale="1000">
<SegmentTimeline>
<S t="0" d="5000" />
<S t="5000" d="5000" />
<S t="10000" d="5000" />
</SegmentTimeline>
</SegmentTemplate>
</Representation>
</AdaptationSet>
</Period>
</MPD> This MPD is similar to the previous one, with the difference that it includes a SegmentTimeline. According to the standard, in this case, the SegmentTemplate should not include a duration. Each segment's duration and start time are represented by the line 2.2 "$Time$"<?xml version="1.0" encoding="utf-8"?>
<MPD ...>
<Period start="PT0S">
<AdaptationSet mimeType="audio/mp4" segmentAlignment="true" startWithSAP="1">
<Representation id="audio" bandwidth="48000" codecs="mp4a.40.2">
<SegmentTemplate initialization="$RepresentationID$-init.mp4" media="$RepresentationID$-$Time$.m4s" timescale="1000">
<SegmentTimeline>
<S t="100" d="4907" />
<S t="5007" d="5013" />
<S t="10020" d="4992" />
</SegmentTimeline>
</SegmentTemplate>
</Representation>
</AdaptationSet>
<AdaptationSet mimeType="video/mp4" segmentAlignment="true" startWithSAP="1">
<Representation id="video" bandwidth="800000" codecs="avc1.64001e" width="1280" height="720">
<SegmentTemplate initialization="$RepresentationID$-init.mp4" media="$RepresentationID$-$Time$.m4s" timescale="1000">
<SegmentTimeline>
<S t="0" d="5000" />
<S t="5000" d="5000" />
<S t="10000" d="5000" />
</SegmentTimeline>
</SegmentTemplate>
</Representation>
</AdaptationSet>
</Period>
</MPD> Similar to the segment selection logic above, just replace 3. Use SegmentList<?xml version="1.0" encoding="utf-8"?>
<MPD ...>
<Period start="PT0S">
<AdaptationSet mimeType="audio/mp4" segmentAlignment="true" startWithSAP="1">
<Representation id="audio" bandwidth="48000" codecs="mp4a.40.2">
<SegmentList duration="5">
<SegmentURL media="video-1.m4s"/>
<SegmentURL media="video-2.m4s"/>
<SegmentURL media="video-3.m4s"/>
</SegmentList>
</Representation>
</AdaptationSet>
<AdaptationSet mimeType="video/mp4" segmentAlignment="true" startWithSAP="1">
<Representation id="video" bandwidth="800000" codecs="avc1.64001e" width="1280" height="720">
<SegmentList duration="5">
<SegmentURL media="audio-1.m4s"/>
<SegmentURL media="audio-2.m4s"/>
<SegmentURL media="audio-3.m4s"/>
</SegmentList>
</Representation>
</AdaptationSet>
</Period>
</MPD> This type of MPD is relatively rare in live streaming, so we won't discuss it in detail for now. Other important fields in MPDavailabilityStartTimeThis field indicates the earliest UTC time to start playing all slices. Since DASH plays strictly according to UTC time, it is necessary to map media time to UTC time. Generally, mapping is done according to the following logic. timeShiftBufferDepthThis field is relatively easy to understand, it represents the window time of the player, which can play segments up to the length of timeShiftBufferDepth in the past. minimumUpdatePeriodThis field is also easily understood. It represents the minimum update frequency of the MPD. For MPDs that use SegmentTimeline, the update frequency is typically set to the duration of the latest segment to ensure that the player can refresh the MPD in a timely manner and obtain the latest segment for playback. minBufferTimeMinimum buffer time: The player must ensure that it has data exceeding this buffer length in order to start the playback. publishTimeThe generation time of MPD at the source station needs to be rewritten every time the MPD is refreshed, and CDN can be used to refresh the cache. Known IssuesFailure of VLC Play
|
SRS selects SegmentTemplate with SegmentTimeline and uses
|
Fixed in v5.0.96 |
Usage
Make sure your SRS version is
SRS 5.0.96+
.Build SRS and run:
Publish by FFmpeg:
Play by
ffplay http://localhost:8080/live/livestream.mpd
http://localhost:8080/live/livestream.mpd
The text was updated successfully, but these errors were encountered: