This feature requires OBS Studio 25.0 or newer.


Table of Contents:


General Overview

  • Secure Reliable Transport (or SRT in short) is a relatively recent open source streaming protocol, originally developped by Haivision (first demo in 2013) and promoted by the SRT Alliance which includes many big players in the streaming/video/telco industry. It promises:
    (1) better resilience to network issues (jitter, lost packets, delay, bandwidth fluctuations) with mechanisms for packet recovery (retransmission or ARQ automatic repeat request; and also FEC forward error correction) + internet bonding and
    (2) low latency (as low as twice the round-trip between encoder and ingest server, with sub-second latency usually).

For a video demo, see here:

  • SRT is mostly used in the broadcast and corporate world at the moment. See the NAB 2018 SRT panel with ESPN, NFL, Microsoft speakers talking about their use of SRT:

The NAB 2019 NAB SRT panel can be watched here: https://www.haivision.com/resources/webinars/broadcast-panel. (Hey Haivision, friendly suggestion: it’d be nice to have this vid posted on YouTube instead of having to enter personal info! :P )

  • Unlike RTMP, SRT is an open source protocol, and the source code can be found on GitHub. While RTMP development has been abandoned since 2012, SRT development is still very much active.

  • As a protocol, it is content agnostic, although the industry uses it along with an MPEGTS container, which is the de facto standard in broadcast industry. (In terms of comparison, RTMP protocol relies on the FLV container.) The MPEGTS container is usually used along with UDP protocol, which makes it fast, but very unreliable and prone to packet loss. It can also be used with TCP, which is more reliable but has larger latency. SRT adds up to these two protocols to transport MPEGTS, with the best of two worlds: the reliability of TCP, and the lower latency of UDP. It also supports encryption and bonding.

  • Other competing new protocols are WebRTC, Zixi (closed source) and RIST; the latter two are quite similar to SRT and all go beyond RTMP.

  • For further technical details, we recommend this video by Alex Converse, a Twitch engineer:

There is also a white paper which can be found here
or there. The API is fully documented on GitHub.
A very good source of info is the SRT Cookbook.


Can SRT be used with Twitch or my favorite service?

Short answer: NO (or not yet?)

Services

Long answer: None of the main streaming services support the SRT protocol for ingest. Most still use RTMP (Twitch, YouTube, Facebook…). (Mixer though relies on WebRTC through its proprietary FTL protocol which OBS already supports). If you’re using exclusively these services, no need to read further.

At this stage of the adoption of SRT protocol, you’ll have to be technically inclined if you want to use SRT. If you are able to set up your own streaming server, maybe redirecting your streams to the main services like Twitch or YouTube and are interested in achieving low-latency with improved network resilience, read on.

The other category of users who could potentially be interested belong obviously to the professional broadcast industry. This wiki entry can be considered as fairly advanced in that it requires access to a server and being able to set it up.

The configuration of OBS itself ranges from easy to medium in terms of difficulty. The server setup is more challenging since it requires system/network admin knowledge.

Encoders

Live software encoders:

  • FFmpeg,
  • OBS Studio which relies on the FFmpeg libraries,
  • vMix,
  • srt-live-transmit (which is a demo app from libsrt developpers; needs to be compiled from source)
  • Larix Broadcaster and Larix Screencaster can broadcast SRT from Android and iOS. Larix Talkback feature allows receiving SRT audio return feed back to mobile devices.
  • CameraFi Live (Android only).

Hardware encoders are also available at various vendors.

Servers

The following servers support SRT ingest:

  • Wowza (paid service)
    • Supports SRT ingest and transmuxing/distributing in RTMP/HLS/DASH
  • Nimble Streamer (free, closed source)
    • While Nimble Streamer is nominally free, it is used along with a non free dashboard which is, in all fairness, quite convenient. But it can also be used without the dashboard and just requires modifying a JSON config file.
    • Supports SRT ingest and transmuxing/distributing to RTMP, SRT, NDI, MPEG-DASH, HLS, Low Latency HLS and more.
  • SRT Live Server (free, open source)
    • This option only serves SRT streams and does not transmux to HLS/RTMP/DASH. It is much more rudimentary than the other servers at this stage but it is FOSS and works fine with OBS Studio in our tests.
  • OvenMediaEngine (free, open source)
    • OvenMediaEngine (OME) can receive a video/audio source from an encoder or camera with SRT, WebRTC, RTMP, MPEG-2 TS, or RTSP-Pull as input. Then, OME transmits it using WebRTC, Low Latency MPEG-DASH (LLDASH), MPEG-DASH, and HLS as output.

Additionally, though it is technically not a server, FFmpeg can be used in listener mode to ingest an SRT stream. It won’t be able to serve the stream as a real genuine server would do. But it could be used to transmux to RTMP and route to nginx-rtmp for instance, which can then handle the ingest to Twitch/YouTube/Facebook/etc.
ex: ffmpeg -i srt://IP:port?mode=listener -c copy -f flv rtmp://IP:1935/app/streamName.
In the same way srt-live-transmit can be used to listen to an srt (or udp) stream and relay to a final srt URL.
ex: srt-live-transmit srt://IPsrc:port srt://IPdest:port.

Players

The following players can be used to watch an SRT stream :

  • VLC (version 3.0+ on Mac/Linux, version 4.0+ on Windows)
  • ffplay (part of ffmpeg tools)
  • OBS Studio Media Source (an obvious case-use is to broadcast from any SRT source to an OBS instance).
  • Haivision Play Pro - iPhone and Android.
  • Larix Player for Android, iOS, Android TV and tvOS.

Receive srt stream within OBS

This could be useful to two pc setups (although NDI is probably a more common solution).
In a Media Source, uncheck ‘Local File’.
For ‘Input’, enter the srt URL. If the stream is received from a server (in listemer mode), the srt connexion will be in mode=caller (which is the default one so the option can be omitted). If however the stream is received straight from an encoder in caller mode, add the mode=listener to the URL (see screenshot).
For ‘Input Format’, enter mpegts.

Streaming with SRT protocol - 图1

VLC usage

Download VLC 3.0 here or VLC 4.0 here (warning: this is the development version of VLC).
If you just want to test without disturbing your current VLC install, we advise you to download a portable install (zip).

Go to Media > Open Network Stream:

Streaming with SRT protocol - 图2

enter the SRT IP which has the form srt://IP:port.

Streaming with SRT protocol - 图3

ffplay usage

It is required that ffplay be compiled with libsrt support. To the extent of our knowledge, there does not seem to be any such binary widely available, although there are no license constraints.

Just launch the command-line:
ffplay srt://IP:port

How to set up OBS Studio

There are two ways of setting up OBS Studio to connect to a server. The first is simpler but gives less options at the moment. The second is a bit more difficult to setup but gives more fine tuning capabilities (muxer options for instance are available). Note: ffmpeg may not come with SRT support in older distributions of Linux, so check the repository sources to ensure that ffmpeg comes with libsrt support, as there is no easy way of getting OBS Studio to reference a custom build of ffmpeg.

Note that while the discussion focuses on SRT protocol, UDP or TCP can also be used instead.

Option 1: Stream SRT using the Streaming output

Credit: Aaron Boxer, Collabora (SRT Alliance) author of the new SRT output. Output rewritten from scratch by pkv (pkv@obsproject.com) to solve some bugs.

Works fine with: Wowza, Nimble server, vlc, ffmpeg, srt live server, Makito X decoder. (those are only the platforms we tested at obsproject; most should work fine.)

  1. Go to Settings > Stream
  2. In the Service drowdown, select Custom.
  3. Enter the SRT URL in the form: srt://IP:port (OBS Studio will also accept any protocol relying on MPEGTS container and supported by FFmpeg, therefore UDP, TCP, RTP, etc.)
  4. Don’t enter anything for the key. It’s not used.

Streaming with SRT protocol - 图4

OBS Studio will accept options in the syntax: srt://IP:port?option1=value1&option2=value2. The full list of options is those supported by FFmpeg: http://ffmpeg.org/ffmpeg-protocols.html#srt.

The most important option is latency in microseconds (μs). It has a default value of 120 ms = 120 000 μs and should be at least 2.5 * (the round-trip time between encoder and ingest server, in ms).
Ex: for a latency of 1 sec, set latency=1000000 .

Another sometimes required option is the mode, which can be caller, listener or rendez-vous. caller opens client connection. listener starts server to listen for incoming connections. rendezvous use Rendez-Vous connection mode which is a bi-directional link where the first to initiate handshake is considered caller. The default value is caller and usually need not be set for OBS Studio since it’ll be in caller mode normally.

A case where it’s useful to set the mode to listener is when sending a stream to VLC. OBS Studio then acts as a server to VLC, which is the client. On a LAN for instance, set OBS Studio to srt://127.0.0.1:port?mode=listener to establish a connection to VLC which you point to srt://127.0.0.1:port.

Streaming with SRT protocol - 图5

Known issues:

  • With v25, a bug with MPEGTS muxer makes the stream not completely compliant with MPEGTS spec. In this case, SRT decoding fails when transmuxing SRT to another protocol/container than the combination SRT/MPEGTS. This is the case for instance with Nimble Streamer and Makito X Decoder. A dump of the MPEGTS stream therefore creates a non-conformant file (this can probably be fixed though by a remuxing with FFmpeg).
    • bug fixed in obs v26 in this PR: PR 2665. Use obs version >= 26.

Option 2: Stream SRT with the Custom FFmpeg Record output

This option is a bit more complicated. It relies on the Advanced: Custom FFmpeg Recording output.
Works fine with: Wowza, Nimble server, vlc, ffmpeg, srt live server (those are only the platforms we tested at obsproject; most should work fine). Doesn’t work well with Makito X decoder from Haivision.

  1. Go to Settings > Output
  2. In the Output mode dropdown, select Advanced.
  3. Click on Recording Tab.
  4. In the Type dropdown, select Custom Output (FFmpeg)
  5. In the FFmpeg Output Type dropdown, select Output to URL
  6. In the File Path or URL box, type the SRT URL: srt://IP:port (options like latency are entered with the syntax srt://IP:port?option1=value1&option2=value2).
    For a list of some of these options and a discussion, see the previous section or refer to FFmpeg documentation : http://ffmpeg.org/ffmpeg-protocols.html#srt.
  7. In Container Format dropdown, select mpegts.
  8. Muxer Settings can be left blank, or you can use the MPEGTS FFmpeg muxer options with the following syntax option1=value1 option2=value2 (the option=value pairs must be separated by a space).
  9. The other settings are self-explanatory. Check the box show all codecs to display all codecs available to FFmpeg.

Note that several audio tracks can be selected. They can be identified on the ingest server side by what is called a PID. On default value, the video track has pid 0x100 (=256), and the other audio tracks have pid 0x101 etc. If you need to change the pid of your tracks, use the muxer option MPEGTS_start_pid in 7.

Streaming with SRT protocol - 图6

Known issues:
There can be issues with Makito X Decoder. (under investigation) Use Option 1 for Makito X decoder.

Pros/Cons in comparison with Option 1:

Pros:

  • MPEGTS muxer options can be customized (ex: set pid for video and audio tracks).
  • Several audio tracks can be streamed (for instance, track 1: main track, track 2: background music, track 3: commentary etc.) while in option 1 only a single track can be selected. OBS Studio supports up to 6 audio tracks.

Cons:

  • More complex to set up.
  • One can not record the stream since one leverages the Record Output to stream.

Examples of setups

Relay server to Twitch

One can use ffmpeg to easily relay an input SRT stream to a standard RTMP compatible with Twitch (or even other streaming services provider like YouTube ou Facebook). This is especially interesting if you have a bad and unstable connection between OBS and the service. Note that the server you use should have a fast and reliable connection to benefit from this (but this is usually the case). Also note that using a server to proxy stream consumes a lot of bandwidth (consider twice as much as the bandwidth of your video stream) and this can be expensive on some providers like Amazon, GCS or Digital Ocean (it doesn’t need a fast CPU though, it’s only receiving, rewraping for RTMP and sending).
This approach has the advantage of being really easy to setup on OBS’s side if you prepare a server for someone else.
For example the following command can be used on the server:

ffmpeg -i srt://:1234?mode=listener&transtype=live&latency=3000000&ffs=128000&rcvbuf=100058624 -c copy -f flv rtmp://live-cdg.twitch.tv/app/streamKey

In this command you can change:

  • 1234: This is the listening port used by the SRT server. You will need to send your stream to this port. It can be anything, provided the ffmpeg can bind to it.
  • latency=3000000: The latency in microseconds. Here we use 3000 milliseconds (3 seconds). This is a simple calculus: the higher it is, the more reliable it will be if you drop packets or your latency increases and your viewer will have to wait more to receive your frames. The lower it is, the more sensible it will be to connection problems, but your viewers will also receive your content faster.
  • ffs and rcvbuf are complicated numbers that indicates the sizes of the differents buffers. They are already set for a latency of 3000ms. Basically, when you increase the latency, you’ll need to increase these values (because SRT needs to store more content in memory to recompose the puzzle afterwards). You can have informations on how to calculate these here: https://github.com/Haivision/srt/issues/703#issuecomment-495570496
  • live-cdg.twitch.tv: Nearest Twitch’s ingest server. Change to your nearest one found here: Twitch Ingest Informations
  • streamKey: This is your Twitch.tv stream key. You will need it to replace this with yours so they know who is streaming to where. Do not leak it, anyone with this key can stream to your account.

On OBS, you just need to configure your stream settings to “Custom”, and specify your server with this template:
srt://ipofyourserver:4444
You can leave the stream key empty, you don’t need it.

Run the ffmpeg command you saw above, it will wait indefinitely for an input stream, and automatically stop at the end of the stream. I leave you the choice of managing the restart with systemd or in a Docker container!