<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Dolby.io</title>
    <description>The latest articles on Forem by Dolby.io (@dolbyio).</description>
    <link>https://forem.com/dolbyio</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/dolbyio"/>
    <language>en</language>
    <item>
      <title>Top Reasons for Updating to the New OBS 30.0 Release</title>
      <dc:creator>Jayson DeLancey</dc:creator>
      <pubDate>Tue, 10 Oct 2023 23:30:00 +0000</pubDate>
      <link>https://forem.com/dolbyio/top-reasons-for-updating-to-the-new-obs-300-release-g6m</link>
      <guid>https://forem.com/dolbyio/top-reasons-for-updating-to-the-new-obs-300-release-g6m</guid>
      <description>&lt;p&gt;Open Broadcast Software (OBS) Studio has released 30.0.0 as a public release candidate with some new updates to the long popular streaming and recording tool. You may be wondering if it is worth downloading the latest version right away so I've highlighted a few of the more note-worthy updates that impact my broadcast workflows. Specifically, including support for WHIP opens up a whole range of new opportunities.&lt;/p&gt;

&lt;h2&gt;
  
  
  1 - Support for WHIP Outputs from OBS
&lt;/h2&gt;

&lt;p&gt;WebRTC-HTTP Ingestion Protocol (WHIP) is an IETF standard protocol for streaming media ingress that was championed by the Millicast team (now part of Dolby.io). WebRTC has been used for many applications including real-time communications, server-rendered gaming, etc. where a protocol supporting use cases where ultra-low latency is prioritized over packet loss. You can find more background and &lt;a href="https://dolby.io/blog/what-is-whip-intro-to-webrtc-streaming-part-1/" rel="noopener noreferrer"&gt;learn more about WHIP&lt;/a&gt; from other posts, but some examples of where this has an impact right away:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://docs.dolby.io/streaming-apis/docs/webrtc-whip" rel="noopener noreferrer"&gt;Secure streaming services such as Dolby.io&lt;/a&gt; support WHIP for setting up protected private broadcasts for real-time applications like sports, betting, gaming, competitions, remote production, etc.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Some public commercial streaming services offer support for WHIP to enable streaming to audiences without sacrificing latency in delivering audio/video while lagging behind text chat and other messaging services.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For the access that WHIP provides alone is enough to justify an upgrade.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkytuy8phgerz34wdba63.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkytuy8phgerz34wdba63.png" alt="Use WHIP Setting" width="800" height="234"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here's an example using WHIP with the &lt;a href="https://docs.dolby.io/streaming-apis/docs/hosted-viewer" rel="noopener noreferrer"&gt;hosted player&lt;/a&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmvufdshfodrax3o4dhxt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmvufdshfodrax3o4dhxt.png" alt="OBS 30 with WHIP Streaming" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Learn more about why &lt;a href="https://dolby.io/blog/obs-studio-adds-native-webrtc-streaming-with-whip/" rel="noopener noreferrer"&gt;OBS Studio Adding Native WebRTC Streaming with WHIP&lt;/a&gt; is important for interactive streaming experiences.&lt;/p&gt;

&lt;h2&gt;
  
  
  2 - Status Bar Changes in OBS 30.0
&lt;/h2&gt;

&lt;p&gt;The status bar has had a design refresh. The information is the same but structured a bit differently with some visual icons to help with identification of the stream health.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwiv8tmym3kzqan70m1fe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwiv8tmym3kzqan70m1fe.png" alt="OBS 30 status bar with new design and icons" width="800" height="21"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By comparison, the status bar from OBS 29:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkscso8vamjt6el0f1vmq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkscso8vamjt6el0f1vmq.png" alt="OBS 29 status bar" width="800" height="20"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You may also notice that encoders and capture screens are now sorted by name to make it easier to find devices in a predictable order. If you often share a screen or an application in your workflows this can prevent some fumbling around while live.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm7hifweoqj7xh7m4c868.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm7hifweoqj7xh7m4c868.png" alt="Sorting in OBS 30 Selection Lists" width="800" height="374"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;These changes offer some nice usability improvements.&lt;/p&gt;

&lt;h2&gt;
  
  
  3 - MacOS Compatibility Improvements
&lt;/h2&gt;

&lt;p&gt;If your primary workstation for running OBS is a Mac, there are some substantial improvements.&lt;/p&gt;

&lt;p&gt;The virtual camera is more widely compatible with other applications. If your primary use case with OBS is to add special effects and filters to regular conference tools, this can give your system some much needed reliability when switching from one application to another.&lt;/p&gt;

&lt;p&gt;There is also audio capture fix and option to hide OBS windows from screen captures.&lt;/p&gt;

&lt;h2&gt;
  
  
  4 - Safe Mode for Complex Integrations
&lt;/h2&gt;

&lt;p&gt;If you use third-party plugins, custom scripts, etc. then you've probably had to scrap a broadcast or two while debugging a problem. The ability to launch in a "Safe Mode" is added so that you can disable loading and investigate the root cause of the issue.&lt;/p&gt;

&lt;h2&gt;
  
  
  Download and Install OBS 30.0
&lt;/h2&gt;

&lt;p&gt;There are lots of other fixes and details called out in the &lt;a href="https://github.com/obsproject/obs-studio/releases" rel="noopener noreferrer"&gt;release notes&lt;/a&gt; that may be valuable for some, but those are a few of the top updates for my workflows.&lt;/p&gt;

&lt;p&gt;You can find the latest release available for download from &lt;a href="https://obsproject.com/download" rel="noopener noreferrer"&gt;obsproject.com&lt;/a&gt;. You can also find preview version from the &lt;a href="https://github.com/obsproject/obs-studio/releases" rel="noopener noreferrer"&gt;obsproject/obs-studio&lt;/a&gt; github release history.&lt;/p&gt;

&lt;p&gt;A huge thanks and shoutout to all the &lt;a href="https://github.com/obsproject/obs-studio/graphs/contributors" rel="noopener noreferrer"&gt;contributors&lt;/a&gt; who made this release possible. If you have time and/or money consider contributing to their &lt;a href="https://patreon.com/obsproject" rel="noopener noreferrer"&gt;Patreon&lt;/a&gt; or &lt;a href="https://github.com/obsproject/obs-studio/labels/Good%20first%20issue" rel="noopener noreferrer"&gt;Open Issues&lt;/a&gt; this Hacktober.&lt;/p&gt;

</description>
      <category>obs</category>
      <category>streaming</category>
      <category>twitch</category>
      <category>webrtc</category>
    </item>
    <item>
      <title>Interactive Live Streaming Through Virtual Cameras on Unity</title>
      <dc:creator>Angelik Laboy</dc:creator>
      <pubDate>Fri, 04 Aug 2023 01:32:54 +0000</pubDate>
      <link>https://forem.com/dolbyio/interactive-live-streaming-through-virtual-cameras-on-unity-20ka</link>
      <guid>https://forem.com/dolbyio/interactive-live-streaming-through-virtual-cameras-on-unity-20ka</guid>
      <description>&lt;p&gt;When it comes to the world of streaming, we have become accustomed to its widespread use, with various technologies striving for lower latency and faster integration. Amidst these advancements, one aspect remains to be tackled: interactivity.&lt;/p&gt;

&lt;p&gt;I mean, look at Netflix's &lt;em&gt;Black Mirror: Bandersnatch&lt;/em&gt; or &lt;em&gt;Kaleidoscope&lt;/em&gt;, all part of their endeavor to immerse audiences in the stories, becoming an integral part of the narrative. And if you observe closely, that is where we get the key to interactivity: &lt;strong&gt;the integration&lt;/strong&gt;. We have come so far in entertainment that we desire to belong to the stories in order to live the adventures of our favorite characters. Wouldn't it be cool to be in the kitchen with Carmy and Sdyney from The Bear? Or to be one of the fans cheering for AFC Richmond in Ted Lasso? &lt;/p&gt;

&lt;p&gt;I would personally want to do that too but the challenge so far has been the immersion part. We haven't really been able to dissociate from life to &lt;em&gt;believe&lt;/em&gt; we are actually in those stories. As of today (August 3, 2023), the stage is still being set for this new era of interactive streaming. By now, the XR community is ready to fire back saying their &lt;em&gt;insert X product&lt;/em&gt; is doing &lt;em&gt;it&lt;/em&gt;, and I genuinely believe they have the potential to deliver on this promise.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk0waexvy8rrdatpnmoej.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk0waexvy8rrdatpnmoej.jpg" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For example, at the 2021 Annecy International Animation Film Festival, &lt;em&gt;&lt;a href="https://www.oculus.com/experiences/quest/5334662579895130/" rel="noopener noreferrer"&gt;"On The Morning You Wake (To The End Of The World)"&lt;/a&gt;&lt;/em&gt; was shown as an official selection which recounts the experiences of a few Hawaiians who received an SMS from the state's Emergency Management Agency about a missile threat. This documentary displayed a turbulent situation where everything is an unceasing chaos and instead gave us a reflective look about the fragility of life. On the visuals, its pointillism technique enhanced the sentiment of this experience being shared by not only one, but thousands of citizens at the same time.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Ultimately, one gets to feel like a part of the story.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5v5tu7ocgetylyvnztsg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5v5tu7ocgetylyvnztsg.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, this experience is exclusively being offered through Meta's Quest lineup, but what if there was a way to experience another one's perspective on a headset like that? What if there was a way to allow us, the audience, to control how we want to watch the film?&lt;/p&gt;

&lt;h2&gt;
  
  
  Live Stream from Unity
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6vp5rc0am8pnt6g7g197.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6vp5rc0am8pnt6g7g197.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I have talked your ear off about interactivity and being the main protagonist. Now, lets talk about the tech. Working within Dolby I have gotten the chance to experiment with our virtual world's plugins. As part of Dolby.io, the Unity/Unreal plugins offer the exciting capability to stream content from inside the engines. Whether it's viewing the 3rd person POV of the main character or streaming from a static virtual camera, the plugin introduces a groundbreaking way to actualize experiences in this space. &lt;/p&gt;

&lt;p&gt;In this post, I aim to demonstrate how you can set up multiple cameras inside the engine and display them all through a single stream using a multi-view configuration.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing the Unity Plugin
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqq5t71kmnbm8nvan4ulc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqq5t71kmnbm8nvan4ulc.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To start, I will be using the Bitsize Samples from Unity - Client Driven for a multiplayer game where users pick up orbs and deposit them in their designated colors. With a fully actualized world, our focus can now shift to enhancing the cinematics.&lt;/p&gt;

&lt;p&gt;To get started, follow these steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open the project through Unity Hub.&lt;/li&gt;
&lt;li&gt;Go to Windows &amp;gt; Package Manager.&lt;/li&gt;
&lt;li&gt;In the Package Manager window, change the view to show only the "In-Project" packages.&lt;/li&gt;
&lt;li&gt;To the left of it, click on the "+" icon, and select the option to Add package from git URL....&lt;/li&gt;
&lt;li&gt;Insert the following URL into the space:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;code&gt;https://github.com/millicast/millicast-unity-sdk.git&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;The installed package should read out Millicast v 1.1.0. &lt;/p&gt;

&lt;h2&gt;
  
  
  Establishing Credentials and Stream Settings
&lt;/h2&gt;

&lt;p&gt;Before starting with the stream, we will need to do one more thing. On Assets, create a new folder and title it “Credentials”. Once inside, right-click and navigate to Create &amp;gt; Millicast &amp;gt; Credentials. By completing this step, you'll set up credentials once and associate them with the plugin's components, eliminating the need to rewrite them for each instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fijigg5s5iz60xhz804a3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fijigg5s5iz60xhz804a3.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The next step will require connecting your Dolby.io credentials so lets check that off. Go to Dolby.io and &lt;a href="https://dashboard.dolby.io/signup/" rel="noopener noreferrer"&gt;receive 50GBs each month for free&lt;/a&gt; to get started with a few examples. Once you're inside the Dolby.io Dashboard, ensure that the Streaming tab is selected on the top left. Here is where stream tokens would be created to enable a broadcast; press on the "+ Create" button and name the token label as desired. For Add stream names, click on the Allow any stream name. The reason why this option is selected is to center all of the cameras to be under the same stream token.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgq28xivbtc2jo3qitwzt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgq28xivbtc2jo3qitwzt.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the token made, navigate inside settings and view the API tab to collect your Account ID and Publishing Token. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd5011ler33t4to67y67r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd5011ler33t4to67y67r.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the same way Credentials was created, two other assets can be added: Video Configuration and Audio Configuration. The first one would have you control the codec, resolution, framerate, and many more while the audio asset is there to control the distance in which the sound would be emitted as well as the volume of the stream. Customize the settings to your choosing!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcg65qclg2t370rgmlre2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcg65qclg2t370rgmlre2.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Publisher: How You Can Stream Out
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0qs65n5ukz8xs14zwawx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0qs65n5ukz8xs14zwawx.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, let's get started! In your Playground, begin by creating an empty GameObject and name it "Production." Within "Production," we will craft four different cameras, each serving a unique purpose. Position these cameras strategically throughout the environment, ensuring they capture crucial action points that the audience would want to witness. Think of these cameras as virtual security cameras, recording the exciting moments at the scoring platforms.&lt;/p&gt;

&lt;p&gt;By incorporating a variety of camera perspectives, you'll create an immersive experience for the audience, drawing them deeper into the virtual world and enhancing their enjoyment of the content.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8i59s6pwail4z2yo7jxl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8i59s6pwail4z2yo7jxl.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With those set, visit the first camera's inspector view and click &lt;strong&gt;Add component&lt;/strong&gt;. Next, search &lt;code&gt;Mc Publisher&lt;/code&gt;; this is a component from Dolby.io that will allow for the game to be streamed out. The first information we need to give is a &lt;code&gt;Stream Name&lt;/code&gt;. Since the stream token was created as a wild card, any name can be used. However, if the stream token was specifically named, return to the streaming dashboard to find the token’s API information, where you will see the Stream Name (e.g., “leqgbgh9”). In the Credentials space, drag and drop the asset containing all the necessary information (previously named Credentials). Do the same in the space asking for the Video Config Data, providing the required video configuration asset. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyy52mlvd96pcamia9t32.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyy52mlvd96pcamia9t32.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Check off the &lt;code&gt;Publish On Start&lt;/code&gt; to link the Start button on the engine to starting the stream on Dolby.io. For &lt;code&gt;Stream Type&lt;/code&gt;, there are three options: Video + Audio, Video Only, or Audio Only. As the options read, this is the field to select how you would like your stream to be shown. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmf5hncgpfd4nixg22gu3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmf5hncgpfd4nixg22gu3.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For the next two settings, Video Source Type and Video Source Camera, simply select "Camera" and the name of the specific camera where this component is added. This space allows you to control the stream without directly adding the component to the camera itself. For example, if you prefer to manage all four cameras from a single GameObject, that is also a viable option.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4llvlf1eutcia5s0gczk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4llvlf1eutcia5s0gczk.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, you will notice that it asks to use the &lt;code&gt;Audio Listener&lt;/code&gt; of the camera. Selecting this option will have the audio transmitted be from the camera’s perspective.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Note: Make sure the &lt;strong&gt;Audio Listener component&lt;/strong&gt; is check on to in order to allow any audio transmission. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Lastly, it is asking whether to &lt;code&gt;Enable Multi Source&lt;/code&gt; and we will apply this option. Multisource is the feature that allows a single stream (WebRTC or RTMP) to support broadcasting multiple video and audio feeds. Once rendered these streams can be switched between, offering the viewer the ability to control how they view and listen to the content. Upon selecting this option, a new field called Source Id will appear. The sourceID represents the ID of a device that you want to use. If you use multiple screens, use this parameter to specify which screen we are viewing from. For simplicity sake, I am going to name the first camera "camOne". &lt;/p&gt;

&lt;p&gt;Follow this procedure for the next three cameras: keep the Stream Name the same for all, change the Video Source Camera to the current camera with the component, and use the sourceId as the next incrementing label, e.g., "camTwo," "camThree," "camFour."&lt;/p&gt;

&lt;h3&gt;
  
  
  BONUS:
&lt;/h3&gt;

&lt;blockquote&gt;
&lt;p&gt;This bitsize sample already has a camera following the mainPlayer. If you wish to stream this perspective, follow the same steps for the mainCamera object. The only difference is that you do not need to mark Enable Multi-Source in this case. When no specific sourceId is labeled, and the video is going into the same stream, it will be identified as the main source by default.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Viewing the Multi-View
&lt;/h2&gt;

&lt;p&gt;With the Mc Publishers finished, let us verify our work. On the engine’s centered buttons, press Start to commence the experience. Congrats, your game has started streaming! But where is it being streamed out to?   &lt;/p&gt;

&lt;p&gt;Let’s pay back a visit to the token’s API information. On the Playback tab, the hosted player path URL will be present. This URL is the link needed to view the livestream going on. In the URL, you will be asked to write in &lt;code&gt;YourPublishName&lt;/code&gt;, which is the &lt;strong&gt;stream name of your token&lt;/strong&gt;. If you decided to add the secure viewer, it would also ask for your subscribe token information. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;https://viewer.millicast.com?streamId=accountID/McPublisherStreamName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftz6b7leu9ndxg13ju84l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftz6b7leu9ndxg13ju84l.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Voilà, your streaming is up and running!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjjadh79wic965k3k0fkr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjjadh79wic965k3k0fkr.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You might notice that it is only showing the mainCamera's view and not the other four camera set up. Go to the gear icon on the bottom right and click on &lt;strong&gt;Show Multi View&lt;/strong&gt;. This will activate the layout on the web viewer. Now you can click around the screen to the different camera to change the focus on who is the first one exhibited. Equally, if all feeds are important then on the gear icon is also the option to &lt;strong&gt;change layout&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flwx1rt4euiqj43yq1x0d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flwx1rt4euiqj43yq1x0d.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Final Thoughts
&lt;/h2&gt;

&lt;p&gt;We have just accomplished a stream from inside an engine. What now? The idea of presenting a streaming solution for XR experience can mean a lot of different things. Not only is it an accessibility bonus for your product for which users who can't use a headset can have a chance at viewing the experience. Keyly, it makes us think of what can be done with more dimensions now added to these experience. Streaming XR experiences can allow audiences to actively participate in virtual worlds, creating a deeper level of engagement and immersion. This interactivity lets users control their actions and decisions within the XR environment, leading to a more personalized and captivating experience. The enablement of XR streaming invites real-time participation in events, performances, and social gatherings. For example, esports viewers can now be more close to the action by giving them the chance to see where they desire. This personalization allows users to shape the storyline, outcomes, and interactions, tailoring the experience to their liking. As a result, users feel more invested in the content and are more likely to return for further engagement. Not to mention, it can foster a sense of community and social interaction even when physically distant. &lt;/p&gt;

&lt;p&gt;All I am saying is that... you should watch out for this field. &lt;/p&gt;

</description>
      <category>webdev</category>
      <category>unity3d</category>
      <category>virtualcamera</category>
      <category>livestream</category>
    </item>
    <item>
      <title>How-to Broadcast a WebRTC stream to Twitch</title>
      <dc:creator>Braden Riggs</dc:creator>
      <pubDate>Tue, 01 Aug 2023 16:03:31 +0000</pubDate>
      <link>https://forem.com/dolbyio/how-to-broadcast-a-webrtc-stream-to-twitch-7fa</link>
      <guid>https://forem.com/dolbyio/how-to-broadcast-a-webrtc-stream-to-twitch-7fa</guid>
      <description>&lt;p&gt;Recently, while exploring &lt;a href="https://docs.dolby.io/streaming-apis/docs/webrtc-whip" rel="noopener noreferrer"&gt;syndicating Dolby.io WebRTC&lt;/a&gt; streams, I learned that &lt;a href="https://www.linkedin.com/posts/sean-dubois_twitch-activity-7053056800861933568-TTPW/" rel="noopener noreferrer"&gt;Twitch has added support for WebRTC Ingest&lt;/a&gt; or &lt;a href="https://datatracker.ietf.org/doc/draft-ietf-wish-whip/" rel="noopener noreferrer"&gt;WHIP&lt;/a&gt; as it is known in the industry.&lt;/p&gt;

&lt;p&gt;WebRTC for streaming is an exciting choice because it can decrease stream latency compared to traditional protocols such as RTMP and HLS. When ingested, Twitch will transmux the WebRTC stream into something the platform supports (HLS), so that adds latency, slowing down the feed.&lt;/p&gt;

&lt;p&gt;With that said, WHIP support is a great step for the community and with OBS now adding support for WebRTC, I thought I'd have to try it out.&lt;/p&gt;

&lt;p&gt;In this guide, we'll showcase how to stream WebRTC from OBS into Twitch.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up OBS for WebRTC
&lt;/h2&gt;

&lt;p&gt;The core OBS project is working to add WebRTC, however, at the moment it is still an experimental build. You can try out this build by downloading the version relevant to your system &lt;a href="https://github.com/obsproject/obs-studio/actions/runs/5227109208" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Once downloaded, extract the project and install it. &lt;/p&gt;

&lt;h2&gt;
  
  
  Streaming WebRTC from OBS to Twitch
&lt;/h2&gt;

&lt;p&gt;With the project installed and launched, navigate to: &lt;br&gt;
&lt;code&gt;Settings -&amp;gt; Stream&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Inside of &lt;code&gt;Stream&lt;/code&gt; select &lt;code&gt;WHIP&lt;/code&gt; as your service:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzziaxvtcx6wnwbpw75am.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzziaxvtcx6wnwbpw75am.png" alt="The WHIP settings in OBS for WebRTC streaming" width="800" height="620"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To start a WebRTC stream to Twitch you need the &lt;code&gt;Server&lt;/code&gt; path and your &lt;code&gt;Stream key&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Twitch WHIP server
&lt;/h3&gt;

&lt;p&gt;The server is (&lt;em&gt;currently&lt;/em&gt;) the same for everyone:&lt;br&gt;
&lt;code&gt;https://g.webrtc.live-video.net:4443/v2/offer&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;Note:&lt;/strong&gt; This server currently only supports H264 and Opus encoded streams.&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Getting Your Twitch Stream Key
&lt;/h3&gt;

&lt;p&gt;Your Twitch Stream Key can be found on your &lt;a href="https://dashboard.twitch.tv/" rel="noopener noreferrer"&gt;dashboard&lt;/a&gt; once you've logged in, under:&lt;br&gt;
&lt;code&gt;settings -&amp;gt; stream&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F22v9s5k3vdu0djamhb7p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F22v9s5k3vdu0djamhb7p.png" alt="Your Twitch API Stream Key on the Dashboard" width="800" height="108"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Copy both the &lt;code&gt;Server&lt;/code&gt; URL and the &lt;code&gt;Stream Key&lt;/code&gt; into the &lt;code&gt;Server&lt;/code&gt; and &lt;code&gt;Bearer Token&lt;/code&gt; inputs within OBS.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F37isoynu2aofbef98eht.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F37isoynu2aofbef98eht.png" alt="Twitch credentials added to OBS for WHIP streaming" width="800" height="613"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;Apply&lt;/code&gt;, set up OBS as usual, and click &lt;code&gt;Start Stream&lt;/code&gt; to begin your WebRTC broadcast to Twitch.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fugj1ya07nxf2dgi8qg5h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fugj1ya07nxf2dgi8qg5h.png" alt="Braden Riggs broadcasting a WebRTC stream from OBS to Twitch" width="800" height="296"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Learn More
&lt;/h3&gt;

&lt;p&gt;Broadcasting a WebRTC stream to Twitch is an great feature for the site as it allows people to easily &lt;a href="https://docs.dolby.io/streaming-apis/docs/syndication" rel="noopener noreferrer"&gt;syndicate their WebRTC streams&lt;/a&gt; to a popular platform. Because Twitch transmuxes the WebRTC stream, some delay is added, so if you're looking for an end-to-end white-label real-time streaming solution, check out &lt;a href="https://dolby.io/products/real-time-streaming/" rel="noopener noreferrer"&gt;Dolby.io Real-time Streaming&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;A special shout out to &lt;a href="https://www.linkedin.com/in/sean-dubois/" rel="noopener noreferrer"&gt;Sean DuBois&lt;/a&gt; for his work on both the OBS project and on Twitch's WHIP support.&lt;/p&gt;

</description>
      <category>webrtc</category>
      <category>twitch</category>
      <category>webdev</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>What is WHEP? – Intro to WebRTC Streaming Part 2</title>
      <dc:creator>Griffin</dc:creator>
      <pubDate>Mon, 01 May 2023 18:51:30 +0000</pubDate>
      <link>https://forem.com/dolbyio/what-is-whep-intro-to-webrtc-streaming-part-2-3d99</link>
      <guid>https://forem.com/dolbyio/what-is-whep-intro-to-webrtc-streaming-part-2-3d99</guid>
      <description>&lt;p&gt;In &lt;a href="https://dolby.io/blog/what-is-whip-intro-to-webrtc-streaming-part-1/" rel="noopener noreferrer"&gt;the previous article&lt;/a&gt;, we discussed WebRTC and the new standard developed to help us ingest data with it, known as &lt;a href="https://datatracker.ietf.org/doc/draft-ietf-wish-whip/" rel="noopener noreferrer"&gt;WHIP&lt;/a&gt;. However, for data that is ingested, that same data will likely need to be egressed, or distributed at some point. Bring in WebRTC-HTTP egress protocol, or WHEP. Abstractly, the ingestion is the part that covers the uploading of data to a server, and the egress handles the downloading to an end user. The benefits we gained from WHIP, such as the low latency and end-to-end encryption apply here as well: WHEP enables WebRTC communication on the other end of the content delivery pipeline; WHEP assists with serving content to the viewer.&lt;/p&gt;

&lt;p&gt;In this post, we will take a look at WHEP, an IETF protocol developed to let us use WebRTC to egress content to other destinations as a way to modernize content delivery over the web from previous standards.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why is WHEP useful?
&lt;/h2&gt;

&lt;p&gt;As mentioned above, WHIP only solves half of the equation when working with WebRTC-based content delivery. While you could read &lt;a href="https://datatracker.ietf.org/doc/draft-murillo-whep/" rel="noopener noreferrer"&gt;the official IETF documentation&lt;/a&gt;, we will summarize it more simply here. WHEP aims to solve the distribution aspect of WebRTC-based content, See the below diagram for a visual aid of how this works together using Dolby.io Real Time Streaming APIs as an example:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhcu5q9iu055e67hks20w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhcu5q9iu055e67hks20w.png" alt="WHIP/WHEP Workflow"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The benefit of having WHEP supplement the egress of broadcast WebRTC infrastructure is similar to the benefits of WHIP, namely the standardization. The same way WHIP allows broadcasters to focus on their infrastructure and the scaling of it without needing to worry about logistics, WHEP allows the distributors to focus on end-user experience, as they know exactly how the data will be received and handled. The end goal is optimization of time and resources across all parties with standardization.&lt;/p&gt;

&lt;p&gt;WHIP and WHEP do for real-time video what RTMP did for flash video or what SRT does for transport streams.  It standardizes the way (protocol) that the media servers speak to each other, like a language, so that any WHIP encoder can talk to any WHIP server and any WHEP service can talk to any WHEP player, without any other setup. Using the WHIP/WHEP URL should simply work no matter which environment is being used.&lt;/p&gt;

&lt;p&gt;There are many situations where a standard protocol for streaming media consumption over WebRTC would be helpful. Some options or examples include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Interoperability between WebRTC services, media servers, publishers, and players&lt;/li&gt;
&lt;li&gt;Playing WebRTC streams on TVs and other smart devices that do not support custom JavaScript scripts&lt;/li&gt;
&lt;li&gt;Creating modular, reusable software for media players&lt;/li&gt;
&lt;li&gt;Integrating with &lt;a href="https://dashif.org/webRTC/report.html#54-example-client-architecture" rel="noopener noreferrer"&gt;DASH&lt;/a&gt;, a current popular standard for adaptive bitrate streaming&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Where it differs from just being “the WHIP spec but in reverse” is in the specifics of the protocol. While for the most part it does behave the same as WHIP, using HTTP requests with Bearer Tokens for authentication, it is more flexible with &lt;a href="https://en.wikipedia.org/wiki/Session_Description_Protocol" rel="noopener noreferrer"&gt;SDP communication&lt;/a&gt;. WHEP allows for an SDP offer to be delivered immediately in the same HTTP request, or to send a POST request with intent to receive an offer back. This offers more flexibility depending on use case and environment, which can be learned more about in the white paper provided above. RTSP, an older standard used in the industry, does not support this model for example.&lt;/p&gt;

&lt;h2&gt;
  
  
  Dolby.io + WHEP
&lt;/h2&gt;

&lt;p&gt;Dolby.io is a leader in the definition and research of WHEP, which is an open standard.  Like WHIP, our researchers have developed this standard, offer support for it into our &lt;a href="https://docs.dolby.io/streaming-apis/reference/whep_whepsubscribe" rel="noopener noreferrer"&gt;Streaming Platform&lt;/a&gt;, and are working directly with software and hardware partners to integrate WHEP directly into their ecosystems. To learn more, see this Kranky Geek recording of Dolby.io Senior Director of Engineering Sergio Garcia Murillo, the head researcher working on developing WHIP and WHEP:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://youtu.be/rIQVVJOjR0U" rel="noopener noreferrer"&gt;https://youtu.be/rIQVVJOjR0U&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We believe WHEP is the future of WebRTC egress, and we want to support the community and projects around it. WHEP is only useful if it gains wide adoption. We encourage you to try out WHEP for your next streaming project and let us know your experience. One way to do this is with our &lt;a href="https://github.com/dolbyio-samples/streaming-WHIP-WHEP-node-sample" rel="noopener noreferrer"&gt;sample app using Node&lt;/a&gt;, our &lt;a href="https://github.com/millicast/videojs-plugin-millicast-whep" rel="noopener noreferrer"&gt;sample using Video.js&lt;/a&gt;, or using another community implementation such as &lt;a href="https://www.meetecho.com/blog/whip-whep/" rel="noopener noreferrer"&gt;this one by Lorenzo Miniero&lt;/a&gt;. We’d love to hear your thoughts on our &lt;a href="https://twitter.com/dolbyio" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt; or &lt;a href="https://www.linkedin.com/company/dolbyio/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>webrtc</category>
      <category>learning</category>
    </item>
    <item>
      <title>What is WHIP? Intro to WebRTC Streaming Part 1</title>
      <dc:creator>Griffin</dc:creator>
      <pubDate>Thu, 20 Apr 2023 18:03:23 +0000</pubDate>
      <link>https://forem.com/dolbyio/what-is-whip-intro-to-webrtc-streaming-part-1-4f6d</link>
      <guid>https://forem.com/dolbyio/what-is-whip-intro-to-webrtc-streaming-part-1-4f6d</guid>
      <description>&lt;p&gt;When considering which tool to use for your real-time streaming platform, WebRTC is one of the hot concepts brought into the forefront. While WebRTC has been around since 2011 and has since been successful at being used in many scenarios, optimizing WebRTC for live generated content, such as in the broadcasting industry, as opposed to pre-existing files is where things get more complex. WHIP and WHEP are two new standards designed to assist in ingesting and egressing this media into WebRTC instead of having to rely on using older standards like RTMP to do that.&lt;/p&gt;

&lt;p&gt;In this post, we will focus on WHIP, or WebRTC-HTTP ingestion protocol, an IETF protocol developed to let us use WebRTC to ingest content into our platforms over these old protocols.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why WHIP?
&lt;/h2&gt;

&lt;p&gt;For those of you who are overwhelmed with &lt;a href="https://datatracker.ietf.org/doc/draft-ietf-wish-whip/"&gt;the official IETF document&lt;/a&gt;, WHIP (sometimes known as WISH) is an open standard that you can use right now for your WebRTC based ingestion. You can use it today with open source software such as &lt;a href="https://docs.dolby.io/streaming-apis/docs/using-whip-with-gstreamer"&gt;GStreamer&lt;/a&gt; or &lt;a href="https://github.com/CoSMoSoftware/OBS-studio-webrtc"&gt;OBS (fork)&lt;/a&gt; as a way to publish your content with WebRTC.&lt;/p&gt;

&lt;p&gt;A benefit to using WebRTC based content is it’s extremely low latency and security with end-to-end encryption. However, initial versions of WebRTC based streaming were associated with poor quality and limited viewer numbers. WHIP solves this by removing the translation layers needed to use WebRTC before that cause many of the previously mentioned flaws, giving us all of the benefits of WebRTC without the downsides. WHIP provides a standard signaling protocol for WebRTC to make it easy to support and integrate into software and hardware.&lt;/p&gt;

&lt;p&gt;WHIP provides support for important standards, such as HTTP POST based requests for SDP O/A, HTTP redirections for load balancing, and authentication and authorization done by the Auth HTTP header and Bearer Tokens.&lt;/p&gt;

&lt;p&gt;Think of this like a train station. Without any rail signalers, the trains will behave sporadically, causing potential slowdowns if too many trains are on the same track, tracks that are unused, and possible crashes and collisions. With a signaler, the trains will be directed more orderly, optimizing the system to keep things moving quickly and efficiently. WHIP acts as this signaler, handling things like creating or deleting endpoints if needed and performing operations like &lt;a href="https://webrtc.github.io/samples/src/content/peerconnection/trickle-ice/"&gt;Trickle ICE&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;How Does Dolby.io fit in?&lt;br&gt;
As mentioned before, WHIP is an open standard. Dolby.io supports WHIP not only by providing integrations, but also by leading the definition and research into the standard. Our researchers have developed this standard, and our engineers have implemented it into our &lt;a href="https://docs.dolby.io/streaming-apis/docs"&gt;Streaming Platform&lt;/a&gt;, as well as having worked directly with software and hardware partners to integrate this standard directly into their platforms, such as &lt;a href="https://docs.dolby.io/streaming-apis/docs/using-whip-with-flowcaster"&gt;FlowCaster&lt;/a&gt; and &lt;a href="https://docs.dolby.io/streaming-apis/docs/using-osprey-talon-whip-hardware-encoder"&gt;Osprey&lt;/a&gt; for both software and hardware encoding.&lt;/p&gt;

&lt;p&gt;We believe WHIP is the future of WebRTC ingestion, and we want to support the development and community around it. As a standard is nothing without wide adoption. We encourage you to try our WHIP today with one of the priorly mentioned integrations for your next streaming project and let us know your experience. We’d love to hear your thoughts on our &lt;a href="https://twitter.com/dolbyio"&gt;Twitter&lt;/a&gt; or &lt;a href="https://www.linkedin.com/company/dolbyio/"&gt;LinkedIn&lt;/a&gt;. Or try out our &lt;a href="https://github.com/dolbyio-samples/streaming-WHIP-WHEP-node-sample"&gt;sample app using Node&lt;/a&gt;, and leave some feedback on GitHub.&lt;/p&gt;

&lt;p&gt;Stay tuned for Part 2 where we will talk about the other end of the process, WHEP, or WebRTC-HTTP egress protocol, and see how WebRTC will define the future of all streaming and broadcasting communications.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>webrtc</category>
      <category>learning</category>
    </item>
    <item>
      <title>A Low-Latency Live Stream React App</title>
      <dc:creator>Braden Riggs</dc:creator>
      <pubDate>Mon, 03 Apr 2023 18:36:01 +0000</pubDate>
      <link>https://forem.com/dolbyio/a-low-latency-live-stream-react-app-53pj</link>
      <guid>https://forem.com/dolbyio/a-low-latency-live-stream-react-app-53pj</guid>
      <description>&lt;p&gt;&lt;a href="https://dolby.io/blog/a-low-latency-live-stream-react-app/" rel="noopener noreferrer"&gt;Original Article Published Here&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When building a streaming app or platform it is important to consider how the end user experiences and engages with the content being streamed. If your users need to engage with the content creator, the delay between capture and consumption should be minimal. To achieve this, many developers rely on WebRTC, a content-over-internet transfer protocol that boasts exceptionally low delays for video and audio. By leveraging WebRTC, developers can quickly build a low-delay immersive experience, leaving plenty of time to make the UI look outstanding using front-end libraries such as ReactJS.&lt;/p&gt;

&lt;p&gt;In this guide, we're going to showcase a WebRTC ReactJS streaming app powered by &lt;a href="https://dolby.io/products/real-time-streaming/" rel="noopener noreferrer"&gt;Dolby.io Streaming&lt;/a&gt; and NodeJS. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxeu6637sxbkjwb7rvkvp.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxeu6637sxbkjwb7rvkvp.jpg" width="800" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The WebRTC React Example Code
&lt;/h2&gt;

&lt;p&gt;The WebRTC React Streaming example app can be found on the &lt;a href="https://github.com/dolbyio-samples/rts-app-react-publisher-viewer" rel="noopener noreferrer"&gt;dolbyio-samples GitHub&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;To set up the project you need four things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; A cloned &lt;a href="https://github.com/dolbyio-samples/rts-app-react-publisher-viewer" rel="noopener noreferrer"&gt;copy of the sample app&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt; &lt;a href="https://nodejs.org/en" rel="noopener noreferrer"&gt;Node v16 or greater&lt;/a&gt; installed.&lt;/li&gt;
&lt;li&gt; &lt;a href="https://yarnpkg.com/" rel="noopener noreferrer"&gt;Yarn package&lt;/a&gt; manager v1.22.19 or greater installed.&lt;/li&gt;
&lt;li&gt; &lt;a href="https://dashboard.dolby.io/signup" rel="noopener noreferrer"&gt;A Dolby.io account&lt;/a&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Once you've cloned the repo and set up the Node and Yarn, navigate to the main directory via the terminal and run the following command to install all dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;yarn
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;While your project is installing we can briefly discuss how to set up your Dolby.io account. Once you've &lt;a href="https://dashboard.dolby.io/signup/" rel="noopener noreferrer"&gt;created an account&lt;/a&gt; you'll be dropped off on the &lt;a href="https://streaming.dolby.io/#/tokens" rel="noopener noreferrer"&gt;Dolby.io Dashboard&lt;/a&gt; where you can create and manage tokens required for leveraging Dolby.io Streaming servers.&lt;/p&gt;

&lt;p&gt;Click the purple and white &lt;em&gt;+ Create&lt;/em&gt; button to create a new token. Give the token a label and your stream name a unique identifier, then switch to the Advanced tab to enable "Multisource" as shown in the image below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxog78o77lf8rtcdm8ba8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxog78o77lf8rtcdm8ba8.png" width="330" height="619"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ogpgk84b9bjmy2h3ise.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ogpgk84b9bjmy2h3ise.png" width="333" height="619"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Enabling &lt;a href="https://docs.dolby.io/streaming-apis/docs/multisource-streams" rel="noopener noreferrer"&gt;Multisource&lt;/a&gt; allows you to leverage Dolby.io Streaming to capture and deliver multiple low-delay streams at once. With your Token created, we can now click on the newly created token and gather your &lt;em&gt;stream name&lt;/em&gt;, &lt;em&gt;stream account id&lt;/em&gt;, and &lt;em&gt;stream publishing token&lt;/em&gt; as shown in the image below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F48vlmkti8dq50tshwxor.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F48vlmkti8dq50tshwxor.png" width="800" height="192"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that we have all the credentials required to connect to the Dolby.io servers, let's update the project credentials. To do this we need to edit the &lt;code&gt;.env.example,&lt;/code&gt; which can be found inside of &lt;code&gt;apps/publisher/&lt;/code&gt; and &lt;code&gt;apps/viewer/,&lt;/code&gt; by renaming the file to &lt;code&gt;.env&lt;/code&gt; and populating the file with the respective credentials.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fprnq387yad19wogn2k45.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fprnq387yad19wogn2k45.png" width="800" height="210"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Additionally inside of the &lt;code&gt;apps/publisher/.env.example&lt;/code&gt; there is a parameter to adjust the viewer URL. For testing locally this can be set to a &lt;a href="https://www.hostinger.com/tutorials/what-is-localhost" rel="noopener noreferrer"&gt;local host URL&lt;/a&gt;, however, in production, this should be a web-accessible endpoint.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;VITE_RTS_VIEWER_BASE_URL=http://localhost:7070/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With all the credentials set up, we can now run the React streaming app. The app can be split into two functions, a publisher and a viewer. The publisher app, which is what a content creator would use, serves content to the end user who participates via the viewer app.&lt;/p&gt;

&lt;p&gt;To start the publisher app experience:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;yarn nx serve publisher
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flbj65ui56x47s4yoj0ug.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flbj65ui56x47s4yoj0ug.jpg" width="800" height="430"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To start the viewer app experience:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;yarn nx serve viewer
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe9bygcdyt8a12zvoo9gi.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe9bygcdyt8a12zvoo9gi.jpg" width="800" height="428"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With both the viewer and the publisher running we now have a live streaming app with Node.js and React powered by Dolby.io WebRTC Streaming. This experience can be &lt;a href="https://www.netlify.com/with/react/" rel="noopener noreferrer"&gt;deployed on a cloud service such as Netlify&lt;/a&gt; to share publicly, just remember to add your branding and styling.&lt;/p&gt;

&lt;h3&gt;
  
  
  Building your own React WebRTC streaming app
&lt;/h3&gt;

&lt;p&gt;With your Dolby.io account already created, building your own bespoke viewer and publisher experience is easy. Dolby.io Streaming has a &lt;a href="https://docs.dolby.io/streaming-apis/docs/rn" rel="noopener noreferrer"&gt;React Native SDK&lt;/a&gt; allowing developers to quickly and easily build a streaming solution. If you are interested in learning more about Dolby.io Streaming check out some of our other blogs including building a &lt;a href="https://dolby.io/blog/building-a-webrtc-live-stream-multiviewer-app/" rel="noopener noreferrer"&gt;Multiview web app&lt;/a&gt; or about our &lt;a href="https://www.youtube.com/watch?v=jUP4vyzbu5Y" rel="noopener noreferrer"&gt;Dolby.io Streaming OBS integration&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Feedback or Questions? Reach out to the team on &lt;a href="https://twitter.com/DolbyIO?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt;, &lt;a href="https://www.linkedin.com/company/dolbyio/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, or via our &lt;a href="https://www.millicast.com/contactus/" rel="noopener noreferrer"&gt;support desk&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>react</category>
      <category>webrtc</category>
      <category>javascript</category>
    </item>
    <item>
      <title>5 Different ways to Broadcast SRT Streams</title>
      <dc:creator>Braden Riggs</dc:creator>
      <pubDate>Mon, 09 Jan 2023 19:16:36 +0000</pubDate>
      <link>https://forem.com/dolbyio/4-different-ways-to-broadcast-srt-streams-21jj</link>
      <guid>https://forem.com/dolbyio/4-different-ways-to-broadcast-srt-streams-21jj</guid>
      <description>&lt;p&gt;&lt;a href="https://dolby.io/blog/broadcasting-srt-streams-with-dolby-io/" rel="noopener noreferrer"&gt;Originally published here&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;SRT, or Secure Reliable Transport, is a type of streaming protocol that provides enhanced security and reliability for video streaming. SRT is becoming increasingly popular among broadcasters and streamers including industry stalwarts such as ESPN because of its ability to deliver high-quality content over challenging network conditions and for its ability to &lt;a href="https://dolby.io/solutions/remote-production-remi/" rel="noopener noreferrer"&gt;make contribution and stream ingestion easy&lt;/a&gt;. SRT streams provide improved security, low latency, and flexibility and is supported by a global community of developers all contributing to the &lt;a href="https://github.com/Haivision/srt" rel="noopener noreferrer"&gt;open-source project&lt;/a&gt;. Because of the power of SRT streams,&lt;a href="https://dolby.io/products/real-time-streaming/" rel="noopener noreferrer"&gt; Dolby.io Real-Time Streaming&lt;/a&gt; has decided to launch support with an &lt;a href="https://docs.dolby.io/streaming-apis/docs/using-srt" rel="noopener noreferrer"&gt;SRT open beta program&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;In this guide, we'll cover a few different ways you can start broadcasting SRT streams with Dolby.io such as OBS, vMix and many more:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dolby.io/blog/broadcasting-srt-streams-with-dolby-io/#h-streaming-srt-with-obs" rel="noopener noreferrer"&gt;Streaming SRT with OBS&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dolby.io/blog/broadcasting-srt-streams-with-dolby-io/#streaming-srt-vmix" rel="noopener noreferrer"&gt;Streaming SRT with vMix&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dolby.io/blog/broadcasting-srt-streams-with-dolby-io/#srt-iphone" rel="noopener noreferrer"&gt;Streaming SRT with your iPhone&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dolby.io/blog/collaborative-post-production-with-avid-media-composer" rel="noopener noreferrer"&gt;Streaming SRT with Avid Media Composer&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dolby.io/blog/broadcasting-srt-streams-with-dolby-io/#srt-osprey" rel="noopener noreferrer"&gt;Streaming SRT Directly from an Osprey Talon Encoder &lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Streaming SRT with OBS
&lt;/h2&gt;

&lt;p&gt;For readers familiar with the &lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; platform you might know about &lt;a href="https://dolby.io/blog/using-webrtc-in-obs-for-remote-live-production/" rel="noopener noreferrer"&gt;our custom forked version of OBS&lt;/a&gt; designed to stream WebRTC natively. Although you can use our WebRTC-enabled OBS fork, you can actually publish SRT streams to the &lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; servers from the original OBS project. To do this you must have an &lt;a href="https://dashboard.dolby.io/signup" rel="noopener noreferrer"&gt;active Dolby.io account, which you can create for free&lt;/a&gt; and the &lt;a href="https://obsproject.com/" rel="noopener noreferrer"&gt;latest version of OBS installed on your system&lt;/a&gt;. To start publishing SRT streams with OBS follow the steps below:&lt;/p&gt;

&lt;p&gt;1. &lt;a href="https://dashboard.dolby.io/signin" rel="noopener noreferrer"&gt;Login&lt;/a&gt; or&lt;a href="https://dashboard.dolby.io/signup" rel="noopener noreferrer"&gt; create a Dolby.io account&lt;/a&gt; and &lt;a href="https://obsproject.com/" rel="noopener noreferrer"&gt;download OBS&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;2. Navigate to your Dolby.io streaming dashboard and create a new token. You can leave all the token settings to default.&lt;/p&gt;

&lt;p&gt;3. Open the API tab on your newly created token dashboard and navigate to the bottom where you'll see the &lt;code&gt;SRT publish path&lt;/code&gt;, the &lt;code&gt;SRT stream ID&lt;/code&gt;, and the &lt;code&gt;SRT publish URL&lt;/code&gt;. Copy &lt;code&gt;SRT publish URL&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fsrtbox.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fsrtbox.png" alt="Pictured is a screenshot of Dolby.io Streaming Token API tab. Highlighted on screen in a red box is the SRT publish URL used in OBS." width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Dolby.io Streaming Token API tab. Highlighted box indicates the SRT publish URL used in OBS.&lt;/p&gt;

&lt;p&gt;4. Open OBS and navigate to settings, then the &lt;code&gt;Stream&lt;/code&gt; tab.&lt;/p&gt;

&lt;p&gt;5. Inside of the &lt;code&gt;Stream&lt;/code&gt; tab, set &lt;code&gt;Service&lt;/code&gt; to &lt;code&gt;Custom&lt;/code&gt; and &lt;code&gt;Server&lt;/code&gt; to the &lt;code&gt;SRT publish URL&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fobssrt.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fobssrt.jpg" alt="Pictured is a screenshot of the black and grey OBS stream settings page. On screen the Service is set to " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;OBS stream settings page. Remember to set Service to "Custom" and Server to "Your SRT Publish URL".&lt;/p&gt;

&lt;p&gt;6. Apply the changes and exit settings. You are now all set up to stream with OBS. When publishing, your SRT stream will be delivered to the Dolby.io Streaming Viewer, which can be found at the Hosted Player Path.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fhosted-player.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fhosted-player.png" alt="Pictured is a screenshot of the Doby.io Streaming Token API tab, with hosted player path highlighted in a red box. " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Dolby.io Streaming Token API tab, with hosted player path highlighted. Opening this path in a browser will launch the stream.&lt;/p&gt;

&lt;p&gt;Although the hosted player path is a great way to view the stream, you can use the &lt;a href="https://dolby.io/blog/building-a-low-latency-livestream-viewer-with-webrtc-millicast/" rel="noopener noreferrer"&gt;Dolby.io Streaming JavaScript SDK&lt;/a&gt; to build a bespoke solution.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Note: If you are using the &lt;code&gt;NVIDIA NVENC H.264&lt;/code&gt; encoder that comes included with OBS you must set &lt;code&gt;Max B-Frames&lt;/code&gt; to &lt;code&gt;0&lt;/code&gt;. This setting can be found in Output, then Advanced Output Mode, then the Streaming tab, where Encoder is set to &lt;code&gt;NVIDIA NVENC H.264&lt;/code&gt; and then Max B-frames is set to 0.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2FB-frames.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2FB-frames.png" alt="If you are using the NVIDIA NVENC H.264 encoder that comes included with OBS you must set Max B-Frames to 0. Image depicts this fix in the settings which can be found in Output, then Advanced Output Mode, then the Streaming tab, where Encoder is set to NVIDIA NVENC H.264 and then Max B-frames is set to 0. Image depicts each of these settings highlighted in red boxes for clarity." width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you are using the &lt;code&gt;NVIDIA NVENC H.264&lt;/code&gt; encoder that comes included with OBS you must set &lt;code&gt;Max B-Frames&lt;/code&gt; to &lt;code&gt;0&lt;/code&gt;. This setting can be found in Output, then Advanced Output Mode, then the Streaming tab, where Encoder is set to &lt;code&gt;NVIDIA NVENC H.264&lt;/code&gt; and then Max B-frames is set to 0.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Streaming SRT with vMix
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.vmix.com/" rel="noopener noreferrer"&gt;vMix&lt;/a&gt; is a paid windows-only remote production tool used for vision mixing. It allows users to juggle input and outputs for live broadcasts and productions and includes support for publishing SRT streams. To publish an SRT stream with vMix follow the steps below:&lt;/p&gt;

&lt;p&gt;1. &lt;a href="https://dashboard.dolby.io/signin" rel="noopener noreferrer"&gt;Login&lt;/a&gt; or&lt;a href="https://dashboard.dolby.io/signup" rel="noopener noreferrer"&gt; create a Dolby.io account&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;2. &lt;a href="https://www.vmix.com/" rel="noopener noreferrer"&gt;Download and open vMix&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;3. Navigate to your &lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; streaming dashboard and create a new token. You can leave all the token settings to default.&lt;/p&gt;

&lt;p&gt;4. Open the API tab on your newly created token dashboard and navigate to the bottom where you'll see the &lt;code&gt;SRT publish path&lt;/code&gt;, &lt;code&gt;SRT stream ID&lt;/code&gt;, and the &lt;code&gt;SRT publish URL&lt;/code&gt;. Copy the &lt;code&gt;SRT publish path&lt;/code&gt; and the &lt;code&gt;SRT stream ID&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;5. Inside of vMix open &lt;code&gt;settings&lt;/code&gt; and switch to &lt;code&gt;Output / NDI / SRT&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fvmix.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fvmix.png" alt="Pictured is a screenshot of the vMix mixing stage. Highlighted in a red box is the settings users should click on." width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The vMix mixing stage. Navigate to "Settings" and click on "Outputs / NDI / SRT" to open up the SRT settings menu.&lt;/p&gt;

&lt;p&gt;6. Once you've switched to &lt;code&gt;Output / NDI / SRT&lt;/code&gt; open the gear icon next to an output source.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fvmixsettings.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fvmixsettings.png" alt="Pictured is a screenshot of vMix settings with the SRT settings tab highlighted in red and the gear icon next to output 1 highlighted in red." width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Inside the SRT settings, select the gear icon highlighted in red.&lt;/p&gt;

&lt;p&gt;7. Inside the output settings &lt;code&gt;enable SRT&lt;/code&gt;, set the &lt;code&gt;Hostname&lt;/code&gt; to the Dolby.io Millicast endpoint and the &lt;code&gt;Port&lt;/code&gt; to the appropriate port (typically 10,000). Additionally, include the &lt;code&gt;Stream ID&lt;/code&gt; and make sure the Quality settings match &lt;a href="https://dolby.io/blog/broadcasting-srt-streams-with-dolby-io/#srt-limits" rel="noopener noreferrer"&gt;the limitations of Dolby.io SRT streaming&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fsettings.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fsettings.png" alt="Pictured is a screenshot of the vMix Output 1 Outpub Settings with Enable SRT, Hostname, Port, StreamID, and Quality all highlighted in red boxes denoting their importance for creating a successful SRT stream." width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When creating the SRT stream define Hostname, port, Stream ID, and Quality.&lt;/p&gt;

&lt;p&gt;8. Press &lt;code&gt;OK&lt;/code&gt; and exit settings. You are now all set up to stream with vMix. When streaming, your SRT stream will be delivered to the &lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; Streaming Viewer, which can be found at the Hosted Player Path.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fhosted-player-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fhosted-player-1.png" alt="Pictured is a screenshot of the Doby.io Streaming Token API tab, with hosted player path highlighted in a red box. " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Dolby.io Streaming Token API tab, with hosted player path highlighted. Opening this path in a browser will launch the stream.&lt;/p&gt;

&lt;p&gt;Although the hosted player path is a great way to view the stream, you can use the &lt;a href="https://dolby.io/blog/building-a-low-latency-livestream-viewer-with-webrtc-millicast/" rel="noopener noreferrer"&gt;Dolby.io Streaming JavaScript SDK&lt;/a&gt; to build out a bespoke solution.&lt;/p&gt;

&lt;h2&gt;
  
  
  Streaming SRT with your iPhone
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://softvelum.com/larix/" rel="noopener noreferrer"&gt;Softvelum's Larix Broadcaster&lt;/a&gt; is a tool available for iOS, Android, and React Native that allows you to push SRT streams directly from your mobile device. To set up a Larix SRT stream on an iOS device:&lt;/p&gt;

&lt;p&gt;1. &lt;a href="https://dashboard.dolby.io/signin" rel="noopener noreferrer"&gt;Login&lt;/a&gt; or&lt;a href="https://dashboard.dolby.io/signup" rel="noopener noreferrer"&gt; create a Dolby.io account&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;2. Download the Larix Broadcaster from the App Store.&lt;/p&gt;

&lt;p&gt;3. Navigate to your &lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; streaming dashboard and create a new token. You can leave all the token settings to default.&lt;/p&gt;

&lt;p&gt;4. Open the API tab on your newly created token dashboard and navigate to the bottom where you'll see the &lt;code&gt;SRT publish path&lt;/code&gt;, the &lt;code&gt;SRT stream ID&lt;/code&gt;, and the &lt;code&gt;SRT publish URL&lt;/code&gt;. Copy the &lt;code&gt;SRT publish path&lt;/code&gt; and the &lt;code&gt;SRT stream ID&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;5. Open the Larix Broadcaster and then &lt;code&gt;Settings&lt;/code&gt;. From &lt;code&gt;Settings&lt;/code&gt;, go to &lt;code&gt;Connections&lt;/code&gt; and add a new connection.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fiosapp-3.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fiosapp-3.jpeg" alt="Pictured is a screenshot from an iOS device using the Larix Broadcaster with a red box highlighting a plus icon." width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Create a new connection with the plus icon in the top right corner.&lt;/p&gt;

&lt;p&gt;6. Inside the connection, set the &lt;code&gt;URL&lt;/code&gt; parameter to your Dolby.io Real-Time Streaming &lt;code&gt;SRT publish path&lt;/code&gt; and set &lt;code&gt;streamid&lt;/code&gt; to your &lt;code&gt;SRT stream ID.&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fiosapp-2.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fiosapp-2.jpeg" alt="Pictured is a screenshot of an iOS device using the Larix Broadcaster with a red box around URL and streamid to indicate their importance to starting the srt stream." width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When adding a new connection in the Larix Broadcaster make sure to assign "streamid" and "URL".&lt;/p&gt;

&lt;p&gt;7. From here you can exit your settings and start the stream by pressing the record button on the broadcaster.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fiosapp-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fiosapp-1.png" alt="Pictured is a screenshot of an iOS device on the Larix Broadcaster screen with the recording button active and stream started. The srt stream itself is of a black screen with no features." width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Press the record button on the left to start an SRT stream.&lt;/p&gt;

&lt;p&gt;8. Like the OBS and vMix examples, your SRT stream will be delivered to the &lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; Streaming Viewer, which can be found at the Hosted Player Path.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fhosted-player-2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fhosted-player-2.png" alt="Pictured is a screenshot of the Doby.io Streaming Token API tab, with hosted player path highlighted in a red box. " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Dolby.io Streaming Token API tab, with hosted player path highlighted. Opening this path in a browser will launch the stream.&lt;/p&gt;

&lt;p&gt;Dolby.io Real-time Streaming supports a number of SDKs for creating viewer apps &lt;a href="https://dolby.io/blog/building-a-real-time-streaming-app-with-webrtc-and-flutter-3/" rel="noopener noreferrer"&gt;including a Flutter 3 SDK&lt;/a&gt; for creating viewer apps for Android, iOS, and Web.&lt;/p&gt;

&lt;h2&gt;
  
  
  Streaming SRT directly from an Osprey Talon Encoder 
&lt;/h2&gt;

&lt;p&gt;OBS, vMix, and Larix Broadcaster are examples of software tools that you can leverage for streaming SRT into the Dolby.io Streaming service, but what about hardware options? Depending on the scale of live production you might have access to cameras with built-in encoders that can directly egress SRT, which we can also connect to the servers. For cameras that don't have built-in encoders, you can connect the camera to an external encoder, some of which support SRT. One example of this is the Osprey Talon 4K-SC, which is not only the first WHIP encoder but can also encode SRT streams that we can connect to the Dolby.io servers. &lt;/p&gt;

&lt;p&gt;1. &lt;a href="https://dashboard.dolby.io/signin" rel="noopener noreferrer"&gt;Login&lt;/a&gt; or&lt;a href="https://dashboard.dolby.io/signup" rel="noopener noreferrer"&gt; create a Dolby.io account&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;2. Connect your Osprey Encoder to your camera and power it up.&lt;/p&gt;

&lt;p&gt;3. Download the &lt;a href="https://www.ospreyvideo.com/_files/ugd/2c643c_d4f522d8a6244a6994f12de0f40721b8.pdf" rel="noopener noreferrer"&gt;Osprey BOSS PRO application&lt;/a&gt;, which will allow you to discover the encoder on your local network. Alternatively, &lt;a href="https://www.ospreyvideo.com/_files/ugd/2c643c_d4f522d8a6244a6994f12de0f40721b8.pdf" rel="noopener noreferrer"&gt;follow this in-depth guide by the Osprey team&lt;/a&gt; for setting up your encoder.&lt;/p&gt;

&lt;p&gt;4. Click on the appropriate encoder, launch the web interface and sign in. Information regarding signing into Osprey equipment &lt;a href="https://www.ospreyvideo.com/_files/ugd/2c643c_d4f522d8a6244a6994f12de0f40721b8.pdf" rel="noopener noreferrer"&gt;can be found here&lt;/a&gt;. Once signed in you will now be in the Osprey Dashboard.&lt;/p&gt;

&lt;p&gt;5. Navigate to your &lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; streaming dashboard and create a new token. You can leave all the token settings to default.&lt;/p&gt;

&lt;p&gt;6. Open the API tab on your newly created token dashboard and navigate to the bottom where you'll see the &lt;code&gt;SRT publish path&lt;/code&gt;, the &lt;code&gt;SRT stream ID&lt;/code&gt;, and the &lt;code&gt;SRT publish URL&lt;/code&gt;. Copy the &lt;code&gt;SRT publish path&lt;/code&gt; and the &lt;code&gt;SRT stream ID&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fsrtbox-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fsrtbox-1.png" alt="Pictured is a screenshot of Dolby.io Streaming Token API tab. Highlighted on screen in a red box is the SRT publish URL used in OBS." width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Dolby.io Streaming Token API tab. Highlighted box indicates the SRT publish URL used in OBS.&lt;/p&gt;

&lt;p&gt;7. Inside the Osprey Dashboard, set &lt;code&gt;SRT Dest Address&lt;/code&gt; to the &lt;code&gt;SRT publish path&lt;/code&gt; excluding the port. Set &lt;code&gt;SRT Port&lt;/code&gt; to the port number at the end of your &lt;code&gt;SRT publish path&lt;/code&gt; (usually 10000) and set &lt;code&gt;SRT Stream ID &lt;/code&gt;to your &lt;code&gt;SRT Stream ID. &lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fosprey.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdolby.io%2Fwp-content%2Fuploads%2F2022%2F12%2Fosprey.png" alt="Pictured on screen is a screenshot of the black and grey Osprey settings board with SRT Dest Address, SRT Port, and SRT Stream ID highlighted in red to indicate where users should input credentials to start an srt stream through the dolby.io servers." width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Set the SRT Dest Address to the SRT publish path, and SRT Port to 10000, and the SRT Stream ID to your Dolby.io Streaming Token Stream ID.&lt;/p&gt;

&lt;p&gt;8. From here press start and the encoder will begin streaming content through the Dolby.io servers.&lt;/p&gt;

&lt;h2&gt;
  
  
  Limitations of Publishing SRT Streams to &lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Streaming SRT is just one part of the equation, &lt;a href="https://dolby.io/products/real-time-streaming/" rel="noopener noreferrer"&gt;Dolby.io Real-time Streaming&lt;/a&gt; also supports &lt;a href="https://docs.dolby.io/streaming-apis/docs/client-sdks" rel="noopener noreferrer"&gt;a number of SDKs&lt;/a&gt; for building streaming into your platforms and apps. If you are interested in learning more about how to use our SDKs &lt;a href="https://dolby.io/blog/" rel="noopener noreferrer"&gt;check out our blog&lt;/a&gt; and let us know what you're building next.\&lt;br&gt;
Feedback or Questions? Reach out to the team on &lt;a href="https://twitter.com/DolbyIO?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt;, &lt;a href="https://www.linkedin.com/company/dolbyio/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, or via our &lt;a href="https://www.millicast.com/contactus/" rel="noopener noreferrer"&gt;support desk&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>api</category>
      <category>frontend</category>
      <category>programming</category>
    </item>
    <item>
      <title>Automating Your Stream Start, Intro, and Ending Processes with OBS Macros</title>
      <dc:creator>Griffin</dc:creator>
      <pubDate>Wed, 16 Nov 2022 22:04:39 +0000</pubDate>
      <link>https://forem.com/dolbyio/automating-your-stream-start-intro-and-ending-processes-with-obs-macros-1p24</link>
      <guid>https://forem.com/dolbyio/automating-your-stream-start-intro-and-ending-processes-with-obs-macros-1p24</guid>
      <description>&lt;p&gt;As you begin developing a brand and a streaming presence, it is often desirable to define and establish your branding with digital assets that are used at the beginning and ends of each of your stream. Often times we see this as an intro video that will broadcast to your audience that your stream is beginning, giving them time to settle in and get excited for the show, whether it be an auction, sports broadcast, gaming event, or something else. While entirely possible to do this manually every time, it can add to the growing complexity of tasks to do when beginning your broadcast. Sometimes you might even forget to play the video at all, human error and all. In this article, we will showcase a few different ways we can automate the media playback to occur whenever you begin your stream, taking an extra step away so you can focus on the broadcast while keeping your brand intact.&lt;/p&gt;

&lt;h2&gt;
  
  
  Starting Stream
&lt;/h2&gt;

&lt;p&gt;Before thinking about what the intro video is going to look like, first think about what the very beginning of a stream looks like. Most viewers who show up early will see a “Broadcast is not Live” page by default, which isn’t ideal. We want to ensure that users know that you are about to go live soon to keep them on the page instead of closing the window. Though if we start too early, not enough users will be around to see the intro video in the first place, both losing brand awareness, and letting people feel like they missed out. The solution to this is to still start the stream ahead of schedule, but with a static image or video that simply lets the audience know that stream is starting soon.&lt;/p&gt;

&lt;p&gt;We can automate the process of beginning this video with an OBS plugin called &lt;a href="https://github.com/WarmUpTill/SceneSwitcher/releases/latest" rel="noopener noreferrer"&gt;Advanced Scene Switcher&lt;/a&gt;. Once downloaded and installed with the appropriate installer for your system, this plugin will create a new option on the “Tools” menu of OBS with the same name.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc9n6essl5zntx5pne7jc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc9n6essl5zntx5pne7jc.png" alt="tools menu" width="300" height="285"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Note: If you are experiencing issues seeing this plugin in the Tools menu, ensure you are not installing OBS from Homebrew on MacOS or similar package managers, but from the official install site. We have experienced issues with the package managed versions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Adding Advanced Scene Switcher to OBS-WebRTC
&lt;/h2&gt;

&lt;p&gt;For use with Dolby.io Streaming, we will want to add the plugin to our installation of OBS-WebRTC. Thankfully, the plugin is compatible, however the installer will only look for OBS-Studio. To add the plugin manually to OBS-WebRTC, do as follows:&lt;/p&gt;

&lt;p&gt;MacOS:&lt;/p&gt;

&lt;p&gt;In &lt;code&gt;~/Library/Application Support/obs-studio&lt;/code&gt; there will be a folder titled “plugins”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2b2balx7k3ew97wrkod3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2b2balx7k3ew97wrkod3.png" alt="Plugins Folder" width="574" height="466"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Copy this folder with the installed plugin(s) and paste it into &lt;code&gt;~/Library/Application Support/obs-webrtc&lt;/code&gt;. Restart OBS-WebRTC and it should appear as normal.&lt;/p&gt;

&lt;p&gt;Windows:&lt;/p&gt;

&lt;p&gt;Follow the above instructions, but replacing the MacOS path with the Windows path for plugins:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;C:\Program Files\OBS-Studio\obs-plugins\64bit&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;C:\Program Files\OBS-WebRTC\obs-plugins\64bit&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Using Advanced Scene Switcher
&lt;/h2&gt;

&lt;p&gt;Upon opening Advanced Scene Switcher, we should see a few options. The most important of which is determining if it is on. You can customize how you want OBS to auto-start the plugin, though ensure that the status is Active before using it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3rzxdzal45u1105sy2wi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3rzxdzal45u1105sy2wi.png" alt="Inactive Plugin" width="800" height="74"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, switch to the “Macro” tab.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F02oo27vfl3bbbalyrxli.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F02oo27vfl3bbbalyrxli.png" alt="Macro Tab" width="800" height="477"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here we can begin adding macros for our stream to automate multiple different actions. Let’s begin with the “Stream is about to begin” automation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Starting the Stream
&lt;/h2&gt;

&lt;p&gt;Inside the Macro tab, we have a few different panels to work with. To begin, lets add a new macro by clicking the “+” under the left “Macros” panel and name it what you want. In this case, I named it “Starting stream”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkxmb810hnmpnumr96eey.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkxmb810hnmpnumr96eey.png" alt="Macro Bar" width="348" height="992"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, under the “Edit macro” panel, we have two sections, the macro conditions on top and the macro actions on the bottom. First, lets click the “+” under macro conditions to create a new condition. This will then generate a conditional statement builder for us to define what will begin our macro. In this case, we want it to read “If Streaming Stream starting”. This will trigger the macro whenever we start the stream.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsxzaborjjlyn8zrumync.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsxzaborjjlyn8zrumync.png" alt="Stream Starting" width="800" height="153"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, we want to create a new macro action by clicking the “+” under the bottom panel. There are a couple of things we want to do here, the first of which being to switch to our Starting stream scene. This can be one with an action with the following information: Switch scene → Switch to scene  using “Cut”, which will automatically set the stream to that scene upon starting the broadcast.&lt;/p&gt;

&lt;p&gt;If you haven’t already yet, we suggest creating a scene with a placeholder image or looping video to direct to.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpxhkr4d9s13vvpatblsv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpxhkr4d9s13vvpatblsv.png" alt="Stream Starting Switch" width="800" height="181"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can also add in some audio as a part of this automation. First, lets add a media source into our scene with some royalty free music we want to play as people join the stream. This could also be a looping video if you are not using a static image in the scene. Ensure that “Loop” and “Restart playback when source becomes active” are checked to ensure that the music doesn’t fully stop until you tell it to.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqh90z189xbu0oqnac1v9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqh90z189xbu0oqnac1v9.png" alt="Add Media" width="800" height="671"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, back in Advanced Scene Switcher, we can add a new macro action to our Starting stream macro to read “Media Music Play ”. This will then ensure that the audio file starts upon starting your stream!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4z71aqkh3xxpd2mpk356.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4z71aqkh3xxpd2mpk356.png" alt="Play Music" width="800" height="119"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, upon starting your stream, OBS will automatically switch to your “Starting stream” scene, and play the audio/video file for a greeting page for all early viewers to be welcomed by as you start amassing viewers.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Stream Intro
&lt;/h2&gt;

&lt;p&gt;Now that we have created our “Starting soon” macro, we will want to add in a macro for an intro video that will play before switching to the live video feed. This can be done very much in a similar way to what we did above, but with a different macro condition. With a new macro we will title “Intro”, we want to create a condition this time with the “Hotkey” option. Name it whatever seems best to you.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftw2fpyvju4fhifp4c37p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftw2fpyvju4fhifp4c37p.png" alt="Start Intro Hotkey" width="800" height="121"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can then modify the hotkey within the OBS Settings menu under “Hotkeys” to whichever key command seems best, which can be found under the name submitted in the previous step.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx0lvibex73stpyud2k34.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx0lvibex73stpyud2k34.png" alt="OBS Hotkeys Menu" width="800" height="593"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Back in Advance Scene Switcher, we can now add in the actions as we did before. Let’s add in another Switch scene:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9uwzxuq9lr3v8a7zmqyv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9uwzxuq9lr3v8a7zmqyv.png" alt="Swap to Intro" width="800" height="147"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Another Media action:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fafkgif0m1lylfzxkipa5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fafkgif0m1lylfzxkipa5.png" alt="Play Video" width="800" height="108"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This time, we want to switch the scene one more time after the video finishes. To do this, we first need to add a “Wait” action equal to the length of the intro video. This will keep the actions from triggering before the video is finished.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fraynnszdjis56wsokxwo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fraynnszdjis56wsokxwo.png" alt="Wait Command" width="800" height="113"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then after the wait, we can Switch scene one last time:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrarqut0bglis6ulkbsa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdrarqut0bglis6ulkbsa.png" alt="Swap to Live Feed" width="800" height="146"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And now upon hitting the assigned hotkey, the Getting Started scene will swap to your intro video, play it, then switch to the live feed when complete!&lt;/p&gt;

&lt;h2&gt;
  
  
  Ending Stream Macro
&lt;/h2&gt;

&lt;p&gt;Before ending your stream, you may want to give your viewers another “Stream is ending” screen to let them know the broadcast is completed if tuning in. This will be build very similarly to the previous macro. To begin, we start with another Hotkey condition:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F35xbb6ub7fjyg3r63ntx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F35xbb6ub7fjyg3r63ntx.png" alt="Ending Hotkey" width="800" height="126"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, we switch the scene to “Stream Ending”:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd24nz6l45gx58hmgbqja.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd24nz6l45gx58hmgbqja.png" alt="Stream Ending Swap" width="800" height="141"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Play our exit music or video&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsdlmsl5quzxibikx3wew.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsdlmsl5quzxibikx3wew.png" alt="Play Music" width="800" height="119"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Wait for the media to finish:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4javj2vivdzttm9w2xi1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4javj2vivdzttm9w2xi1.png" alt="Wait Again" width="800" height="115"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then stop the stream with a final action of “Streaming → Stop Streaming”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3voly16bua8hv8zupd1r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3voly16bua8hv8zupd1r.png" alt="Stop Streaming" width="800" height="110"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And that’s it! One hotkey will transition to an ending slide, play music, and end the stream for you. No more fiddling around with multiple applications and buttons to finish that task.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;In this article, we outlined a few different things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Installing Advanced Scene Switcher to OBS (and OBS-WebRTC)&lt;/li&gt;
&lt;li&gt;Configuring macros for

&lt;ul&gt;
&lt;li&gt;Starting a stream&lt;/li&gt;
&lt;li&gt;Playing an intro video&lt;/li&gt;
&lt;li&gt;Ending a stream&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Adding custom hotkey support&lt;/li&gt;

&lt;li&gt;Managing autoplayed media for streams&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;This is an extremely useful tool for turning your live streams into a more professional, broadcast quality operation, with use cases expanding to many different industries needing to live stream their content and events.&lt;/p&gt;

&lt;p&gt;Read more about the Advanced Scene Switcher on &lt;a href="https://obsproject.com/forum/resources/advanced-scene-switcher.395/" rel="noopener noreferrer"&gt;the OBS Forum&lt;/a&gt;, or read &lt;a href="https://dolby.io/blog/using-webrtc-in-obs-for-remote-live-production/" rel="noopener noreferrer"&gt;this blog post&lt;/a&gt; on getting started with OBS-WebRTC to enable your live streams to broadcast in real time for maximum interactivity with your viewers.&lt;/p&gt;

&lt;p&gt;Happy streaming!&lt;/p&gt;

</description>
      <category>mentorship</category>
      <category>discuss</category>
    </item>
    <item>
      <title>Using NDI in your Real Time Live Streaming Production Workflow</title>
      <dc:creator>Dan Zeitman</dc:creator>
      <pubDate>Mon, 07 Nov 2022 20:18:33 +0000</pubDate>
      <link>https://forem.com/dolbyio/using-ndi-in-your-real-time-live-streaming-production-workflow-59m7</link>
      <guid>https://forem.com/dolbyio/using-ndi-in-your-real-time-live-streaming-production-workflow-59m7</guid>
      <description>&lt;p&gt;If you're a developer who's also creating live streams and content you know that it takes a lot of effort setup a solid content streaming workflow. &lt;/p&gt;

&lt;h2&gt;
  
  
  NDI to the rescue!
&lt;/h2&gt;

&lt;p&gt;NDI® (Network Device Interface) is a free protocol for Video over IP, developed by NewTek. The innovation is in the protocol, which makes it possible to stream video and media across networks with low latency from many device sources. These NDI device sources can be physical hardware or software based. This makes it possible to connect to any device, in any location, anywhere in the world – and transmit live video to wherever you are.  There are a suite of &lt;a href="https://www.ndi.tv/tools/" rel="noopener noreferrer"&gt;NDI tools&lt;/a&gt; that work directly with NDI systems and sources on your network. Combine NDI with &lt;a href="https://Dolby.io" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; Real-time Streaming to deliver real time video for remote or interactive experiences.  &lt;/p&gt;

&lt;p&gt;Dolby.io Real-time Streaming offers incredibly low latency streams typically under a second. And even better, besides having the ability to white-label your own stream with their viewer, or &lt;a href="https://docs.dolby.io/streaming-apis/docs/getting-started" rel="noopener noreferrer"&gt;develop a complete streaming solution&lt;/a&gt;, that same stream can also be distributed through streaming services such as YouTube, Facebook and Twitch. &lt;/p&gt;

&lt;h2&gt;
  
  
  Devices Everywhere
&lt;/h2&gt;

&lt;p&gt;There are many low-to-moderate cost prosumer video devices, PTZ cameras and security systems that offer NDI support. &lt;/p&gt;

&lt;p&gt;If you do not have a camera that supports NDI, you can simply download one of many software-based solutions that stream video and audio over your network over NDI. &lt;/p&gt;

&lt;p&gt;In fact, that shiny new iPhone with the amazing and gorgeous camera might actually provide the best camera solution for live streaming content over NDI. &lt;/p&gt;

&lt;p&gt;Some of our customers have had great success with various apps that are available in the App Store. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://apps.apple.com/us/app/ndi-hx-camera/id1477266080" rel="noopener noreferrer"&gt;NDI HX Camera&lt;/a&gt; by NewTek and &lt;a href="https://apps.apple.com/us/app/stream-camera-for-ndi-hx/id1633326432" rel="noopener noreferrer"&gt;Stream Camera for NDI HX&lt;/a&gt; by fellow iOS developer Thomas Backes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Streaming Workflow
&lt;/h2&gt;

&lt;p&gt;Everyone has their own opinion on what a good live streaming content workflow actually looks like. You decide.   We've created a &lt;a href="https://docs.dolby.io/streaming-apis/docs/using-ndi" rel="noopener noreferrer"&gt;quick guide&lt;/a&gt; to make it easy for you to integrate &lt;strong&gt;&lt;em&gt;your&lt;/em&gt;&lt;/strong&gt; workflow; you have multiple options to publish NDI out with your Dolby.io account. This &lt;a href="https://docs.dolby.io/streaming-apis/docs/using-ndi" rel="noopener noreferrer"&gt;guide&lt;/a&gt; will walk you through two of these options and assumes NDI tools are already installed on your computer.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Dolby.io recently acquired Millicast; you many note some references in the documentation. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  OBS WebRTC
&lt;/h2&gt;

&lt;p&gt;Besides our web application, we also provide a forked version of OBS that's fine tuned for advanced 4K streaming and other features of the Dolby.io platform. &lt;/p&gt;

&lt;p&gt;Download the &lt;a href="https://github.com/CoSMoSoftware/OBS-studio-webrtc/releases" rel="noopener noreferrer"&gt;OBS WebRTC&lt;/a&gt; publisher.&lt;/p&gt;

&lt;p&gt;In OBS create your NDI scene and add your NDI source, which can be from a camera or a mobile app. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F78evnzga9brqzlc42g96.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F78evnzga9brqzlc42g96.png" alt="Image of OBS NDI settings panel" width="800" height="636"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You are now ready to start publishing using NDI with OBS WebRTC for a real time broadcast at scale.&lt;/p&gt;

&lt;p&gt;For the stream, OBS has the following settings:&lt;/p&gt;

&lt;p&gt;VP9&lt;br&gt;
1920x1080&lt;br&gt;
Bitrate 4000Kbps&lt;br&gt;
FPS 30&lt;/p&gt;

&lt;p&gt;You can adjust the OBS WebRTC settings as needed to deliver the best quality and experience. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://dashboard.dolby.io/signup" rel="noopener noreferrer"&gt;Sign up&lt;/a&gt; to get started and then choose streaming to navigate to the Real-time Streaming API section.&lt;/p&gt;

</description>
      <category>streaming</category>
      <category>contentproduction</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Building a Livestream App with Flutter 3</title>
      <dc:creator>Braden Riggs</dc:creator>
      <pubDate>Mon, 31 Oct 2022 20:19:09 +0000</pubDate>
      <link>https://forem.com/dolbyio/building-a-real-time-streaming-app-with-webrtc-and-flutter-3-2ghj</link>
      <guid>https://forem.com/dolbyio/building-a-real-time-streaming-app-with-webrtc-and-flutter-3-2ghj</guid>
      <description>&lt;p&gt;&lt;a href="https://dolby.io/blog/building-a-real-time-streaming-app-with-webrtc-and-flutter-3/" rel="noopener noreferrer"&gt;Originally published here.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Streaming, especially the low latency kind, has become a popular medium to engage with an audience, &lt;a href="https://dolby.io/solutions/events/" rel="noopener noreferrer"&gt;host live events&lt;/a&gt;, and connect people virtually. For developers building streaming apps, however, there is just one issue. If we are interested in connecting to a wide audience we need to develop for a wide range of platforms such as Android, iOS, Web, and even desktop native apps, which can quickly become a heavy lift for any team. This is where &lt;a href="https://flutter.dev/?gclid=CjwKCAjw4c-ZBhAEEiwAZ105RYihY2PWVmum6IojgwCKgGWKZg9IOYmyhWlapji_zIYo_FpW-vW8tRoCoKcQAvD_BwE&amp;amp;gclsrc=aw.ds" rel="noopener noreferrer"&gt;Flutter 3&lt;/a&gt; comes in, released in May of 2022, Flutter 3 takes cross-platform to the next level allowing users to "&lt;em&gt;build for any screen&lt;/em&gt;" from a single code base. Hence, rather than building 3 separate apps for iOS, Android, and Web, you can build just one. To further sweeten the deal,&lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt; Dolby.io&lt;/a&gt; has recently released their &lt;a href="https://docs.dolby.io/streaming-apis/docs/flutter" rel="noopener noreferrer"&gt;WebRTC real-time streaming SDK for Flutter&lt;/a&gt;, allowing users to &lt;a href="https://dolby.io/products/real-time-streaming/" rel="noopener noreferrer"&gt;build cross-platform streaming apps&lt;/a&gt; that combine scalability and ultra-low delay. &lt;/p&gt;

&lt;p&gt;In this guide, we'll be exploring how to build a cross-platform real-time streaming app that works on Android, iOS, Desktop Native, and Web using the&lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;&lt;/a&gt;&lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io Streaming SDK for Flutter&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa6gjawzp26fcfved0pha.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa6gjawzp26fcfved0pha.jpg" alt="An example of the Flutter real-time streaming app in action, streaming out to a chrome tab." width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting Started with the Real-Time Streaming SDK
&lt;/h2&gt;

&lt;p&gt;Before we begin you need to make sure you have the &lt;a href="https://docs.flutter.dev/get-started/install" rel="noopener noreferrer"&gt;latest version of Flutter installed&lt;/a&gt; and set up on your machine. To get started with building a streaming app we need to install the &lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; Streaming SDK for Flutter 3 via the terminal.&lt;br&gt;
&lt;br&gt;
 &lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;flutter pub add millicast_flutter_sdk
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Then run the following command in terminal to download the dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;flutter pub get
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With the Flutter Streaming SDK installed, you can start by creating a &lt;a href="https://docs.flutter.dev/get-started/test-drive?tab=vscode" rel="noopener noreferrer"&gt;vanilla Flutter app&lt;/a&gt; and add the most recent version of &lt;code&gt;flutter_webrtc&lt;/code&gt; to your project's &lt;code&gt;pubspec.yaml. &lt;/code&gt;You should also see that the Dolby.io Millicast flutter SDK has been automatically added.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;flutter_webrtc: ^x.x.x
millicast_flutter_sdk: ^x.x.x
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then inside  &lt;code&gt;main.dart&lt;/code&gt; you just import &lt;code&gt;flutter_webrtc &lt;/code&gt;alongside any other dependencies your project may have.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import 'package:flutter_webrtc/flutter_webrtc.dart';
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In addition to installing the SDK, you'll also need to &lt;a href="https://dashboard.dolby.io/signup/" rel="noopener noreferrer"&gt;create a free &lt;/a&gt;&lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt;&lt;a href="https://dashboard.dolby.io/signup/" rel="noopener noreferrer"&gt; Account&lt;/a&gt;. The free account offers 50 Gigabytes of data transfer a month, which will be plenty for building and testing out the real-time streaming app.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Interested in following along with a project that already has the SDK installed and set up? &lt;a href="https://github.com/dolbyio-samples/blog-streaming-flutter-app/tree/main/streaming_app" rel="noopener noreferrer"&gt;Check out this GitHub repository&lt;/a&gt; which contains a completed version of this app.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Building the Real-Time Streaming App with Flutter
&lt;/h3&gt;

&lt;p&gt;Building a WebRTC Flutter streaming app can be complicated, so to get started we first need to divide the app into a series of features that come together to support a real-time streaming experience. In order for the app to connect to the&lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt; Dolby.io&lt;/a&gt; servers, we must include a way for the user to input their streaming credentials and tokens in order to authenticate and use the&lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt; Dolby.io&lt;/a&gt; servers.&lt;/p&gt;

&lt;h4&gt;
  
  
  Taking in the WebRTC Stream Credentials
&lt;/h4&gt;

&lt;p&gt;To publish and view a WebRTC stream with the&lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt; Dolby.io&lt;/a&gt; Flutter SDK we need three things: an &lt;code&gt;account ID&lt;/code&gt;, a &lt;code&gt;stream name&lt;/code&gt;, and a &lt;code&gt;publishing token&lt;/code&gt;. &lt;a href="https://docs.dolby.io/streaming-apis/docs/about-dash" rel="noopener noreferrer"&gt;These credentials can be found on your Dolby.io dashboard&lt;/a&gt; and need to be input by the user which we can capture with the &lt;code&gt;TextFormField&lt;/code&gt; widget, where the widget, on change, updates a &lt;code&gt;TextEditingController&lt;/code&gt; variable.&lt;br&gt;
&lt;br&gt;
 &lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Container(
    width: MediaQuery.of(context).size.width,
    constraints: const BoxConstraints(
        minWidth: 100, maxWidth: 400),
    child: TextFormField(
      maxLength: 20,
      controller: accID,
      decoration: const InputDecoration(
        labelText: 'Enter Account ID',
      ),
      onChanged: (v) =&amp;gt; accID.text = v,
    )),
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;&lt;em&gt;Note: In production, you don't need to have users input these credentials, instead you could use a custom login and serve the users a temporary login token. For learning more about Dolby.io tokens &lt;a href="https://dolby.io/blog/secure-token-authentication-with-dolby-io-millicast-streaming-webrtc/" rel="noopener noreferrer"&gt;check out this blog on creating and securing tokens&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Because we need three inputs to publish a WebRTC stream to the Dolby.io server, we can repeat this code for each input.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Container(
     width: MediaQuery.of(context).size.width,
     constraints: const BoxConstraints(
         minWidth: 100, maxWidth: 400),
     child: TextFormField(
       maxLength: 20,
       controller: accID,
       decoration: const InputDecoration(
         labelText: 'Enter Account ID',
       ),
       onChanged: (v) =&amp;gt; accID.text = v,
     )),
Container(
     width: MediaQuery.of(context).size.width,
     constraints: const BoxConstraints(
         minWidth: 100, maxWidth: 400),
     child: TextFormField(
       maxLength: 20,
       controller: streamName,
       onChanged: (v) =&amp;gt; streamName.text = v,
       decoration: const InputDecoration(
         labelText: 'Enter Stream Name',
       ),
     )),
 // Publishing Token Input
 Container(
     width: MediaQuery.of(context).size.width,
     constraints: const BoxConstraints(
         minWidth: 100, maxWidth: 400),
     child: TextFormField(
       controller: pubTok,
       maxLength: 100,
       onChanged: (v) =&amp;gt; pubTok.text = v,
       decoration: const InputDecoration(
         labelText: 'Enter Publishing Token',
       ),
     )),
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Additionally, we can add an &lt;code&gt;ElevatedButton&lt;/code&gt; for the user to press once they have added their credentials.&lt;br&gt;
&lt;br&gt;
 &lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ElevatedButton(
  style: ElevatedButton.styleFrom(
    primary: Colors.deepPurple,
  ),
  onPressed: publishExample,
  child: const Text('Start Stream'),
),
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwisdn2ehwr3bq0p269du.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwisdn2ehwr3bq0p269du.jpg" alt="The sample app, on launch, captures the user’s credentials to start the stream." width="800" height="1030"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h4&gt;
  
  
  Authentication and Publishing Streams from Flutter
&lt;/h4&gt;

&lt;p&gt;You'll notice that the Elevated button triggers a function via its &lt;code&gt;onPressed&lt;/code&gt; parameter. This function, called &lt;code&gt;publishExample,&lt;/code&gt; checks if the credentials are valid and authenticates the stream. First, the function checks that a user has input a value for each input.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;void publishExample() async {
    if (pubTok.text.isEmpty || streamName.text.isEmpty || accID.text.isEmpty) {
      ScaffoldMessenger.of(context).showSnackBar(const SnackBar(
        backgroundColor: Colors.grey,
        content: Text(
            'Make sure Account ID, Stream Name, and Publishing Token all include values.'),
      ));
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then the function calls &lt;code&gt;publishConnect&lt;/code&gt;, an asynchronous function that takes in &lt;code&gt;streamName,&lt;/code&gt; &lt;code&gt;pubTok&lt;/code&gt;, and a third object called &lt;code&gt;localRenderer&lt;/code&gt;. &lt;code&gt;localRendered&lt;/code&gt; is a &lt;code&gt;RTCVideoRenderer&lt;/code&gt; object included with the &lt;code&gt;flutter.webrtc&lt;/code&gt; package.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;final RTCVideoRenderer localRenderer = RTCVideoRenderer();
publish = await publishConnect(localRenderer, streamName.text, pubTok.text);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Using these three parameters we have everything we need to authenticate and begin publishing a stream. Inside of the &lt;code&gt;publishConnect&lt;/code&gt; function, we need to generate a temporary publishing token using the &lt;code&gt;streamName&lt;/code&gt; and &lt;code&gt;pubTok&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Future publishConnect(RTCVideoRenderer localRenderer, String streamName, String pubTok) async {
  // Setting subscriber options
  DirectorPublisherOptions directorPublisherOptions =
      DirectorPublisherOptions(token: pubTok, streamName: streamName);

  /// Define callback for generate new token
  tokenGenerator() =&amp;gt; Director.getPublisher(directorPublisherOptions);

...
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With the temporary publishing token created, we then can use it to create a &lt;code&gt;publish&lt;/code&gt; object. Using this &lt;code&gt;publish&lt;/code&gt; object we could start the stream, however, we wouldn't be able to see or hear anything, this is because we haven't specified what kind of stream we are creating or which devices we will connect to. To do this we need to specify if the stream will include audio, video, or audio &lt;em&gt;and&lt;/em&gt; video, then we need to pass these constraints into the &lt;code&gt;getUserMedia&lt;/code&gt; function which will map the constraints to the default audio capture device and the default video capture device.&lt;br&gt;
&lt;br&gt;
 &lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
...
Publish publish =
      Publish(streamName: 'your-streamname', tokenGenerator: tokenGenerator);

  final Map&amp;lt;String, dynamic&amp;gt; constraints = &amp;lt;String, bool&amp;gt;{
    'audio': true,
    'video': true
  };

  MediaStream stream = await navigator.mediaDevices.getUserMedia(constraints);

...
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Using this stream object, we can also provide a feed to the user in the form of a viewer. To do this we need to assign our input devices to&lt;code&gt; localRender&lt;/code&gt; as sources.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
...

localRenderer.srcObject = stream;

...
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, we can map the &lt;code&gt;stream&lt;/code&gt; object and pass it as an option to the &lt;code&gt;connect&lt;/code&gt; function, which is inherited from &lt;code&gt;publish&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
...
//Publishing Options
  Map&amp;lt;String, dynamic&amp;gt; broadcastOptions = {'mediaStream': stream};

  /// Start connection to publisher
  await publish.connect(options: broadcastOptions);
  return publish;
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With our stream connected, we can now look at setting up the viewer using &lt;code&gt;localRender.&lt;/code&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  In-App WebRTC Stream Viewer
&lt;/h4&gt;

&lt;p&gt;Now that our stream is authenticated and publishing we need to add a viewer object so the streamer can see themselves streaming. This can be done with &lt;a href="https://pub.dev/documentation/flutter_webrtc/latest/flutter_webrtc/RTCVideoView-class.html" rel="noopener noreferrer"&gt;a &lt;code&gt;RTCVideoView&lt;/code&gt; object&lt;/a&gt; which takes in our &lt;code&gt;localRender&lt;/code&gt; object and is constrained by a container.&lt;br&gt;
&lt;br&gt;
 &lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Container(
  margin: const EdgeInsets.all(30),
  constraints: const BoxConstraints(
      minWidth: 100, maxWidth: 1000, maxHeight: 500),
  width: MediaQuery.of(context).size.width,
  height: MediaQuery.of(context).size.height / 1.7,
  decoration:
      const BoxDecoration(color: Colors.black54),
  child: RTCVideoView(localRenderer, mirror: true),
)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h4&gt;
  
  
  Sharing the Real-time Stream
&lt;/h4&gt;

&lt;p&gt;With the stream authenticated and live, we want to share our content with the world. We can do this via a URL formatted with our &lt;code&gt;streamName&lt;/code&gt; and our &lt;code&gt;accountID&lt;/code&gt; which we collected as inputs. Using the example app as a template we can create a function called shareStream which formats the URL to share and copies it to the clipboard.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;void shareStream() {
    Clipboard.setData(ClipboardData(
        text:
            "https://viewer.millicast.com/?streamId=${accID.text}/${streamName.text}"));
    ScaffoldMessenger.of(context).showSnackBar(const SnackBar(
      backgroundColor: Colors.grey,
      content: Text('Stream link copied to clipboard.'),
    ));
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Unpublishing a WebRTC Stream
&lt;/h4&gt;

&lt;p&gt;To unpublish the stream we can call the publish object returned from our asynchronous &lt;code&gt;publishConnect&lt;/code&gt; function to stop, killing the connection with the &lt;a href="http://dolby.io/" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; server.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;publish.stop();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Flutter 3 is Truly Cross Platform  
&lt;/h4&gt;

&lt;p&gt;The power of Flutter is taking one code base and having it work across multiple platforms. Here we can see examples of the app working for Android, Windows, and Web:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fysoly1b5pkvtndib0hoj.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fysoly1b5pkvtndib0hoj.jpg" alt="An example of the Flutter real-time streaming app launching on an Android emulator." width="463" height="858"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsgvnej5uyreo2073efvp.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsgvnej5uyreo2073efvp.jpg" alt="An example of the Flutter real-time streaming app launching as a web app." width="800" height="617"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4a6reoog8tnjzbo6qzyh.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4a6reoog8tnjzbo6qzyh.jpg" alt="An example of the Flutter real-time streaming app launching as a Windows Native app." width="800" height="1030"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Building in this cross-platform framework saves both time and resources, allowing you to get started building real-time streaming apps without having to worry about which platform works for your users. These apps are perfect for streaming live events and virtual events to the widest range of audiences allowing for high-quality interactive experiences. If you are interested in learning more about our Flutter streaming SDK &lt;a href="https://docs.dolby.io/streaming-apis/docs/flutter" rel="noopener noreferrer"&gt;check out our documentation&lt;/a&gt; and play around with the full project on&lt;a href="https://github.com/dolbyio-samples/blog-streaming-flutter-app/tree/main/streaming_app" rel="noopener noreferrer"&gt; this GitHub repository&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Feedback or Questions? Reach out to the team on &lt;a href="https://twitter.com/DolbyIO?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt;, &lt;a href="https://www.linkedin.com/company/dolbyio/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, or via our &lt;a href="https://www.millicast.com/contactus/" rel="noopener noreferrer"&gt;support desk&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>flutter</category>
      <category>webdev</category>
      <category>api</category>
      <category>android</category>
    </item>
    <item>
      <title>Dolby.io Real-time Streaming Wins Best in Media APIs Award</title>
      <dc:creator>Jayson DeLancey</dc:creator>
      <pubDate>Sat, 01 Oct 2022 01:33:19 +0000</pubDate>
      <link>https://forem.com/dolbyio/dolbyio-real-time-streaming-wins-best-in-media-apis-award-30mm</link>
      <guid>https://forem.com/dolbyio/dolbyio-real-time-streaming-wins-best-in-media-apis-award-30mm</guid>
      <description>&lt;p&gt;We're excited to share that Dolby.io's &lt;a href="https://dolby.io/products/real-time-streaming/" rel="noopener noreferrer"&gt;Real-time Streaming&lt;/a&gt; has won the &lt;strong&gt;Best in Media APIs&lt;/strong&gt; category at the &lt;strong&gt;2023 API Awards&lt;/strong&gt;. With Dolby.io, we provide a WebRTC-based real-time streaming service to enable sub-second latency, broadcast-quality color and sound, global scale, and end-to-end encryption — all with native support for web browsers and Internet-enabled devices.&lt;/p&gt;

&lt;h2&gt;
  
  
  Real-time Streaming with Sub-second Latency
&lt;/h2&gt;

&lt;p&gt;Latency is the time it takes for a live stream to get from the point of capture such as live feed from a video camera to the device where the stream is being viewed.  Latency is important whenever this glass-to-glass video feed is time-sensitive and viewers must receive the content in a short and predictable amount of time. This is often the case when trying to closely coordinate audio/video with an underlying data messaging service.&lt;/p&gt;

&lt;p&gt;Real-time video streaming is especially important for live events, sports, sports betting, gaming, online auctions, AR/VR virtual worlds, remote production, live-streaming and other scenarios where time-to-delivery is mission critical in creating live human interaction.  These experiences must be less than 1 second from the source to the viewer to be effective.  For creating 2-way interactivity the target is even faster at 100-500 milliseconds to support a full round-trip.&lt;/p&gt;

&lt;p&gt;With Real-time Streaming you are in control of the experience for your audience. You can broadcast to massive audiences with immersive and interactive applications that rival the real world. The Streaming APIs have integrations for &lt;a href="https://docs.dolby.io/streaming-apis/docs/using-obs" rel="noopener noreferrer"&gt;OBS&lt;/a&gt;, &lt;a href="https://docs.dolby.io/streaming-apis/docs/player-plugin" rel="noopener noreferrer"&gt;Unreal Engine&lt;/a&gt;, and &lt;a href="https://docs.dolby.io/streaming-apis/docs/client-sdks" rel="noopener noreferrer"&gt;Client SDKs&lt;/a&gt; to support whatever platforms you want to build.&lt;/p&gt;

&lt;p&gt;Most folks will broadcast from &lt;a href="https://dev.to/dolbyio/top-reasons-for-updating-to-the-new-obs-300-release-g6m"&gt;Open Broadcast Software (OBS) Studio&lt;/a&gt; and drop an &lt;a href="https://docs.dolby.io/streaming-apis/docs/hosted-viewer" rel="noopener noreferrer"&gt;embedded hosted player&lt;/a&gt; in their web application like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;iframe&lt;/span&gt;
    &lt;span class="na"&gt;src=&lt;/span&gt;&lt;span class="s"&gt;"https://viewer.millicast.com?streamId=accountId/streamName"&lt;/span&gt;
    &lt;span class="na"&gt;allowfullscreen&lt;/span&gt;
    &lt;span class="na"&gt;width=&lt;/span&gt;&lt;span class="s"&gt;"640"&lt;/span&gt;
    &lt;span class="na"&gt;height=&lt;/span&gt;&lt;span class="s"&gt;"480"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/iframe&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Dolby.io Best in Media APIs Award
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://bit.ly/3LwnSYz" rel="noopener noreferrer"&gt;2023 API Awards&lt;/a&gt; celebrate the technical innovations, adoption, and reception by leading APIs &amp;amp; Microservices used by the global developer community.  The Awards Ceremony will be presented during &lt;a href="https://bit.ly/3LwnSYz" rel="noopener noreferrer"&gt;API World 2023&lt;/a&gt;, the world's largest API &amp;amp; Microservices conference with over 4,000 attendees.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Dolby.io's Real-time Streaming&lt;/strong&gt; win here at the 2023 API Awards is evidence of their leading role in the growth of the global API ecosystem,” said Jonathan Pasky, Executive Producer &amp;amp; Co-Founder of DevNetwork, producer of API World &amp;amp; the 2023 API Awards.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The 2023 API Awards received hundreds of nominations, and the Advisory Board to the API Awards have selected Dolby.io based on three criteria: &lt;br&gt;
(1) attracting notable attention and awareness in the API industry; &lt;br&gt;
(2) general regard and use by the developer &amp;amp; engineering community; and &lt;br&gt;
(3) being a technical leader in its sector for innovation. &lt;/p&gt;

&lt;h2&gt;
  
  
  Celebrate with Dolby.io at API World
&lt;/h2&gt;

&lt;p&gt;Our team will be attending &lt;a href="https://link.devnetwork.com/DWlmmw3e" rel="noopener noreferrer"&gt;API World 2023&lt;/a&gt; in-person October 24-26 in San Jose to accept the award and we'd love for you and our community to join us to celebrate.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://link.devnetwork.com/DWlmmw3e" rel="noopener noreferrer"&gt;Register now for API World 2023&lt;/a&gt; and claim one of the Free OPEN Passes we can offer for a limited time. Additionally, let me know you'll be there in the comments below then come find me at the event for some exclusive SWAG. &lt;/p&gt;

&lt;p&gt;If you haven't used Dolby.io Real-time streaming yet, &lt;a href="https://dolby.io/signup" rel="noopener noreferrer"&gt;sign up for an account&lt;/a&gt;. Take a look at some of our tutorials and how-to articles for &lt;a href="https://docs.dolby.io/streaming-apis/docs/getting-started" rel="noopener noreferrer"&gt;getting started&lt;/a&gt;. We'd love to hear more about your experience with an invitation-only private event (space limited to availability).&lt;/p&gt;

</description>
      <category>media</category>
      <category>streaming</category>
      <category>api</category>
      <category>dolby</category>
    </item>
    <item>
      <title>Learn Babylon.js to Create Your Own 3D Metaverse Environments</title>
      <dc:creator>Dan Zeitman</dc:creator>
      <pubDate>Wed, 28 Sep 2022 04:06:48 +0000</pubDate>
      <link>https://forem.com/dolbyio/learn-babylonjs-to-create-your-own-3d-metaverse-environments-3o53</link>
      <guid>https://forem.com/dolbyio/learn-babylonjs-to-create-your-own-3d-metaverse-environments-3o53</guid>
      <description>&lt;p&gt;&lt;a href="https://dolby.io" rel="noopener noreferrer"&gt;Dolby.io&lt;/a&gt; recently sponsored a workshop at CascadiaJS, the Northwest’s premier JavaScript developer conference. That workshop, Learn BabylonJS to Create Your Own 3D Metaverse Environments was quite a popular workshop at the conference. The workshop sold-out early, and was standing room only. And for good reasons. there is a lot of interest in developing for the Metaverse.&lt;/p&gt;

&lt;h2&gt;
  
  
  What was all the excitement about BabylonJS and the Metaverse?
&lt;/h2&gt;

&lt;p&gt;There's been a lot of buzz about building Metaverse experiences in Unreal Engine and Unity, and yes Dolby.io has you covered with plugins for both platforms.  The new excitement for developers is the possibility of building Metaverse experiences using JavaScript.  Our workshop focus at &lt;a href="https://2022.cascadiajs.com" rel="noopener noreferrer"&gt;CascadiaJS 2022&lt;/a&gt; was to help developers get started with a fully-featured JavaScript game engine called &lt;a href="https://www.babylonjs.com" rel="noopener noreferrer"&gt;Babylon.js&lt;/a&gt;, then level up the experience with Real-Time Streaming and Spatial Audio.&lt;/p&gt;

&lt;p&gt;Here's what developers had to say about the content we presented and their experience at the Dolby.io BabylonJS Metaverse workshop.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/2EU69qU9DIg"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  No More Fear of Missing Out!
&lt;/h2&gt;

&lt;p&gt;No worries if you could not attend in-person.&lt;br&gt;
You can now learn how to get started with BabylonJS with our self paced workshop available at &lt;a href="https://github.com/dolbyio-samples/workshop-babylonjs-metaverse" rel="noopener noreferrer"&gt;DolbyIO Samples on our GitHub&lt;/a&gt;.&lt;br&gt;
You'll learn the basics of 3D development, plus learn how to bring real-time streaming into your own Metaverse experiences. We've developed a full repo of code samples and tutorials to help you get started with BabylonJS and Dolby.io technologies. &lt;/p&gt;

&lt;h2&gt;
  
  
  Stay connected to to this project!
&lt;/h2&gt;

&lt;p&gt;Don’t forget to star and bookmark this repo as we are going to be producing and releasing a companion video series to showcase the possibilities of creating amazing experiences for the Metaverse. Make sure you take a moment to subscribe our new YouTube channel for video tutorials on the Metaverse, real-time streaming, spatial audio and a lot more. &lt;/p&gt;

&lt;p&gt;Metaverse Workshop Repo:&lt;br&gt;
&lt;a href="https://github.com/dolbyio-samples/workshop-babylonjs-metaverse" rel="noopener noreferrer"&gt;https://github.com/dolbyio-samples/workshop-babylonjs-metaverse&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Unreal Plugins for Live Streaming:&lt;br&gt;
Streaming Player – Play a live steam within Unreal Engine.&lt;br&gt;
&lt;a href="https://docs.dolby.io/streaming-apis/docs/player-plugin" rel="noopener noreferrer"&gt;https://docs.dolby.io/streaming-apis/docs/player-plugin&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Stream Broadcaster – broadcast a virtual camera with Dolby.io&lt;br&gt;
&lt;a href="https://docs.dolby.io/streaming-apis/docs/publisher-plugin" rel="noopener noreferrer"&gt;https://docs.dolby.io/streaming-apis/docs/publisher-plugin&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Unity Plugin for real-time communications:&lt;br&gt;
&lt;a href="https://github.com/DolbyIO/comms-sdk-dotnet" rel="noopener noreferrer"&gt;https://github.com/DolbyIO/comms-sdk-dotnet&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Unity Plugin Docs:&lt;br&gt;
&lt;a href="https://dolbyio.github.io/comms-sdk-dotnet/documentation/unity.html" rel="noopener noreferrer"&gt;https://dolbyio.github.io/comms-sdk-dotnet/documentation/unity.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Subscribe to our YouTube Channel:&lt;br&gt;
&lt;a href="https://www.youtube.com/c/DolbyIO" rel="noopener noreferrer"&gt;https://www.youtube.com/c/DolbyIO&lt;/a&gt;&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>babylonjs</category>
      <category>programming</category>
      <category>gamedev</category>
    </item>
  </channel>
</rss>
