Creating a 24 Hour YouTube Radio Station for Bob Marley on 4/20/20

Using Open Broadcaster Software

“One good thing about music, when it hits you, you feel no pain.”

— Bob Marley

Well, it’s safe to say that a lot has changed around the world since that video premiere in February. With stay at home orders now being a normality, Harold thought we should create a 24 hour YouTube #StayHome livestream inspired by Chillhop Music. If you aren’t familiar with the format, these are basically live streaming radio stations in the form of videos. They are typically accompanied by a subtle looping animation and generate a sense of community due to the real-time chat. Personally, I’ve tuned into “lofi hip hop radio” to do programming and write case studies like this. In addition to audio, Harold and UMe also had the ambition to include video content, including segments from their incredible LEGACY series.

Having developed a couple of dynamic YouTube video projects for Trivium and Slipknot, I was under the impression that this would be a simple execution. We would use Open Broadcaster Software (OBS) to play through a playlist of audio and video files. The album playbacks should mimic the setup of YouTube radio station (music, meta, looping animation) and videos should transition in and out occasionally. So, I presented three ways I thought we could pull this off.

  1. OBS provides functionality for “Scenes” which allow you to create various scenes of content which may include several sources such as images, video, and audio. However, you must use an unofficial plugin in order to automatically switch between scenes. In addition, if you wanted to include the meta data (song , album, artwork, etc) on the album playback scenes, you must use another plugin in order to do so. Unofficial plugins increase the likelihood that OBS might crash which bring your stream down with it.
  2. Another thought was using After Effects and DataClay to pre-generate the album playback videos and maybe even stitch together the entire broadcast in one 24 hour video. The issue here is that rendering takes time and we didn’t have a whole lot of it once we kicked development off. Also, any changes in content would require a re-rendering.
  3. Both my Trivium and Slipknot campaigns relied on streaming a browser window so I thought I could simply create a custom web app interface that was capable of playing through the albums and videos. Being web based meant that we had much more control over all the data and changes should be very easy to make.

I really liked the idea of the 3rd option because I had several successful projects under my belt using this format so that’s where I started.

Spoiler: All of these ideas failed.

Web App: Failed

Image for post
Image for post
Album playback layout from Figma

It didn’t take long to put together a simple web app in CodePen.IO which could play through a playlist of tracks. You can use Howler as a base for audio playback or simply roll your own player engine using Web Audio. You mostly need to keep track of the currently playing track, know when it ends, and skip to the next track to keep the stream going. Once the audio playback was in, I built out a simple version of the UI, including the meta data and progress bar. I was then ready to do some streaming tests. What seemed to work well initially had a major flaw on closer inspection: the audio and visual was widely out of sync. I can’t remember if the audio was early or late but I do remember that OBS’s audio delay function did not help. I believe the issue became progressively worst over time. I let Harold know and jumped to the next idea.

Video Generation: Failed

You may know my After Effects and DataClay setup from my infamous project for Marilyn Manson or my less infamous project for Foo Fighters. The short story is I was gifted an 18-Core Windows machine from Intel which I use to generate data-driven videos at scale. If you bring in a plugin like RenderGarden (current free due to Covid-19,) you can also increase your rendering speed 2–3x.

I got that all setup and spent a few hours creating a dynamic composition using DataClay for our album playback scenes. In addition to handling all the dynamic text, DataClay has all these really nice time sculpting features which can position all of the audio files of an album in a cohesive timeline. I started rendering Legend and spent this time creating a data JSON file for the other 10 records. It took about the duration of the album to render the video so I did some math… 11 records at 1 hour each would take 11 hours to render. That seemed doable to me and I excitedly started rendering the second record. That’s when my After Effects file corrupted itself. I can’t remember the exact error but I think DataClay’s time sculpting functionality had arranged the timeline in a way that AE could no longer interpret and I couldn’t even open the file any longer.

I don’t like when I can’t fully understand why something isn’t working (which is due to my limited AE knowledge) so I moved on from this solution rather than recreating the dynamic composition.

Scenes: Failed

Somehow, I found myself back at the initial solution. We could create a scene for each album playback and each video. Then, we would use Advanced Scene Switcher to automatically switch between each scene at specified intervals. So, I recreated an album playback scene in OBS with a VLC source and used the Tuna plugin to power the track metadata, artwork, and progress bar. I then added another scene which included one of the Bob Marley videos. Finally, I configured Advanced Scene Switcher to jump to the video once album playback was completed and to the album when the video ended. I excitedly showed the client how well this worked and decided I would spent the next morning inputting more of our playlist.

It was that morning I realized this setup wasn’t going to work either. Tuna can only be configured to listen to one VLC source at a time and this setup would require a different VLC source for each album playback scene. That’s when I started to get worried. In general, if we wanted to simply do the audio playback scene (like Chillhop,) things would be much simpler. It was the fact that we were trying to seamlessly transition from different albums and videos throughout the day which made this complicated. I took a step back and cleared my head. As always, a possible solution came to me in the shower.

One VLC Source + Hacks: Success

Image for post
Image for post
Our OBS setup on my Intel machine

🚿 “Wait, can VLC sources play video too?” Turns out they could… Of course they could… I guess this solution was sorta hiding in plain sight but what if we setup a single VLC Source in a single Scene which had a playlist of all of our audio and video files. I quickly setup a demo of this to see how it might work. The VLC source sits on top of the album playback elements so they become hidden when a video is playing. The VLC source was able to play an audio file and then play a video. No problem. However, when the video stopped playing, the last frame of the video stayed visible while the audio was playing. This prevented our album playback elements from being seen. That brings us to our first hack.

Hack 1: “Clearing” the Screen

Since the VLC source carries a memory of the last frame of the last video when it switches to audio, what if we created the smallest and shortest video possible? So that’s what I did. I created a 4x4 pixel two frame video in After Effects and added it to the playlist before each album playback segment. It appears as if the screen clears when in reality there is a tiny video frame in the top left corner. Now this is important: make sure your VLC source does not have any transforms applied to it. Simply keep it in the top left of the screen and allow it to dynamically resize based on the content it plays. Of course, this might be an issue for videos which are not the same size as your scene. In our case, 1920x1080. Which brings us to our next hack.

Hack 2: Make All Videos 1920x1080

A lot of our videos were 1920x1080 but those that weren’t needed to be resized to fit within this size so that they covered the album playback elements and fit nicely on the screen. Using FFMpeg, we can resize each video to fit within 1920x1080 and then pad any remaining space with black bars. You can bulk resize an entire folder of MP4 videos with the following command on Windows:

for /f "tokens=1 delims=." %a in ('dir /B *.mp4') do "C:/Program Files/ffmpeg.exe" -i "%a.mp4" -vf ="scale=1920:1080:force_original_aspect_ratio=decrease,pad=1920:1080:(ow-iw)/2:(oh-ih)/2" -crf 29 "encoded/%a.mp4"

Also, while I’m on the topic of FFMpeg, I also put all the audio files in the same format by converting the WAVs I was provided to MP3 files. Here’s an FFMpeg command which can do that also:

for /f "tokens=1 delims=." %a in ('dir /B *.wav') do "C:/Program Files/ffmpeg.exe" -i "%a.wav" -b:a 320k "mp3/%a.mp3"

I then manually added the title, album name, and artwork to each of these MP3 files using Fission because that’s what powers the Tuna plugin.

Hack 3: Clear the VLC Artwork Cache (Manually)

Speaking of Tuna, I noticed on our last round of testing that some artwork images were coming up wrong. Apparently, VLC has had a known bug for many years which associates artwork to songs simply based on title. So, the song “Talkin’ Blues” may come up with the artwork for Natty Dread when playing back the album Talkin’ Blues. The only way around this at the moment is to manually clear the artwork cache which I fucking did for 24 hours. Here’s where you can find it on Windows:



In general, this setup worked nicely for us and we had a super successful day. However, it isn’t perfect. Managing 24 hours worth of content using the built in VLC source sucks. It takes about an hour to build out the playlist queue. While OBS provides a “Studio Mode” to make real-time adjustments, this doesn’t really work well with the VLC source. We wanted to add some new videos while broadcasting and this required an abrupt change in the stream rather than a smooth transition. I think the VLC source should clear itself after a video completes. I suspect this occurs simply because audio doesn’t have a size and it doesn’t know what to resize to.

Additional Thoughts

Preparing for The Worst

Live streaming is a high stakes game and I ran our 4/20 stream from the Intel computer which sits next to my desk and runs from home internet. I noticed that bad weather was in the forecast for Sunday night and decided I would also configure a virtual machine on MacStadium to stream just in case my power went out. Would you believe that when I tested things on Saturday (the day before the stream) we had a small storm and my power DID go out? I was glad to have this backup for our launch but I did notice that the Mac version of OBS was much more prone to crashing. In the future, I would likely look towards setting up a virtual Windows machine if the weather calls for it.

OBS Websocket Plugin

OBS actually has a Websocket API which can be used to listen to and emit events to OBS software from an external app, like a website. I almost went down this rabbit hole on our project but decided to keep things as simple as possible given our timeline. However, I’m super excited at the possibility of building a controller for OBS using a platform I’m familiar with.


Image for post
Image for post

Being able to help broadcast Bob Marley and his family’s art was a dream. This was a major effort and great care was taken at every step including content management, art direction, and execution. Thanks to Joey for designing the incredible video for our album playback segments. Thanks to Harold, UMe, and Tuff Gong for hiring me and helping wrangle over 24 hours of Marley content. And thanks to Bob Marley, whose music and message is more relevant now than ever.

Written by

I develop websites for rock 'n' roll bands and get paid in sex and drugs. Previously Silva Artist Management, SoundCloud, and Songkick. Currently: Available

Get the Medium app