A little over two years ago, Khruangbin and I released an application called AirKhruang which allowed you to generate a Spotify playlist for an upcoming flight. The app was a hit largely because Khruangbin themselves provided the pool of songs and they have impeccable worldly taste. Flash forward to the beginning of this year, the band and I had some preliminary conversations about building a V2 of the concept sometime in Spring in support of their now announced, upcoming new record, Mordechai. Well… I don’t need to tell you how many things, including air travel, have changed since then. So, last month I pitched a remix to the concept which instead created playlists for all of the activities we are now doing while #StayingHome. Khruangbin agreed and snapped back with the perfect title: “Shelter In Space.”
The app is now live at space.airkhruang.com. Head there to generate a Spotify or Apple Music playlist for one of fourteen activities at a duration of your choosing. Continue reading to learn about how it came together.
A New Stack
Two years ago, I was a very different developer. AirKhruang was built using Backbone and sits on a Heroku server. It has a complicated authentication system with a database for users. The pool of songs are hard-coded into the application. 🤯 Only a few months later, I would build my first Vue.js application for Guns N’ Roses and then my first Nuxt.js app for A Quiet Place. Before the end of that same year, I would deploy my first app on Netlify. I haven’t used Backbone or Heroku since.
I now almost exclusively build my client applications using Nuxt.js and host them on Netlify. In general, the way Vue.js allows me to break all of the components of a web app down into manageable bits is a dream as I’m usually only focused on a few key elements. I also have forgone databases in most cases and now use Implicit Auth Grant to authenticate Spotify users without storing any of their info. Finally, I have also added Apple’s MusicKit JS to the mix, bringing in another huge ecosystem of users.
That hard coded pool of songs has been replaced by Contentful so the client can maintain the songs easily and we can also integrate them into future applications, like AirKhruang 2, when the time is right.
One thing that hasn’t changed much is how we’re handling the playlist creation algorithm. We’re still using Spotify’s track audio features to understand measurements such as tempo, danceability, and loudness for each of our songs. I added each of these features to a spreadsheet alongside all of the activities and then rated them on a relevancy scale of low (blue), neutral (gray), and high (red.) For example, the activity of “Meditation” would likely have a low
loudness. “Reading” might require high
instrumentalness and low
speechiness to receive tracks without singing. And you probably want a high
danceability when “Dressing Up.” It’s an abstract and opinionated exercise but it leaves you with a nice set of data which you can use to score and pull appropriate tracks from the pool.
I knew the design of our app would come from old sci-fi movies or television but I wasn’t sure exactly from where. I began by mood boarding terminal screens from 2001 Space Odyssey and Alien. That led to the first two designs you see above. My thinking was a sort of high design future hospitality interface you might find aboard the Discovery One. I also had thoughts regarding the handling of activity images and I couldn’t stop thinking about the constellation driven skill tree of Skyrim.
I presented my ideas to Mark from the band and he brought a sci-fi reference of his own: the 1980’s BBC television adaption of The Hitchhiker’s Guide to the Galaxy. In particular, the animations which were shown during the guide definition sequences. Whereas I was thinking sophisticated, Mark felt the absurdity and fun of the Hitchhiker sequences would fit better. I agreed!
I took quick stock over some of the key elements of the sequences:
- Old slightly out-of-focus monitor
- Limited color palette
- Auto-typing phrases
- Vintage 3D graphics
- Simple illustrations
- Vertical scanning transitions
The real trick was going to be recreating these features but still creating a highly functional and responsive web app. I took it one step at a time and started by simply using a CSS blur on some text.
That single property really made things look like an old monitor and that gave me the confidence to start working on components.
The captions on the Hitchhiker videos were typed out as the narrator spoke them. Typically, I would used Typed.js for something like that but I knew I needed to separately style that last character, which is something Typed does not offer a solution for out of the box. Instead, I decided to roll my own Vue.js component which I could include anywhere I required auto-typed text. My solution uses a
setInterval method to increment an index which is then used to compute which characters of a phrase should be shown. Once the index met the total phrase character length, I could clear the interval and stop the animation. Here’s what that looks like.
From here you can style both the
.end text differently.
My favorite element of the Hitchhiker videos are the vertical scanner animations which transition between sections and images. Content is transitioned via a simultaneous vertical swipe while a scanner line brightens both the leaving and entering content at the point of intersection. Watch above to see what I’m describing.
Let’s start with the scanner line. My brain went in all sort of directions when tackling this one, from gradient animations to pre-rendering. Then I remembered the CSS property backdrop-filter. The
backdrop-filter property allows you to apply the same effects you get from the
filter property but to elements behind the styled div. So, I created a
#scan div that would represent the scanning line and gave it a
backdrop-filter that should heavily brighten anything beneath it.
This worked well because our background was pure black so everything that wasn’t black, brightened, creating a believable scanning effect.
I then needed to handle the vertical swipe which would show the image from the top or hide the image from the bottom. For this, I decided to use an inset clip-path. The
clip-path CSS property allows you to define clipping regions in a bunch of shapes. We just needed a plain ol’ rectangle. For example, if I wanted to clip a div 50% from the bottom, I would do this:
clip-path: inset(0 0 50% 0)
What about 75% from the top?
clip-path: inset(75% 0 0 0)
Writing this case study, I can already see some improvements I would like to make, such as using a pseudo element rather than requiring the extra
#scan div. This is basically what I went live with though.
I don’t typically introduce much 3D work into my projects but I have used Three.js in the past and knew it would be a good fit to build both the spinning wireframe globe and star field background. Thanks to a couple of informative tutorials, both of these elements came together quickly and you can check out their final executions in this CodePen.IO collection.
Khruangbin is one of those bands who can truly transport you to anywhere in the world and I know that’s exactly what I needed these past few weeks. Maybe you feel the same? Thanks to Nate Utesch for getting the activity illustrations done. Thank you to Secretly Group and Dead Oceans for hiring me and making the whole process seamless. Finally, thanks to Khruangbin (especially Mark) for being incredible curators and collaborators. I can’t wait take flight on AirKhruang 2 when it is safe to do so.