Prototyping a Video Sequencer Tool for Mediathread

The Mediathread team has been laying out plans for a video sequencing tool to be used as part of Mediathread, and now it's time to actually start putting it together.

The idea for the tool is to have two side-by-side visual tracks playing at the same time. The left track displays a video playing from start to finish. The right track displays videos, images, or pieces of text, which are queued up by the user in a timeline interface. YouTube's annotation editor is the closest idea I've seen on the web to what we want to build, but I'm sure there are others that I'm not aware of.

For the past few months I've been thinking I'll just develop this with rigorously structured plain JavaScript, using a small collection of libraries instead of an actual framework. As I got closer to the prototyping stage, I kept thinking about all the little pieces of data and interface cues that I'll need to keep in sync, a lot like the WeatherRoulette game in WACEP. Obviously, I want this tool to be stable, and I've had luck with Ember in the past when working on interactive front-end applications.

I was pretty sure I didn't want to use Ember. I like Ember, but it's a Rails-style "opinionated" framework, which is great when starting up a new project, but I want this video tool to be tightly integrated into Mediathread. I talked to someone at PyCon in Portland who reminded me that React strives to be easily integrated into any web stack. It's really just a way of organizing the application into components in a declarative fashion.

So I've bitten the bullet and started experimenting with React. My learning process was:

  1. Read Thinking in React, and skim over the React documentation.
  2. Download and try out the code in the React Starter Kit to see how all the pieces fit together, what's required to deploy the application, and what options I had regarding that process.
  3. Copy one of the example apps to a new directory and create some components for the video tool, then get everything to communicate with each other to get the behavior I want.

The results of my final step are in a GitHub repo here, including stuff that I've so far only had peripheral knowledge of, like "babel," "webpack," and "jsx".

My process has been productive. I can use the basic structure of my prototype whether or not I choose to go with React during the rest of development. I've been reminded of how nice it is to work with a framework that automatically updates template variables based on the state of the system, like Ember but unlike Backbone. So I plan on continuing my React development until I run into any major problems.

I'm developing my prototype outside of Mediathread, keeping in mind I'll need to bring it back in to the larger application. In order to queue up new pieces of media, I'll need to call Mediathread's collection pop-up to display the current collection to the user.

I will probably need to use Mediathread's SherdJS abstraction layer to talk to the YouTube and Vimeo APIs through a uniform interface. Interestingly, it looks like with some minor hacks you can play Vimeo and YouTube videos with the <video> tag! I haven't tried this out myself, but if that worked consistently, it would eliminate the work and complexity of connecting SherdJS to our new juxtaposition tool.

I've added an iframe to the prototype below that I'll keep up to date as I make progress.