By Daniel Ockeloen

Daniel works for a SME called Noterik B.V. in Amsterdam where he is involved with building video applications for online uses. He started programming in 1981 and moved from building games to online applications with a passion for building tools for content creators. He worked for TV stations for about 10 years and is now involved in European/Education and Oral-History projects that make up the core aim for the company. He was invited to the OKFest because of involvement with EUscreen.

Overview

As a programmer I am limited to how much I can make during a workshop like this, so in preparing for this workshop I had a talk with Kati Hyyppä and Sanna Marttila and decided to show and try to implement an example of a topic I am interested in, which is researching if we can change the way that a video-montage is made. In the old way, video was edited in a editing program and cut by the editor. He/she would convert the raw video into a story in a fixed way. For example, a documentary might have the following structure: opening, statement, example, example, statement, example, example, commentary and conclusion. The question we have been playing with is can we instead of cutting the video in a fix way just tag video’s in such a way that we can do the “montage”, not at the moment of production, but at the moment of consumption without loosing the creators power to tell her/his story.

Material used / input

Since some preparation was needed for the Open Video Make Session, I picked a video from Open Images called “California Dreaming” by Bregtje van der Haak. It is available under a Creative Commons license and produced by one of the Dutch national broadcasters I used to work for called VPRO.

During the discussion phase of the open video workshop, two ideas were given that I decided to use as examples of showing the basic concept of a dynamic montage. One was the theme of horror movies and moods, and the other was the idea by Sanna Marttila that involved the Kallio Archive and showing the weather in a place/time by showing a video-montage instead of telling you the weather directly (so watch what the weather is like). I decided to combine both, even if my preselected video didn’t really fit with the topics. I hoped that the ideas would be strong enough during the presentation at the end of the day that people would not mind.

Steps & tips

I used our own platform (called Springfield), which we are preparing for open source, and which is used in many of our projects to define different tagging layers.

Chapter Tagging

I added several layers to cover the two types of requests needed, and ended up adding four tagging layers: Mood, Season, TimeOfDay and Weather. Then, with some help from the group, we defined the different options for each layer that would be needed and added them to the system. The mood editor ended up like this:

Mood editor

Next I added all the layers for the other example (weather) and we ended up with a editor that was able to timebase tag the whole video. I put in some time tagging parts of the video this way, and we ended up with something like this:

All the layers after tagging

As you can see these layers can overlap and form the base of metadata we need to be able to generate a montage on the fly later. The next part I did was to try and build the output filter. Basically the part where the user would hit play with a given scenario would generate the correct video playlist. So for example, show me the weather in Helsinki now, and I am in a good mood would result use as input 18th September 2012, Helsinki, at 15:00 and mood “happy”.

Playout with dynamic playlist

Presentation

During the hackday I talked to several people about what would be possible, but since we had only couple of minutes to present our works in the end of the day, I decided to create two screencasts which showed the result. One shows the tagging tools and one the end result. You can find them here :

Tagging : http://images1.noterik.com/okfest/tagging_video.mp4
Playback : http://images1.noterik.com/okfest/playback.mp4

Now, the end result looks little bit weird in that we tagged randomly and the content doesn’t match exactly what we are trying todo. But you will notice that the clip is “cut” on the fly, and jumps around in the original TV program (sorry about that Bregtje if you see this). Given the correct tagging, and using for example the Kallio Archive videos, the end result would have been as suggested by Sanna Marttila.

Future of this prototype

After the event I took a good look at the Kallio Archive and researched a bit more what metadata is available. I am strongly considering finishing this prototype based on their content and putting it in as an example when the open source version of our platform is released, since it is a good example of what you can do, and the material is very nice and perfect for things like this since it consists of short clips taken over a longer period and conditions.

Thanks for all the fish

I would like to thank the other participants of the workshop, and Kati, Ramyah, Sanna and the sponsors for the free food, drinks and interactions. If you have any questions about Noterik, our video platform (Springfield) or this demo, you can email me at daniel@noterik.nl or daniel@xs4all.nl