±«Óătv

Visual Perceptive Media gets people talking

Published: 22 December 2015
  • Ian Forrester

    Ian Forrester

    Senior "firestarter" Producer

Imagine a world where the narrative, background music, colour grading and general feel of a drama are shaped in real time to suit your personality. Ian Forrester explains more about Visual Perceptive Media.

You might have heard about the Visual Perceptive Media project via , in  or in the . I’ve already written about the project , but in this blog post, I’d like to explain a little more about it.

Visual Perceptive Media is a film which changes based on the person who is watching the video. Rather than drawing on sensor data to profile the environment, it focuses on the user themselves. It uses profiled data from a phone application to build a profile of the user and their preferences via their music collection and some personality questions. The data then is used to inform which assets are used in which order, what real time effects are applied and ultimately when. Cinematic effects twist the story one way or another.

Object-based media

To set the project in context, Visual Perceptive Media isn’t a one-off; it’s one of a range of projects we’re working on to demonstrate the power of . , our model for end-to-end broadcasting that will allow a live studio to run entirely on IP networks, is a big part of our object-based media ambitions. And , our personalised weather forecast demo, has a lot in common with Visual Perceptive Media.

All the new content experiences we’re building are about serving audiences better as individuals. All are underpinned by IP technology and involve rethinking our notions of media as a solid monolithic blocks.

You are already seeing this happen with the movement in . However, while audio multiplication in the open environment of the web is easier via the , there’s no real unified API like the WebAudioAPI for Video. SMIL was that but it got sidelined as HTML5 pushed the capabilities in the browsers, not media players. We have been working in this area and looked at many options including . In the end we started creating a video compositor library and . Without that library, the Visual Perceptive Media project would be still be in our lab.

The ability to customise or even personalise media (video in this case) in a browser using no special back-end technology or delivery mechanism is fascinating. It’s all JavaScript, client side technologies and standard http in a modern web browser. Because of this it’s open, not proprietary and I believe it’s scalable (the way it should be). This also means that when we do make it public, it will be accessible to the widest possible audience. It may even be possible on some mobile devices as their capabilities increase.

Discussion and feedback

There has been some debate around our use of personality data to shape the drama. Is there an ethical issue with a film “knowing” its audience?

Data ethics is something we have been thinking and talking about a lot while developing Visual Perceptive Media. Earlier this year we created a  summing up some of our thoughts and hearing the opinions of some industry experts.

It’s important to say we are using personality as simply a proxy for changing things in the film. It could have been anything. As someone suggested, we could even have used shoe size. We used personality after meeting a long while ago and being impressed by their technology.

The next thing was to connect the data to changeable aspects of a film. Film makers are very good at this and working with (film director and writer) we explored the links between personality and effect. Colour grade and music were key ones, along with shot choices, we felt were most achievable.

Final session @ - taking film into different spaces @gabriellejenks @lisabrook

— davemoutrey (@davemoutrey)

In earlier December, we revealed Visual Perceptive Media to 16 people from across the industry at the conference.

It originally was meant to be a smaller number but the demand was such that we increased the number and increased the machines needed to view it. The technical challenges did cause problems but with the help of Anna from , Andy and I got some good feedback. We are still crunching the feedback but I expect the frank discussions will be the most enlightening.

The panel discussion I took part in the following day was most useful. (See the .) I was my usual firestarter self and maybe caused people to rethink quite a bit. Some of the feedback afterwards was quite amazing. I had a very productive and useful conversation with people who insisted “This will not work!”

Whatever people think of Visual Perceptive Media, the project has certainly got people thinking and talking about the potential of object-based media.

Hopefully ±«Óătv R&D’s variety of work around object-based media will start to complete the picture of what’s possible and show the incredible value the ±«Óătv brings to the UK.

  • An earlier version of this blog post was originally published on .

Search by Tag:

Rebuild Page

The page will automatically reload. You may need to reload again if the build takes longer than expected.

Useful links

Theme toggler

Select a theme and theme mode and click "Load theme" to load in your theme combination.

Theme:
Theme Mode: