Brought to you by UBM
An audience sitting passively in front of a goggle box, bombarded by images on endless channels of TV programming, was once thought of as an unchanging fact in traditional broadcasting.
It took only a few years for Netflix and other streaming services to change that. Today’s audiences are in command, deciding when and how they consume the content they desire. And they explore, with the help of automated suggestions based on past preferences, to find new content.
In the coming years, media consumers will be empowered by technologies that broadcasters are increasingly seeking to roll out. New experiences are already starting to hint at what is on the horizon.
Artificial intelligence (AI) will enable viewers to view different, personalised versions of a TV broadcast, just like how websites can be personalised today based on a user’s previous visit.
Viewers can also expect to be more immersed in what they are viewing. Virtual reality (VR), augmented reality (AR) and 360 videos, are finally coming to the fore to enable viewers to “see” more and experience more.
For many broadcasters, the change is nothing short of transformative. Though some of the new technologies are being tested on selected content, many are expected to spearhead the way most content is delivered in future.
At the BBC, weather forecasts could be shown to viewers based on their preferences, say, where they stay or where they will be travelling to. Instead of the entire programme, a short clip on the relevant location is shown instead.
Similarly, on its popular Match of the Day (MOTD) football show, a viewer can automatically be directed to the segment where his team is featured. This means he doesn’t need to look through the entire show to find what he is interested in.
The concept, in truth, is not new. For years, broadcasters have been trying to “narrowcast” to users, catering to their narrow preferences to keep them interested and engaged.
The difference is the digital platform available now. When they consume media on digital devices – from phones to smart TVs – they also inform broadcasters of their habits. By better segmenting and tagging each part of their content with metadata, broadcasters can match audience with content more effectively.
Of course, this is easy on paper. Tagging so much content with the relevant metadata is a tedious task – one that AI can help with. Indeed, the technology is empowering broadcasters by better making use of all the data they are getting today.
One company that makes this tagging easy is Ooyala. Its AI feature learns from the type of content that a broadcaster puts out and figures out the tags that are important from previous usage. In time, it promises to replicate the task accurately and automatically.
Besides tagging it, AI can play a big role in making the content itself. In future, it can be a producer, say, at a sports event by making decisions on which camera angle to feature to show the most dramatic image of a goal or match-winning shot.
AI can also make a video based on the clips it is given. In 2016, IBM’s Watson helped create a trailer for the sci-fi horror movie Morgan, by making use of the visuals that is deduced were appealing to audiences.
To do so, it was fed more than 100 horror film trailers as well as the 90 minutes of the actual movie itself. By finding 10 scenes that it felt made the most impact, it enabled human editors to quickly stitch them together. The process took only a day, compared to the usual 10 to 30 days, according to a report in Wired.
If AI is changing the way content is produced, other technologies such as VR, AR and 360 videos are allowing people to experience more as well. Though the technology has had many starts and stops in years past, the more recent advancement of related innovations in cameras and software, for example, has enabled the creation of more such content of late.
Today, it is not difficult to find 360 videos, for example, of a holiday destination on YouTube. News reports from big news outlets, such as the New York Times, also increasingly rely on such formats to allow people to have a literally 360-degree view of, say, a war zone.
In future, the days when people sat passively in the living room will only be remembered like how people recall black-and-white TVs, experts say. And just like how over-the-top services have upended traditional broadcast, the transformation today will lead to deep and far-reaching changes in how people consume content in the years ahead, they add.
The challenge is making sure that the technology empowers the business of broadcasting. The technology itself also has to overcome challenges that emerge each step of the way, some of which have taken a while to resolve.
“As with every emerging technology, there are challenges to overcome before it truly takes off,” said head of marketing for Asia-Pacific, Mongchee Chang.
“For immersive technologies like VR, AR and 360 videos, challenges include headsets, costs, content and user experiences,” she added. “Service providers also need to work out how to monetise the services, either through ads or other business models.”
Visit BroadcastAsia2018 at Suntec Singapore from June 26 to 28 and find out more about the transformation going on in the industry. BroadcastAsia is a part of the mega tech event ConnecTechAsia, which also encompasses CommunicAsia and the new NXTAsia.