At the Mobile World Congress last month, SK Telecom (Nasdaq: SKM) unveiled a new platform aimed at distributing 360-degree video broadcasts to mobile devices. The operator claims that this reduces bandwidth and improves video quality, both key challenges for the successful adoption of 360-degree video.
It also gave Telco Transformation a chance to meet up with Jongmin Lee, who heads the media technology team within SK Telecom's R&D division and led the development of this platform. Lee is also a much valued member of our Video Transformation Advisory Board (VTAB) and took us through demonstrations of various technologies in the SK Telecom booth, while also sharing his thoughts on the key trends and expectations in this area.
Telco Transformation: First, let's talk about your role at SK Telecom. Tell us about your responsibilities.
Jongmin Lee: I work in the R&D division of SK Telecom, and am responsible for development of media technology. I manage a small team that is currently focused on immersive media technology, such as 360-degree video and virtual reality (VR), augmented reality (AR) and live streaming in HD quality, with ultra-low latency.
I am also involved with standard activity with the MPEG (the Moving Picture Experts Group) and am developing a patent in that area.
[Lee already holds a very impressive portfolio of 174 patents, to which he continues to add. Laughing, he says he's targeting 1,000 patents over the course of his career. He recently received an award from a South Korean government ministry, and just the day before I met him, a shiny new Glomo -- global mobile award from the GSMA, organizers of the Mobile World Congress. It sits in the SK Telecom booth behind him.]
TT: What did you receive the award for?
JL: The award is for our "Okusu" technology. Okusu means corn in Korean, and it was the name selected for the technology because video pixels are similar to small kernels of corn.
[Okusu is a mobile IPTV service jointly operated with SK Broadband (SK Telecom's operator division), offering 115 live TV steaming channels, 17,000 movie titles and exclusive original content in collaboration with partners, including some content in 360-degree video. Okusu was awarded the "Best Mobile Video, TV or Film App" for 2017.]
For Okusu, we developed a personalization engine to optimize the user experience, using big data technology. Okusu uses "T Live Streaming," a customized, low-latency version of the MPEG Media Transport (MMT) for the mobile environment, and also has other advantages such as fast channel change [less than 0.6 seconds], better bandwidth efficiency [generating up to 10% less traffic than HLS], easy ad-insertion into the video and traffic analytics. (See SK Telecom Targets VR Tech Challenges.)
TT: What are you showing at the booth right now?
JL: We are demoing live streaming of 360-degree video in 4K resolution, with ultra-low latency, and we have our media streaming protocol that we have developed. We also have a world-first in the area of compression, where we are showing tracking of the region of interest as the viewer watches video.
[This is a tiling technique developed by Lee's team that delivers a defined area at the center of the viewer's field of view in high resolution, while other parts of the 360-degree picture are delivered at a lower resolution. This saves on bandwidth for the operator. If the viewer shifts perspective, then the resolution in the new central area improves, while the earlier region drops to a lower resolution. This allows for latency-free shifts in perspective, but also a high-quality visual experience. The platform also uses multi-band blending technology that eliminates ghosting, and a headend that combines two video streams at different bit-rates into one, without transcoding. This is the secret sauce that enables the tilting technique.]
TT: And what is under development with your team for the future?
JL: The future plan is to go for higher quality. We are preparing ultra-ultra-high quality streams. The target is 8K for this year, and maybe 16K for next year. But this has to be done with ultra-low latency, maybe less than one second. Latency is very important for live. In the past, it was maybe 15 seconds for mobile, but in the case of sports, [low latency] is very important -- in racing, for example.
[Lee's demonstration highlights the impact of 15-20 seconds latency between the end of a race, and the ability of a viewer to know it had ended. Other distribution platforms could be showing the end of the race, people could be celebrating, while the viewer on the mobile broadcast stream is still 20 seconds behind.]
TT: What are the main trends you are seeing at Mobile World Congress?
JL: Well, I'm only looking at the media trends -- so VR, AR, live streaming -- all related to media technology.
In my opinion, there's a lot on immersive media technology. I'm seeing applications in lots of business areas -- gaming, support, telepresence, education, connected cars -- all sorts of applications. Surveillance is another important target. You have 360-degree cameras, a black box in the car… I see a lot of possibilities in media technology, in my area, for building and increasing [applications] with high quality and low latency.
For example, if you look at in-car cameras -- you could have a 4K camera in the car. You could sync to it with ultra-low latency, less than one second on a 5G network. It can provide any kind of media streaming. It's like telepresence. You could use it for CCTV in the car, or for a self-driving car.
Initially it could be for surveillance but you could also use it for remote access. You could use the camera to see where the car is in a parking garage, or use it to check the weather outside without having to go out yourself. [But] you will need to synchronize, with high quality and low latency.
— Aditya Kishore, Practice Leader, Video Transformation, Telco Transformation