Mobile media consumption has ramped up dramatically in recent years, and is expected to keep growing in coming years, though at a slowing rate. While time spent consuming media may be flattening, usage is shifting from other applications to video -- a very high bandwidth application. As live streaming and OTT video services become even more prominent, networks will face traffic spikes and congestion and, consequently, operators are concerned about their ability to maintain quality of experience (QoE) for their subscribers. (See EE's Stagg Warns of Brewing 'Content Storms'.)
While there are several technologies and approaches being evaluated today, edge-caching and using content delivery networks (CDN) is one approach that has had success in the fixed-line world. But mobile networks present different challenges. Telco Transformation discussed this issue with Michael Archer, chief strategist for emerging mobile technologies at Akamai Technologies Inc. (Nasdaq: AKAM). He described a new approach Akamai is promoting, which involves essentially using the end user's mobile device as a caching node.
Telco Transformation: What challenges and trends are you seeing with mobile video delivery?
Michael Archer: What we are seeing from a market perspective is some challenges that are unique to the cellular network, with mobile delivery. Our customers are finding that mobile delivery is different, more difficult in some ways. Everyone is looking for a panacea solution, to make mobile faster, less challenged in terms of content delivery.
There is clearly a great desire to reach mobile audiences -- there are a lot of mobile services, and many providers are taking a "mobile first" approach to their services. But also there are big differences. Different regions have different requirements. At Akamai, most of our customers are big, global players so they have a global audience that they are concerned with reaching. That means navigating big differences, in terms of networks, devices -- specially in emerging markets, you have different levels of technology. It's a very difficult landscape to navigate.
TT: At Mobile World Congress this year, Lior Netzer (VP & GM of the Emerging Mobile Business Unit at Akamai), talked about using mobile devices themselves as caches. Can you explain what Akamai is doing in this space?
MA: So Akamai's traditional business is content delivery -- it's a content delivery network (CDN) and it takes content and distributes it to the edge of the network. In reality though, it's really the edge of the Internet. We are trying to get deeper, closer to the consumer and into the last mile. But in mobile it's not a mile -- the last "mile" could be several hundred miles. So we developed a solution that takes network unpredictability out of the equation -- our Predictive Content Delivery (PCD) solution.
The goal is to improve the media consumption experience. We pre-position content ahead of time -- our customers can use a white-label app or an SDK. The content is then pushed on to the device, and takes any variability on the network out of the equation. There is no seeking time, no buffers -- the experience is very positive. And we're seeing that users watch more and watch for longer because that experience is better.
The content is also intelligently positioned. We track and refine our algorithm based on what the user watches, and also when the content is downloaded, based on various factors -- time of day, network state, whether the device is plugged in or not -- we have policies set up. Also it helps move traffic off the network at peak times -- content can be downloaded at night or at home, using WiFi. And the end user can access the content without network connectivity -- on a plane, in the subway, wherever. And they will have a high-quality, personalized experience with the content that they want.
TT: In the past there have been questions about pre-loading, mostly centered around battery power and data usage. How does the PCD solution address those?
MA: The solution allows for configurability. Content owners can set up the downloading to work certain ways, for example, they could say that downloading can only occur when the phone is plugged in or only when connected to a Wifi network. The SDK has these capabilities built into it, so when it's integrated into the media app, end users can also set up the configuration to work how they want. We do appreciate that battery life is an issue, want to be cognizant of that. So we have built in this flexibility and users can set up their app to manage both battery power and quota management of data.
Also, in the US now we are seeing the return of unlimited plans. That takes some of that quota management anxiety away. But we have tools for both operators and end users to manage the process. It also allows for management of the cache -- you can decide how much space you want to use for this.
It's all in a capability library in the SDK. Now it's up to the content owner what control they want to pass on to the end user, but mostly they do -- it makes sense. They want the user engagement anyway.
Also, it allows control of genres -- users can select what specific teams they want to follow, or what news sites, things like that.
TT: How is PCD being distributed? How are you deploying it?
MA: Customers are a combination of content owners and mobile operators. In fact, ideally the two do it together. There's also been some shifts in the regulatory environment, so there's the possibility of zero rating services, which takes away end-user concerns about using up their data quotas. So operators might look at downloading between midnight and 5 a.m. That's mostly in developing markets, where it makes sense if you can shift traffic to quiet times.
TT: And it's integrated into the content owner's app? So taking the example of an HBO Now, would HBO just integrate the PCD SDK into the HBO Now app and then the end user downloads it onto their phone?
MA: That's pretty much it, yes. We work with the content owner, they plug in our SDK, which is a completely wrapped library. Integration is a couple of lines of code and then the customer has access to the SDK. Also, DRM is pre-integrated which is quite important for premium content -- because it's actually downloaded on to the device.
TT: And with operators?
MA: Similarly, it works with their app. But when an operator gets involved, they can do more with content delivery. For example, Verizon has announced support for LTE Broadcast in their Go90 app, which allows them to switch to a broadcast stream to manage large scale delivery of content. Operators can also look at zero rating content. So if they are also involved, you can do more. And it helps them also, moving traffic from peak delivery times to a network trough. Anything they can do to utilize their network more efficiently certainly helps. So it's a mutually beneficial ecosystem.
TT: Are you looking at enterprises at all for this solution?
MA: It's an interesting opportunity. There's more of a shift to video training modules which employees can access while they are on the road. And also, once they take a course, we know what the next module will be, so we can download it. I've seen a couple of these…looks interesting.
There's also other kinds of content -- such as a product catalog that can be made available without delay, or even without a network connection. These are interesting use-case scenarios. Definitely early stages, but we are seeing them.
TT: What are your expectations – how will this space develop?
MA: We're seeing larger media companies integrating and deploying, and we're getting interest from developing markets where operators are trying to mitigate network traffic at peak moments.
We're also looking at caching in different areas on the network. We've been looking at ways to get content closer to the end user, and this gets us right on the handset -- we can't get any closer! But we are also interested in looking at other options.
The trend towards Mobile Edge Computing (MEC) is also interesting. It gives us a different aggregation point for caching. MEC started out as an approach to caching at the base station. But we don't get efficiency in terms of cache hit utilization [there aren't enough people being served by a single base station to make caching efficient.] So then it moved back up the network, aggregating 50 cells or so -- which is a more optimal point for aggregation.
There is a shift towards fog computing principles -- a distributed mesh of caching nodes that will distribute content to the most efficient parts of the network. But there's a need to figure out the right balance, as going further out can make a cache less effective. Our customers also have different needs -- for some, low latency is very important, others are more concerned with their storage footprint. So again, it's about finding that right balance to optimize delivery efficiency.
— Aditya Kishore, Practice Leader, Video Transformation, Telco Transformation