Virtualization and preparations for 5G are transforming AT&T's network, as are the desires of its business and residential customers. The bottom line is that this is a quickly evolving world in which low latency 5G applications won't be capable of being supported unless elements of the cloud are placed closer to end users and more tightly linked to the access network.
Hank Kafka, vice president of Access Architecture and Analytics AT&T Inc. (NYSE: T), spoke to Telco Transformation abut the ways in which the carrier is evolving to deal with these foundational changes.
Telco Transformation: Where is AT&T in terms of building out its 5G ecosystem?
Hank Kafka: During our current pre-standards 5G fixed wireless trials, we are testing mmWave spectrum. We're excited about what we’ve seen so far in our field testing and customer trials. We've observed speeds over 1 gigabit per second and latency rates well under 10 milliseconds. These real-world business and residential customer trials are helping us to better understand the characteristics of mmWave and early 5G concepts. While 5G standards are still being finalized, we're laying the foundation for tomorrow's faster wireless speeds with the launch of our 5G Evolution network in several markets throughout the U.S. We plan to start deploying standards based mobile 5G as early as late 2018.
TT: What role does OpenStack play in your design?
HK: We began developing the AT&T Integrated Cloud Platform (AIC) three years ago. This cloud is the foundation of our network transformation and is key for us to build a cloud-based infrastructure that can run virtual network functions. Today, the AIC serves both national and international customers through the more than 80 AIC zones located around the globe. The foundation for AIC is OpenStack. Rather than buying another proprietary software stack from a vendor, we wanted to harness the power of the OpenStack community. This will lead to faster innovations and upgrades.
TT: What impact will the new services made possible by 5G have in the data center?
HK: One day, AR and VR [augmented reality and virtual reality] could be as widespread and game-changing as the smartphone. But, that won't happen until consumers can access high-quality, compelling AR and VR experiences without either being tied to a PC or worrying about overheating, limited battery life or app crashes on mobile chipsets. Next-gen applications like autonomous cars and AR/VR will demand massive amounts of low latency computation. To provide rich content experiences that consumers want and need, we must harness the power of cloud computing resources. Processing and data storage requirements of emerging applications such as AR/VR, autonomous vehicles, drones, etc., are rapidly outpacing the capabilities of the end devices, creating the need for ultra-low-latency access to localized cloud resources.
TT: What role will distributed edge computing play in regards to enabling 5G applications and services?
HK: Using centralized cloud computing doesn't give acceptable levels of latency for some types of real time applications. It can also greatly increase OPEX for some types of applications, requiring heavy data flow across large parts of the network. One solution is to shift cloud resources geographically closer to the end user and more tightly integrate them with the access network.
This would reduce network congestion and application response times. If the large centralized data centers are the "core" of the cloud, computing resources placed near towers, central offices and small cells will make up the "edge" of the cloud. Intelligence will no longer be confined to the core. High-quality AR and VR hold potential far beyond gaming and entertainment. If given the proper mobility and quality of content, it can transform healthcare, education and much more. For example, mobile AR/VR is an ideal use case for the projected latency and bandwidth/data rate capabilities of future 5G wireless networks.
— Carl Weinschenk, Contributing Writer, Telco Transformation