Why do we need another generation of technology?
The service providers operating the mobile networks have been battling for years to be more than just a commodity pipe, carrying exciting and valuable services for other, over-the-top entities. The arrival of 5G is revolutionising the way that they can deliver networks because the operator is becoming critical to if and how that service is delivered.
There’s a recognition that new services will disrupt the operations of industries and how individuals conduct their lives. And this will be a force for good, whether it’s managing operations of a factory more safely and efficiently or giving individuals rich and immersive experiences to connect with friends and family. These cannot happen unless we revolutionise the way connectivity is delivered. 5G is taking significant strides towards making this a reality, and the operators play a key role in the delivery.
If 5G has a close relationship to 4G, is it just a simple upgrade?
We are asking the industry to deliver services that meet demanding measures of quality of service with much lower latency and response times than we have seen before, along with higher density of devices and more volumes of data being conveyed. While 4G technology was a significant evolution, it wasn’t up to the job of delivering what was being asked, and this spawned the 5G project. When we get into the details, we realise that to deliver higher volumes of data we must open up more spectrum at frequencies that are less favourable for cellular radio. Along with the fact that we want to serve orders of magnitude more people, IoT devices and machines, this means that we need to better manage the spectrum we have and look to new technology to help us operate with the spectrum that we didn’t previously have to worry about. Densification of the network is also part of the solution. Tighter spacing between radios is one strategy but we are unlikely to succeed unless we grapple with highly challenging technologies like massive MIMO beamforming antennae.
But, it’s not just about new radio spectrum and technologies. Very low latency services need the user plane to be broken out near the user, and that means core network functions can’t just sit in a centralised data centre; they must be able to reside near to the edge. Conversely, limited real estate at the edge means that we can’t rely on being able to deploy anything but the bare minimum of infrastructure at the radio sites; anything that doesn’t absolutely have to be co-located with the radio must be capable of being hosted away from the edge.
These are some of the factors that mean that 5G needs completely new mindsets about designing, deploying and operating the new network.
Is 5G also about reducing cost?
The drive towards a more disaggregated and open RAN arises in part by the desire to make the infrastructure ecosystem more competitive and lower the barrier to selecting new vendors, reducing cost and lock-in for operators. But I believe that is a secondary consideration to other factors such as the fact that delivering a wide range of use cases in the various environments we encounter in cellular networks is a complex problem. This necessitates flexible systems able to adapt to delivering the specific mix of services in the environment in which they find themselves. This is also a major driver to openness, at least as important as costs.
But this doesn’t mean that cost isn’t critically important. Delivering an ever-evolving mix of services to a greater variety of customers, reliably, regardless of the outages, failures, congestion, and any number of impairments that networks face, needs constant tuning. The cost of manual management and optimisation would be way too high, if it were possible at all. Hence the need to open the network to the best technologies that manage the operation and optimisation of 5G systems so that the dynamic complexity can be delivered with a realistic cost.
And finally, let’s not forget about the energy required to deliver cellular connectivity. A denser network based on more disaggregated components delivering orders of magnitude more data would by default consume more energy than previously and certainly more than one would like in a world aiming for net-zero. Here again, the automation of 5G can control the energy demand. We do this by asking the network automation to not only deliver the SLAs of the services we want, but also to introduce energy consumption into the mix. Blended objective functions mean that services will be delivered reliably with a minimum of energy consumption.
What are the sorts of problems that we will face building out the 5G network?
We’ll face many of the same problems that come with building out a network of any flavour; confirming that the physical infrastructure and transport are installed, connected, and functioning correctly. But in 5G, that physical estate is implementing technologies like massive MIMO beamforming, which is significantly more challenging than in previous generations. We need to know if the beamforming is functioning correctly, configured appropriately for the environment, and can it rely on the xHaul to deliver the fronthaul packets with the strict latency and packet loss requirements? The sectorised cells of older generations could be understood with relatively coarse drive tests. The much narrower and dynamic beams of massive MIMO vary and interact on much smaller spatial scales. Understanding these demands new approaches and solutions. Time Division Duplexing (TDD) brings with it potential synchronisation problems that must be guarded against.
Network disaggregation means that the specific mix of components, services and network functions are unlikely to have been brought together in the same combination previously. As there are intelligence and automation elements influencing components from different vendors, their behaviours can be expected to change over time. And these must work in a wide variety of scenarios and conditions. This can sound daunting, but a well-designed approach to validation will manage the risk and mean that the 5G network will deliver the services demanded of it. This approach starts in the lab, with thorough functional testing of the individual components and verification that the system performs as a coherent whole, in the various scenarios that it will encounter in the field. This feeds into the deployment stage which for example will check that all the infrastructure and virtual components have been successfully deployed and are operational. Then comes the ongoing assurance to monitor operation and performance to ensure that this complex and dynamic system continues to adapt to deliver flexible services to a dynamic population of users and devices in the face of an ever-changing environment.
There are big expectations for 5G. Can it really deliver on these?
The first exposure to 5G of the public will simply be a new symbol appearing in the top corner of their shiny new handset, representing Enhanced Mobile Broadband (eMBB). But this is only the first step of a long journey towards delivering the promise of 5G. It is only when the rich new services for consumers and a wide range of valuable new industrial use cases arrive, tested, validated, monitored and assured, that we can say 5G is delivered. This is possible with the right mindset and with that will come the promotion of the operator to the position of being critical to the success of this societal transformation.
To find out more about how to get to problem-free 5G, please visit viavisolutions.com/5gbootcamp, where you can view a short series of video conversations with Prof. Andy Sutton, BT and Julie Snell, Chair of the Scottish 5G Centre and access further information here https://linktr.ee/viavi5G