Biden proposes $100bn investment in US broadband
“Americans pay too much for internet service. We’re going to drive down the price for families who have service now,” said Biden, following the unveiling of his $2.3trn federal investment proposal earlier last week.
The plan, a White House spokesperson later confirmed, would see the US Government invest heavily in “future proofing” wireless infrastructure in underserved and marginalised areas, as well as provide support to local government, cooperatively owned and not-for-profit networks.
According to the , currently around 20 US states have legislation in place that restricts the ability of local governments and other non-corporate groups to provide local broadband services. By giving these operators a leg up when it comes to competition with the country’s major carriers, like AT&T, Verizon and T-Mobile, is expected to help create the drop in prices the Biden Administration’s plan hopes to achieve.
According to the FCC, the number of Americans with access to sub-par internet (defined as speeds of 25mbps and upload speeds of 3mbps or lower) was around 14.5mn as of the end of 2019. Additional figures released by the White House, more than 30mn Americans lack access to high speed internet, and a significantly larger number can’t afford it, even if such services are available.
With the COVID-19 pandemic continuing to raise the number of people working remotely, improving broadband access across rural and marginalised communities is increasingly becoming an essential step towards ensuring their occupants can participate in the modern digital economy.
Larry Irving, a telecom official in the Clinton administration praised the step, saying that, “The simple act of recognising that poverty is a bigger indicator of lack of access than geography is a huge statement.”
Republican politicians are already vocally opposing the plan, with Cathy McMorris Rodgers of Washington, the Republican ranking member of the House Energy and Representative Commerce Committee, said Biden’s plan would “hurt private investment in our networks without actually closing the digital divide.”
Similarly, the US telecom industry’s representatives in Washington have spoken out against the announcement. Michael Powell, head of the telecom industry trade group, said that that move “risks taking a serious wrong turn in discarding decades of successful policy by suggesting that the government is better suited than private-sector technologists to build and operate the internet.”
However, the increasing role of internet access as an essential utility makes its continued privatisation an ever more sticky issue. Los Angeles Times business columnist, David Lazarus, r that, “many Americans know full well that there are three things they can’t live without. Two of them are power and water. The third, I’m sure, will be obvious to all. Internet access.” Lazarus adds that it is painfully obvious that the internet has grown into a utility “and internet access should be regulated as such.”
Utilities, by their very definition, are both necessary to function in a society, and access to them is regulated lest the private sector abuse that necessity in order to drive profits at the expense of human safety, comfort and dignity. Look at the US healthcare system - a service which is regulated and treated as a public utility and human right in many other parts of the world.
In the US, which has privatised healthcare, the average spent on healthcare costs per person is roughly double the global average - with the difference being funneled into the balance books of the private health insurance, pharmaceutical and primary care companies that provide the service without government regulation. The issue has become so globally prominent that a leading Normegian university recently issued guidelines (which it has now re-worded following something of a public relations firestorm) for students studying abroad to return home due to the COVID-19 pandemic, especially “if you are staying in a country with poorly developed health services and infrastructure and/or collective infrastructure, for example the USA.”
As access to the internet becomes increasingly essential, so too does the need for the government to regulate its provision as an essential service. Susan Aaronson, director of the Digital Trade and Data Governance Hub at George Washington University said recently that, “It is an essential public good and should be embedded in the law as some nations do. It is essential to equality of opportunity, access to credit, access to other public goods, access to education.”
Biden’s new $100bn is alarmingly light on the details. However, if such an initiative has provoked this kind of viscerally negative response from corporate telecom lobbyists and members of the Republican party (on what the Associated Press affirms is a bipartisan issue) then it’s probably safe to assume it’s a step in the right direction.
How to expand the cloud-native technology workforce
The telecom market is in a state of flux. The ongoing pandemic has inflated global Internet traffic by up to 60%, increasing demand for bandwidth and adding more pressure on operators to continue to provide reliable, high-speed broadband connectivity. This has challenged operators’ future-ready and efficient network infrastructure perspectives, leading them to question the way they have deployed and operate their networks. While telco technology has remained stagnant for decades, we have now reached the precipice of a shift towards disaggregated, cloud-native networks – with industry bodies like the TIP Initiative leading the way.
The market is now seeing a move towards a cloud compute approach, and away from the traditional monolithic legacy hardware that has dominated the sector since its inception. With this comes a demand for new skillsets. Just as the dot-com boom of the 2000s brought the rise of coding bootcamps and a push towards retraining employees for the new age, the cloud-native overhaul of the 2020s will lead a push towards new skillsets within the industry. These new “cloud native engineers” will have to embrace software-centric, cloud native and disaggregated networks, from the Radio Access Network (RAN) to the edge and 5G core. They need to able to understand and navigate the world of cloud with ease and take an application from a repository through a continuous integration and delivery pipeline, and into a new operational environment.
The challenge now is that there is a skills gap for both in-house and outsourced staff. There is already a shortage of technicians who can properly install fibre, power and radio equipment on telecommunications sites, let alone engineers with the expertise to accurately navigate the new cloud native environment. So, how can we expand the next cloud native technology workforce?
Adapting to cloud native environments
In the telecom world, the term “cloud native” is used to describe various functions within networks that have been developed as software from the outset and run on independent hardware. Of course, a cloud native design like this brings many advantages, with independent microservices deployed and running in containers. If a new function or an update is required, a corresponding microservice is supplied by the software developer, which updates or adds the respective feature within milliseconds without interrupting the service. This way, route processing, updating, and restarting are 20 times faster than with conventional router operating systems. If open interfaces are also available, network operators can even develop and implement their own functions.
However, the implementation of a cloud native environment – as well as the code and processes that sit on top of regulating functions and management – must be done by engineers with new skills. Compared to older legacy fixed networks and hardware, cloud native engineers must understand how container architecture functions to allow microservices and APIs to work together in a loosely coupled approach for maximum flexibility and development agility. They must also possess skills pertaining to the operation of routing software that turns bare-metal switches into IP/MPLS carrier routers, often in different areas of the network, such as broadband access, edge or core. For engineers, bridging the gap to the new cloud native environment is not easy, but can be achieved through training and experience.
New ways of building cloud native expertise
Of course, traditional routers and dynamic control systems are challenged by new concepts such as disaggregation and distributed SDNs. They are promising significantly faster implementation, automated control, and a shorter time to market. For future router designs to meet these challenges, fundamentally new router hardware and software must be developed, and modern software architectures and paradigms introduced.
A cloud native engineer must have software skills, such as coding, testing, design, architecture, etc., whilst also knowing how to adopt applications to leverage cloud platform services for maximum impact. The best way to build this wide knowledge base is through training programs and hands-on experience. Training typically includes learning about Docker and Kubernetes in production use cases, writing complex cookbooks from scratch, transforming existing applications to cloud native oriented applications etc. Unfortunately, most training is currently focused on the legacy engineer, deployed and tasked in the field to replace radio equipment or repair newer 5G stations. Not enough is being done to promote this new cloud native path at the grassroots level – in universities or further education colleges.
Leading the charge in re-training the existing workforce for the cloud native future
Most operators understand the case for a cloud native approach, since the improved flexibility in deployment, roll out of services to field and cost-savings are plain to see. However, they’re bogged down with thousands of operational staff that – rather than looking towards the future – have been trained to solve yesterday’s problems. Imagine the electric car industry came along and said: “We’ve designed this cool electric car, but we don’t sell the engine, or the batteries that are running it”. This is exactly what is happening now with the cloud native approach. Operators are not used to building networks this way, so they’re having to tap into other workforces to execute their plans.
To build talent, the first place an organisation should look is within its own ranks. Sure, some employees may balk at having to start over with a challenging skillset. But there are plenty of young, bright, hungry-to-learn engineers that would be eager to pick up new cloud native skills if given the opportunity. Also, this approach allows for a hybrid model of expertise that can be beneficial to operators, depending on the project being implemented.
Looking more broadly across the UK and Europe, investment in engineering skills is essential to giving these markets a competitive advantage for decades to come. The best way to do this is to start young – in schools, universities, colleges and through apprenticeships – and provide practical, project-based education that allows young engineers to develop both individually and operationally