As we usher in a new technological age, we continuously increase our use of high-density content that we don’t have the technical infrastructure to support. Our demand for global content is on the verge of outstripping the limits of the current global bandwidth limits. Bluntly, we are at risk of Global Bandwidth Starvation.
Feeding the Ever-Hungry Data Demands
As our technology becomes more data-intensive and includes the increase in the quality of video content and the rise of 3D technologies, we are ill-equipped to deal with the data demands of the future. The fixation with mobile and video content puts tremendous strain on the data pipeline. TeleoGraphys Global Bandwidth Research Service notes that no continent has experienced annual growth of less than 40% over the past five years. As data traffic rises at a rate of 53% every year, this constant growth of demand has forced the industry to be in a constant state of innovation. For data links at the intersection between Africa, Asia, and Europe, traffic has been astronomical. As more users watch video content on both computers and mobile devices our data consumption is only set to increase.
How is data starvation even possible??
As we’ve gone deeper into the technological age of the 21st century, our data consumption has increased astronomically. The rise in usage of video content has been particularly bandwidth intensive. Video content sites like YouTube boast millions of users every day, consuming countless hours of content, and that’s before we even consider the amount of data used by other streaming sites like BitTorrent. As our online video content becomes more advanced, the demand for data will begin to exceed the supply. According to Cisco, it would take more than 5 million years to watch the amount of video that will be transferred every month in 2020. The present infrastructure is simply not equipped to be able to handle that data consumption.
The present infrastructure is maintained through an intercontinental network of submarine cables, which transmit data across the world. As oil would go through a pipeline, our traffic goes through these cables from continent to continent. Like a pipeline, these cables can only cope with a certain volume of content before they reach full capacity. As future technologies like 3D content come to the fore, it’s vital that we improve our infrastructure to be able to sustain the strenuous demands of the coming age. Even now it’s noticeable that in crowded areas the transfer of data slows to a crawl. In the future, data epicenters like Europe will struggle to deal with the sheer volume of traffic coming from Africa, Asia, and the Middle East.
In talking about the root of the problem it’s worth noting that it is private company data which is taking its toll. They use more data than the entire number of public users. Content providers like Google and Facebook have been driven to constructing their own internal networks to try and ensure they manage traffic effectively.
Indeed, Google has just recently launched an INDIGO cable system that will be around 9,000 km in length to connect Singapore and Perth. According to a recent news release, INDIGO will be able to transmit an equivalent of 7.2 million HD movies simultaneously. It’s clear that in order to solve the infrastructure problem, future development will be in the hands of private corporations. As major content providers and users, companies like Google will be the primary investors in the infrastructure of the future.
How to Solve Global Bandwidth Starvation
With regards to Global bandwidth starvation, there will no quick fixes. The key to solving the problem lies in investing in new infrastructure and applying innovation to that infrastructure. In practice, this will result in a technological arms race, with companies vying to end the data crisis. Part of this effort will be to simply improve existing submarine cables, but there are a number of advancements in cloud-based solutions like 5G and distributed computing that could help solve the crisis. Companies like Amazon and Google make use of cloud computing services every day. The case for 5G as one of the main solutions for global bandwidth starvation is quite strong.
5G: A Cloud-Based Solution
New technologies like 5G, will be able to take our data transfer speeds to the next level. 5G is the proposed fifth generation of mobile networks, with increased speed, bandwidth, and reduced latency. In order to be classified as 5G, there will need to be a 1Gbps downlink speed and sub 1ms latency. At these speeds, a full HD movie would be downloaded in fewer than 10 seconds. Yet the ambition of 5G does not stop there, the Third Generation Partnership Project is aiming for 100 percent geographical coverage around the world. As we phase out the infrastructure of hard-wired connections, technologies like 5G will act as a global cloud service.
However, the transition to 5G is not without its challenges. 5G will need to be able to support millions of devices across the world. If complete geographical coverage is to be implemented this will be no small feat, particularly when you consider that this will need to be done with minimal latency. As the demand for cloud-based services increases with the increase in users accessing content through mobile devices on the go, 5G will become extremely prominent within the data landscape.
Necessity is the Mother of Invention
While Global Bandwidth stretches to the demands of the future, vigilant and relentless tech companies around the world will be working around the clock to lay the foundations of the future technological landscape. Perhaps rather than investing in new and improved submarine cables, the demands of the future are better met by investing in cloud-based services like 5G. As our usage of on-demand content continues to grow, the need for 5G will be ever more present. Of course, such technology is not without its significant challenges, yet for the ambitious tech company, necessity is indeed, the mother of all invention. Who will come up with the most efficacious solution?