Scaling Up Data Volumes (From Bytes To Zettabytes)

19th June, 2019 by

It’s often said that we are currently living through the information age. The use of stone defined the Stone Age, just as bronze characterized the Bronze Age. In the same way, information and data define today’s era, as data volumes continue to grow at an astonishing rate.

The growth of data far exceeds the capacity of traditional computing. It makes developing new IT infrastructure and systems an absolute necessity. That presents a major challenge for businesses, in terms of building and maintaining an IT infrastructure which can scale up and keep pace with the world’s spiraling data requirements.

From kilobytes to zettabytes

People are more connected today than at any other time in human history. The sheer breadth of interconnection means there are more sources of data than ever, and data volumes have reached a level that would never have been thought possible before. IDC predicted in 2017 the world’s total digital data will have expanded to 163 zettabytes (10007 kB) by 2025. This represented a tenfold increase on where data volumes stood at the time of the prediction, and their prediction doesn’t seem exaggerated two years later.

Such a marked increase should come as no surprise. A couple of decades ago, 8-bit computer games and 286 PC hard drives represented the pinnacle of domestic computing technology. Today, we have UHD streaming services, multifunctional smartphones, and – most notably – the Internet of Things (IoT). At the most basic level, the IoT is the extension of connectivity to mundane physical devices. It has been developing apace in recent years and is only going to become more ubiquitous.

The same IDC analysis two years ago also made a prediction regarding the IoT. It stated that by 2025, the average connected person will interact with connected devices close to 4,800 times a day, or once every 18 seconds. Little wonder that data volumes are expected to continue growing at a meteoric rate.

 The scalability challenge

Businesses need to be future proof as much as feasibly possible, and able to meet the challenge of rising data requirements. That means building and maintaining an IT infrastructure which can cope not only with the expected growth of data but also with related challenges. They need an infrastructure that allows them to continue deriving meaning and value from the data available to them, so it doesn’t become so many useless silos.

Increased computing power is a must for coping with ever-growing amounts of data. Having the power to process terabytes, petabytes, or even zettabytes isn’t, however, the only consideration. It’s also vital any system can cope with the greater variety of data formats that are sure to become prevalent; any infrastructure must be scalable to cope with both greater amounts and wider varieties of data.

There are two main options for scaling IT infrastructure to meet those requirements. The first is to scale vertically, by adding resources to existing systems over time. This is an easier option but is limited by the maximum capacity of the original system. The second option is to scale horizontally by adding more aligned systems. They can then be connected in such a way as to share requirements across the systems. It’s a more complex alternative, albeit subject to fewer restrictions.

The solution for many businesses will be to combine the two alternatives by scaling their IT infrastructure in a multi-faceted way, to meet the twin challenges of growing data volumes and increasing variety. In the information age, intelligent thinking like this could keep forward-facing businesses ahead of the curve.