Press "Enter" to skip to content

Is the Growth of Data Pedestrian or Problematic?

So has good ol’ data storage once again gone from pedestrian to problematic? Have we created a perfect storm for chief information officers and chief financial officers with a confluence of technological advancements? Consider the increased connectivity and bandwidth from 5G, expanded computing capabilities coming from edge-to-cloud innovations, heated demand for artificial intelligence and machine learning, and the rising complexity of cybersecurity and encryption. 

The simple answer is yes. Yes, we are now faced with a situation where we are opening the aperture in some aspects of performance by creating more and more data every year. This data is being ingested by enterprise applications closer to the end-user than ever before. In the meantime, the need to close the aperture requires secure, encrypted and properly governed data volumes to convert this data into information. 

To compound this dynamic, the rise of heterogeneous compute environments requires that organizations implement new information management strategies. This will help them gain critical insights from data residing in different sectors of an enterprise. This will only get harder to manage with ubiquitous sensors as 5G increases connectivity and bandwidth. The 5G revolution is now reaching 10 gigabits per second, up to 100 times faster than 4G. This will bring further adoption of next-generation connectivity and drive the forecasted total amount of data to increase rapidly over the next four years from 64.2 zettabytes to more than 180 zettabytes in 2025. If you are not familiar with a zettabyte, you are not alone. One zettabyte contains one million petabytes, and one petabyte contains one million gigabytes. 

Two trends impacting enterprise IT environments with multiple-petabyte, data management needs, are now emerging that will impact the solicitations, statements of objectives and secure development operations of government customers long into the future: edge-to-cloud and software-enabled storage.

Edge-to-Cloud, or E2C, refers to moving computing, storing and analyzing data closer to the user (or consumer) and away from the traditional data center. Second, the consensus is that edge computing and 5G network offerings will see significant growth as major cloud service providers deploy more services in local markets and telecom providers continue 5G rollouts. The global COVID-19 pandemic reinforces this prediction as with the global push to remote working, learning and socializing. According to IDC’s worldwide IT predictions for 2021, COVID-19’s impact on the workforce and operational practices will be the dominant accelerator for 80 percent of edge-driven investments. This business model will change across most industries over the next few years.

Thanks to E2C solutions and ecosystems, real-time intelligence processing is possible for users even in remote or disconnected environments. They shorten the time to analyze data and cut mission-critical decisions from weeks or days to near real-time. With E2C architectures, enterprise/mission IT operations can support advanced capabilities such as augmented and virtual reality, AI and ML—even in austere environments. This enables data exploitation and decreased time-to-value for mission-critical information. As Lt. Gen. Dennis Crall, chief information officer for the Joint Chiefs of Staff remarked recently: “If you think about the challenge we have in AI, in amalgamating data and sharing data, what it means to take processing and move that processing requirement to the tactical edge, without a cloud, none of this [is possible].”

Data storage has come a long way from the days of punch cards and floppy disks with invention and innovation improving performance, availability, security and reliability. But recognizing the volume and scale of creating data combined with how the storage technology has developed around hardware points to a growing need for software-enabled storage innovations.

Software-enabled storage is gaining momentum and pushing innovation that can address random data reads while using combinations of hardware technologies to drive more cost-effective solutions. This approach is especially important when encryption requirements impact data management strategies that previously relied upon data reduction techniques such as compression and deduplication.

The growth of data is certainly not pedestrian, but with innovations coming from E2C and software-enabled storage solutions, big data does not need to be problematic. As organizations scale beyond PB levels, supporting enterprise-level applications and multi-geographic use cases is getting closer to reality. The critical takeaway here is that the intersection of cloud, edge computing and the exponential growth of data is going to increase the burden on organizations’ operations. This is especially relevant considering heavily regulated markets such as the public sector and health care. Nevertheless, this condition creates a wealth of opportunities to apply technology to effectively balance data creation, security, IT operations and governance while driving value. 

Derrick Pledger is vice president and director of digital modernization at Leidos.

Brendan Walsh is senior vice president of partner programs for 1901 Group, a Leidos company.

source: NextGov