Planning for the 44 Zettabyte World

2019-01-13T17:44:33-05:00September 2nd, 2016|Blog|

Last week, ViON had the opportunity to present our perspective on the impact of the data explosion on the Records Management world, at the Records Management Conference @930gov 2016 in Washington, DC. For those who were unable to attend, we wanted to share some of the thoughts and insight from that presentation, “Data to Decision: Making Data an Asset through Effective Records Management,” which detailed how to implement a strategy to get from data to decision by controlling storage in an end to end fashion, with full analytical insight and centralized management.  More specifically, emphasizing the importance of properly tiered storage architecture, solution automation and the flexibility to leverage different IT business models.

See Data Growth – Just Look at Your Cellphone
If you’ve been around awhile, you’ve been to a rock concert and looked out upon a sea of lighters held up in tribute toward the band. Flash forward to 2016, and those lighters have been replaced by hundreds if not thousands of cell phones. Think about what that shift implies. Those smartphones equate to substantial data creation, including copies, versions and records stored in thousands upon thousands of locations. That’s just one example of what is driving today’s massive data build up, which is growing at rates faster than anyone expected. Mobility drives data growth!

Ten years ago, I was responsible for storage product strategy and development at a major storage vendor. We saw the data tsunami coming and even briefed executives on the R&D challenge to keep up with projected growth, but we, along with other experts in the field, actually missed it by a wide margin!

We estimated that data stores would grow to 50+ Exabytes by 2020. By today’s predictions, we will hit 44 Zettabytes by 2020: Nearly 1000 times what we initially thought.  What if we are wrong again? What if the challenge in 2020 is 100 Zettabytes? Can we respond?

For most organizations, the current answer may well be no. Experts now estimate that in just three years, 5,500 GBs of data will be in existence for every single person on Earth. Most of that is unstructured data, growing at a 75% Compound Annual Growth Rate (CAGR). Structured data is also growing at unprecedented rates, and that must be managed, too.

Critical Mandates Accompany Data Growth
As data grows, so do the mandates that correspond with how data is managed, monitored, protected, retained and accessed. From HIPAA, the Freedom of Information Act, CJIS compliance, OMB mandates and protection for Personal Identifiable Information (PII) to copyright documentation, patent and Privacy Act regulations, the list of regulations organizations face today is daunting.

Records management is critical as data volumes explode. Moreover, requirements for information use are unpredictable.  Without careful data management and protection, organizations can be crippled by financial, legal, decision and policy-making impacts. While technology can certainly help, it can’t cover all of the bases alone, no matter how cutting-edge or innovative, particularly when emergencies, unplanned events, natural disasters, hacking and security challenges occur. No one wants to be the organization that is unable to provide information at the most critical point of need.

Start Planning Now for the 44 Zettabyte World
The solution? Plan now for the 44 Zettabyte world – or at least your organization’s part of it. That entails much more than throwing hardware and software at data challenges.

Organizations must apply the right solution architecture, both for structured and unstructured data. They must establish tiers of capability and position the data in the right tier. They should also be able to move and migrate data seamlessly from tier to tier depending upon requirements. But that’s not all: Centralized management across locations and devices is a must, security and protection has to be an automated and active 24/7/365, and it all has to scale to support future data growth. Plus, to get real value from data assets, organizations need search and content insight tools to find what they need, when they need it, and deliver it securely to the person or persons who will use it.

A key step in solving the data management problem is to have the discipline to manage data storage properly – and tiering requires data discipline. New technology works best within a stable environment. Even though unstructured data is by far the fastest growing part of the problem, structured data is growing and must be accounted for, particularly as structured data is often among the most critical and sensitive information to manage.

For this arena, a well thought out and implemented storage virtualization architecture is essential, which entails implementing appropriate storage tiers based on capability and requirements. Key components of this architecture are:

  1. Cross connected storage resources for simplified access and management
  2. Multiple storage tiers for cost competiveness
  3. Massive scaling to meet exponential data growth
  4. Automated migration of data between tiers and across systems
  5. Non-disruptive ability to introduce and retire technology

While architecture is important, it doesn’t stand on its own. Automation, on board tools, and policy-driven automated actions have to be in place, built natively into the storage solutions that we deploy.

Business model flexibility is also crucial in this fast scaling, complex data world. The storage solutions an organization chooses should be accessible and adaptable to both traditional on customer premise data centers and via cloud models, with the option to quickly scale and pay only for the amount of storage an agency is actually using.

The to-do list to take on today’s data storage challenges may seem overwhelming. Organizations require a structured approach to tiering, coupled with cost-effective data placement, access controls, and scalable, content-aware tools with advanced search capability. All of this must by carefully engineered to enable fast access by authorized users, deliver advanced data analytics, and generate economic value from the data lake.

Basically, solving the storage dilemma boils down to architecture, automation, and flexible business model implementation. It takes all three to be ready for this 44 Zettabyte world. Thankfully, organizations can check-off their storage to-do lists with some careful planning, enlisting the help of storage and infrastructure experts and powerful solutions sets such as Hitachi Content Platform (HCP).   ViON has been privileged to provide these capabilities to our customers for over 36 years, with nearly 14 years delivering solutions in various cloud business models.

Learn more about HCP here and download the Why Data Management Matters Right Now ebook to dig deeper into today’s data management challenges and best practices for solving them. 

By using this site, you agree to our Privacy Policy. Ok