Preparing for the Perfect Storm of a Data Deluge

Preparing for the Perfect Storm of a Data Deluge

2019-01-22T15:45:47-05:00October 5th, 2016|Blog|

We are in the middle of a perfect data management storm for CIOs and CTOs in the federal government.  Data is growing at an exponential rate, expected to reach 44 zettabytes globally by 2020; there are regulatory pressures of storing, accessing and securing that data; running advanced analytics on this data promises potential benefits; and in the midst of all these requirements, there are cost pressures to accomplish these goals with less budget.  Those aren’t my thoughts (although I completely agree and sympathize) – they were shared recently by Richard Young, CIO of Foreign Agricultural Service at USDA during a discussion hosted by Meritalk on which I had the privilege of collaborating.  This accurately sums up the pressures CIOs and CTOs are facing and it underscores the need for implementing an effective strategy of aligning technology with process to taking a long term approach to meeting these challenges.  I thought it would be valuable to break down our conversation and address each of these topics independently.

Managing Data Growth
It’s mind boggling, but if you break it down, a zettabyte is the equivalent of 1 million petabytes, 1 billion terabytes and 1 trillion gigabytes.  And by 2020, the world will have 44 of them to manage, protect and derive value from.  When you consider the volume of data that IT organizations manage now and the 3-year growth trajectory, it’s a formidable task.  In an ad hoc survey conducted among attendees to the Meritalk Webinar, “Preparing for the Federal Data Deluge” 40% responded that the top data challenge is consolidating and integrating data.  How do you scale your existing resources and plan for future growth?  The objective in managing data growth for any IT organization must be to implement a data governance policy to move data to the appropriate tier of storage, so “active” data is in accessible tier one storage and “inactive” data is in lower cost tier 3 and 4 storage.  While this may seem fundamental, it’s surprising how many organizations are unable to adhere to this.  Technology that can automatically migrate your data to the appropriate storage tier, like Hitachi Content Platform are simplifying this process for many IT organizations, by reducing complexity and centralizing management.

Store, Access & Secure Data 
As Richard said in our discussion, data is an asset.  IT organizations need a roadmap to data governance to move from an “As Is” state to a state of “To Be”.  They need information to be available, but secure.  And as in the case of Foreign Agricultural Service at USDA, that means managing information globally as well as domestically.  Balancing security and availability is a fine line.  We have open data initiatives to share information across agencies, but at the same time we are in a constant vigil, protecting sensitive assets like employee personal identifiable information, or PII.  Centralizing the management of agency information in a secure data repository enables IT organizations to simultaneously achieve both of these objectives.

Data Analytics
The explosive growth of data is both a challenge and an opportunity.  We are on the forefront of leveraging big data analytics throughout the public and private sector to improve experiences, identify threats, increase profitability, make breakthroughs in science and technology and much more.  The first step in data analytics is collecting, organizing and defining data.  In the federal government, we are in the early stages of using that information for predictive analytics, evidenced by the informal poll conducted during our webcast – 35% of respondents said they are using data analytics for forecasting and pattern recognition.  There are solutions available that streamline the deployment of analytics, providing a preconfigured, validated hardware and software platform that can accelerate speed to value and deliver on the promise of big data solutions.  The key objective for federal agencies is to move from a testing environment and into production, so they can begin to recognize the value that this holds and refine the process with real world data.

Managing Costs
In the midst of all of these challenges, the most common barrier to implementation is budget.  Fortunately, the “As a Service” concept is helping to mitigate these pressures, allowing IT organizations to leverage a consumption based, pay as you go, utility model.  Not all applications neatly fit in this bucket, but by balancing traditional CapEx spending with a cloud based OpEx model, organizations can maximize their resources and make significant progress in addressing all of their technical challenges.

Mr. Young reiterated the need to focus on the mission and to use technology as a means to achieve agency goals.  Having a technology partner that understands an agencies mission, the implications of an effective data management strategy and the capability to operationalize that strategy can help CIOs and CTOs navigate this deluge of data.  I would invite you to check out ViON’s eBook, “Why Data Management Matters Right Now” to learn more about how we can navigate this perfect storm together or contact us at [email protected].

By using this site, you agree to our Privacy Policy. Ok
REQUEST A QUOTE