The shelf life of data is increasing exponentially with the proliferation of AI, ML and advanced analytics.
Data that you thought you would never ever touch again that is tucked away in a long term archive is suddenly becoming relevant. Legacy applications have been dependent upon historical data. Now, organizations are looking at this valuable data to use with new AI development programs to create better outcomes. This principal is very similar to weather forecasting: the more data points that can be analyzed over a longer period of time, the more accurate the weather prediction. The same holds true for AI models. The conundrum for IT organizations is accessing this data in a way that is efficient and cost effective with quality data and it brings to the forefront the question of data storage.
Many organizations have turned to the public cloud for active and archived data storage. For many scenarios, the public cloud is an excellent option. The caveat however is that while moving data into the public cloud can be fast and cost effective, extracting it might not be. We’ve heard more than one story from customers that have stored hundreds of petabytes of data in the public cloud to save money, only to be hit with expensive egress charges and operational fees when that data is extracted from the cloud, negating any cost savings and in some instances putting them way over budget. The desire to have data in a more flexible model with easy access for extracting data from the cloud is becoming increasingly more common to feed the AI models.
What is your data management strategy? We encourage you to think about a hybrid model to take advantage of your existing on-prem infrastructure and public cloud investments. Data management requires a holistic strategy, reviewing where data resides, and where it can be most efficiently stored over its lifetime. As we enter this new era of advanced data intensive applications, we need to be vigilant of how data is managed to avoid scenarios like we illustrated above.
ViON is tackling this challenge head-on with the ViON Forever Data Cloud. This is a data management solution that helps IT organizations take a strategic look at their data. Here’s how the solution works: a policy manager manages and tiers the data to match the appropriate storage resource to the data requirements:
- High Performance Active Data would reside in a high bandwidth, low latency SAN powered by all-flash arrays & NVME storage
- Fast Staging Inactive Data is assigned to a high capacity hybrid storage environment like SSD and HDD
- Slower Staging Inactive Data is stored in a more cost-effective long-term storage resource like tape media
Data is searchable and accessible across all of the storage tiers from a central location. This allows for inactive data to be staged to a more performant storage tier when it is needed, and then moved back to cheaper, less performant storage after the data has been used.
The ViON Forever Data Cloud can integrate the policy manager with existing infrastructure and on-premises private cloud infrastructure to balance the speed of access, data staging and cost effectiveness while maximizing the efficiency of existing resources. The solution can be augmented by the public cloud storage for disaster recovery as well a leading perimeter cybersecurity solution for data protection. The outcome is clear, utilizing a more efficient use of the customers’ existing storage and providing the same experience of the public cloud without the risk of escalating charges.
As the drive to AI, ML and other high performance, data driven applications continues to accelerate, data management will play an increasingly important role in how impactful these tools are for any organization. As we evolve further into the world of AI, the strengths and weaknesses of the data management strategy will be magnified. To learn more about how to build your own Forever Data Cloud, visit our website at vion.com/FDC