Today, we have three subject matter experts in the studio to expand on tools that can have federal leaders rest assured in today’s dangerous environment.
When we look back on 2020, federal information technology professionals will note the drastic increase in remote users, the move to the cloud to allow collaboration, and the serious threats that arose because of these unfortunate series of events.
Data security is becoming important because we have seen a drastic increase in ransomware attacks. In fact, in March 2020 alone there was a reported 148% increase in ransomware attacks. With all this transition, one can get distracted from some of the core principles of data management: specifically, data backups as a first line of defense against ransomware.
Today’s interview has two subject matter experts who put the concept of data protection into the new order of federal information technology. The transition to remote work was relatively painless, the concern now is how to make people more productive.
It makes no sense to have highly trained data management people doing what Jeff Phelan calls “pick and shovel” work. He means that there are agencies where much of the data backup is done manually.
A best practice here is to automate what you can and save talent for the hard projects. He suggests a look at automating as much as possible in the data backup realm.
The origin of data has changed as well. If we have 90% of staff remote, this means that each human on the network is creating data distinctly. We have endpoints that are becoming the tactical edge. Is daily backup enough? Consideration must be given to being able to manage these endpoints in a manner where data is backed up in a timelier manner.
Michael Lamb notes that today’s volume means backup procedures must be able to search and do analytics more dynamically. Data protection today means achieving data protection as well as giving administrators the flexibility to search for specific data points as well as analyzing what is in the stored volume.