It’s prediction season in Federal IT, and time to take bets if 2017 will (finally) be the year agencies find a way to speed their path to the cloud.
In the world of big data, "predictive analytics" is all the buzz. You put tons of information into a big data machine, a magical algorithm does its part and BOOM - you suddenly predict the future right? Well, not so fast. While the term predictive analytics has panache, it's not an exact science - it's the process of calculating probability based upon the analysis of data points. Determining probable outcomes requires the synthesis of mountains of individual data points, a valid data mining methodology and a clear idea of the answers you are seeking. The data must be gathered, organized, cataloged and indexed before the analytics can begin. This is the less than glamorous side of predictive analytics that makes all the difference in the success or failure of determining projected outcomes.
While the final presidential debate is a wrap, the Federal cloud debate (slightly more quiet) continues. Like the election, cloud has the potential to impact government for many years to come, from ensuring soldiers have the needed intelligence to make good decisions on the battlefield, to NIH having the ability to pool medical research and find new cures for disease.
Bridging the Technology Divide: How Mainframe as a Service Delivers Flexibility, Innovation while Driving Down Costs
Today’s state and local government agencies as well as educational institutions must be able to respond to an array of requirements and speed applications critical to the organization. To do so, they are expected to leverage leading-edge IT to provide social services and support to their citizens and communities. Otherwise, they risk failing the populations they are dedicated to serving.
With hurricane season upon us, and Hurricane Matthew dominating the headlines, I pay close attention to the weather to see if the Washington DC area will be impacted. The European Weather Model sometimes says “yes,” while the U.S Weather Model says “no.” It begs the question, how can two data analytics platforms looking at essentially the same information come up with such very different predictions?
Companies and organizations struggling with the influx of video surveillance data can breathe a little easier today. Hitachi Data Systems just introduced their Video Management Platform (VMP) and this new solution is making video storage and organization available for everyone. The hardware and software components of the VMP are designed and configured specifically to support video data from a few hundred cameras up to thousands of them simultaneously.
We are in the middle of a perfect data management storm for CIOs and CTOs in the federal government. Data is growing at an exponential rate, expected to reach 44 zettabytes globally by 2020; there are regulatory pressures of storing, accessing and securing that data; running advanced analytics on this data promises potential benefits; and in the midst of all these requirements, there are cost pressures to accomplish these goals with less budget. Those aren’t my thoughts (although I completely agree and sympathize) - they were shared recently by Richard Young, CIO of Foreign Agricultural Service at USDA during a discussion hosted by Meritalk on which I had the privilege of collaborating. This accurately sums up the pressures CIOs and CTOs are facing and it underscores the need for implementing an effective strategy of aligning technology with process to taking a long term approach to meeting these challenges. I thought it would be valuable to break down our conversation and address each of these topics independently.
It seems like we hear a new story about the value of video surveillance technology almost every week. And each time, it’s increasingly clear the value is more about lives saved than money. Recent weeks have brought home the point in a particularly vivid way with multiple bombing incidents and the subsequent arrest of a suspect in New York City.
Buying or upgrading a mainframe is a significant undertaking that many organizations put off for as long as possible. First and foremost, mainframe replacements or upgrades are enterprise-class IT investments that typically require a significant capital budget. In today’s economically-conscious atmosphere, where every single upgrade is meticulously scrutinized, government agencies and private enterprises are extremely cost sensitive to new capital investments. Secondly, mainframe systems, hardware and software are all historically very reliable. So much so that many organizations are willing to take the risk of delaying upgrades or new purchases for as long as possible.
On May 7th 2016, NYPD responded to a disturbance in the South Bronx on East 151st Street. Multiple people were stabbed in what seemed to be a drunken street brawl. By the end of the night, Roberto Rodríguez was dead and Lázaro Martínez was in surgery fighting for his life. As the police investigated and the New York Times reported on August 24th, this crime that seemed to be a simple street fight turned out to be much more complicated. Video surveillance of this dangerous section of New York City played an important role in the investigation and illustrates how visualization systems can be the critical force multiplier in the battle to prevent crime and solve cases.