Trying to Reason with Hurricane Season – Is Weather Prediction a Big Data Analytics Problem?

Trying to Reason with Hurricane Season – Is Weather Prediction a Big Data Analytics Problem?

2019-01-13T16:35:17-05:00October 21st, 2016|Blog|

With hurricane season upon us, and Hurricane Matthew dominating the headlines, I pay close attention to the weather to see if the Washington DC area will be impacted.  The European Weather Model sometimes says “yes,” while the U.S Weather Model says “no.” It begs the question, how can two data analytics platforms looking at essentially the same information come up with such very different predictions?  As it turns out, meteorologists face many of the same challenges that Chief Data Officers (CDO) and Chief Data Analytics Officers (CDAO) face in the public and private sector: budget, resources, access to information and the right technology.

Discrepancies in weather prediction are not uncommon. In 2012, the European Model predicted that Hurricane Sandy would have a significant impact to the mid-Atlantic, while the U.S. Model had it going out to sea.  According to Diana Kwon’s article, “Are Europeans Better Than Americans at Forecasting Storms?” in Scientific American, the consensus among meteorologists is that the European model is better at predicting weather and in the case of Sandy, this is unfortunately true.   Meteorologists gather thousands and thousands of data points continuously from 5,000 weather stations, 800-1100 upper-air stations 2,000 ships, 600 aircraft and various other sources that measure temperature, winds, air quality, global atmospheric composition, soil moisture and much more.  So if two models have access to the much of the same information, why the different results?

The primary reason for disparity lies within the compute power and data simulation of the different platforms.  The Europeans have invested more heavily in technology, providing greater capability to ingest more sensor data at a higher frequency and run more sophisticated analytics on this data.  Why is their compute power so much better?  Because they have put more resources into developing their weather models.  This probably sounds familiar to anyone who is in charge of managing a budget for any type of IT deployment: “Do more with less.

Budget challenges aside, the weather model illustrates the importance of having the right technology platform in place to achieve the desired outcomes.  And this is especially true in the realm of big data analytics.  In previous blogs, we’ve talked about the difference between batch and real time analytics.  The European Center for Medium Range Weather Forecasts (ECMWF) uses its forecast models and data assimilation systems to ‘reanalyze’ archived observations, creating global data sets describing the recent history of the atmosphere, land surface, and oceans.  They are able to take historical (batch) data and reanalyze it in the context of the current operational (real-time) data to build a prediction model that draws from broad spectrum of information.   European systems are able to run 50 simulation cycles whereas the American model is only able to run 20 – you can draw a direct correlation between processing capability and the resulting predictions.

How can American weather modeling close the gap in forecasting accuracy?  And to examine it at a  broader level, how do organizations responsible for the analysis of large data sets, accomplish multiple, simultaneous efforts at scale, without overtaxing resources?  The ability to manage the volume, variety, veracity and velocity of big data and perform complex analytics requires significant computing and manpower, which comes at a price.  The old adage says, “you get what you pay for.” There may be some truth to this, but when it comes to a data analytics solution, it’s a bit more nuanced.  At ViON, we talk to many IT organizations investing in big data analytics and the recurring challenge that we hear most often is the issue of resources.  Big data analytics requires a scalable platform with the resources to manage massive volumes of information and the data scientists, analysts and big data experts who are able to derive value from that data.  These organizations need to be able to scale hardware and software systems to function under the stress of more data sources, more historic observations and more aggressive analytics and modeling.   For many this is a time consuming, complicated and costly endeavor and they lack the in-house expertise.  ViON’s DataAdapt solution is helping to address these challenges through a preconfigured big data analytics platform that integrates hardware and software that can be quickly and easily deployed in the data center, allowing the data analysts to begin ingesting data immediately and performing analytics.  We can’t promise perfect weather forecasts, but we can ensure that you have the power to handle your most complex analytics challenges with confidence.

By using this site, you agree to our Privacy Policy. Ok