Big Data is Getting Bigger
In 2016, big data spending and the growth rate of investment in the big data industry is expected to hit an all-time high.
According to public sector IT distributor immixGroup, civilian big data spending could max out as high as $2 billion this coming year. Coupled with the defense sector, an annual total of $3.6 billion could be spent on big data by the US alone.
“Big data” generally refers to extremely large data sets that are too massive to be run through traditional data processing applications. With the right IT equipment, the data can be analyzed computationally to reveal useful trends and associations, especially in business/human behavioral contexts.
Big data is commonly associated with web behavior and social network interactions, but more traditional data (regarding product transactions, financial records, etc.) also falls into the category of big data.
Big data can also be sorted into unstructured and multi-structured data.
Unstructured data derives its name from its disorganized nature and the inherent difficulty of inputting it into any pre-defined model. Often unstructured data comes in the form of text, whether it be metadata or a Twitter tweet.
Multi-structured data comes in many forms and is created by non-transactional systems such as machines, sensors, and customer interaction streams. The variety of the systems from which it is created makes it extremely expensive and difficult to use.
In 2013, Gartner estimated that 85% of the Fortune 500 will remain unable to exploit big data through 2015. The current velocity at which data is created and the diversity of its many forms present an unprecedented problem for businesses and government agencies alike. The information is there, and it is useful, but how to sort through all of it remains a largely un-cracked code.
That said, the U.S. federal government is on the case. The amount of information that federal agencies must collect, store, process and manage is, of course, unimaginably large, and due to the government’s surprising disparity of big data talent, the private sector is getting involved.
Global IT company Unisys Federal recently sponsored a survey that found that 46% of respondents (all IT managers of US federal agencies) “plan to increase their use of third-party consultants and contractors for big data initiatives in the coming year.”
Considering the U.S. federal government now needs to take charge of exabytes (bytes in the quintillions) of data, those 46% might be onto something.
The movement towards implementing big data analysis has actually been a two-steps-forward, one-step-back type progression.
In 2012, federal spending attributed to big data topped out at $832 million. By 2013, that number had fallen to $693 million. 2014 saw gains again, with federal spending launching up to $1 billion.
Some analysts believe that big data for private and public sector use could reach as high as $3.9 billion by 2017 and $4.2 billion by 2018.
Big data analysis has a lot of ground to cover, but clearly money and expertise (at least in the private sector) is not in short supply.