According to statistics from IDC, data use is expected to grow by as much as 44 times, amounting to some 35.2 zettabytes (ZB - a billion terabytes) globally. However, the file size of individual datasets has also climbed, leading to an increased need for greater processing power to analyse and make sense of them.
Storage giant EMC notes up to 1000 of its customers currently utilise more than a petabyte of data on its arrays, a figure expected to grow to 100,000 by the year 2020. Some customers will also begin to utilise a thousand times that - an exabyte or more - within the next two years.
"Every time we've estimated the growth rates, we've been wrong, and we've always been wrong in the wrong direction," president of EMC's unified storage division, Rich Napolitano, said.
"It's always growing faster. Use cases expand; it's been true for 30 years and the data types are richer and richer."
According to Melbourne IT CTO, Glenn Gore, the big data phenomenon has been on the radar for some 18 months already.