IDC: big data skills crisis needs vendor action

21.04.2012
Enterprise IT departments are struggling to take advantage of the benefits big data can bring due to a lack of skills in the marketplace, according to IDC research VP Susan Feldman.

At a big data roundtable this week, Feldman explained that the complexity of big data technology requires an advanced skillset that is quite rare amongst IT professionals.

"There aren't a lot of people who are very skilled in these new technologies. How are enterprises supposed to hire people if they aren't there?" asked Feldman.

The most common technology used by companies to analyse hundreds of terabytes, or even petabytes, of unstructured data is an open-source tool called Hadoop.

Hadoop uses a process called parallel programming, which allows analytics to be run on hundreds of servers, with lots of disk drives, all at the same time. It stores this data in a file system called HDFS (Hadoop distributed file system), in effect a flat file system that can spread data across multiple disk drives and servers.

However, it is widely agreed in the industry that Hadoop is an extremely complex system to master and requires intensive developer skills.There is also a lack of an effective ecosystem and standards around the open-source offering.