Google's BigQuery Offers Infrastructure to Crunch Big Data

01.05.2012
Few companies in the world have access to datasets as large as Google does, and, unsurprisingly, Google is one of the companies at the forefront of Big Data analytics. Now Google plans to share the wealth by giving others access to its data crunching infrastructure with its new Google BigQuery Service.

The BigQuery service is an online analytical processing (OLAP) system designed for terabyte-scale datasets. It gives customers the capability to run SQL-like queries against massive datasets that potentially have billions of rows without requiring the hardware and software costs associated with an on-premise solution. BigQuery has been in beta test, or what Google calls "limited preview," since last November. Now Google believes it's ready for prime time.

"The service is conceived so customers can upload their own data," says Ju-Kay Kwek, product manager of BigQuery and leader of some of Google's other Big Data efforts as well. "They can store it all in Google and then, either through a RESTful API or very simple Web UI, they can interrogate their data."

"Imagine a big pharmaceutical company optimizing daily marketing spend using worldwide sales and advertisement data," Kwek adds. "Or think of a small online retailer that makes product recommendations based on user clicks."

Kwek notes that one BigQuery customer, social and mobile analytics specialist Claritics, leveraged the service for a Web application for game developers that gives them real-time insights into user behavior. Another customer, Amsterdam-based analytics firm Crystalloids, built a cloud-based application to help a resort network analyze customer reservations, optimize marketing and maximize revenue.