NextAction taps Oracle for database marketing

02.05.2006
Some people choose databases from Oracle Corp. because of their cutting-edge features, but Steve Helle, chief technology officer at Denver area database marketing firm NextAction Corp., wanted something else.

'I like stability more than features,' said Helle. And no wonder: When Helle arrived at Westminster, Colo.-based NextAction in mid-2004, the company was plagued with near-weekly crashes of its core database, which aggregates and analyzes consumer purchasing data from more than 1,100 retailers such as Home Depot and Williams-Sonoma.

NextAction uses a custom-written extract, transform and load (ETL) tool to clean up and import the data. Unlike competing consumer databases that can aggregate only where and when a consumer goes shopping, NextAction can also tell what the consumer actually bought at each store, according to Helle.

In 2004, the company's 13TB data warehouse was spread across five Microsoft SQL Server 2000 databases. But within a year, the fast-growing firm's database had grown to 30TB spread across eight instances of SQL Server 2000. 'My guys were working 70 to 80 hours a week,' he said. 'We were dying on the vine.'

NextAction's data warehouse was growing too fast to easily manage on SQL Server, said Helle, and it would crash so often that the process of mining data for retailers grew from three and a half days to eight days -- a major problem, since the data mining took place weekly. 'That's what really impacted revenue, the fact that it failed quite often,' Helle said.

One of the first actions Helle, a 15-year veteran of database marketing, took was to convince his boss that NextAction would need to upgrade to new hardware and a new database to handle its intensive and fast-growing data needs.

Partly because the rest of NetAction was mostly a Microsoft .Net environment, Helle tested the then-beta version of SQL Server 2005 against Oracle 10g, both of which had 64-bit versions.

Microsoft treated Helle well, putting NextAction into a special program for extremely large databases that included plenty of support. But within two weeks, the results were clear to Helle. 'SQL Server has made great strides in the past couple of years. But like I told the Microsoft guys, it's hard to match [Oracle's] 15 years of evolution,' he said.

Not only was Oracle Enterprise 10g much more stable, it was also much faster, Helle said. 'We ran about 20 benchmarks. Creating applications was, on average, eight times faster on Oracle. Loading data was about 12 times faster. An updating statement that died after 1.5 days on SQL Server without completing took one hour and 10 minutes to finish on Oracle.'

The weekly customer data mining now takes only one day. That is in part because NextAction uses Oracle Partitioning to run more than 1,100 partitions, one for each customer.

That 30TB database was shrunk to just 5TB after being rewritten -- and it was moved to one Oracle instance from eight. For backup purposes, NextAction is using Oracle's Data Guard, which sets up a mirror with a second, identical Oracle database.

Although the majority of databases Helle had run involved Oracle on Unix, for speed and cost reasons he chose a Hewlett-Packard Co. DL 585 server using four AMD Opteron processors with Novell Inc.'s SUSE Linux operating system. That configuration cost about US$40,000, versus $2 million for the prior SQL Server-based system.

That price doesn't include the storage, which Helle acknowledged was expensive. But it is also large: The HP StorageWorks XP1200 disk array can store 330TB, which Helle thinks should easily last him five years. Just in case, though, the HP can manage a total of 3 petabytes via auxiliary systems.

Helle is also testing Oracle RAC (Real Application Clusters) to help ease future growth.