Capacity in the Cloud

13.07.2009

It's relatively easy to host and transfer large amounts of data from a day-to-day, user-level perspective, says Rob Walters, general manager of the Dallas office of cloud hosting company The Planet. But moving 20TB to 25TB of data in a chunk continues to daunt current systems. "The networks that we have [today] just aren't good at it. It's just a weak point right now, and everybody is looking at dealing with that," Walters says.

For enterprises, the "initial ingestion" of backup data to the cloud can be done by copying data to the cloud over a WAN or LAN link, but "that initial backup, depending on how much data you have on your server, could take weeks," Couture cautions.

Doctors' offices that hire Arvada, Colo.-based Nuvolus to create private cloud storage for their sensitive patient data don't like data to be copied and physically taken out of their offices, says Nuvolus CEO Kevin Ellis. So the company requires its health care industry clients to have "a decent Internet connection" -- typically 500Gbit/sec. -- to transfer the backup data over the pipes, says Ellis.

"Depending on the office, we could be looking at pretty long upload times," he says. "You're uploading overnight. We're trying to make sure we're not impacting the doctor's office during the day as well."