News

Why Zynga moved from public cloud to hybrid cloud

Alex Barrett

LAS VEGAS — Listening to Zynga CTO of Infrastructure Allan Leinwand talk about his company’s hybrid cloud, data center managers learned what they had suspected all along: Used at scale, the public

Requires Free Membership to View

cloud is no bargain.

In many ways, Zynga Inc., the company that created FarmVille and other popular social network games, has the public cloud to thank for its existence, Leinwand told Interop 2012 attendees in this week’s keynote.

Founded in 2007, Zynga tapped the public cloud in 2009 to help it deal with crushing demand for its new FarmVille game, and the public cloud saw it through its 2010 CityVille launch.

Despite those public cloud successes, Zynga decided to develop its own internal private cloud called zCloud that went live in 2011, Leinwand said. Today, the company uses a hybrid-cloud approach: It hosts 80% of its games internally and 20% on the public cloud – the reverse of its pre-zCloud days.

Among data center professionals, the question was why?

“There are only two reasons that would push Zynga to move it back in house,” said an IT director for a Midwestern hospital system that asked to remain anonymous. “Because it was cheaper, or because they were able to gather more information than they could in the public cloud.”

Most folks are still saying, ‘I’m not going to put my production data in the cloud.’

Brett Goodwin

Leinwand alluded to the cost savings, saying by using its own zCloud, Zynga is able to use only one server to every three servers it would have used in the public cloud.

“As they gained more maturity with their apps, they learned the optimizations they could make that were not common options in the public cloud,” said Kurt Marko, an independent IT analyst in a panel discussion. “That’s probably what led them to that 3:1 efficiency.”

Indeed, at scale, running any application 24/7 in the public cloud costs more than running it on dedicated servers, said John Engates, CTO at Rackspace, Inc. the San Antonio hosting and cloud provider. As such, the company advises its customers to follow the “own the base; rent the spike” principle.

That leaves most mainstream IT organizations to use public cloud mainly for dynamic workloads such as test/dev, virtual demos and training, said Brett Goodwin, vice president of marketing and business development at Skytap, Inc., another cloud provider based in Seattle.

“Most folks are still saying, ‘I’m not going to put my production data in the cloud,’” Goodwin said.

But that is changing. One Skytap customer, for example, operates under the “DOS” principle: Don’t Own Stuff.

“If at all possible, they don’t want to buy stuff, provision stuff, depreciate stuff, or end-of-life stuff,” Goodwin said.

For that class of customer, enterprise cloud providers at Interop busied themselves touting their hybrid-cloud capabilities. Rackspace, for one, offers a Cloud Connect service that leverages load-balancers to direct traffic usually bound for dedicated servers to spillover servers in its cloud. Terremark Worldwide, Inc. talked up its CloudSwitch, software that allows customers to encapsulate workloads running in their private environment and seamlessly port them to its cloud.

True public cloud, in short, is only part of these companies’ businesses.

“If I’ve learned one thing,” said Ellen Rubin, Terremark vice president for cloud products in a panel, “it’s not to see cloud as an isolated thing.”

Let us know what you think about the story; email Alex Barrett, Executive Editor at abarrett@techtarget.com, or follow @aebarrett on twitter.