News

Systems management in the virtual age: Out with the old

Alex Barrett, Executive Editor

Consider how Facebook has changed the fabric of society by providing new ways for people to connect. With its large, relatively new data center, the social media company and other organizations like it provide fresh ideas for IT pros working on an old problem: systems management.

Many of these large companies use open source tools and tweak them as they see fit. Stuart Radnidge, an infrastructure architect at a large, multinational financial services firm, viewed a video demonstration of Facebook engineers and their back-end management. “You never see guys like them buying from the Big Four,” Radnidge said. “They use open source tools and modify them for their massive scale.”

Indeed, in era of the Internet and virtualization, IT managers seek inspiration on how to manage their environments from everywhere except the Big Four-- otherwise known as IBM, Hewlett-Packard, BMC Software and CA. And for the up-and-coming generation of systems administrators, traditional systems management tools are almost anathema.

“I built a lot of big infrastructure and never once used a Big Four tool,” said Jesse Robbins, the CEO of OpsCode, who in a previous life worked as a systems administrator at Amazon.com. Generationally, systems administrators in these shops “have little knowledge of enterprise management systems and a certain amount of contempt for them,” Robbins said.

Big Four lack cloud readiness
IT pros claim that incumbents

Requires Free Membership to View

lack the ability to innovate, and they tend to re-label existing technologies as “cloud products.” And for all their breadth and cost, traditional tools lack sophistication.

I always assumed that there’d be some notion of behavioral-based monitoring, but the big guys never seemed to catch on to that.

Stuart Radnidge, infrastructure architect, multinational financial services firm

Radnidge recalled, for example, how for years he would receive pages from coworkers in the middle of the night because a backup job had triggered a CPU usage threshold alert. “I always assumed that there’d be some notion of behavioral-based monitoring, but the big guys never seemed to catch on to that,” he said.

Harsh words, but hardly unusual, said Jonathan Eunice, a principal IT adviser at Illuminata Inc. in Nashua, N.H. Many systems administrators who are old enough to remember the 1980s and 1990s still smart at the mere mention of the Big Four and their espoused “frameworks” philosophy, he said. Those frameworks, and the all-encompassing enterprise management systems built on top of them, earned a reputation for being complex to deploy and configure, to say nothing of being expensive.

From a cost perspective, certainly the price premium that traditional systems management vendors charge appears to have held. Joe Foran, the director of IT at FSW Inc., a nonprofit social services agency in Bridgeport, Conn., is overhauling the organization’s data center, including its systems management tooling. Out of curiosity, he explored monitoring tools from CA but discovered that it would have cost approximately three times more than Hyperic HQ, an open source systems monitoring suite that he eventually selected.

Eyeing open source
These days, IT managers look to other sources for new systems management tools.

“If I could start over, I’d probably take a really hard look at what the open source world has to offer,” Radnidge said, citing examples like Ganglia, a performance management and monitoring system designed for clusters and grid, or OpsCode’s Chef, an automation and configuration management platform.

While most open source software is associated with Linux, these tools can monitor an impressive breadth of systems, said FSW’s Foran. In Hyperic’s case, “I think the only operating system they can’t see is BeOS,” he quipped, surely a corner case in any modern data center.

Further, having access to the source code and a community of developers translates into new features much faster than in a monolithic, closed-source environment, said the senior product manager at a large Canadian telecommunications firm, which last year replaced traditional Big Four monitoring tools with the commercial open source Zenoss.

In some cases, with traditional tools “requests for development enhancements can fall on deaf ears,” the product manager said. But with commercial open source, the firm can develop new features on its own or leverage the community, bringing new functionality online faster.

And these days, the usual knock against open source – that what you save on licensing costs you make up in personnel – doesn’t really hold true.

“The argument that it’s cheaper to buy something off the shelf than it is to do it yourself – I’m not convinced of that,” Radnidge said. When you compare the high cost of off-the shelf-tools with an engineer – “how much cheaper is that really?”

Brave new operations teams
Beyond using open source tools, many Internet companies have warmed to a new operational philosophy called DevOps, which takes principles of Agile Development of frequent updates to production code -- plus strong focus on automation, configuration management, and version control -- to improve the effectiveness of development and operations teams.

That DevOps philosophy appears to have trickled down to the enterprise.

“DevOps is a mindset – it’s absolutely not a toolset,” Radnidge said. “It comes when there’s a recognition from sys admins and developers and application owners that there’s a massive chasm between them, and that you need to start thinking about adopting automation,” he said.

Granted, the average IT shop looks nothing like a big Internet company. These companies lack scale and are often characterized by off-the-shelf applications. For that reason, DevOps is more or less a nonstarter at FSW. “We don’t do a lot of in-house development,” Foran said. “There’s no immediate need for frequent code updates, or check in and check out.”

But even slightly larger shops can certainly benefit from DevOps, said Illuminata’s Eunice, especially those that have customized large enterprise resource planning, customer relationship management and database packages.

Traditional IT shops’ modus operandi has been to hold off for as long as possible before installing an update and, then, only after it’s been thoroughly tested. But that approach can backfire, Eunice said.

“One of the reasons so many shops have had problems with Internet Explorer 6 is because they didn’t consider updating the components – they just sort of let it ride – and now, they’re faced with this incredibly big, pressing update,” Eunice said. In a DevOps shop, that update would have been handled “in more manageable chunks.”

For DevOps, another major focus is automation. And there too, enterprises can benefit.

“Doing things manually is error-prone and fragile,” Eunice said. If you bring in high-level, policy-based automation, you can reduce your cost of operation and ease the impact of change.”

However, the DevOps mantra that it’s better to beg for forgiveness than to ask for permission might be too much for some organizations. “Obviously that’s a bit extreme for your average enterprise,” Radnidge said. But with good systems in place to do configuration management and automation, “You can always roll back,” he said.

Managing the cloud, from the cloud
Slowly but surely, IT managers have embraced a new idea: that some – or all -- their IT infrastructure will no longer be under their control but instead run in the cloud.

“If legacy systems weren’t an issue, I would try and offload as much of my infrastructure as possible to the cloud for the ease of bringing up new infrastructure quickly,” said Kevin Armour, the CTO at Paycor, a Cincinnati-based payroll services company that is currently exploring Infrastructure as a Service offerings for test and development environments.

According to Bernd Harzog, an analyst at the Virtualization Practice, that’s the new reality of IT: highly dynamic, virtualized applications running simultaneously in private data centers and on shared, multi-tenant public clouds. Traditional systems management tools, meanwhile, were designed for static, on-premise, nonvirtualized, single-tenant environments, and, to put it bluntly, “aren’t going to make the transition.”

Harzog said that forward-thinking IT architects pondering systems management need to “start over,” looking for tools designed for this reality, with a focus on the best possible products.

At the very least, IT managers say they support systems management functions running in the cloud.

“There’s a massive array of software that there’s absolutely no need to run internally,” Radnidge said. His organization, for example, subscribes to OpsCode’s Chef service, 37signals for Web-based collaboration, and Thoughtworks Studio’s Mingle, an Agile project management methodology.

In their defense, traditional systems management players have awakened to the new world and responded by acquiring new forward-looking management players. In the past year, for example, CA purchased 3Tera, a cloud and grid management platform, and Software as a Service-based monitoring provider Nimsoft Technologies.

As new players emerge and demonstrate success, expect more acquisitions, said Michael Coté, an industry analyst at RedMonk. “Big vendors don’t chase markets with questionable revenue; that’s what startups are for: to explore new revenue models.”

Let us know what you think about the story; email Alex Barrett, Executive Editor, at abarrett@techtarget.com, or follow @aebarrett on twitter.