Taking the Shot: When to Go Green With Your Data Center

NWF   |   April 13, 2011

guest post by Jennifer Grayson
Staff writer for campustechnology.com

When it comes to “greening” a college or university data center, institutional inertia, budgetary concerns and politics often combine to create delays and complications that can put projects off for years. For schools looking to upgrade to a leaner, greener data center, the key to success often lies in understanding when those very same problems can be used to press the advantage.

Mark Twain once said, “I was seldom able to see an opportunity until it had ceased to be one.” He would have made a lousy CIO. In today’s high-pressure IT world, almost every opportunity comes hidden inside a problem. And when it comes to “greening” a data center, the problems can be especially daunting, given institutional inertia, budgetary concerns, politics, and more. For CIOs looking to notch up a win with a leaner, greener data center, the key to success often lies in understanding when those very same problems can be used to press the advantage.

“I call it the institutional readiness factor,” says Susan Malisch, vice president and chief information officer at Loyola University Chicago. “Are things lining up so that you can make a good case for what needs to happen?” Malisch and her team recently jumped on the opportunity to build a new, energy-efficient data center after being forced out of their old one. “Timing is everything in some cases,” she explains, “and for us it was certainly the impetus for us to be able to move forward with a big and expensive project.”

The Data Behind Consolidation

More than three-quarters (78 percent) of higher ed institutions either have or are developing a plan to consolidate their data center. That’s the word from the2010 Energy Efficient IT Report, a study performed byCDW Government that looked at energy-efficiency trends in IT at 756 organizations, including 152 colleges and universities.

Among higher ed professionals surveyed for the report, the primary motives behind the drive to consolidate data center operations are:

  • Reducing energy consumption (64%)
  • Reducing expenditures on hardware, software, and operations (60%)
  • Increasing use of new and more efficient computing platforms and technologies (46%)
  • Supporting a green initiative (42%)
  • Increasing IT security (36%)

The Heavens Aligned


A green data center was not initially top-of-mind for Loyola. When opportunity knocked, though, the IT department opened the door wide. “In our case, we had a few things lining up on campus that were, quite honestly, outside our control,” explains Malisch.

Over the years, the school’s population of Jesuits, many of whom taught on campus, had shrunk. Their residence, in a large building in the center of the school’s Lake Shore campus, was quite old, and the universitydecided to tear it down as part of a master campus design plan (including new academic buildings, green space, and a resized Jesuit residence). Coincidentally, the endpoint for the university’s entire voice and data network was also located in the building.

Moving that infrastructure to the data center wasn’t an option: The server room, located in a different building on campus, wasn’t much of a data center from the outset. The cooling apparatus was so poor that oscillating fans and temporary ductwork were deployed in a stopgap measure to control the room temperature. “It looked like a temporary server room that had grown to accommodate all the equipment of a data center,” saysMalisch.

At about the same time, a university-wide disaster-recovery initiative required IT to examine how well equipped Loyola was to respond to a disaster scenario (avian flu was a big concern at the time). The exercise was revealing: “We really didn’t have a good story to tell around our systems and the environment that we had them in,” Malisch admits.

So the university began building a new 2,362-square-foot data center, using a renovated space in the basement of Loyola’s Dumbach Hall. Its location, across from the campus steam/chilled-water plant, allowed the school to supply the new data center with cooling via a new chilled-water network.

As luck would have it, Loyola was simultaneously constructing its first LEED-certified building, so IT staff tapped the project’s lead design engineer to assist in the design of its new energy-efficient data center, too. In addition, Dan Vonder Heide, Loyola’s director of infrastructure services, and the project manager from facilities attended a weeklong training workshop on data center design at the Uptime Institute.

The Uptime Institute

Loyola University Chicago has a reputation for being green. It boasts a forward-thinking Center for Urban Environmental Research & Policy; it has committed all its future construction to Silver LEED certification; and, last fall, it became the first school in the United States licensed to produce and sell biodiesel fuel. So whenLoyola IT staff moved forward with its data center overhaul, it only made sense for them to seek out the most energy-efficient technology available.

As part of their research, Dan Vonder Heide, Loyola’s director of infrastructure services, and the project manager from facilities attended a weeklong training workshop on data center design at the Uptime Institute. The institute, which provides education to data center professionals, has taken on the emerging issue of data center efficiency, working alongside environmental leaders such as the Department of EnergyEPA, andThe Green Grid.

It was at the Uptime Institute that the Loyola team learned the ins and outs of leveraging local climate for data center cooling. Armed with that knowledge, LoyolaIT added a winter-mode economizer to its new data center, using outside air to cool the water feeding the data center’s air-handling units.

Susan Malisch, Loyola’s CIO, recommends a training program such as the Uptime Institute to any institution looking to redesign its data center. And since communication is key, she advises that Facilities and IT attend the training together. “The partnership, the relationship that you create [between the two departments] is huge,” she says. “Because you’re really working with a project manager that’s probably never designed a data center before. The kinds of things you need to address are different in that kind of space versus a classroom or residence halls or the typical stuff that we usually do on a college campus.”

The Uptime Institute also hosts an annual Symposium on Green Enterprise IT, as well as recognizing groundbreaking energy-efficiency initiatives with itsGreen Enterprise IT Awards.

Based on this training, the university decided to install a winter-mode economizer for the new data center, in which outside air is used to cool the water feeding the data center’s air-handling units. Given Chicago’s famously frigid winter weather, it was a relatively simple decision with a big upside in cost savings.

In the four years since, the department has virtualized more than 70 percent of its servers and upgraded to Energy Star equipment. Electronic meters installed in December 2010 will soon give Loyola a sense of the energy savings, and those data will inform future design and purchasing decisions. All the work has paid off, it seems: In 2007, the data center won an Excellence in Engineering Award from Illinois chapter of the American Society of Heating, Refrigerating and Air-Conditioning Engineers.

“The intent wasn’t greening the data center,” says Malisch, whose initial goal was simply to find new space on campus when her old building was torn down. “But if you’re going to go through all that effort, you take a look at redesigning what you have, right?”

Making a Fresh Start

Sometimes, a glaring IT problem–and a real opportunity–can become buried under the press of other projects. In such cases, it often takes a fresh pair of eyes to see what needs to be done.

That was the case at Randolph College, a liberal arts school in Lynchburg, VA, where Victor Gosnell took the helm as CTO in 2008. Prior to his arrival, the IT team had been very busy: It was in the final stages of a campuswide wireless access point rollout; it had also recently transitioned the school to a new VoIP phone system and had upgraded many lab and public-access computers. All of these projects were worthwhile and much needed. To Gosnell, though, it was immediately obvious that the school’s data center was crying out for attention. Randolph’s enrollment is around 500 students, yet the school had more than 50 physical servers–some dating back eight years–in its rack.

“It seemed like overkill, the amount of energy it was consuming,” saysGosnell, referring both to the electricity the data center gobbled up and the man-hours needed to maintain it. “The amount of time, energy, and cost to keep up all those pieces of physical equipment was exponentially unnecessary for the size of the institution and our data-processing needs.”

Fortunately, Chris Burnley, Randolph’s CFO and Gosnell’s boss, shared his concerns. “When you stepped into the data center, replete with server fans, hard drives, and an AC system to keep them all cool, you would have thought from the decibel level that the room was one jet engine shy of liftoff,” remarks Burnley.

In making his case for a new data center, Gosnell tapped into Randolph’s own focus on conservation. The college has an environmental studies program, an organic garden, and a chicken coop. Amid all that, the energy-guzzling data center stood out like a Humvee on a street full of bicycles.

But Gosnell couldn’t rely on the green factor alone. On a campus feeling the budget crunch due to recession-fueled dips in enrollment, he had to show that he could save some green, too.

So Gosnell and Burnley, together with the network and systems administrators, decided on a simple solution that would pay for itself. With an initial investment of $86,000, the school settled on Dell EqualLogichardware and VMware software and planned to virtualize 25 of its existing servers. According to the team’s initial calculations, Randolph would save 56 percent in total ownership costs (including energy consumption) over a three-year period, as well as more than $200,000 in server refresh costs.

To pay for the new equipment–at least partially–Gosnell repurposed money that he already had in his budget for replacement servers. To identify additional funds, he also launched an examination of the other operational areas within IT. Gosnell ticks off some of the actions he took to reduce spending: “We eliminated a planned purchase of a software package by doing the programming in-house; we reduced the amount ofSmartnet coverage on a number of our less critical Cisco switches; and we temporarily reduced the amount of money planned for conferences and travel by using webinars and online offerings.” Out of a discretionary budget of about $234,000, Gosnell uncovered a staggering $72,000 in savings.

The savings didn’t stop there: The virtualization project proved so easy to implement that the IT team virtualized 25 additional servers. Now, the setup consists of three physical servers running 50 virtual ones, and the school is anticipating $30,000 a year in electricity savings.

“We all concurred that this [project] was the opposite of Murphy’s Law,” marvels Gosnell. “If anything could go right, it did.”

Going Their Separate Ways


Gosnell seized his opportunity during the honeymoon phase of his career at Randolph. Opportunities can also exist in the midst of a bitter breakup. For Doug Herrick, senior director of infrastructure services at Thomas Jefferson University (PA), divorce was the last thing on his mind prior to 2005.

TJU, an academic medical center, had shared a data center for years with Thomas Jefferson University Hospital, a separate teaching hospital with its own IT group. The setup was pretty cozy: While the consolidated data center was housed in the basement of a TJU-owned building, the hospital IT group maintained the facility, charging TJU on a cost-share basis.

As at Randolph, it was obvious that TJU’s data center was in need of an overhaul, but the politics of the situation were complex. Even so, the two groups coexisted harmoniously until TJU moved forward with an $18 million electronic medical record (EMR) project for Jefferson UniversityPhysicians, a nonprofit physician practice comprising TJU teaching faculty. Suddenly, the aging data center became an immediate concern. “Going paperless was somewhat daunting for folks,” says Herrick. “They were very concerned that, if they became dependent upon this EMR service, all of [its] systems and servers…were sitting in this obsolete data center.”

Independent consultants brought in to examine the data center–including cooling, electricity, and the site itself–concurred: The infrastructure was, indeed, a ticking time bomb, at least where EMR was concerned. What’s more, an investment of $6 million to $8 million would be necessary to bring it up to speed.

So the project team handling the university EMR implementation turned to the hospital IT group for the next move. The answer? Silence.

“We kept saying, ‘Alright, what’s the game plan here?'” recalls Herrick. “We were going to go live on this EMR in a few months and we really couldn’t have our systems there. The hospital, unfortunately, didn’t feel this was something it really needed to address.” So Jefferson UniversityPhysicians management–with the help of the IT department at TJU–decided to move on.

After examining all the possibilities, TJU’s tech group, known as Jeff IT, decided on an innovative, if unorthodox, data center solution: It would migrate its computing assets to a 100 percent outsourced Tier 4 data center and disaster-recovery facility at DBSi in nearby Valley Forge, PA. The one-time costs for the move were $357,000, which included new hardware and electronics, new servers and storage, software, construction, and more. The ongoing operational costs are comparable to what TJU had been paying the hospital to share its old data center.

“No one at our school had done this before; no one had outsourced a data center,” says Herrick. “People were saying, ‘Are you crazy?'” For Jeff IT, though, getting out of the data center operations business just made sense. “These days, with the economy, you have to ask yourself: What is your mission-critical goal? Is it to run a data center, or is it to provide services?”

Herrick credits his CIO, Bruce Metz, with flexing the political muscle necessary to bring the administration on board. “He went out on a limb to do this,” Herrick says. “When you have an organizational break like this with your partner institution, there’s a lot of risk. If something goes wrong, you’re going to have the ‘I told you so’ folks lining up left and right.”

Luckily, Metz’s political capital was spent wisely: The new, outsourced data center provides greater computing power with a 67 percent smaller footprint–thanks, in large part, to virtualization–and a reduction in energy costs of 40 percent. What’s more, the move took place in less than 18 months, with virtually no disruption in service.

So far, no one is saying, “I told you so.” According to a recent customer-satisfaction survey, 69 percent of TJU faculty and 76 percent of staff and administration think systems uptime since the move to DBSi has gotten “better or much better.”

Environmental journalist and green living expert Jennifer Grayson isThe Huffington Post’s green advice columnist (Eco Etiquette) and founding editor of The Red, White and Green. For more information, check out www.jennifergrayson.com or follow her on Twitter.

,
Published: April 13, 2011