Construction Sections

Cool Running

High Performance Computing Center Touts Energy, Security Innovations

The MGHPCC, which will open along the canals in Holyoke this fall.

The MGHPCC, which will open along the canals in Holyoke this fall.

For John Goodhue’s father, it took a tour through the Massachusetts Green High Performance Computing Center to understand exactly what goes into housing — and protecting — computers.
“When my dad toured the center,” said Goodhue, the center’s executive director, “he came out the other end and said, ‘I finally get it! It’s not about computers; it’s about bringing electricity in and creating a lot of heat and then removing that heat from the building.’”
Bingo.
Of course, that has been just one of the challenges — albeit a critical one — of preparing the MGHPCC to open in Holyoke later this year. The $95 million facility is a joint venture between UMass, MIT, Harvard University, Boston University, and Northeastern University, as well as technology giants EMC Corp. and Cisco Systems Inc., to create a high-tech research center.
To create that all-important cooling effect, the facility will use a continuous water loop in and out of the building. A chilling system will cool the water, which will then be pumped into air-conditioning units placed beside the computers; the heat generated by the equipment will then be exhausted outside, and the process begins again. Constantly.
“The cooling took an enormous amount of effort,” Goodhue said, explaining that the two major techniques used for the process involve air and water, respectively. After six weeks debating which technique to use, architects and builders decided on the chilled-water option. “And we’re bringing it quite close to the computers; for every two racks full of computers, right next to them is a little air-conditioning rack. It takes water into it and cools the air around the computers, and takes the water out.
“Water is actually much better at carrying heat and absorbing heat than air is,” he continued. “A very small volume of water, relatively speaking, can carry the same heat as a much larger volume of air, and it’s one of things that allows us to run the center more efficiently. The cooling system really allowed us to cool these computers that are very, very hot. Some machines pack a considerable amount of electronics into a very small space, and we have to be extra vigilant about cooling — and water is better at doing that.”
But that raises challenges regarding energy efficiency — another goal of the computing center’s leaders. Meanwhile, designers were also faced with protecting sensitive equipment and data from more than heat, so decisions about building security were high on the priority list as well.
For this issue, BusinessWest delves into some of these questions, and how the MGHPCC is proving to be an innovative facility long before going online this fall.

Green for a Reason
From the start, the Holyoke center was designed to be energy-efficient, Goodhue told BusinessWest. “One of the things that drew us to Holyoke is that the power came principally from renewable energy. Holyoke Gas & Electric generates 70% of its power from renewable sources — primarily the dam, but it also has the largest solar array in the state, and also has ideas about adding other resources to their portfolio.”
Holyoke’s dam on the Connecticut River generates hydroelectricity that is then sold to industrial users for about 8 cents per kilowatt hour, compared to a state average of more than 12 cents, according to the U.S. Energy Department.
That’s good, because data centers tend to suck up a lot of energy — partly because they never shut down, partly because of the power the equipment uses. “There has been a trend in recent years toward operating computers and servers at higher and higher temperatures,” Goodhue noted.
In fact, according to a 2011 Stanford University report, data centers account for about 2% of the nation’s energy consumption, and many use electricity generated by coal-fired power plants, not exactly a clean energy source. Because of its power supply and design, the MGHPCC is expected to use at least 25% less energy than the typical data center.
Goodhue said the computing center has applied to be a Leadership in Energy and Environmental Design (LEED) project, aiming for Gold status — the second-highest accreditation — from the national recognition program run by the U.S. Green Building Council.
“Again, that’s by paying attention to hundreds of details, from how we manage stormwater to the white reflecting roof; from what landscaping materials we use to the chemical basis for our paints,” he explained, noting that the paint must not contain what are known as volatile organic compounds, or VOCs; when breathed in, these are not acutely toxic, but can cause long-term health effects.
“There are dozens of small things that, taken individually, add up to a very different way of designing and building this center, so that it has a much lower environmental impact,” Goodhue continued. “The good part about it is, people have thought very carefully about the environmental impact, so you don’t have to reinvent the wheel — just follow the best practices you know, none of which are crazy or over the top. They just make good design sense.”

The Springfield Data Center, currently under construction on the former Technical High School site.

The Springfield Data Center, currently under construction on the former Technical High School site.

Some of the same focus on energy efficiency is evident at the Springfield Data Center (SDC), set to open in 2013 on the site of the former Technical High School. The facility will be one of the state’s two primary data centers, backing up and supporting the Massachusetts Information Technology Center in Chelsea.
New York-based Skanska USA, the contractor for the SDC, has also incorporated a number of energy-efficient elements in aiming for Silver certification under LEED. “This is one of the most energy-efficient buildings of its type in the United States right now,” said Steve Eustis, senior vice president and project executive for Skanska.
The design includes selecting materials that are energy- and water-efficient and incorporates ‘daylight harvesting,’ which uses sensors in the lighting system to shut off the lights when there is sufficient daylight; 90% of the occupants will have daylight views. The roof will be also be a reflective white, and HVAC systems were designed with energy conservation in mind. In fact, the air-conditioning system that cools the computers will capture the waste heat and reuse it.

Securing the Data
Of course, protecting computers from heat damage while keeping energy costs low is only one balancing act a data center must perform. Another is keeping data private while not hindering the ability of the facility’s users to conduct and share their research.
“Security is of critical importance. If you’re doing medically oriented research, for instance, you might have sensitive patient data in the center, and it’s very important to protect that,” Goodhue said. “At the same time, this is a research center, and it’s very important to give people as much flexibility as possible to share data.
“So we have these two conflicting constraints, and we handle that in two ways,” he continued. “One has to do with the physical infrastructure. There are maps that label every room as a security zone, with relatively small lists of people who are allowed to go into each room in the building, and that drives our keycard-access system. Your ID will let you inside doors and won’t let in others.
“So, if you’re an electrician servicing the transformer,” he went on, “you probably don’t need to go into the computer room, so that person’s card will let him into the transformer room and maybe one or two other adjacent rooms. Similarly, if you’re operating a computer in the computer room, you probably have no business hanging around the transformers.”
The other element is how networking is handled, Goodhue continued.
“Every institution that uses the facility — Harvard, MIT, UMass, and so forth — already has well-developed methods of protecting data when it flows across their networks,” he said. “So, imagine that, on the floor, there’s one network that we’ve arranged so it’s an extension of the MIT campus network, and one we’ve arranged as an extension of the Harvard campus network, and so on. Each exists here in parallel universes; they don’t see each other, but are kept separate. If you’re at MIT, you can think of the building as just another building on campus, but farther away, and BU folks can see it the same way.
“That gets you the protection,” he noted. “So how do you get flexibility?”
For that, the center uses what Goodhue called a “meet-me switch,” which allows two or more users from different networks to exchange data. “Again, it’s the balance between access and flexibility and making sure the data is protected and controlled.”
In addition, each rack of computers has its own set of keys, so only authorized people can access each one. “This isn’t like the movies where you see these places with barbed wire and armed guards and so forth,” he said. “We are a high-tech facility, and we’re very careful about protecting it, but we’ll put on a slightly friendlier face than what you see in the movies.”

Little Things
Goodhue said the MGHPCC will open on time and under budget, but that’s far from the only positive aspect of it.
“People often ask me, ‘what’s one unique thing about the data center that makes it the best in some way?’” he told BusinessWest. “But there’s not just one thing. It’s lots of attention paid to literally hundreds of details that gets you there.”
And that’s when the real excitement — the research itself — begins.

Joseph Bednar can be reached at [email protected]