Data center design study: Cool consideration of a hot issue
Over the next two years, researchers at Binghamton University and partnered institutions will be helping to protect life as we know it. While the claim might sound extreme, keep in mind that they will be working to improve the design and energy efficiency of data centers.
Data centers. Thousands of them. All processing vital information, critically important to much that drives our daily lives– from world financial markets, government and military operations, business and industry, worldwide shipping and transportation, health and human services, entertainment–even organized athletics and religion.
Keeping data centers in business by optimizing their design, energy efficiency and information processing efficacy is the goal of a $437,270 two-year project backed by a $247,533 contract from the New York State Energy Research and Development Authority.
Peter R. Smith, NYSERDA’s president, said the data center project is one of several high-tech, but fundamentally important research projects sponsored by the public benefit corporation. “These data centers are high electric-demand nerve centers whose utility service is large. They require stable and secure electric power for machine operation and cooling. Considering New York’s prime financial center role, NYSERDA seeks to find ways to serve these centers efficiently and securely, and then replicate those designs at universities and other large computing power centers around New York.”
The project, “Optimizing Airflow Management Protocols in New York Data Centers,” will team researchers at Binghamton University, Georgia Tech, Lawrence Berkley Labs and IBM. The project will focus on surveying, modeling, and then testing design improvements to an existing Manhattan data center, all in hopes of devising new design strategies that can be employed the world over.
Just now approaching their adolescence, data centers have already become the heart and central nervous system of the information age. Without these data storage and processing strongholds, the Internet would be reduced, at least temporarily, to a grown-up, digitized version of the old Campbell’s soup-can network, where end users hunker down and share sketchy information available to few and meaningful to even fewer. Search engines would die with nary a sputter. Encyclopedia salesmen would be cruising middle-class neighborhoods by tomorrow. “Googling” someone, unless quickly redefined and with a possible jail term attached, would be out of the question.
Not to worry. Data centers, like most adolescents, probably aren’t going to leave us anytime soon. They will, instead, grow stronger. Hopefully, they will also grow smarter. Like it or not, they will reproduce. And like adolescents everywhere, data centers, at least for the foreseeable future, will continue to consume fuel at rates guaranteed to raise eyebrows and empty wallets.
Bahgat Sammakia, interim vice president for research at Binghamton University and director of the University’s renowned Integrated Electronics Engineering Center, has spent much of his 30-year career working to improve thermal management strategies in electronics packaging. That means devising ways to keep computers and other electronics from spontaneously combusting the moment they are turned on.
“Imagine the heat generated by a 100 watt light bulb,” Sammakia explained. “Now take that same amount of power and double it, so you have the equivalent of a 200 watt light bulb in an area the size of a computer chip. Now you have extremely high heat density. When you turn your machine on, the temperature shoots up to 200, 300 or even 400 degrees Celsius.”
Sammakia’s point is this: Without adequate thermal management, the electricity coursing through your computer would reduce it to a puddle of melted solder and burnt plastic before you could pull the power cord.
That’s the kind of thermal management challenge Sammakia and other electronics packaging researchers have become accustomed to dealing with at the device and packaging level. The issue becomes even more heated, however, when hundreds of pieces of electronic equipment, drawing thousands of kilowatts of power 24 hours a day, seven days a week, 52 weeks a year, are housed together in a data center, he said.
If the computers had their way, data center energy costs would be significantly higher. For all the heat they generate, computers actually thrive on cold. Most electronic computing devices run 30 to 40 percent faster at subfreezing temperatures, Sammakia said. Human interaction and monitoring of data center equipment, however, is a round-the-clock enterprise, so optimum temperatures in the range of 60 to 70 degrees Farenheit are maintained not because they best serve the needs of the machines, but for the practical purposes of human comfort and survival. While the computers might prefer an Arctic clime in data centers, achieving temperatures even as cool as a late spring day are a tall order, Sammakia said.
“We’re talking about rooms the size of basketball courts, where you have row after row of main frames and servers, all dissipating heat, and the entire room is designed with the sole purpose of sustaining this equipment and maintaining it at the right temperature,” said Sammakia. “These data centers, and there are hundreds of them in New York City alone, consume massive amounts of power. By making computations more cost and energy efficient, by reducing total energy consumption, and by passing on the environmental benefits of those savings, any energy efficiency can make a very significant difference.”
Simply discovering the best location for cold-air delivery vents could easily mean million-dollar savings once incorporated as a standard strategy in data center designs, Sammakia said.
The research team will be striving to achieve whatever energy efficiencies it can. Sammakia thinks improvements on the order of 20 to 25 percent are “very doable.” To reach that goal Binghamton University researchers will build numerical, computer models of the Manhattan data center during the first year of the project. Their models will then be used to enhance the team’s ability to predict the efficacy of design changes proposed throughout the remainder of the project.
Researchers at Georgia Tech will confirm the accuracy of the modeling by building an actual room at scale and taking appropriate measurements. Finally, in year two, the team will send researchers to take measurements in the actual data center and will write a design guide to help improve energy efficiency in all data centers. IBM, an established leader in energy metrics, will mentor modeling and measurements, while Lawrence Berkley Labs, already a prominent name in data center research and design in California, will help to benchmark the Manhattan data center.
Media Contact
All latest news from the category: Studies and Analyses
innovations-report maintains a wealth of in-depth studies and analyses from a variety of subject areas including business and finance, medicine and pharmacology, ecology and the environment, energy, communications and media, transportation, work, family and leisure.
Newest articles
NASA: Mystery of life’s handedness deepens
The mystery of why life uses molecules with specific orientations has deepened with a NASA-funded discovery that RNA — a key molecule thought to have potentially held the instructions for…
What are the effects of historic lithium mining on water quality?
Study reveals low levels of common contaminants but high levels of other elements in waters associated with an abandoned lithium mine. Lithium ore and mining waste from a historic lithium…
Quantum-inspired design boosts efficiency of heat-to-electricity conversion
Rice engineers take unconventional route to improving thermophotovoltaic systems. Researchers at Rice University have found a new way to improve a key element of thermophotovoltaic (TPV) systems, which convert heat…