PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Improving energy efficiency of supercomputer systems through software-aided liquid cooling management

Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Many fields of modern science rely more and more on the immense computing power of supercomputers. Modern, multi-thousand node systems can consume megawatts of electrical energy in highly uneven manner, challenging the data center infrastructure, both power and cooling coils. The traditional way of managing the infrastructure makes each subsystem of a data center (e.g. cooling) independent from all other in the way it relies only on local sensors to manage the infrastructure. The erratic nature of computing in a large data center makes this approach suboptimal. In the paper we show that by challenging the traditional split between the infrastructure and the computing equipment, one can gain significant boost in energy efficiency of the entire ecosystem. A solution that predicts cooling power demand basing on the information from a supercomputer resource manager, and then sets up the parameters of the cooling loop, is presented along with potential benefits in terms of reduction of the power draw.
Słowa kluczowe
Rocznik
Strony
89--103
Opis fizyczny
Bibliogr. 14 poz., rys., tab.
Twórcy
  • Poznań Supercomputing and Networking Center (PSNC), Poznań, Poland
autor
  • Institute of Computing Science, Poznań University of Technology, Poznań, Poland
autor
  • Institute of Computing Science, Poznań University of Technology, Poznań, Poland
Bibliografia
  • [1] Greenberg S., Mills E., Tschudi B., Rumsey P., Myatt B., Best Practices for Data Centers: Lessons Learned from Benchmarking 22 Data Centers, 2006 ACEEE Summer Study on Energy Efficiency in Buildings, 2006.
  • [2] HPL: High Performance Linpack, used optimized intel HPL implementation: https ://software.intel.com/en-us/articles/intel-mkl-benchmarks-suite.
  • [3] iDataPlex: https://lenovopress.com/tips0878-idataplex-dx360-m4.
  • [4] Iyengar M., David M., Parida P., Kamath V., Kochuparambil B., Graybill D., Schultz M., Gaynes M., Simons R., Schmidt R., Chainer T., Server liquid cooling with chiller-less data center design to enable significant energy savings, 28th Annual IEEE Semiconductor Thermal Measurement and Management Symposium (SEMI-THERM), 2012.
  • [5] Januszewski R., Gilly L., Yilmaz E., Auweter A., Svensson G., Cooling - making efficient choices, PRACE (Partnership for Advanced Computing in Europe) Whitepaper, 2016.
  • [6] Januszewski R., Meyer N., Nowicka J., Evaluation of the impact of direct warm-water cooling of the HPC servers on the data center ecosystem, Proceedings of ISC 2014 Conference, 2014.
  • [7] Nguyen C.T., Roy G., Gauthier C., Galanis N., Heat transfer enhancement using Al2O3-water nanofluid for an electronic liquid cooling system, Applied Thermal Engineering, 27, 2006, 1501-1506.
  • [8] Novec: https://www.3m.com/3M/en_US/novec-us.
  • [9] Open Compute Project: www.opencompute.org.
  • [10] PSNC: Poznan Supercomputing and Networking Center, http://www.man.poznan.pl/online/en.
  • [11] Rasmussen N., Implementing Energy Efficient Data Centers, Schneider Electric Whitepaper, 2014.
  • [12] Semenov O., Vassighi A., Sachdev M., Impact of Self-Heating Effect on Long-Term Reliability and Performance Degradation in CMOS Circuits, IEEE Transactions on Device and Materials Reliability, 6, 2006, 17-27.
  • [13] SLURM: https://slurm.schedmd.com.
  • [14] Steinberg D.S., Cooling techniques for electronic equipment, New York, Wiley-Interscience, 1980.
Uwagi
Opracowanie rekordu w ramach umowy 509/P-DUN/2018 ze środków MNiSW przeznaczonych na działalność upowszechniającą naukę (2018).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-62673743-e093-4d83-b22e-ef299e3cc7bc
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.