A Review of Current Status of Free Cooling in Datacenters

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

Applied Thermal Engineering 114 (2017) 1224–1239

Contents lists available at ScienceDirect

Applied Thermal Engineering


journal homepage: www.elsevier.com/locate/apthermeng

Research Paper

A review of current status of free cooling in datacenters


Hafiz M. Daraghmeh, Chi-Chuan Wang ⇑
Department of Mechanical Engineering, National Chiao Tung University, Hsinchu 300, Taiwan

a r t i c l e i n f o a b s t r a c t

Article history: In this study, an overview of current status of the free cooling technologies applicable for datacenters had
Received 14 March 2016 been discussed, including airside economizers, waterside economizers, and heat pipe technology. By
Revised 12 October 2016 introducing the free cooling technologies, the compressor loading of refrigeration system can be partially
Accepted 14 October 2016
or completely relieved. Utilization of airside or waterside free cooling relies strongly on the ambient con-
Available online 20 October 2016
ditions, yet either airside free cooling or waterside free cooling may be integrated with other systems
such as absorption, solar system, adsorption, geothermal, evaporative cooling, and the like to extend
Keywords:
its performance. On the other hand, thermosiphon heat exchangers and pulsating heat pipes feature
Datacenter
Free cooling
unique characteristics to transfer heat at small temperature difference are quite promising for datacenter
Air economizers free cooling.
Water economizers Ó 2016 Elsevier Ltd. All rights reserved.
Heat pipe

1. Introduction total energy consumption is shown in Fig. 1 [1]. In this regard,


implementing efficient cooling systems, strategies, and methods
In recent years, electrical power consumed in data centers has which can help to reduce the energy costs of traditional cooling
experienced a dramatic increase, and it is expected to pose critical systems, is imperative for datacenters. Traditional cooling systems
influence on information technology (IT) now and in the coming which depend on vapor compression cycle by which may consume
future [1]. The energy demand for datacenter surged more than large quantities of electrical power subject to some subsequent
100 times higher as compared to related infrastructures such as reasons. Firstly, this cycle needs to work all the year round, and
typical buildings [2]. Datacenters normally comprise buildings, even during winter season when the ambient temperature is low.
rooms, and facilities that contain enterprise servers, server com- Secondly, mixing of cold and hot air streams, that is happening
munication equipment, cooling equipment and power equipment, due to the lack of airflow control devices. Thirdly, datacenter cool-
have an average energy consumption of 872 kW h/m2 by 2011 [3]. ing systems, especially waterside cooling systems, require
On the other hand, from 2006 to 2011 energy consumed by data mechanical piping system, which consumes a lot of energy through
center servers and related infrastructure equipment has doubled pumps and fans to transport cold water or air, regardless long dis-
in the United States and worldwide [4]. In 2006, US Environmental tance transportation could result into appreciable loss [11]. The
Protection Agency (UPA) predicted that the costs of all US data cen- efficiency of datacenter cooling systems can be improved through
ters were 4.5 billion dollars, yet by 2011 was 7.4 billion dollars [5]. elimination/reduction of long distance transportation. Some previ-
Moreover, the average power density of datacenters is currently ous studies had found various solutions for such problems, such as
about 6 kW per rack. According to power trend of datacenter, with using rack backdoor coolers [12], ceiling coolers [13], and inverter
the rapid growth of the datacenter infrastructures accompanied controlled fans [14], structure of the perforated tiles optimization
with the development of higher power density server components, [15–18], supply and return modes [19–21], and some particular
the average power density of datacenters is expected to reach rack arrangement [22,23]. These methods had been used to ease
50 kW per rack by 2025 [6,7]. concerns of the second and third aforementioned problems. For
Meanwhile, the energy consumption of traditional cooling sys- resolving the firstly mentioned problem, a common widely used
tems of datacenters infrastructures normally takes up about 50% of energy efficiency strategy is the so called free cooling technology,
the total energy consumption and can be even more severe in some which is known as economizer cycle. By employing this concept,
situations [8–10]. A typical breakdown of the current datacenter utilization of either computer room air conditioner (CRAC) units
or chiller plants can be reduced or potentially eliminated, resulting
⇑ Corresponding author at: EE474, 1001 University Road, Hsinchu 300, Taiwan. in substantial savings in the total cooling energy requirement
E-mail address: [email protected] (C.-C. Wang).
[24,25]. Economizer cycle makes use of the ambient climate

http://dx.doi.org/10.1016/j.applthermaleng.2016.10.093
1359-4311/Ó 2016 Elsevier Ltd. All rights reserved.
H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239 1225

Nomenclature

AHU air handler unit NSIDC national snow and ice data center
ASHRAE American society of heating, refrigerating and air- PHP pulsating heat pipe
conditioning engineers PUE power usage effectiveness
CES cold energy storage RH relative humidity
COP coefficient of performance TS thermosiphon heat exchanger
CRAC computer room air conditioner US United States of America
CRAH computer room air handler UK United Kingdom
DX direct expansion UPA environmental protection agency
HE heat exchanger TES thermal energy storage
IBM international business machines corporation UPS uninterruptible power supply
ISMT integrated system of mechanical refrigeration and ther- VFD variable frequency drive
mosiphon
IT information technology

However, there are three remarkable factors that impact the


utilization of economizers. The first factor is the geographic loca-
tion; airside economizer can be easily implemented in the mild
and cold climate areas, while it is hard to apply in hot and/or
humid areas and in the areas of rapidly changing climate. The sec-
ond factor is the allowable operating range. Before 2008, the range
for ambient temperature and humidity is comparatively narrow
according to the ASHRAE regulation, limiting embodiment of air-
side economizer. However, ASHRAE in 2011 had relaxed the oper-
ating conditions such as, dry-bulb temperature, humidity ranges
and maximum dew point [26]. Accordingly, it had appreciably
boosted the use of airside economizer for more operational hours
during the whole year, resulting in substantial energy and effi-
ciency savings. Table 1 and Fig. 2 depicted the associated ASHRAE
regulations for environmental classes [27]. In order to achieve
higher reliability of economizers, datacenter operator must choose
an appropriate class to operate in the most energy efficient mode.
The third factor which affects the economizers is the arrangement
in the datacenter. The most crucial importance comes from how to
avoid mixing of hot and cold air stream, and this can be made
available through some efficient separation of cold air supply from
hot air supply. Two types of commonly used cold- and hot-aisle
containments are shown in Fig. 3 [28]. In the past the economizers
were used as a supplement of operation that could gain some bet-
Fig. 1. Typical energy use breakdown of datacenter energy consumption [1].
ter energy savings and efficiencies during certain time of the year.
Nowadays, the use of economizers is becoming primary mode of
especially in mild and cold climates areas to relief the need of the operation to ease the loading of mechanical refrigeration-based
CRAC/chiller plant inside data centers facilities. This is because system since it can reduce or even eliminate the runtime of com-
normally the outside temperature in the vast majority of regions pressor, thereby yielding highly efficient datacenter cooling sys-
is lower than that inside datacenters. However, when the outside tems. The purpose of this paper is to provide an overview of the
temperature is high, the refrigeration cycle must be employed, recent used datacenter economizers, modes and options, and to
and the ambient dry bulb temperature and relative humidity develop guidelines for selecting the proper economizer to match
may play essential role on the refrigeration system. Therefore, different situations and conditions of datacenter infrastructures.
under certain favorable conditions and careful manipulation, the
compressor can be bypassed, which can lead to a considerable
Table 1
energy savings. Many studies have examined the potential of using 2011 ASHRAE environmental classes for data centers applications [27].
free cooling economizers in datacenter infrastructures [3,4,8–10].
Range Class Dry-bulb Humidity range, non-
The current economizers utilized in the datacenters can be divided
temperature condensing
into three categories, namely the airside economizers, waterside
Recommended All A 64.4–80.6 °F 41.9 °F DP to 60% RH and
economizers, and heat pipe cooling systems which had recently
59 °F DP
been introduced to datacenters. The airside free cooling cools the
Allowable A1 59–89.6 °F 20–80% RH
datacenter by either directly bringing the cold air into the datacen-
A2 50–95 °F 20–80% RH
ter or indirectly through some heat exchange equipment. Hence A3 41–104 °F 10.4 °F DP and 8% RH to 85%
some filters or heat exchangers (e.g. fin-and-tube heat exchangers, RH
rotary wheel, air-to-air heat exchangers and the like) are required. A4 41–113 °F 10.4 °F DP & 8% RH to 90% RH
The waterside free cooling adopts cooling tower, water pump, and B 41–95 °F 8% RH to 80% RH
C 41–104 °F 8% RH to 80% RH
relevant infrastructures to circulate colder water into the CRAC.
1226 H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239

Fig. 2. ASHRAE environmental classes for datacenters economizers [26].

Fig. 3. The commonly used cold-aisle and hot-aisle containments in datacenter [28].
H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239 1227

A detailed listing and comparison of the most recent used econo- generated inside datacenter facilities can be removed when out-
mizers is tabulated in Table 4 and further discussions on relevant side colder air is brought into the datacenter directly. This strategy
studies will be addressed subsequently. had been widely employed by datacenters operators; in fact, as
much as 40% out of all free cooling economizers’ deployment takes
this form [8]. Many researchers had discussed the potential of
2. Airside economizers using free cooling economizers in data centers [29,30]. For exam-
ple, Lee and Chen [29] had numerically investigated the potential
Airside economization can be obtained through pumping colder saving of using direct airside free cooling economizers for 17 cli-
outside air into datacenters. In cold or mild climates, using outside mate zones. Their results showed a significant energy savings of
air cooling can lead to an extremely high efficiency, while it is not economizers used in mixed-humid, warm and warm-marine cli-
so usable in hot, humid and rapidly changing weather areas. Typ- mates zones, and it is comparatively inefficient in dry as well as
ical airside free cooling can be in forms of direct airside cooling, humid climate zones due to the need of high energy fans, humid-
indirect airside cooling, or evaporative cooling (direct, indirect, or ifiers, and dehumidifiers. The usage of these devices can offset
multi-stage). the benefits of free cooling. The results also showed that using a
well-controlled datacenter is essential since 2 °C drop in the indoor
2.1. Direct airside free cooling temperature reduced the energy savings by 2.8–8.5%. Siriwardana
et al. [30] studied the use of direct air economizers in different Aus-
Direct airside cooling is the simplest economization method tralian states with different climates. In some states the results
and it is what datacenter operators think of first. It is an effective showed substantial energy savings. In 2008, the Intel company
method to improve datacenter cooling system that takes advan- adopted the same strategy to launch a 10-megawatt (MW) data
tages of colder outside air. Typical direct airside economizer con- center, and reported a huge annual energy savings of USD 2.87 mil-
sists of air ducts and a set of fans, louvers, filters, vents, and lion [31]. In order to take use of outside air in low humidity areas
dampers. These components can effectively use outside air to and dry-hot areas, direct evaporative cooling had been used to cool
replace refrigeration cooling entirely or partially. Examples of data center facilities. An evaporative cooler consists of a large fan
direct air economizers are shown in Fig. 4(a) [10]. The heat that draws warm air through moisture filler. As the water in the

(a) Common datacenter direct airside cooling [10].

(b) Direct evaporative cooling [33].


Fig. 4. Some schemes of direct airside free cooling utilized in datacenter.
1228 H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239

Table 2
Total annual utilization of direct airside free cooling of some European countries [3].

Location Climate zone Region 1 (h) Region 2 (h) Total (h)


Amsterdam Oceanic climate 6611 67 6678
Barcelona Mediterranean climate 4549 293 4842
Frankfurt Temperate oceanic climate 6120 219 6339
London Temperate oceanic climate 6441 97 6538
Stockholm Humid continental climate 7077 165 7242

moisture filler evaporates, the air is cooled before entering the Table 3
datacenter [32]. The temperature can be controlled by adjusting Comparison between direct airside cooling and indirect airside cooling [39].
the airflow of the cooler and a schematic of this example is shown Type Direct airside cooling Indirect airside cooling
in Fig. 4(b) [33]. However, direct airside free cooling method has
Efficiency Refrigeration compression Lower efficiency due to one
some remarkable problems to be solved, such as climate change, cycle is not needed when the more heat exchanger between
air quality, high humidity, and maintenance issues. outside conditions are outdoor and indoor air
Udagawa et al. [34] conducted a study in Japan to compare favorable, but fan power will
free cooling-based systems with package type air conditioning increase as filters become
clogged
system. Their simulation considered different climate conditions
Capital Lower capital cost for These systems normally cost
on four several locations. They claimed that high-efficiency con- cost prefabricated systems more than direct airside
trol was attainable and the coefficient of performance (COP) tends cooling
to increase only when the outside temperature is low. Oró et al. Operating Higher energy cost for Lower operating and
cost geographies with high maintenance cost since the
[3] studied the potential use of direct air free cooling combined
temperature and humidity outdoor air is isolated
with thermal energy storage strategy (TES). The study was per- Higher maintenance cost due completely by air to air
formed in five different European cities subject to different cli- to filter replacement exchanger, which results in
mates conditions in association with ASHRAE recommended longer full and partial
guidelines of total annual hours. As tabulated in Table 2, the sim- economizer mode hours

ulation suggested that Stockholm and Amsterdam revealed the


highest potential of using outside air cooling for more than ation. In addition, it also helps to prevent unwanted contaminants
6500 h operations annually. The air quality of direct free air cool- like dust and smoke into the datacenters. In addition to filters, the
ing is another concern that should be taken into account, drawing heat exchanger also can help to get rid of contaminants [39]. A fur-
a large amount of outside air into the data center may bring ther comparison between direct and indirect airside cooling is tab-
about a lot of contaminants. This may raise the risk of degrading ulated in Table 3 [39]. Furthermore, this kind of economizer has
data center facilities, and this challenge brings up the remarkable much less effect on reducing the effective airflow volume. One of
need for pretreatment and filtration. Using filters may affect the the most common embodiments used in the datacenter is the
required airflow volume for effective cooling. Dai et al. [11,35] air-to-air heat exchanger configuration as shown in Fig. 5(a). The
reported the drawbacks of contaminants which were brought to air-to-air heat exchanger provides a full separation between out-
datacenter facilities by direct airside economizer. They reported side air and treated air within the datacenter [40]. Moreover, in
that suitable algorithm can be used as monitoring equipment warm ambient conditions this type of economizer can be used
parameter to provide early failure warnings and avoid sudden along with evaporative cooling to maximize the economizer oper-
down time [11]. Furthermore, Shehabi et al. [36–38] discussed ation hours [39]. This kind of economizer includes indirect evapo-
the concentration of the particle pollutants. They found that the rative cooler, pumps, valves, fans, water spray nozzle (drift
increment of pollutant concentration in the outside air which is eliminator), sensors and direct expansion (DX) assistance. Another
introduced to datacenter system can be neglected as long as it well-known indirect sensing cooling implementation which can
did not overtake the ASHRAE specified limits. They suggested that separate the outside air from datacenter is the heat wheel as
a high performance filter can be very beneficial to overcome the shown in Fig. 5(b) [41]. This innovated design consists of rotary
high humidity problem. Depending on such filters, annual savings wheel which is considered as a regenerative air to air heat exchan-
were estimated to be within the range of 60–80 MW h/year. In ger, the heat wheel typically has a very large diameter of about 8–
summary of the foregoing discussions, an ideal direct airside free 12 ft. The hotter inside air from the datacenter passes through the
cooling mode must be taken into account the variation of envi- wheel to heat up the passage (some special corrugated Aluminum
ronmental conditions to maximize economizer utilization hours. material) of the wheel, while the hotter passage then gradually
Hence, it is imperative to incorporate sensors and some feedback rotated to outdoor side to give up heat to the colder air, lowering
controls simultaneously as far as successful implementation of the passage temperature accordingly. Some efficient high air vol-
the direct airside free cooling is concerned. ume and low pressure fans are utilized to move air across the
wheel. This design can incorporate with an internal DX system
2.2. Indirect airside free cooling with multiple cooling stages for high ambient conditions. However,
both air-to-air heat exchanger and heat wheels require a con-
The idea of the indirect airside free cooling economizer had stantly cleaning or regularly replacement. This is because large
been employed to solve the aforementioned problems of the direct quantities of contaminants in the air passing through the heat
airside economizer. Indirect air economizer takes advantage of exchanger will build up over time and will reduce the effectiveness
favorable outdoor conditions without introducing outside air into of the heat exchangers [39].
the datacenters. The outside air which cools the heated air is circu-
lated to the environment through an independent loop. Since the 2.3. Multi-stage evaporative cooling systems
outside air is not brought directly into the data center, these econ-
omizers allow closer control over humidity and air quality. Thus, Indirect evaporative cooling economizer can be improved by
the outdoor conditions cast less influences on the datacenter oper- adding a sensible cooling stage to the system. This modification
H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239 1229

in the design can enhance the cooling efficiency significantly and schematic of the cooling system designed by NSIDC. Results
relax limitation of outdoor wet bulb temperatures. Adding this showed substantial energy savings and reduction in PUE as com-
sensible cooling is essential when the outside humidity is rela- pared to typical CRAC system. The CRAC system had an annual
tively high. It can be easily obtained by mixing the outside air with average PUE of 2.01 while the NSIDC’s new system had an average
the exhaust air from the conditioned air before bringing it to the PUE of 1.55 [42].
evaporative cooler. Obviously, heat exchanging is possible since
the outside temperature is much higher than the exhaust temper- 3. Waterside economizers
ature. Multistage evaporative cooler is designed to get the maxi-
mum possible benefits accompanied with improved system Waterside economization can be obtained through directly
performance. For example, it can be used to operate even when pumping water into datacenters or indirectly with the help of cool-
the inlet air temperature is lower than the wet-bulb temperature ing towers or dry coolers. The working principle of this economizer
of the outside air. Hence, it can be used in high humidity areas. depends on using cooling towers to precool partially or completely
In 2011, National Snow and Ice Data center (NSIDC) designed a the return water in a chilled water loop [4]. In essence, this econ-
new cooling system consists of air handling unit (AHU) powered omization process requires a changeover from free cooling mode to
by a 7.5-kW (10-horsepower) fan motor via a variable frequency mechanical cooling mode to achieve the required heat rejection.
drive (VFD), eight multi-stage indirect evaporative cooling air con- According to Brown [43], water economizers are considered as
ditioners, along with hot aisle containment design. Fig. 6 shows a one of the best practices in reducing power usage effectiveness

Table 4
Representive researches in association with free cooling.

Source Free cooling strategy Operating conditions Benefits


Niemann et al. Direct air 1 MW data center 718,159 (kW h/year) PUE = 1.14
[40]
Atwood and Direct air 10-MW data center 2.87 million USD (annual)
Miner [31]
Oró et al. [3] Direct air 1125 kW data center The cooling electricity cost was reduced up to 51% Highest
cooling potential = 6500 h/year
Shehabi et al. Direct airside – Annual savings at Center 8 at estimated within the range of
[38] 60–80 MW h/year
Niemann et al. Indirect airside 1 MW data center 466,518 (kW h/year) PUE = 1.09
[40] (evaporative HE)
Niemann et al. Indirect airside (Heat 1 MW data center 503,999 (kW h/year) PUE = 1.10
[40] wheel)
Potts [64] Indirect airside (Kyoto – COP of 8–10 can be achieved
wheel)
Longbottom Direct water – Reduce 60% of heat from 33 kW rack
[44]
Niemann et al. Chiller-dry cooler 1 MW data center 846,039 (kW h/year) PUE = 1.16
[40]
Choi et al. [65] Chiller-dry cooler – COP of 3.2–9 can be achieved
Niemann et al. Waterside (Chiller- 1 MW data center 728,195 (kW h/year) PUE = 1.10
[40] cooling tower)
NSIDC [42] Multi stage-Indirect AHU powered by 7.5 kW fan motor Substantial energy savings and reduction in PUE PUE = 1.55
evaporative cooling
Brown [43] Direct water – Can reduce PUE to 1.5
Carlson [46] Waterside (Chiller- – When the difference in temperatures is not very large, a
cooling tower) change of only a few degrees can bring substantial gains in
efficiency
Hamann et al. Waterside (Chiller- Tin = 66 °F Chiller power consumption reduced by 31%
[47] cooling tower)/Solar Tout = 80 °F
cooling
Qian et al. [50] Heat pipe as free Used when Tin < Tout Significant energy savings of 38.9%
cooling device
Samba et al. Heat pipe Working fluid: n-pentane and Filling ratio about 9.2% Equipment’s heating load of cooling system with no TS must
[51] be lower than 250 W
With TS, must be lower than 600 W (Higher operating
temperature)
Jouhara and Meskimmon [52] Heat pipe Conducted in a typical region in the UK
Potential
energy
savings up
to 75%
Han et al. [56] ISMT High heat density mobile base station About 34.3–36.9% energy savings
Zhang et al. ISMT Experimentally investigated in four different climate Annual energy savings up to 47%
[57] zones in China
Tian et al. [59] ISMT High density data centers based on the least dissipation Improved indoor thermal environment and annual cooling
theory and property matched heat exchange principle cost reduction by approximately 46%
Zhang et al. CES – Saves capital and operating expenses up to $2668/day and
[62] $825/day, respectively, for a data center with 17,920 servers
Singh et al. CES using heat pipe – 3 million USD/year energy savings
[61] Heat output capacity = 8800 kW
Lu and Jia [63] PHP – Startup lower temperature, Uniform temperature distribution,
Temperature reduction of hotspots from 36.7 to 32.9
1230 H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239

(a) Air-to-air heat exchanger [39, 40];

(b) Kyoto wheel [10, 41].


Fig. 5. Indirect airside free cooling technology.

Fig. 6. Schematic cooling system designed by NSIDC [42].


H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239 1231

Fig. 7. Datacenter cooled by geothermal mechanism [45].

Fig. 8. An example of an indirect cooling tower cooling system [10,40].

Fig. 9. Integrated cooling tower-chiller system by Carlson [46].


1232 H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239

Fig. 10. Free water cooling system combined with solar cooling cycle [47].

(PUE) to 1.5. Generally, waterside free cooling economizer can be companies used direct waterside free cooling for their commercial
categorized into three designs: direct waterside cooling system, products. Based on the report of IBM, such a system can remove
cooling-tower system, and integrated dry cooler-chiller system, 60% heat from a 33 kW high density rack [44]. Yet James and
and their detailed implementations and operational principles Rubenstein [45] invented a geothermal cooling mechanism, where
are given below. data center facilities were cooled through a thermally conductive
interface with some heat pipes that could be buried beneath the
3.1. Direct waterside cooling system earth for heat dissipation. This mechanism is depicted in Fig. 7.
The usage of such heat pipes can transfer the heat from datacenter
Direct waterside free cooling system is widely used since it is facilities to the earth without additional need for electrical power
comparatively much more efficient than airside free cooling. Many consumption. The design concept, of course, can be slightly modi-

Fig. 11. Integrated dry cooler-chiller system [10,40].


H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239 1233

3.2. Cooling tower system

This system consists of two cycles, one is chiller cycle where


water is circulated through CRACs, and a cooling tower cycle where
hot waste water from datacenter is cooled through evaporative
cooling. Obviously, cooling towers are more applicable in cold
and mild weather only when the outside air has a comparatively
low wet bulb temperature and low humidity. Chiller cycle will stop
running when the outside temperature is cold enough and cold
water from cooling tower is directly fed into datacenter. Therefore,
such systems require some algorithms for changeover from econo-
mizer mode to compressor refrigeration mode as depicted in Fig. 8
[10,40]. Note that the installation cost and space should be taking
into account. Udagawa et al. [34] suggested that optimal capacity
of the cooling tower system should be estimated to maximize
the system-wide effect. Carlson et al. [46] developed a cooling sys-
tem including water cooling source such as cooling tower and plu-
rality of on-floor cooling units, each unit is configured to cool a
portion of datacenter racks as shown in Fig. 9.
Much attention is currently paid on the combination of cooling
tower system and absorption refrigeration cycle integrated heat
source either from solar energy or waste heat generated from dat-
acenter. Using this technology can be advantageous in most of dat-
acenters infrastructures regardless datacenter location and climate
conditions. Since solar energy can be used for cooling in hot
weather and cooling towers or chiller plants can be eliminated,
resulting in substantial energy savings. Hamann et al. [47] pro-
posed a similar cooling design that combines free water cooling
Fig. 12. Two phase thermosiphon loop for telecommunication equipment [51].
and absorption solar refrigerator as shown in Fig. 10. During hot
season, heat can be generated by solar collector system to power
the absorption refrigerator for supplying chilled water to circulate
fied by changing passive heat pipe deign with some ground heat
in the internal water loop. Conversely in cold season, datacenter
exchangers and water pumps. However, the initial cost and fouling
cooling system takes advantage of cooling tower system. However,
concerns may jeopardize the actual implementations.
the downside of the cooling tower system is that large quantities of

Fig. 13. ISMT for datacenter application proposed by Suenaga and Ichimura [53].
1234 H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239

contaminants in the outside air will accumulate in the heat datacenter economizers being used, water-side economizers are
exchanger over time. Accordingly, it requires frequent mainte- the most widely used in datacenters [48]. Moreover, water-side
nance. Furthermore, cooling towers can be operated only in cooling systems are commonly used in large datacenter systems,
relative cold and mild climates in which the corresponding wet- suggesting enormous requirements of gigantic water supply. Miller
bulb temperature of outside air is low enough. Note that the idea [49] mentioned that a 15 MW datacenter can consume as much as
of integration of absorption can be easily switched to adsorption 360,000 gallons of water per day. Hence it may raise serious con-
system (either solid adsorption or liquid adsorption). Adsorption cerns of the shortage of water supply and suggested that cooling
systems can further remove all the fluid machineries, yielding a methods in association with air-side free cooling may be more
smaller PUE. responsible as far as environment is concerned [49].

3.3. Integrated dry cooler-chiller system (water to air dry coolers) 4. Heat pipe technology

Dry cooler is a fin-and-tube heat exchanger that is used to cool For effective removal of heat transfer and shorter distance
the circulated hot water from datacenter when the outside temper- transportation, some recent studies adopted heat pipe technology
ature is low enough. Dry cooler can be used to circulate chilled for thermal management of the datacenter. Using heat pipe heat
water in the CRAH to bypass the packaged chiller with same oper- exchanger such as thermosiphon can help to minimize the thermal
ational principles such as aforementioned cooling tower cycle as load on traditional cooling system as well as reducing greenhouse
shown in Fig. 11(a), or it can be connected with a second external gas emission. Note that heat pipes feature some unique character-
fin-and-tube heat exchanger to cool condenser water in the CRAC istics such as passive operation and extremely high effective ther-
unit as shown in Fig. 11(b). Dry cooler can be used as a partial mal conductivity (still in function even when the temperature
mode operation. Apparently, the dry cooler is connected in series difference is small between indoor and outdoor) which makes it
to the packaged chiller unit and it can be incorporated with evap- quite perfect for some thermal managements. Qian et al. [50] intro-
orative cooling to chill water in dry locations [10,40]. However, this duced heat pipe cooling system as the free cooling source, which
kind of economization system is not suitable for all year operation was implemented in Beijing, China and was operated when the
since it requires regularly switching to the traditional DX cooling outside temperature is lower than that of environmental set tem-
system. Meanwhile, according to ‘‘green grid 2011” survey about perature. By introducing heat pipe heat exchanger, they reported a

(a) Integrated system: vapor compression mode and thermosiphon mode

(b) Developed self-operated three-way valve


Fig. 14. ISMT proposed by Han et al. [56].
H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239 1235

significant energy saving of 38.9%. Samba et al. [51] presented and geometric parameters on the performance. The results showed
experimental investigations with thermosiphon loop as shown in that cooling capacity and circulation flow rate increased with the
Fig. 12. The n-pentane was used as working fluid and different riser of thermosiphon diameter and air flow rate but decreased
working fluids filling ratios were tested. The optimal filling ratio with the tube length. Tian et al. [59] proposed a cooling solution
value obtained from the results was about 9.2% which corresponds for high density data centers based on the least dissipation theory
to the minimum operating temperature. The maximum heating and property matched heat exchange principle. The cooling system
load of the telecommunication equipment with and without ther- is a combination between internally cooled rack having two-stage
mosiphon loop was about 250 W and 600 W respectively, meaning heat pipe loops and a combined water loop with some serially con-
that the heating load of the equipment with thermosiphon loop nected cold sources as shown in Fig. 16. The proposed combined
can be significantly reduced. Jouhara and Meskimmon [52] con- cooling system could help eliminating the undesired mixing of
ducted a case study in the UK about using heat-pipe technology hot and cold air, thereby yielding a uniform distribution of indoor
as solution for the high energy cost datacenters. They constructed temperature. A comparative measurement was performed to vali-
a theoretical model based on the established, proven performance date the effectiveness of the system. The results showed an
characteristics of heat-pipe technologies and the weather data for a improved indoor thermal environment with an annual cooling cost
typical region in the UK. Their results showed appreciably potential reduction by approximately 46%.
energy savings of up to 75% by employing heat pipe based free
cooling solution. Currently, heat pipe cooling technology used in
datacenters can be classified into three parts: namely integrated
system of mechanical refrigeration and thermosiphon (ISMT), cold
energy storage system (CES) and pulsating heat pipe (PHP). Further
details upon these technologies are addressed in the following.

4.1. Integrated system of mechanical refrigeration and thermosiphon


(ISMT)

Utilizing ISMT in datacenters is considered as an ideal solution


for energy savings. Suenaga and Ichimura [53] proposed an ISMT
consisting of thermosiphon system and vapor compression system.
Both of them shared one indoor unit fan as shown in Fig. 13. The air
is precooled by thermosiphon first, and then vapor compression
cycle distributes the air to the required place. The result showed
a substantial energy saving of vapor compression cycle due to
(a) Thermosiphon mode
the help of thermosiphon loop. However, this system has some
problems to be solved in regard to installation space, fluctuating
cooling capacity, maintenance, investment, and control. Okazaki
and Seshimo [54], and Lee et al. [55] developed the ISMT in which
evaporator, condenser were shared. Real integration and better
installation space control were achieved. However, the test results
showed a smaller cooling capacity in thermosiphon loop for two
reasons; i.e. the absence of mechanical circulation device and a
small diameter of the connection pipe between evaporator and
condenser, which affected the pressure drop greatly. In order to
solve this problem, Han et al. [56] developed a new ISMT in high
heat density mobile phone stations. The integrated system combi-
nes the traditional vapor compression cycle and thermosiphon
cycle, which can save space and energy as shown in Fig. 14(a). In
order to reduce the pressure drop and enhance the performance
of the system, a self-operated three-way valve was developed as (b) Refrigeration mode
shown in Fig. 14(b). This valve can connect the compressor and
condenser in the vapor compression mode, and bypass the com-
pressor, connecting condenser and evaporator directly in the ther-
mosiphon mode. This developed ISMT was tested in different
mobile datacenters in China. The results showed that the improved
ISMT consumed about 34.3–36.9% less energy than traditional
cooling systems. Zhang et al. [57] proposed an integrated system
of mechanical refrigeration and thermosiphon (ISMT), the system
includes two circulation loops: a mechanical refrigeration loop
and a thermosiphon loop. They used three-fluid heat exchanger
to connect the two loops. The system can work in three modes:
mechanical refrigeration, thermosiphon mode, and dual mode as
shown in Fig. 15. Thermosiphon free cooling mode is activated only
when the outdoor temperature is favorable. The performance of
the integrated system is experimentally investigated in four differ-
ent climate zones in China and the results showed an annual (C) Dual mode
energy-saving up to 47.3%. Zhang et al. [58] investigated the per-
formance of the ISMT mode and studied the effect of air flow rate Fig. 15. Schematic diagram of ISMT proposed by Zhang et al. [57].
1236 H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239

to provide a sufficient cooling capacity. This is often necessary to


support the system in the hot seasons. The cold storage can be sim-
ple water storage or ice storage depending on the geographical
location. It provides the chilled water through highly effective
plate heat exchanger which also helps to avoid contaminants of
liquid cooled heat sink. Since typical thermosiphon is gravity
assisted design, it can only operate when the ambient temperature
is lower than storage temperature. If the sized capacity of the heat
pipe system is not enough and the ambient temperature is high,
the chiller-cooling tower system is connected to the cold storage
to provide required cooling capacity. In essence, the aforemen-
tioned systems with the help of heat pipe technology showed sig-
nificant energy savings compared to the traditional cooling
systems. Zheng et al. [62] proposed a generalized power shaving
framework that exploits UPS batteries and a new knob, thermal
energy storage (TES) tanks in datacenters as depicted in Fig. 18.
Fig. 16. Schematic of two-stage heat pipe loop combined with water loop proposed Zheng et al. [62] modeled and discussed the characteristics of dif-
by Tian et al. [59]. ferent kinds of TES and their effects on datacenter cooling system.
Furthermore, they had designed the framework featuring different
4.2. Cold energy storage system (CES) using heat pipe strategies to shave both cooling side and server-side power based
on the different characteristics of TES and UPS. They reported an
One of the aforementioned problems of the ISMT is the unstable appreciable saving of capital and operating expenses up to
cooling capacity of the heat pipe system due to the changeable $2668/day and $825/day, respectively, for a datacenter with
weather. CES is an innovative method can be used along with the 17,920 servers. In summary of the foregoing discussions upon
heat pipe in data center to take an advantage of cold weather by the existing solutions to reduce the power consumption of data-
storing cold energy from the ambient in ice or water. By these cold centers, heat pipe technology reveals great thermal control for its
storages, chiller cooling system in the datacenter can be partially ability to transfer heat at small temperature difference.
replaced in order to obtain significant energy savings. Wu et al.
[60] and Singh et al. [61] proposed a CES thermosiphon system. 4.3. Pulsating heat pipe (PHP)
The proposed system can be easily connected with the traditional
cooling systems without major design changes as shown in Fig. 17. The pulsating heat pipes (PHP) are made of small diameter
It should be noted that cooling tower and chiller system was added tubes with vertically serpentine configurations, filling some

Fig. 17. Cold energy storage system (CES) using heat pipe proposed by Wu et al. [60] and Singh et al. [61].
H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239 1237

amount of working fluid. Through this arrangement, the PHPs are heat transfer performance of the PHP. Lu and Jia [63] proposed a
able to operate subject to expansion of vapor slug at the evaporator rack cooling system combined with pulsating heat pipe. The sche-
and contraction in the condenser with unevenly distributed liquid matic diagram of the system is shown in Fig. 19. They had experi-
slugs and vapor plugs within the whole tube. Unlike the conven- mentally studied the influence of start-up performance of
tional heat pipes that make use of the capillary wick structure to pulsating heat pipe, influence of heating load of the server in the
complete the flow circulation, the wickless design of PHP signifi- rack, and the influence of chilled air velocity. The result showed
cantly eases the capital cost of the manufacturing process and it that the start-up of PHP makes the temperature lower and the tem-
is quite effective for larger size operation/circulation. In addition, perature distribution more uniform in the rack. Comparing to the
the PHPs also hold many superior features such as high effective conventional cooling system, the cooling system combined with
thermal conductivity, flexibility, large maximum heat transfer, PHP could reduce the temperature of some hot spots from
and long distance transportation capability. Hence it is especial 36.7 °C to 32.9 °C. The higher heating power produced the shorter
viable for future possible implementation of heat pipe alternatives start-uptime and the faster heat transfer. In our laboratory, we also
in datacenter application. Most of these recent studies focused on conduct similar experiments to replace traditional thermosiphon
the factors that would influence the start-up phenomenon and heat pipe by PHP as schematically shown in Fig. 20. With the same

Fig. 18. Schematic of data center power shaving framework that exploits UPS/TES tanks equipped in data center proposed by Zheng et al. [62].

Fig. 19. Experimental setup of rack cooling system based on pulsating heat pipe [63].
1238 H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239

Acknowledgements

The authors are indebted to the financial support from Ministry


of Science and Technology of Taiwan under contract 104-2221-E-
009-184-MY3.

References

[1] K. Tupper, Making big cuts in data center energy use, Rocky Mountain
Institute, 2012, Available at: <http://blog.rmi.org/blog_making_big_cuts_
in_data_center_energy_use> (May, 2012).
[2] E. Oró, V. Depoorter, A. Garcia, J. Salom, Energy efficiency and renewable
energy integration in data centres, strategies and modelling review, Renew.
Sustain. Energy Rev. 42 (2015) 429–445.
[3] E. Oró, V. Depoorter, N. Pflugradt, J. Salom, Overview of direct air free cooling
and thermal energy storage potential energy savings in data centres, Appl.
Therm. Eng. 85 (2015) 100–110.
[4] Y.Y. Lui, Waterside and airside economizers design considerations for data
center facilities, ASHRAE Trans. 116 (1) (2010) 98–108.
[5] F. Zhou, T.X. Tian, G.Y. Ma, Investigation into the energy consumption of a data
center with a thermosyphon heat exchanger, Chin. Sci. Bull. 56 (2011) 2185–
2190.
[6] J. Verge, J. Smith, What will the data center of 2025 look like?, Data Center
Knowledge, 2014, Available at: <http://www.datacenterknowledge.com/
archives/2014/04/29/will-data-center-2025-look-like/> (April 29, 2014).
[7] J. Pouchet, I. Bitterlin, Data center density: will it soar or stall?, Emerson
Electric Co., 2015.
[8] E. Khosrow, G.F. Jones, A.S. Fleischer, A review of data center cooling
technology: operating conditions and the corresponding low-Grade waste
heat recovery opportunities, Renew. Sustain. Energy Rev. 31 (2014) 622–638.
[9] S.F. Hassan, M. Ali, U. Perwez, A. Sajid, Free cooling investigation of RCMS data
center, Energy Proc. 75 (2015) 1249–1254.
[10] H. Zhang, S.Q. Shao, H.B. Xu, H.M. Zou, C.Q. Tian, Free cooling of data centers: a
(a) Thermosiphon heat exchanger (b) Pulsating heat pipe heat exchanger review, Renew. Sustain. Energy Rev. 35 (2014) 171–182.
[11] J. Dai, D. Das, M. Pecht, Prognostics-based risk mitigation for telecom
equipment under free air cooling conditions, Appl. Energy 99 (2012) 423–429.
Fig. 20. Photos of typical thermosiphon heat exchanger and pulsating heat pipe.
[12] A. Almoli, A. Thompson, N. Kapur, J. Summers, H. Thompson, G. Hannah,
Computational fluid dynamic investigation of liquid rack cooling in data
centres, Appl. Energy 89 (2012) 150–155.
[13] C.D. Patel, C.E. Bash, C. Belady, Computational fluid dynamics modeling of high
surface area and being operated under same heat source and oper- compute density data centers to assure system inlet air specifications, in:
ation velocities, test results also indicate that the PHP can lower Proceedings of the Pacific Rim/ASME International Electronic Packaging
Technical Conference and Exhibition Held in Kauai, Hawaii, 2001.
the case temperature further by 4–5 °C as compared to traditional [14] T.D. Boucher, D.M. Auslander, C.E. Bash, Viability of dynamic cooling control in
thermosiphon heat exchanger. a data center environment, Center for the Built Environment, UC Berkeley,
Available at: <http://escholarship.org/uc/item/0wj7r61r> (January 1, 2006).
[15] R. Schmidt, E. Cruz, Cluster of high-powered racks within a raised-floor
computer data center: effect of perforated tile flow distribution on rack inlet
5. Conclusions air temperatures, J. Electron. Packag. 126 (2004) 510–518.
[16] K.C. Karki, A. Radmehr, S.V. Patankar, Use of computational fluid dynamics for
In the past, cooling system economizer modes have not been calculating flow rates through perforated tiles in raised-floor data centers,
HVAC&R Res. 2 (2003) 153–166.
seriously considered in most datacenters. Nowadays, energy [17] K.C. Karki, S.V. Patankar, A. Radmehr, Techniques for controlling airflow
demand of datacenters is increasing rapidly year after year, and distribution in raised-floor data centers, Adv. Electron. Packag. 2 (2003) 621–
it is expected that energy consumption for datacenters will become 628.
[18] J. Rambo, Y. Joshi, Supply air distribution from a single air handling unit in a
even worse in the upcoming years. Currently, the amount of energy
raised floor plenum data center, in: Proceedings of the Sixth ISHMT-ASME
spent on cooling datacenters account for about 50% of the total Heat and Mass Transfer Conference and Seventeenth National Heat and Mass
energy consumption in the datacenters. Thus, effective energy Transfer Conference held in Kalpakkam, India, 2004.
managements especially for relaxing the cooling load by refrigera- [19] B. Sammakia, R. Schmidt, M. Iyengar, Comparative analysis of different data
center airflow management configurations, in: Proceedings of IPACK held in
tion systems are urgently required. As a result, free cooling tech- San Francisco, CA, 2005.
nology, including airside economizers, waterside economizers, [20] J. Cho, T. Lim, B.S. Kim, Measurements and predictions of the air distribution
and heat pipe technology therefore play essential role in partial systems in high compute density (internet) data centers, Energy Build. 41
(2009) 1107–1115.
or complete relief of compressor operation. In the present over- [21] H.E. Khalifa, D.W. Demetriou, Energy optimization of air-cooled data centers, J.
view, benefits, shortcomings and limitations of all these three tech- Therm. Sci. Eng. Appl. 2 (041005) (2010) 1–13.
nologies had been summarized and discussed. Among them being [22] C.D. Patel, R. Sharma, C.E. Bash, A. Beitelmal, Thermal considerations in cooling
large scale high compute density data centers, in: Proceedings of the Eighth
addressed, indirect airside coolers such as air-to-air heat exchanger Intersociety Conference on Thermal and Thermomechanical, Phenomena in
system and heat wheel show very high efficiency, yet they can be Electronic Systems held in San Diego, California, 2002.
used in almost any climate. Either airside free cooling or waterside [23] R. Schmidt, E. Cruz, Raised floor computer data center: effect on rack inlet
temperatures when adjacent racks are removed, Adv. Electron. Packag. 2
free cooling may be integrated with other systems (e.g. absorption, (2003) 481–493.
solar system, adsorption, evaporative cooling, geothermal, and the [24] ASHRAE, ASHRAE-Fundamental-Handbook, Atlanta, 2001.
like) to claim even higher performance. On the other hand, ther- [25] M. Pawlish, A.S. Varde, Free cooling: a paradigm shift in data centers, in:
Proceedings of the 5th International Conference on Information and
mosiphon heat exchangers feature unique characteristics to trans-
Automation for Sustainability held in Colombo, Sri Lanka, 2010.
fer heat at small temperature difference are quite promising for [26] ASHRAE T.C.9.9, 2011 Thermal guidelines for data processing environments –
datacenter free cooling while the pulsating heat pipes can trans- expanded data center, Data Processing, 2011, pp. 1–45.
port heat with even longer distance and outperform traditional [27] R. Steinbrecher, R. Schmidt, Data center environments ASHRAE’s evolving
thermal guidelines, ASHRAE J. 53 (2011) 42–49.
thermosiphon heat exchanger reveal even more promising [28] J. Niemann, K. Brown, V. Avelar, Impact of hot and cold aisle containment on
features. data center temperature and efficiency, APC White Pap. 135, 2011.
H.M. Daraghmeh, C.-C. Wang / Applied Thermal Engineering 114 (2017) 1224–1239 1239

[29] K.P. Lee, H.L. Chen, Analysis of energy saving potential of air-side free cooling [48] J. Kaiser, J. Bean, T. Harvey, M. Patterson, J. Winiecki, Survey results: data
for data centers in worldwide climate zones, Energy Build. 64 (2013) 103–112. center economizer use, the Green Grid White Paper #41, 2011, pp. 1–19.
[30] J. Siriwardana, S. Jayasekara, S.K. Halgamuge, Potential of air-side economizers [49] R. Miller, Data centers move to cut water waste, Data Center Knowledge, 2009,
for data center cooling: a case study for key Australian cities, Appl. Energy 104 Available at: <http://www.datacenterknowledge.com/archives/2009/04/
(2013) 207–219. 09/data-centers-move-to-cut-water-waste/> (April, 2009).
[31] D. Atwood, J.G. Miner, Reducing data center cost with an air economizer, Intel [50] X.D. Qian, Z. Li, H. Tian, Application of heat pipe system in data center cooling,
Information Technology, 2008. Sustain. Energy Technol. 2 (2014) 609–620.
[32] TechTarget, Adiabatic cooling, 2014, Available at <http://whatis. [51] A. Samba, H. Louahlia-Gualous, S.L. Masson, D. Norterhauser, Two-phase
techtarget.com/definition/adiabatic-cooling> (July, 2014). thermo syphon loop for cooling outdoor telecommunication equipments,
[33] C. Longbottom, Get comfortable with an adiabatic cooling system in the data Appl. Therm. Eng. 50 (2013) 1351–1360.
center, Techtarget, 2014, Available at <http://searchdatacenter. [52] H. Jouhara, R. Meskimmon, Heat pipe based thermal management systems for
techtarget.com/feature/Get-comfortable-with-an-adiabatic-cooling-system- energy-efficient data centres, Energy 77 (2014) 265–270.
in-the-data-center> (May, 2014). [53] T. Suenaga, M. Ichimura, Air-cooled packaged air conditioner utilizing
[34] Y. Udagawa, S. Waragai, M. Yanagi, W. Fukumitsu, Study on free cooling thermosyphon system, Trans. SHASEJ 60 (12) (1986).
systems for data centers in Japan, in: 32nd Annu. Int. Telecommun, Energy [54] T. Okazaki, Y. Seshimo, Cooling system using natural circulation for air
Conf. INTELEC, 2010, pp. 1–5. conditioning, Trans. Jpn. Soc. Refrig. Air Condit. Eng. 25 (3) (2008) 239–251.
[35] J. Dai, D. Das, M. Pecht, A multiple stage approach to mitigate the risks of [55] S. Lee, H. Kang, Y. Kim, Performance optimization of a hybrid cooler combining
telecommunication equipment under free air cooling conditions, Appl. Energy vapor compression and natural circulation cycles, Int. J. Refrig. 32 (2009) 800–
64 (2012) 424–432. 808.
[36] A. Shehabi, A. Horvath, W. Tschudi, A.J. Gadgil, W.W. Nazaroff, Particle [56] L. Han, W. Shi, B. Wang, P. Zhang, X. Li, Development of an integrated air
concentrations in data centers, Atmos. Environ. 42 (2008) 5978–5990. conditioner with thermosyphon and the application in mobile phone base
[37] A. Shehabi, S. Ganguly, L.A. Gundel, A. Horvath, T.W. Kirchstetter, M.M. Lunden, station, Int. J. Refrig. 36 (2013) 58–69.
W. Tschudi, A.J. Gadgil, W.W. Nazaroff, Can combining economizers with [57] H. Zhang, S. Shao, H. Xu, H. Zou, C. Tian, Integrated system of mechanical
improved filtration save energy and protect equipment in data centers?, Build refrigeration and thermosyphon for free cooling of data centers, Appl. Therm.
Environ. 45 (2010) 718–726. Eng. 75 (2015) 185–192.
[38] A. Shehabi, W. Tschudi, A. Gadgil, Data center economizer contamination and [58] H. Zhang, S. Shao, C. Tian, Simulation of the thermosyphon free cooling mode
humidity study, LBNL-2424E, 2010. in an integrated system of mechanical refrigeration and thermosyphon for
[39] P. Lin, J. Niemann, L. Long, Choosing between direct and indirect air data centers, Energy Proc. 75 (2015) 1458–1463.
economization for data centers, White Paper 215, 2015. [59] H. Tian, Z. He, Z. Li, A combined cooling solution for high heat density data
[40] J. Niemann, J. Bean, V. Avelar, Economizer modes of data center cooling centers using multi-stage heat pipe loops, Energy Build. 94 (2015) 177–188.
systems, White Paper 132, 2011. [60] X.P. Wu, M. Mochizuki, K. Mashiko, Thang Nguyen, Tien Nguyen, V.
[41] P. Jones, Kyotocooling: catching the tailwind, Available at <http:// Wuttijumnong, G. Cabusao, R. Singh, A. Akbarzadeh, Cold energy storage
archive.datacenterdynamics.com/focus/archive/2011/04/kyotocooling- systems using heat pipe technology for cooling data centers, Front. Heat Pipes
catching-tailwin> (April, 2011). 2 (2011).
[42] NSIDC, Energy reduction strategies: airside economization and unique indirect [61] R. Singh, M. Mochizuki, K. Mashiko, T. Nguyen, Heat pipe based cold energy
evaporative cooling, White Paper, 2012, pp. 1–8. storage systems for datacenter energy conservation, Energy 36 (2011) 2802–
[43] R. Brown, Report to congress on server and data center energy efficiency public 2811.
law, 109-431, Available at: <https://escholarship.org/uc/item/74g2r0vg> [62] W.L. Zheng, K. Ma, X.R. Wang, Exploiting thermal energy storage to reduce
(June, 2008). data center capital and operating expenses, IEEE, 2014, 1143607.
[44] C. Longbottom, Water cooling vs. air cooling: the rise of water use in data [63] Q.Y. Lu, L. Jia, Experimental study on rack cooling system based on a pulsating
centers, Techtarget, 2011, Available at: <http://www.computerweekly.com/ heat pipe, J. Therm. Sci. 25 (2016) 60–67.
tip/Water-cooling-vs-air-cooling-The-rise-of-water-use-in-data-centres> [64] Z. Potts, Free cooling technologies in data centre applications, SUDLOWS
(August, 2011). White Paper, Manchester, 2011.
[45] S.M. James, B.A. Rubenstein, Renewable energy based datacenter cooling, US [65] J. Choi, J. Jeon, Y. Kim, Cooling performance of hybrid refrigeration system
Patent 20140368991 A1. designed for telecommunication equipment rooms, Appl. Therm. Eng. 27
[46] A.B. Carlson, Data center cooling, US Patent 8113010 B2, 2012. (2007) 2026–2032.
[47] H.F. Hammann, M.K. Iyengar, T.G. Kessel, Cooling infrastructure leveraging a
combination of free and solar cooling, Patent US 8020390 B2, 2011.

You might also like