LCS Value Proposition - LiquidCool Solutions

Transcription

LCS Value Proposition - LiquidCool Solutions
LCS Value Proposition
November 2014
Data Center Trends 2014
• Improving Data Center Power Efficiency & Reducing Carbon Footprint
•
•
•
•
Initiatives for raising data center temperatures
Data center upgrades for cooling and power management
Low power server alternatives (ARM) – HP Moonshot, etc.
DCIM offerings for efficient management of DC infrastructure
• Increasing power density & core counts for HPC clusters with focus on
energy efficiencies.
• “Green 500” configurations with pervasive adoption of GPGPU and Intel Phi cores for compute
• Moves to higher voltage power inputs to reduce copper and power distribution challenges
• Adoption of alternatives to traditional air cooling schemes
•
•
•
Cold Plates utilizing water and ethylene water glycol fluids
Heat Pipes used in conjunction with water based heat exchangers
Single phase and phase change submersion cooling employing dielectric fluids
• Market growth for modular approaches to add Data Center capacity
• Attractive economics, flexibility, and lead time compared to brick and mortar options
• Modular Data Center Market projected by 451 Research to be $2.5 Billion annually by 2015
© LCS 2014 - Proprietary and Confidential
2
LCS Cooling Complements Trends
• Improving Data Center Power Efficiency
•
•
•
•
Reductions in “Power to Cool” at the device level of up to 98% (Traditional Air vs LCS)
Typical reductions in overall data center power consumption of 40%*
Inherent ability to recover and utilize waste heat for facility purposes
Potential for achieving “True PUE’s” as low as 1.02
• Increasing Power Density
• Vastly superior physics for heat transfer – volumetric heat capacity ~ 1500x that of air
• More compact packaging options – no need for fans
• Cooling system components can be remotely configured to increase IT density
• Modular Data Center Configurations
•
•
•
•
Lower cost infrastructure for fluid distribution and heat exchange
More compact footprints than with typical air-cooled options
Lower operating costs and maintenance requirements than air cooled configurations
No considerations or provisions required for local air quality – sealed systems
* Based on comparison to Data Center Study of 2012 ASHRAE Report Findings
© LCS 2014 - Proprietary and Confidential
3
Fans are a Problem
Fans Waste Energy
• 15% of total datacenter energy is used to move air
• Additionally fans in the chassis can use up to 20% of IT power at the device level
• Fans are inefficient and generate heat that must be removed
Fans Waste Space
• Racks need room to breathe
• CRAC units require space around the racks
Fans Reduce Reliability
•
•
•
•
Fans fail
Thermal fluctuations drive solder joint failures
Structure borne vibration frets electrical contacts
Fans expose electronics to air
• Oxidation/corrosion of electrical contacts
• Exposure to electrostatic discharge events
• Sensitivity to ambient particulate, humidity and temperature conditions
© LCS 2014 - Proprietary and Confidential
4
Benefits from Eliminating Fans
Cool High Power Electronics
- Lower operating temperatures result in lower leakage current
- All internal components are kept within normal operating temperature ranges
Save Energy
- For LCS, device power-to-cool can be reduced by up to 98% vs. air-cooled devices
- For LCS, the “true” cooling PUE is ~1.03
- Waste heat easily can be recovered for other uses
Save Space
-
Higher rack density because there is no need for hot aisles or to circulate air in the racks
For LCS, 64 IT devices in a 42U rack or 72 devices in a 48U rack
Increase Reliability
- Sealed fluid circuits prevent failures from corrosion and contamination
- Liquid submersion reduces thermal fatigue of solder joints
Operate Silently - Fan noise is eliminated
Five Technologies Eliminate Fans
1.
2.
3.
4.
5.
Clustered Systems
Green Revolution Cooling
Iceotope
SGI
LiquidCool Solutions
© LCS 2014 - Proprietary and Confidential
5
Clustered Systems
• Clustered Systems circulates a refrigerant through an external cold plate
mounted on top of each server in the rack.
• Mechanical pressure is used to create contact between the cold plate
and the top of the server.
• Heat is transferred to from the server to the refrigerant by conduction.
• The system requires a refrigerant temperature less than 30oC.
© LCS 2014 - Proprietary and Confidential
6
Green Revolution Cooling
• Green Revolution Cooling’s CarnotJetTM system circulates a mineral oil
based dielectric fluid through a tank containing submerged IT devices.
• The system resembles a rack tipped over on its back, with modified
servers inserted vertically into slots in the tank.
• Each 42U tank is filled with roughly 250 gallons of the mineral oil.
• Maintenance access is through the top of the tank
© LCS 2014 - Proprietary and Confidential
7
Iceotope
• Iceotope mounts off-the-shelf motherboards
inside sealed hot-swappable cartridges that
are flooded with a dielectric fluid, 3M Novec.
• Novec, which remains a liquid, moves heat to
a hot plate mounted on the side of the
cartridge.
• Water circulates in a secondary circuit through
the hot plate, and heat is transferred from
Novec to water in the hot plate by conduction.
• Aster Capital, an investment group backed by
Alstom, Schneider Electric and Solvay,
recently invested $10 million in Iceotope
• Along with the investment Schneider
announced that it intends to commercialize
Iceotope technology
© LCS 2014 - Proprietary and Confidential
8
SGI
• SGI uses a two-phase (evaporative) immersion cooled system.
• Electronic components are submerged into a bath of Novec, an expensive
dielectric refrigerant.
• Boiling occurs on the surface of the heat generating devices and vapor passively
rises to the top of the enclosure, where it condenses on water-cooled coils and
falls back into the tank.
© LCS 2014 - Proprietary and Confidential
9
LiquidCool Solutions
Patented Directed-Flow Technology
• No fans or other moving parts in the chassis
• Total liquid submersion in a eco-friendly dielectric fluid
• Rack-mounted devices are easy to maintain
• Within a device “cool” liquid is circulated directly to the components with the
highest power density
• The remaining components are cooled by bulk flow as the dielectric fluid is drawn
through the unit to a return manifold
• Electronics are decoupled from the environment
How it Works
© LCS 2014 - Proprietary and Confidential
10
Heat Dissipation
LCS Cooling System Elements
• Pump supplies “cool” dielectric liquid to multiple IT racks
• If there is no energy recycling option “hot” fluid is circulated to an evaporative
fluid cooler
• Incoming “cool” fluid can be as warm as 45°C for most applications
© LCS 2014 - Proprietary and Confidential
11
Reliability
LCS technology decouples electronics from the room, eliminating
the root causes of failure:
• Dramatic reduction in thermal fluctuations, which drive solder joint failures
• Much lower operating temperatures for the board and components
(typically 20-30 C cooler device temps than with air)
• No oxidation/corrosion of electrical contacts
• No fretting corrosion of electrical contacts induced by structural vibration
• No moving parts within the device enclosure
(fan failures are one of the highest service triggers for electronics)
• No exposure to electrostatic discharge events
• No sensitivity to ambient particulate, humidity, or temperature conditions
When maintenance is required to upgrade components:
• The swap out procedure takes less than 2 minutes with no measurable loss of fluid
• An IT device can be removed from a rack, drained, opened, serviced, reassembled,
refilled, and reinstalled within a 15 minute turnaround window
© LCS 2014 - Proprietary and Confidential
12
Robust IP Portfolio
20 Issued & 19 Pending Patents
•
•
•
•
•
•
•
Liquid tight server case with dielectric liquid for cooling electronics
Extruded server case used with liquid coolant of electronics
Computer with fluid inlets and outlets for direct-contact liquid cooling
Case and rack system for liquid submersion cooling of an array of
electronic components
Computer case for liquid submersion cooled components
Liquid submersion cooled computer with directed flow
Gravity assisted directed liquid cooling of electronics
© LCS 2014 - Proprietary and Confidential
13
Additional LCS Benefits
Any Shape or Size
Clustered Systems, Green Revolution,
SGI, Iceotope
No Water
Iceotope
Scalable
Green Revolution, SGI
Easy to Swap Devices
Green Revolution, SGI
Easy to Maintain Devices
Green Revolution, SGI, Iceotope
Harsh Environment Deployments
Green Revolution, SGI
Costs Less than Air
Clustered Systems. SGI, Iceotope
Liquid Submerged Computer Operating
Underwater in an Aquarium
© LCS 2014 - Proprietary and Confidential
14
Any Shape or Size
8 servers with liquid-to-liquid cooling distribution unit
Industrial and embedded computing applications
© LCS 2014 - Proprietary and Confidential
15
Any Shape or Size
Liquid Submerged Computer with Passive Radiator
Liquid Submerged Computer with Stacked Boards
© LCS 2014 - Proprietary and Confidential
16
64-Server Configuration
Connection
to remote
CDU
© LCS 2014 - Proprietary and Confidential
17
Low Cost “Clamshell ”Server
•
•
•
Motherboard sandwiched between two sealed enclosures
Rack-mountable
I/O connectors remain outside the liquid enclosure
© LCS 2014 - Proprietary and Confidential
18
Clamshell Server System
© LCS 2014 - Proprietary and Confidential
19
Example –Modular Data Center
Air-Cooled
Input Power – 500kW
IT Power – 250kW
LiquidCool
Input Power – 320kW
IT Power – 250kW
© LCS 2014 - Proprietary and Confidential
20
1MW Hybrid HPC Module
Overall dimensions 12’ x 12’ x 42’
© LCS 2014 - Proprietary and Confidential
21
200 KW Hybrid HPC Module
Overall dimensions: 12’ x 12’ x 20’
•
Air cooled section for data storage & switches
Liquid cooled section for compute
© LCS 2014 – Proprietary and Confidential
22
Fluid Distribution for LSS 220 Rack
© LCS 2014 – Proprietary and Confidential
23
250 kW Liquid-to-Liquid CDU
Fully Redundant Heat Exchangers, Pumps, and Pump Control Units
Approximate size: 48” wide x 48” deep x 30” high
© LCS 2014 – Proprietary and Confidential
24
Federal Data Center Upgrade
Current Facility Layout (Air-Cooled)
• 50’ x 100’ Facility Space
• 115 racks of air cooled IT equipment
• Estimated IT compute power consumption of 500 kW
• 7 Air Handling Units fully operational at capacity
Facility Space Reduction using LCS Servers
• 50’ x 100’ Facility Space
• 24 Liquid Cooled 48U racks w/72 servers/rack
• 500+ kW of IT compute capacity at 21 kW/rack
• Approximate footprint = 450 ft2
• Air Handling Units to Red may be decommissioned
• Remainder of space available for repurposing or
expansion
© LCS 2014 - Proprietary and Confidential
25
TROPEC
Transformative Reductions in Operational Energy Consumption
TROPEC Objective – Allow enterprise servers and communication/network equipment to
continuously operate in tropical environments with no mechanical cooling
• LiquidCool submitted a proposal for “Modular System for High-Efficiency Electronics Cooling at
Expeditionary Base Camps” in September 2013
• A comprehensive assessment of the LiquidCool
system has been completed at Lawrence Berkeley
National Laboratory
• US Navy (PACOM) has recommended that the
LiquidCool system be moved forward to the field
assessment phase
• Independent testing of the TROPEC system
achieved successful cooling of an HPC server and
cooling unit at ambient temperature of 101°F for 24
hours achieving a true PUE of 1.019
© LCS 2014 - Proprietary and Confidential
26
LCS Value Proposition















Any electronics can be cooled
There is no water near electronics
Rack-mounted devices are hot swappable and easy to maintain
There are no moving parts or heat exchange barriers in the IT device chassis
There is no boiling or condensing
The dielectric liquid is inexpensive, eco-friendly and never needs to be replaced
There are no microchannels to clog
Electronics are not exposed to air pollution
Blade and rack fans are eliminated
There is no noise, vibration or extreme temperature fluctuation
Air handlers and CRAC units are eliminated
Mechanical refrigeration is eliminated
There is no need for humidification or dehumidification
Raised floors and high ceilings are eliminated
Heat is recovered in a convenient form for recycling
…and LCS cooling costs less too!
© LCS 2014 - Proprietary and Confidential
27
More Information
Herb Zien, CEO
herb.zien@liquidcoolsolutions.com
414-803-6010
Jay Ford, VP Sales & Marketing
jay.ford@liquidcoolsolutions.com
847-370-7296
Rick Tufty, VP Engineering
rick.tufty@liquidcoolsolutions.com
507-535-5829
Steve Einhorn, Chairman
seinhorn@capitalmidwest.com
414-453-4488
© LCS 2014 - Proprietary and Confidential
28