Post 4/10 series
Well as we see in Trend #3   Energy Efficiency  worst system inside in a DC  is Cooling , the task is to reduce the need for freezing the IT equipment.  Manufacures today deliver equipment that can work in mid 30°C and ASHRAE has released new temperature & humidity parameter, so is it easy we can turn up the temperature can´t we?

Well yes but NOT, why ?
Because it is true with the NEW equipment we can set higher set point in our computer room, however in a common DC exist several IT equipment "generations"  and those that are "old" (4-6 years) can not perform their task in higher temperature. SO what can be done?
Well the technologists today advise us to "segregate" thos equipment and set classes of them. How? well with differentiation - temp & humidity- their place inside a Data centre.  So some facilities are today prepare to separate those equipment with different temp&humidity features, but always each DC has a set of characteristics that are not equal to other one.

So the DC´s operator must find out if they can separate the IT eqipmen in different halls / rooms or need to separete with containeraized solution; cabinet per cabinet or row per row; these solution can help to solve the challenge to most of actual DCs.

But what if you are to plan a new one, you must ask yourself if this facility will be a Cloud (Colo /host), private, goverment, or.....? what ever you find out regarding your busines you will have to analyse IF you are going to buy a full set of new IT equipment (computing /storing /telecomm)  if so -that is VERY RARE- you will be able to set almost all kind of Energy Efficiency cooling techniques & technologies.

However what if you can not afford all new equipment (or your bizz is COLO /Host)? well the a a very good technology expert team has to be set BEFORE a drawing is on the screen.  because in this case your team will have to understand the very new alternatives to cool a DC

  • In row
  • Air /water side economizer
  • Entalpy solutions (like Tokio Cooling or so)
  • Passive cooling 
  • NON raised floor 
  • Aisles containment (hot / cold) 
  • liquid direct INTO the servers
  • direct FREE air to the computing room
  • cooling tower
  • etc, etc, etc.
 And which one your project MUST use, one, two several none of them?

This is a very INITIAL decision when you are going to design a facility, and of course this one must be taken when all the designing team understand what YOU (operartor/investor/project owner) the organization that will set the money NEED / WANT.

Why because any decision come from the OPEration / Availability / Business impact that you as DC´s owner /operator  have to comply in a certain market or industry.

Of course I like those technologies that give us sstainability and Energy Efficiency, however exist people that DO NOT like to be aware on PV solar cells or sea Tide to generate energy and mantain working the cooling equipment, this is like when you set your home concept. with or without chimmeny, swimming pool, two or one floor.

This is a human´s decisions of course we use the business sense to make them but I have witnessed a lot of decison that go green because a brand or just financial issues, not technical nor sustainable nor energy effiency.In mid 80´s in mexico we have a saying..." NOBODY could be fired if buys IBM´s stuff"  and really works.

So which technology should we use in our DCs; my opinion is that, the designing team must evaluate every side of the equation each time, each project, each customer. What is good for a financial organization, maybe will take out from bsines to a reatil one.

But what is your opinion? 

roberto sanchez, RCDD



Popular Posts