Tuesday, August 26, 2008

DC CRACs, Liquid, Passive?

This entry goes for your comments I always have been an Infrastructure Integrator and designer; so when I meet a DC environment and the customer ask how to solve the hot spots; of course I try to give the "traditional" answer (hot-cold aisles; no open RUs, seal raised floor openings, and so on) after new temperature and flow measures, most of the times it became a better DC environment. However, when you are designing a brand new critical telecom/data space like a DC; you have took all of this "best practice" for the design. Then you realize that there are some other that are "popping up" in the industry or market, like "passive, liquid or just reloaded CRACs type" cooling At this time thinking in cabinets with internal cooling is quite amazing (I know that the "older DC for mainframes where like that) because I have to learn how to design such solution. Other one is to switch cabinets with a chimmeny duct type, it sounds reasonable, however the server deployment from one cabinet to this new one, will be a challenge in most of the actual DC, with no "service downtime windows" available. Hearing Liebert or APC with their proposals is quite interesting, however the server set in the top will continue having a hot spot? Well what do you think on this?

roberto sanchez,RCDD

Saturday, August 16, 2008

Data Centre cabling options

Cabling has been always an important issue in every building but in Data Centres is quite a decition that could force a new investment in few years if it is not the proper one. Lets take TIA 942 as the reference for the cabling; there you can find that the cabling architecture is quite simple (the same as a Telecomunication Rooms in a Building); so the key is to lay those cables that will be from the racks battery where the routers, switches, modems are deployed to such cabinets where the servers or SAN will be set. This cabling will remain with no MACs (movements, additions and changes) and could be copper or fiber; so if we have planned that the cabling will be useful for at least five to seven years; we MUST choose the late generation in cooper systems (Category 6A or CLASS F -ISO designation) But what kind of category 6A F/UTP or just UTP? There are a bunch of manufactures that offer both kind of types and some that only have the UTP. You can revise the technical issues that grant one or other as the best for the job, trasmitting 10 gps - Fibre Channel will use this ethernet option quite soon- Other point to be consider is the pathways. As the new SCS going to Cat 6A or Class F those trays or ladders have to be as wide as the raised floor support pedestal grid permit; so the elevation of those grid became an issue to take in consideration when are you designing the data centre cabling. Cabling Performace is the main concern whe you chosse the SCS nowdays the 40 and 100 gps trasmitting equipments are knocking the door, so the cabling should be prepare for. Most of us could think , "when in heaven are we going to have such transfer speed as a need? well when I started in this industry ARCnet was the LAN protocol that hit the market (1985) and Token RING (4 Mps) was the new wave with IBMs PCs. So what can I figure is that 100 Gps could catch us in 4 to 5 years. If we have to design a DATA CENTRE´s infrastructure to be ready for such speeds ; I can suggest that we have to observe those SCS that can give us the chance to switch in few weeks from one speed to the higher one. Observe class F and of course Fiber Optics is "other tale..."


roberto sanchez, RCDD

Monday, August 4, 2008

DC refurbishing

One of my actual activities is to revise and prepare a prosal for refurbishing Data Centres. Most of the time I have found that our customers did not have a very clear picture of what they want as outcome when the project became a reality.

Here some 5 ideas to keep in mind when you start such task:

1st. Look very carefully the project time frame. Most of the projects need more than one year to be finished from planning up to production. If you have a requirement for doing the planning in one or two months; it will be better to shoot your feet byyourself. Planning a refurbish is quite more complex that doing the same task for a new location.
Why? just because you have to get a very good stuff inventory of what are you going to get into the SAME space.

2nd Doing the inventory. A lot of customers says "we have 180 servers, 3 SAN, 25 Lan/wan switches; 14 rout......
But a most of the time they do not have the real information updated or available ( power consuming, model, brand, cards, and so on)

3rd Observe the actual site; Is it a traditional computer or equipment room with all facility structured cabling going into?; does it be placed by the perimeter fence?; has it only one entrance to?; is it located in the basement or has only two water pipes inside the place?
All of this situations ALWAYS arise several and costly inevstments to change them; so keep an eye on where the site is.

4th. How long the DC will be in comission? This is avery difficult decition to take when the planning starts; because a dozen of million investment is not for only four or five years. The TCO has to taken in consideration; it will be a GREEN DC?; the space could be resized in the near future, if so the cost would be...?

5th The designing team. This is one of the most important resources to consider just when the idea of refurbish (or building a new DC) is consider. You MUST have experienced people doing what the have learned & studied; not only the fancy or famous names should be consider but those that have rolled the sleaves to carry out such jobs.

roberto sanchez, RCDD

Friday, August 1, 2008

Remodelacion del CD de la SCJN

Bien recien (ayer) terminados la 1a etapa de la remodelacion del CD de la Suprema Corte de Justicia de la Nacion. El reto de esta remodelacion fue que se realizo "en vivo"; por lo que la demolicion del plafón y demas actividades generadoras de polvo fueron complejas.

Las imagenes nos muestran algunos de los aspectos de la ejecución, decidimos la iluminación en base a LEDs.

La segunda etapa sera levantar a 60 cm el piso elevado; circundar el area de procesamiento y tapiar las ventanas y por último se realizara un cableado de 10 Gbps y se colocara el falso plafón.

Well testerday we delivered the DC for the Suprem Court of Justice, located in Mexico City downtown "Zocalo"; images gives you the idea of such job. this first part included take away the false ceiling; set a new illumination (LEDs ); the steel tray for cabling; arrange in "hot & cold aisles" the cabinets. Next phase will include rise the "raised floor" up to 60 cm (now 30 cm); set a cat 6 A cabling; build wall to close the computer room and set a separation with UPS & cooling.