Stay out of my server room!


This is for computers, not your reams of paper!


We always design our data centres and server rooms with the best of intentions. A nice, enclosed space with excellent ventilation and air conditioning, sufficient power points to ensure circuits are never overloaded, and enough space to get at both the front and back sides of the rack – ideally all cordoned off into its own room so nobody has to listen to the noise or be perpetually chilled to the bone.

Of course, space is at a premium in today's economy. Leases are far from cheap, and many SMBs would like to use every extra square inch of space for products or raw materials instead of spare parts for IT.

For many, having a dedicated room for their IT infrastructure is nothing more than a pipe dream. They're working right next to their servers in the IT office, or they've got them stashed away in a closet and are constantly pushing their luck with the thermal profile.

The march of progress gives us everything from new CPUs to hyperconvergence – letting us do more with less – but getting our hands on those costs money. Sometimes you've just gotta make do with what you've got on hand.

Management and IT like to butt heads over everything, and physical space is no exception. When a logistics droid takes a gander around the office, he or she is looking for idle merchandise. Anything that isn't going into something being sold or going on display had better have a good reason for being there.

Those spare hard drives you ordered to go with your servers the last time your department got a decent amount of budget? They're clearly not being used, they've got to go. We're putting $materials on that shelf now!

Administrators spend a great deal of time doing preventative maintenance. Keeping the servers running doesn't mean putting out fires as they come, it means planning for hypothetical scenarios with the resources available. This type of work doesn't immediately present a benefit, and when the time comes to cut some chaff, perception is key.

Management droids who've never experienced the pain of an outage might not have the same respect for having the hardware on hand as you and me, and the blame cannon is somehow never pointed at the penny-pincher who thought doing without a support contract was an acceptable risk.

Battling at the temple gates
IT operations types view the data centre as sacred ground. Woe unto those who set foot into the holy chamber of information. To the untrained eye, these rooms are nothing but a bunch of noisy boxes sitting around doing nothing. To us, they're the very lifeblood of the business.

So when our logistics droid gets tired of arguing about space and starts hauling $materials into the data centre, we get rightfully upset. Doing something as simple as plopping a box in front of a rack can be a risk.

Air flow is of critical concern in a data centre. Access to both sides of the chassis is usually needed to keep cold air moving in and hot air moving out – especially if your air conditioner is struggling with the BTU output of the servers (and the odd fleshbag walking around inside swapping cables!).

Cables aren't always neatly packed away and bundled up, either. Not all server rooms are cable porn. Some are – well... if you're reading this, you know.

In many businesses, as server rooms edge close to refresh time, years of "hair on fire" has meant no time to crimp a fresh cable to the perfect length or feed it through the conduit. Fibre is notoriously fragile. RJ45 connectors break and sit inside switch ports held only by the force of friction. When cables are strewn about the floor, walking on them or dropping boxes could easily take down a production cluster.

The icing on the cake here is that the data centre is always IT's responsibility. If someone walks into the DC and breaks something, the outrage will be directed at IT for any outages caused. If there's a security breach (maybe the logistics droid plugged his phone into a server to charge it), IT must fix it and report back to management.

The best solution, of course, is don't let unauthorised personnel into the data centre. Unfortunately, IT doesn't always have that kind of pull. Moore's Law and the perpetually shrinking server footprint has saved me, but that only works when the IT needs of a company are relatively static.

We always design our data centres and server rooms with the best of intentions. A nice, enclosed space with excellent ventilation and air conditioning, sufficient power points to ensure circuits are never overloaded, and enough space to get at both the front and back sides of the rack – ideally all cordoned off into its own room so nobody has to listen to the noise or be perpetually chilled to the bone.