We in the IT field (and somewhat the general working public) sometimes have to walk a fine line of “If” the company should do something versus “How” we’re going to make that thing happen. Thinking about this takes me back to 2004 or 2005 when I worked at an auto finance company. As this was the early 2000s, there was no cloud yet; there were hosting companies and co-location facilities, but we had our own data center on-premises. The office I worked in was in California, and we had an office in Texas with a few IT systems, but we didn’t have disaster recovery for our websites or any customer-facing systems set up in Texas.

We were told by the power company that there was going to be a 12-24 hour long power outage for the block that our building was on. The reason for the power outage was that a transformer needed to be replaced, and this was the time window that they were going to need to replace it. As everything on the block was businesses that would all be closed over the weekend, they told us that they were going to cut out power from early Saturday until early Sunday.

Now, the company didn’t have a generator and didn’t have a concrete pad to put a generator. But a friend of one of our C-levels owned a rental company and would let us borrow a generator if we wanted one. I’ll point out that the IT team didn’t know about this at this point.

There was a meeting where the IT team all got together and planned out how to do a graceful shutdown of our data center. We figured out the order to shut things down in app servers, web servers, domain controllers, SANs, and network switches, along with what doors needed to be propped open as the servers that ran things like the badge readers on the doors would be offline. As the meeting was wrapping up, our director asked, “How hard would it be just to connect a generator and keep everything online?” When the company had moved in years prior, they had the foresight to install a power transfer switch on the main power line so that a generator could be connected and switched over to, taking the company off of the city power. So, the installation was going to be easy.

Our first question to the building’s facilities team was how the air conditioners in our data center were powered. Were they connected to our office power or to the data center power (which was controlled by the transfer switch I mentioned above)? The answer was office power, so they weren’t on the transfer switch. And this wasn’t something that would be easily changeable in the 2 days we had, if at all. Our first problem to overcome was that our data center was on the 4th floor of a 6-story building, and we leased floors 1, 2, 3, 4, and 6, and we were now talking about running our data center without air conditioning for 12-24 hours. So that wasn’t going to work, as the equipment would shut down at some point because of the heat.

At this point, everyone was pointing out all the problems with this plan, specifically the heat issue, and that it would be a much safer plan for the equipment to shut everything off for the weekend and have people come in late Saturday or early Sunday and power everything back up. At this point, we were in the “IF” part of the decision-making process. If we’re going to do the thing, with the thing at this point being “Should we turn off the data center?” Eventually, our boss got a text from his boss and told us the bad news: the C suite had decided that we were keeping the data center on, and we had to figure out how to make it happen.

This brings us from the “if” part of the decision-making to the “how” part of the process. In this case, how the hell were we going to make this happen? Thankfully, everyone in the IT team (there were 9 of us) was able to switch modes pretty quickly. Management had made a decision, a horribly bad decision, but we were told this was the decision and we had to figure it out.

As we tried to figure out how to keep the data center cool, we discussed a few ideas, from kicking out the windows (don’t forget we’re on the 4th floor) to trying to figure out if we can direct the heat from our floor up to the 6th floor through one stairwell while getting fresh air through another stairwell or through the elevator shafts.

We decided to go with this, sending the heat up the stairs plan. Mostly because there was rain in the forecast, and we didn’t know how long it would take the building to get the windows replaced. We sent our NOC manager to the hardware store to get all the duct tape and plastic sheeting that he could find, and we spent Friday blocking off the stairs with plastic so that air wouldn’t come up the wrong staircase and go right up to the 6th floor. The one thing that we had going for us was that up on the 6th floor, in each corner of the building, was a patio with big sliding doors from the offices on either side of the patio to the outside. We opened those doors so that the hot air would vent out of the building.

The big downside to our plan was that the data center needed to vent out through the door between the IT team desks and the data center. So it was going to get hot by our desks. We were guessing low 90s Fahrenheit/ 32 Celsius. The saving grace here was that the NOC had a small AC unit that was connected to the data center’s power, and it had a TV, so we at least had something to do while we waited.

The important part of the entire exercise was that, as a team, we could switch from “Should the company be doing this?” to “How are we going to make this happen?” because our bosses had decided that we would do it.

Did we agree with their decision? Oh God, no. The risks were way too high. But we rallied and made management’s poor decision happen.

Denny

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trust DCAC with your data

Your data systems may be treading water today, but are they prepared for the next phase of your business growth?