Insider Redux: Data Barn in a Farm Town

I thought I would start my first post by addressing the second New York Times article first. Why? Because it specifically mentions activities and messages sourced from me at the time when I was responsible for running the Microsoft Data Center program. I will try to track the timeline mentioned in the article with my specific recollections of the events. As Paul Harvey used to say, so then you could know the ‘REST of the STORY’.

I remember my first visit to Quincy, Washington. It was a bit of a road trip for myself and a few other key members of the Microsoft site selection team. We had visited a few of the local communities and power utility districts doing our due diligence on the area at large. Our ‘Heat map’ process had led us to Eastern Washington state. Not very far (just a few hours) from the ‘mothership’ of Redmond, Washington. It was a bit of a crow eating exercise for me as just a few weeks earlier I had proudly exclaimed that our next facility would not be located on the West Coast of the United States. We were developing an interesting site selection model that would categorize and weight areas around the world. It would take in FEMA disaster data, fault zones, airport and logistics information, location of fiber optic and carrier presence, workforce distributions, regulatory and tax data, water sources, and power. This was going to be the first real construction effort undertaken by Microsoft. The cost of power was definitely a factor as the article calls out. But just as equal was the generation mix of the power in the area. In this case a predominance of hydroelectric. Low to No carbon footprint (Rivers it turns out actually give off carbon emissions I came to find out). Regardless the generation mix was and would continue to be a hallmark of site selection of the program when I was there. The crow-eating exercise began when we realized that the ‘greenest’ area per our methodology was actually located in Eastern Washington along the Columbia River.

We had a series of meetings with Real Estate folks, the local Grant County PUD, and the Economic Development folks of the area. Back in those days the secrecy around who we were was paramount, so we kept our identities and that of our company secret. Like geeky secret agents on an information gathering mission. We would not answer questions about where we were from, who we were, or even our names. We ‘hid’ behind third party agents who took everyone’s contact information and acted as brokers of information. That was early days…the cloak and dagger would soon come out as part of the process as it became a more advantageous tool to be known in tax negotiations with local and state governments.

During that trip we found the perfect parcel of land, 75 acres with great proximity to local sub stations, just down line from the Dams on the nearby Columbia River. It was November 2005. As we left that day and headed back it was clear that we felt we had found Site Selection gold. As we started to prepare a purchase offer we got wind that Yahoo! was planning on taking a trip out to the area as well. As the local folks seemingly thought that we were a bank or large financial institution they wanted to let us know that someone on the Internet was interested in the area as well. This acted like a lightning rod and we raced back to the area and locked up the land before they Yahoo had a chance to leave the Bay Area. In these early days the competition was fierce. I have tons of interesting tales of cloak and dagger intrigue between Google, Microsoft, and Yahoo. While it was work there was definitely an air of something big on the horizon. That we were all at the beginning of something. In many ways many of the Technology professionals involved regardless of company forged some deep relationships and competition with each other.

Manos on the Bean Field December 2005The article talks about how the ‘Gee-Whiz moment faded pretty fast’. While I am sure that it faded in time (as all things do), I also seem to recall the huge increase of local business as thousands of construction workers descended upon this wonderful little town, the tours we would give local folks and city council dignitaries, a spirit of true working together. Then of course there was the ultimate reduction in properties taxes resulting from even our first building and an increase in home values to boot at the time. Its an oft missed benefit that I am sure the town of Quincy and Grant County has continued to benefit from as the Data Center Cluster added Yahoo, Sabey, IAC, and others. I warmly remember the opening day ceremonies and ribbon cutting and a sense of pride that we did something good. Corny? Probably – but that was the feeling. There was no talk of generators. There were no picket signs, in fact the EPA of Washington state had no idea on how to deal with a facility of this size and I remember openly working in partnership on them. That of course eventually wore off to the realities of life. We had a business to run, the city moved on, and concerns eventually arose.

The article calls out a showdown between Microsoft and the Power Utility District (PUD) over a fine for missing capacity forecasting target. As this happened much after I left the company I cannot really comment on that specific matter. But I can see how that forecast could miss. Projecting power usage months ahead is more than a bit of science mixed with art. It gets into the complexity of understanding capacity planning in your data centers. How big will certain projects grow. Will they meet expectations?, fall short?, new product launches can be duds or massive successes. All of these things go into a model to try and forecast the growth. If you think this is easy I would submit that NOONE in the industry has been able to master the crystal ball. I would also submit that most small companies haven’t been able to figure it out either. At least at companies like Microsoft, Google, and others you can start using the law and averages of big numbers to get close. But you will always miss. Either too high, or too low. Guess to low and you impact internal budgeting figures and run rates. Not Good. Guess to high and you could fall victim to missing minimal contracts with utility companies and be subject to fines.

In the case mentioned in the article, the approach taken if true would not be the smartest method especially given the monthly electric bill for these facilities. It’s a cost of doing business and largely not consequential at the amount of consumption these buildings draw. Again, if true, it was a PR nightmare waiting to happen.

At this point the article breaks out and talks about how the Microsoft experience would feel more like dealing with old-school manufacturing rather than ‘modern magic’ and diverts to a situation at a Microsoft facility in Santa Clara, California.

The article references that this situation is still being dealt with inside California so I will not go into any detailed specifics, but I can tell you something does not smell right in the state of Denmark and I don’t mean the Diesel fumes. Microsoft purchased that facility from another company. As the usage of the facility ramped up to the levels it was certified to operate at, operators noticed a pretty serious issue developing. While the building was rated to run at certain load size, it was clear that the underground feeders were undersized and the by-product could have polluted the soil and gotten into the water system. This was an inherited problem and Microsoft did the right thing and took the high road to remedy it. It is my recollection that all sides were clearly in know of the risks, and agreed to the generator usage whenever needed while the larger issue was fixed. If this has come up as a ‘air quality issue’ I personally would guess that there is politics at play. I’m not trying to be an apologist but if true, it goes to show that no good deed goes unpunished.

At this point the article cuts back to Quincy. It’s a great town, with great people. To some degree it was the winner of the Internet Jackpot lottery because of the natural tech resources it is situated on. I thought that figures quoted around taxes were an interesting component missed in many of the reporting I read.

“Quincy’s revenue from property taxes, which data centers do pay, has risen from $815,250 in 2005 to a projected $3.6 million this year, paying for a library and repaved streets, among other benefits, according to Tim Snead, the city administrator.”

As I mentioned in yesterday’s post my job is ultimately to get things done and deliver results. When you are in charge of a capital program as large as Microsoft’s program was at the time – your mission is clear – deliver the capacity and start generating value to the company. As I was presented the last cropThe last bag of beans harvested in Quincy of beans harvested from the field at the ceremony we still had some ways to go before all construction and capacity was ready to go. One of the key missing components was the delivery and installation of a transformer for one of the substations required to bring the facility up to full service. The article denotes that I was upset that the PUD was slow to deliver the capacity. Capacity I would add that was promised along a certain set of timelines and promises and commitments were made and money was exchanged based upon those commitments. As you can see from the article, the money exchanged was not insignificant. If Mr. Culbertson felt that I was a bit arrogant in demanding a follow through on promises and commitments after monies and investments were made in a spirit of true partnership, my response would be ‘Welcome to the real world’. As far as being cooperative, by April the construction had already progressed 15 months since its start. Hardly a surprise, and if it was, perhaps the 11 acre building and large construction machinery driving around town could have been a clue to the sincerity of the investment and timelines. Harsh? Maybe. Have you ever built a house? If so, then you know you need to make sure that the process is tightly managed and controlled to ensure you make the delivery date.

The article then goes on to talk about the permitting for the Diesel generators. Through the admission of the Department of Ecology’s own statement, “At the time, we were in scramble mode to permit our first one of these data centers.” Additionally it also states that:

Although emissions containing diesel particulates are an environmental threat, they were was not yet classified as toxic pollutants in Washington. The original permit did not impose stringent limits, allowing Microsoft to operate its generators for a combined total of more than 6,000 hours a year for “emergency backup electrical power” or unspecified “maintenance purposes.”

At the time all this stuff was so new, everyone was learning together. I simply don’t buy that this was some kind Big Corporation versus Little Farmer thing. I cannot comment on the events of 2010 where Microsoft asked for itself to be disconnected from the Grid. Honestly that makes no sense to me even if the PUD was working on the substation and I would agree with the articles ‘experts’.

Well that’s my take on my recollection of events during those early days of the Quincy build out as it relates to the articles. Maybe someday I will write a book as the process and adventures of those early days of birth of Big Infrastructure was certainly exciting. The bottom line is that the data center industry is amazingly complex and the forces in play are as varied as technology to politics to people and everything in between. There is always a deeper story. More than meets the eye. More variables. Decisions are never black and white and are always weighted against a dizzying array of forces.

\Mm

Sites and Sounds of DataCentre2012: My Presentation, Day 2, and Final Observations

nice

Today marked the closing lot of sessions for DataCentres2012 and my keynote session to the attendees.    After sitting through a series of product, technology, and industry trend presentations over the last two days I was feeling that my conversation would at the very least be something different.   Before I get to that – I wanted to share some observations from the morning. 

It all began with an interesting run-down of the Data Center and infrastructure industry trends across Europe from Steve Wallage of The BroadGroup.   It contained some really compelling information and highlighted some interesting divergence between the European market and the US market in terms of adoption and trends of infrastructure.   It looks like they have a method for those interested to get their hand on the detailed data (for purchase) if you are interested.  The parts I found particularly industry was the significant slow down of the Wholesale data center market across Europe while Colocation providers continued to do well.   Additionally the percentages of change within the customer base of those providers by category was compelling and demonstrated a fundamental shift and move of content related customers across the board.

This presentation was followed by a panel of European Thought Leaders made up mostly of those same colocation providers.  Given the presentation by Wallage I was expecting some interesting data-points to emerge.  While there was a range of ideas and perspectives represented by the panel, I have to say it really got me worked up and not in a good way.   In many ways I felt the responses from many (not all) on the panel highlighted a continued resistance to change in thinking around everything from efficiency, to technology approach.  It represented the things I despise most about about our industry at large.  Namely the slow adoption of change. The warm embrace of the familiar.  The outright resistance to new ideas.    At one point, a woman in the front row whom I believe was from Germany got up to ask a question if the panelists had any plans to move their facilities outside of the major metros.  She referenced Christian Belady’s presentation around the idea of Data as Energy and remote locations like Quincy, Washington or Lulea, Sweden.   She referred to the overall approach and thinking differently as quite visionary.   Now the panel could have easily have referred to the fact that companies like Microsoft, Google, Facebook and the like have much greater software level control than a colo-provider could provide.   Or perhaps they could have referenced that most of their customers are limited by distance to existing infrastructure deployments due to inefficiencies in commercial or custom internally deployed applications. Databases with response times architected for in-rack or in-facility levels of response times.   They did at least reference that most customers tend to be server huggers and want their infrastructure close by.  

Instead the initial response was quite strange in my mind.  It was to go after the ideas as “innovative” and to imply that nothing was really innovative about what Microsoft had done and the fact that they built a “mega data center” in Dublin shows that there is nothing innovative really happening.  Really?   The adoption of 100% Air Side economization is something everyone does?   The deployment of containerized compute capacity is run of the mill?  The concepts about the industrialization of compute is old-hat?  I had to do a mental double take and question whether they were even listening during ANY of the earlier sessions.   Don’t get me wrong, I am not trying to be an apologist for the Microsoft program, in fact there are some tenets of the program I find myself not in agreement with.  However – You cannot deny that they are doing VERY different things.   It illustrated an interesting undercurrent I felt during the entire event (and maybe even our industry).  I definitely got the sensation of a growing gap between users requirements and their forward roadmaps and desires and what manufacturers and service providers are providing.  This panel, and a previous panel on modularization really highlighted these gulfs pretty demonstrably.   At a minimum I definitely walked away with an interesting new perspective on some of the companies represented.

It was then time for me to give my talk.   Every discussion up until this point had really focused on technology or industry trends.  I was going to talk about something else. Something more important.  The one thing seemingly missing from the entire event.   That is – the people attending.   All the technology in the world, all of the understanding of the trends in our industry are nothing unless the people in the room were willing to act. Willing to step up and take active roles in their companies to drive strategy.  As I have said before – to get out of the basement and into the penthouse.   The pressures on our industry and our job roles has never been more complicated.   So I walked through regulations, technologies, and cloud discussions.  Using the work that we did at AOL as a backdrop and example – I really tried to drive my main point.   That our industry – specifically the people doing all the work – were moving to a role of managing a complex portfolio of technologies, contracts, and a continuum of solutions.  Gone are the days where we can hide sheltered in our data center facilities.   Our resistance to embrace change, need to evolve with us, or it will evolve around us.   I walked through specific examples of how AOL has had to broaden its own perspective and approach to this widening view of our work roles at all levels.   I even pre-announced something we are calling Data Center Independence Day.   An aggressive adoption of modularized compute capacity that we call MicroData Centers  to help solve many of the issues we are facing as a business and the rough business case as to why it makes sense for us to move to this model.    I will speak more of that in the weeks to come with a greater degree of specifics, but stressed again the need for a wider perspective to manage a large portfolio of technologies and approaches to be successful in the future.

In closing – the event was fantastic.   The ability this event provides to network with leaders and professionals across the industry was first rate.   If I had any real constructive feedback it would be to either lengthen sessions, or reduce panel sizes to encourage more active and lively conversations.  Or both!

Perhaps at the end of the day, it’s truly the best measure of a good conference if you walk away wishing that more time could be spent on the topics.  As for me I am headed back Stateside and to digging into the challenges of my day job.    To the wonderful host city of Nice, I say Adieu!

 

\Mm

Site Selection,Data Center Clustering and their Interaction

I have written many times on the importance of the site selection for data centers and its growing importance when one considers the regulatory and legislative efforts underway globally.   Those who make their living in this space know that this is going to have a significant impact on the future landscape of these electronic bit factories.   The on-going long term operational costs over the life of the facility,  their use of natural resources (such as power) and what they house and protect (PII data or Personally Identifiable Information) are even now significantly impacting this process for many large global firms, and is making its way into the real estate community.  This is requiring a series of crash courses in information security, power regulation and rate structures, and other complex issues for many in the Real Estate community. 

In speaking to a bunch of friends in the Real Estate side of the business, I thought it might be interesting to take a few of these standard criteria head on in an open discussion.   For this post I think I will take on two of the elemental ones in data center site selection. We will look at one major item, and one item that is currently considered a minor factor that is quickly on the rise in terms of its overall importance.  Namely Power and Water, respectively.

Watts the Big Deal?

Many think that power cost alone is the primary driver for data centers, and while it is always a factor there are many other facets that come into play underneath that broader category of Power.   While Site Selection Factors are always considered highly confidential I thought it might highlight some of the wider arcs in this category.

One such category getting quite a bit of attention is Power  Generation Mix.   The Generation mix is important because it is essentially the energy sources responsible how that area or region gets its energy.  Despite what politicians would lead you to believe, once an electron is on the grid it is impossible to tell from which source it came.   So ‘Green Energy’ and its multitude of definitions is primarily determined by the mix of energy sources for a given region.   A windmill for example does not generate an electron with a tiny label saying that is sourced from ‘green’ or ‘renewable’ sources.   Understanding the generation mix of your power will allow you to forecast and predict potential Carbon output as a result of Data Center Carbon production.   The Environmental Protection Agency in the US, produces a metric called the Carbon Emission Factor which can be applied to your consumption to assist you in calculating your carbon output and is based upon the generation mix of the areas you are looking to site select in.   Whether you are leasing or building your own facility you will likely find yourself falling into a mandatory compliance in terms of reporting for this kind of thing.

So you might be thinking, ‘Great, I just need to find the areas that have cheap power and a good Carbon Emission Factor right?’  The answer is no.  Many Site Selection processes that I see emerging in the generic space start and stop right at this line.   I would however advocate that one takes the next logical step which is to look at the relationship of these factors together and over a long period of time.

Generation Mix has long been considered to be a ‘Forever’ kind of thing.  The generation sources within a region, rarely changed, or have rarely changed over time.   But that is of course changing significantly in the new era that we live in.

Lets take the interplay (both historical and moving forward) of the Power Cost and its relationship with the Generation Mix.  As humans we like to think in simplistic terms.  Power costs for a specific region are ‘so many cents per kilowatt hour’ this changes based upon whether you are measured at a residential, commercial, or industrial rate schedule.   The rate schedule is a function of how much power you ultimately consume or promise to consume to the local utility.   The reality of course is much more complicated than that.   Power rates fluctuate constantly based upon the overall mix.   Natural disasters, regulation, etc. can have a significant impact on power cost over time.   Therefore its generally wise to look at the Generation Mix Price Volatility through the longer term lens of history and see how a region’s power costs oscillate between these types of events.     However you decide to capture or benchmark this it is a factor that should be considered. 

This is especially true when you take this Volatility factor and apply it the changing requirements of Carbon Reporting and impacts.  While the United States is unlikely to have a law similar to the CRC in the UK (Carbon Reduction Commitment), it will see legislation and regulation impacting the energy producers.  

You might be asking yourself, ‘Who cares if they go after those big bad energy companies and force them to put more ‘green power in their mixes’.  Well lets think about the consequences of these actions to you the user, and why its important to your site selection activity.

As the energy producers are regulated to create a more ‘green’ mix into their systems, two things will happen.  The first of course is that rates will rise.  The energy producers will need to sink large amounts of capital to invest into these technologies, plants, research and development, etc to come to alignment with the legal requirements they are being regulated to.   This effect will be uneven as many areas around the globe have quite a disparate mix of energy from region to region.   This will also mean that ‘greener’ power will likely result in ‘more expensive power’.   Assessing an area for the potential impacts to these kinds of changes is definitely important in a data center build scenario as you likely have a desire to ensure that your facility has the longest possible life which could span a couple of decades.  The second thing which may be a bit harder to guess at, is ‘which technology’ is a given region or area likely to pick and its resulting carbon output impact.   While I have a definite approach to thinking through such things, this is essentially the beginning of the true secret sauce to site selection expertise and the help you may require if you don’t have an internal group to go through this kind of data and modeling.  This is going to have an interesting impact on the ‘clustering’ effect that happens in our industry at large.

We have seen many examples like Quincy, Washington and San Antonio, Texas where the site selection process has led to many Data Center providers locating in the same area to benefit from this type of analysis (even if not directly exposed to the criteria).  There is a story (that I don’t know if its true or not) that in the early days when a new burger chain was looking to expand where it would place its restaurants, it used the footprint of its main competitor as its guide. The thinking was that they probably had a very scientific method for that selection and they would receive that same ancillary benefit without the cost and effort.   Again, not sure if that is true or not, but its definitely something likely to happen in our industry. 

In many markets these types of selections are in high demand.   Ascent Corporation out of St. Louis is in the process of building a modern facility just down the street from the Microsoft Mega-Facility near Chicago.   While Ascent was a part of the original Microsoft effort to build at that location, there has been an uptick in interest for being close to that facility for the same reasons as I have outlined here.  The result is their CH2 facility is literally a stones throw from the Microsoft Behemoth.  The reasons? Proximity to power, fiber, and improved water infrastructure are already there in abundance.  The facility even boasts container capabilities just like its neighbor.   The Elmhurst Electrical Substation sits directly across the highway from the facility with the first set of transmission poles within easy striking distance.  

Elmhurst Electrical Yard

The Generation mix of that area has a large nuclear component which has little to no carbon impact, and generates long term stability in terms of power cost fluctuations.   According to Phil Horstmann, President of Ascent, their is tremendous interest in the site and one of the key draws is the proximity of its nearby neighbor.  In the words of one potential tenant ‘Its like the decision to go to IBM in the 80s.  Its hard to argue against a location where Microsoft or Google has placed one of its facilities.’

This essentially dictates that there will be increasing demand on areas where this analysis is done or has been perceived to be done.   This is especially true where co-location and hosting providers can align their interests with those commercial locations where there is market demand.  While those that follow first movers will definitely benefit from these decisions (especially those without dedicated facility requirements), first movers continue to have significant advantage if they can get this process correct.

Tying into the power conversation is that of water.  With the significant drive for economization (whether water based or air-based)  water continues to be a factor.  What many people don’t understand is that in many markets the discharge water is clean to dump into the sewage system and to ‘dirty’ to discharge to retention ponds.  This causes all kinds of potential issues and understanding the underlying water landscape is important.   The size of the metropolitan sewage environments, ability to dig your own well efforts, the local water table and aquifer issues, your intended load and resulting water requirements, how the local county, muncipality, or region views discharge in general and which chemicals and in what quantities is important to think about today.  However, as the use of water increases in terms of its potential environmental scrutiny – water is quickly rising on the site selection radar of many operators and those with long term holds.

I hope this brief talk was helpful.  I hope to post a few other key factors and a general discussion in the near future.  

\Mm