A Well Deserved Congratulations to Microsoft Dublin DC Launch

Today Microsoft announced the launch of their premier flagship data center facility in Dublin, Ireland.  This is a huge achievement in many ways and from many angles.    While there are those who will try and compare this facility to other ‘Chiller-less’ facilities, I can assure you this facility is unique in so many ways.   But that is a story for others to tell over time.

I wanted to personally congratulate the teams responsible for delivering this marvel and acknowledge the incredible amount of work in design, engineering, and construction to make this a reality.  To Arne, and the rest of my old team at Microsoft in DCS – Way to go! 

\Mm

PS – I bet there is much crying and gnashing of teeth as the unofficial Limerick collection will now come to a close.  But here is a final one from me:

 

A Data Centre from a charming green field did grow,

With energy and server lights did it glow

Through the lifting morning fog,

An electrical Tir Na Nog,

To its valiant team – Way to Go!

Modular Evolution, Uptime Resolution, and Software Revolution

Its a very little known fact but software developers are costing enterprises millions of dollars and I don’t think in many cases either party realizes it.   I am not referring to the actual cost of purchase for the programs and applications or even the resulting support costs.   Those are easily calculated and can be hard bounded by budgets.   But what of the resulting costs of the facility in which it resides?

The Tier System introduced by the Uptime Institute was an important step in our industry in that it gave us a common language or nomenclature in which to actually begin having a dialog on the characteristics of the facilities that were being built. It created formal definitions and classifications from a technical perspective that grouped up redundancy and resiliency targets, and ultimately defined a hierarchy in which to talk about those facilities that were designed to those targets.   For its time it was revolutionary and to a large degree even today the body of work is still relevant. 

There is a lot of criticism that its relevancy is fading fast due to the model’s greatest weakness which resides in its lack of significant treatment of the application.    The basic premise of the Tier System is essentially to take your most restrictive and constrained application requirements (i.e. the one that’s least robust) and augment that resiliency with infrastructure and what I call big iron.   If only 5% of your applications are this restrictive, then the other 95% of your applications which might be able to live with less resiliency will still reside in the castle built for the minority of needs.  But before you you call out an indictment of the Uptime Institute or this “most restrictive” design approach you must first look at your own organization.   The Uptime Institute was coming at this from a purely facilities perspective.  The mysterious workload and wizardry of the application is a world mostly foreign to them.   Ask yourself this question – ‘In my organization, how often does IT and facilities talk to one another around end to end requirements?’  My guess based on asking this question hundreds of times of customers and colleagues ranges between not often to not at all.  But the winds of change are starting to blow.

In fact, I think the general assault on the Tier System really represents a maturing of the industry to look at our problem space more combined wisdom.   I often laughed at the fact that human nature (or at least management human nature) used to hold a belief that a Tier 4 Data Center was better than a Tier 2 Data Center.  Effectively because the number was higher and it was built with more redundancy.   More Redundancy essentially equaled better facility.    A company might not have had the need for that level of physical systems redundancy (if one were to look at it from an application perspective) but Tier 4 was better than Tier 3, therefore we should build the best.   Its not better, just different. 

By the way, that’s not a myth that the design firms and construction firms were all that interested in dispelling either.   Besides Tier 4 having the higher number, and more redundancy, it also cost more to build, required significantly more engineering and took longer to work out the kinks.   So the myth of Tier 4 being the best has propagated for quite a long time.  Ill say it again.  Its not better, its just different.

One of the benefits of the recent economic downturn (there are not many I know), is that the definition of ‘better’ is starting to change.  With Capital budgets frozen or shrinking the willingness of enterprises to re-define ‘better’ is also changing significantly.   Better today means a smarter more economical approach.   This has given rise to the boom in Modular data center approach and its not surprising that this approach begins with what I call an Application level inventory.   

This application level inventory first specifically looks at the make up and resiliency of the software and applications within the data center environments.  Does this application need the level of physical fault tolerance that my Enterprise CRM needs?  Do servers that support testing or internal labs need the same level of redundancy?  This is the right behavior and the one that I would argue should have been used since the beginning.  The Data Center doesn’t drive the software, its the software that drives the Data Center. 

One interesting and good side effect of this is that the enterprise firms are now pushing harder on the software development firms.    They are beginning to ask some very interesting questions that the software providers have never been asked before.    For example, I sat in one meeting where and end customer asked their Financial Systems Application provider a series of questions on the inter-server latency requirements and transaction timeout lengths for data base access of their solution suite.  The reason behind this line of questioning was a setup for the next series of questions.   Once the numbers were provided it became abundantly clear that this application would only truly work from one location, from one data center and could not be redundant across multiple facilities.  This led to questions around the providers intentions to build more geo-diverse and extra facility capabilities into their product.   I am now even seeing these questions in official Requests for Information (RFI’s) and Requests for Proposal (RFPs).   The market is maturing and is starting to ask an important question – why should your sub-million dollar (euro) software application drive 10s of millions of capital investment by me?  Why aren’t you architecting your software to solve this issue.  The power of software can be brought to bear to easily solve this issue, and my money is on the fact this will be a real battlefield in software development in the coming years.

Blending software expertise with operational and facility knowledge will be at the center of a whole new train of software development in my opinion.  One that really doesn’t exist today and given the dollar amounts involved, I believe it will be a very impactful and fruitful line of development as well.    But it has a long way to go.    Most programmers coming out of universities today rarely question the impact of their code outside of the functions they are providing and the number of colleges and universities that teach a holistic approach can be counted on less than one hands worth of fingers world-wide.   But that’s up a finger or two from last year so I am hopeful. 

Regardless, while there will continue to be work on data center technologies at the physical layer, there is a looming body of work yet to be tackled facing the development community.  Companies like Oracle, Microsoft, SAP, and hosts of others will be thrust into the fray to solve these issues as well.   If they fail to adapt to the changing face and economics of the data center, they may just find themselves as an interesting footnote in data center texts of the future.

 

\Mm

Must Have Swag…..

I try not to post much business related stuff (ala Digital Realty Trust) on Loosebolts as its my own place to rant and rave.   To be clear-none of the things I say on here represent the views of the company what-so-ever.   But sometimes, there are a things that come along that really make me smile and I have to comment on them.

As you know I am huge fan of modularization in the data center.  Modularization in construction, modularization in operation, modularization is just all-around goodness from a technical perspective through the business side of things.   That’s why the newest marketing campaign from Digital has me smiling ear to ear.  image   The new Data Center Construction kit brings back memories from when I was a kid and built giant structures for my little people to generally live, die, and party in.    It was of course a modular approach that led to endless hours of fun and imagination.   Applying these fond remembrances of youth and combining it with both the modular data center movement, and general fun will make this the MUST-HAVE piece of swag in the industry.   Data Center Knowledge posted a video about the toys a few weeks ago.    I can definitely tell you it will lead to hours of fun and wasted time at work putting it together.   I should know, my completed “data center” sits proudly in my office! 

After all we are all just kids at heart, aren’t we?

\Mm

Speaking on Container Solutions and Applications at Interop New York

I have been invited to speak and chair a panel at Interop, New York (November 16-20, 2009) to give a talk exploring the hype and reality surrounding Data Center based containers and Green IT in general.  

image

The goal of the panel discussion will help data center managers evaluate and approach containers by understanding their economics, key considerations and real-life customer examples.  It’s going to be a great conversation.  If you are attending Interop this year I would love to see you there!

 

\Mm

More Chiller Side Chat Redux….

I have been getting continued feedback on the Chiller Side Chat that we did live on Monday, September 14th.  I wanted to take a quick moment and discuss one of the recurring themes of emails I have been receiving on the topic of data center site selection and the decisions that result at the intersection of data center technology, process and cost.    One of the key things that we technical people often forget is that the data center is first and foremost a business decision.  The business (whatever kind of business it is) has a requirement to improve efficiency through automation, store information, or whatever it is that represents the core function of that business.  The data center is at the heart of those technology decisions and the ultimate place where those solutions will reside.  

As the primary technical folks in an organization whether you represent IT or Facilities,  we can find ourselves in the position of getting deeply involved with the technical aspects of the facility – the design, the construction or retro-fit, the amount of power or cooling required, the amount of redundancy we need and the like.  Those in upper management however view this substantially in a different way.    Its all about business.  As I have gotten a slew of these mails recently I decided to try and post my own response.  As I thought about how I would go about this, I would keep going back to Chris Crosby’s discussion at Data Center Dynamics about two years ago.   As you know I was at Microsoft, at the time and felt that he did an excellent job of outlining the way the business person views data center decisions.    So I went digging around and found this video of Chris talking about it.  Hopefully this helps!  If not let me know and I am happy to discuss further or more specifically.

\Mm

Miss the “Live” Chiller Side Chat? Hear it here!

The folks who were recording the “Live” Chiller Side Chat have sent me a link to the recording.    If you were not able to make the event live, but are still interested in hearing how it went feel free to have a listen at the following link:

 

LIVE CHILLER SIDE CHAT

 

\Mm

Live Chiller Side Chat Redux

I wanted to take a moment to thank Rich Miller of Data Center Knowledge, and all of those folks that called in and asked and submitted questions today in the Live Chiller Side Chat.   It was incredible fun for me to get a chance to answer questions directly from everyone.   My only regret is that we did not have enough time!

When you have a couple of hundred people logged in, its unrealistic and all but impossible to answer all of the questions.  However, I think Rich did a great job bouncing around to clue into key themes that he saw emerging from the questions.    One thing is for sure is that we will try to do another one of those given the amount of unanswered questions.  I have already been receiving some great ideas on how to possibly structure these moving forward.  Hopefully everyone got some value or insight out of the exercise.  As I warned before the meeting, you may not get the right answer, but you will definitely get my answer.  

One of the topics that we touched on briefly during the call, and went a bit under-discussed was regulation associated with data centers or more correctly, regulation and legislation that will affect our industry.    For those of you who are interested I recently completed an executive primer video on the subject of data center regulation.  The link can be found here:

image

Data Center Regulation Video.

Thanks again for spending your valuable time with me today and hope we can do it again!

\Mm