The Cloud Cat and Mouse Papers – The Primer

image_thumb1_thumb

Cat and Mouse with Multi-national infrastructure –The Participants to date

There is an ever-changing, game of cat and mouse developing in the world of cloud computing.   Its not a game that you as a consumer might see, but it is there.  An undercurrent that has been there from the beginning.   It pits Technology Companies and multi-national infrastructure against local and national governments.  For some years this game of cat and mouse has been quietly played out in backrooms, development and technology roadmap re-works, across negotiation tables, and only in the rarest of cases – have they come out for industry scrutiny or visibility.  To date the players have been limited to likes of Google, Microsoft, Amazon, and others who have scaled their technology infrastructure across the globe and in large measure those are the players that governments have moved against in an ever intricate chess game.  I myself have played apart in the measure/counter-measure give and take of this delicate dance.

The primary issues in this game have to do with realization of revenue for taxation purposes, Safe Harbor and issues pertaining to personally identifiable information, ownership of Big Data, the nature of what is stored, how it is stored, and where it is stored, the intersection where politics and technology meet.   A place where social issues and technology collide.   You might call them storm clouds, just out of sight, but there is thunder on the horizon that you the consumer and/or potential user of global cloud infrastructure will need to be aware of because eventually the users of cloud infrastructure will become players in the game as well. 

That is not to say that the issues I tease out here are all gloom and doom.  In fact, they are great opportunities for potential business models, additional product features, and even cloud eco-system companies or niche solutions unto themselves.  A way to drive significant value for all sides.   I have been toying with more than a few of these ideas myself here over the last few months.

To date these issues have mostly manifested in the global build up of infrastructure for the big Internet platforms.  The Products and Services the big guys in the space use as their core money-making platforms or primary service delivery platforms.  Rarely if ever do these companies use this same infrastructure for their Infrastructure as a service (IAAS) or Platform as a Services (PAAS) offerings.  However, as you will see, the same challenges will and do apply to these offerings as well.  In some cases they are even more acute and problematic in a situation where there may be a multi-tenancy with the potential to put even more burden on future cloud users.

If I may be blunt about this there is an interesting lifecycle to this food chain whereby the Big Technology companies consistently have the upper hand and governmental forces through the use of their primary tools – regulation and legislation – are constantly playing catch up.  This lifecycle is unlikely to change for at least five reasons.

  • The Technology Companies will always have the lens of the big picture of multi-national infrastructure.   Individual countries, states, and locales generally only have jurisdiction or governance over that territory, or population base that is germane to their authority.
  • Technology Companies can be of near singular purpose on a very technical depth of capability or bring to bear much more concentrated “brain power” to solve for evolutions in the changing socio-political landscape to continually evolve measures and counter-measures to address these changes.
  • By and large Governments rely upon technologies and approaches to become mainstream before there is enough of a base understanding of the developments and impacts before they can act.  This generally places them in a reactionary position. 
  • Governmental forces generally rely upon “consultants” or “industry experts” to assist in understanding these technologies, but very few of these industry experts have ever really dealt with multi-national infrastructure and fewer still have had to strategize and evolve plans around these types of changes. The expertise at that level is rare and almost exclusively retained by the big infrastructure providers.
  • Technology Companies have the ability to force a complete game-change to the rules and reality by completely changing out the technology used to deliver their products and services, change development and delivery logic and/or methodology to almost affect a complete negation of the previous method of governance, making it obsolete. 

That is not to say that governments are unwilling participants in this process forced into a subservient role in the lifecycle.  In fact they are active participants in attracting, cultivating, and even subsidizing these infrastructural investments in areas of under their authority and jurisdiction.  Using tools like Tax breaks, real estate and investment incentives, and private-public partnerships do have both initial and ongoing benefits for the Governments as well.  In many ways these  are “golden handcuffs” for Technology Companies who enter into this cycle, but like any kind of constraint – positive or negative – the planning and strategy to unfetter themselves begins almost immediately.

Watson, The Game is Afoot

Governments, Social Justice, Privacy, and Environmental forces have already begun to force changes in the Technology landscape for those engaged in multi-national infrastructure.  There are tons of articles freely available on the web which articulate the kinds of impacts these forces have had and will continue to have on the Technology Companies.  The one refrain through all of the stories is the resiliency of those same Technology Companies to persevere and thrive despite what might be crucial setbacks in other industries.

In some cases the technology changes and adapts to meet the new requirements, in some cases, approaches or even vacating “un-friendly” environs across any of these spectrums becomes an option, and in some cases, there is not an insignificant bet that any regulatory or compulsory requirements will be virtually impossible or too technically complex to enforce or even audit.

Lets take a look at a couple of the examples that have been made public that highlight this kind of thing.   Back in 2009, Microsoft migrated substantial portions of their Azure Cloud Services out of Washington State to its facilities located in San Antonio Texas.  While the article specifically talks about certain aspects of tax incentives being held back, there were of course other factors involved.   One doesn’t have to look far to understand that Washington State also has an B&O Tax (Business and Occupation Tax) which is defined as a gross receipts tax. It is measured on the value of products, gross proceeds of sale, or gross income of the business.  As you can imagine, interpreting this kind of tax as it relates to online and cloud income and the like could be very tricky and regardless would be complex and technical problem  to solve.  It could have the undesired impact of placing any kind of online business at an interesting disadvantage,  or at another level place an unknown tax burden on its users.   I am not saying this was a motivating factor in Microsoft’s decision but you can begin to see the potential exposure developing.   In this case, The Technology could rapidly change and move the locale of the hosted environments to minimize the exposure, thus thwarting any governmental action.  At least for the provider, but what of the implications if you were a user of the Microsoft cloud platform and found yourself with an additional or unknown tax burden.  I can almost guarantee that back in 2009 that this level of end user impact (or revenue potential from a state tax perspective) had not even been thought about.   But as with all things, time changes and we are already seeing examples of exposure occurring across the game board that is our planet.

We are already seeing interpretations or laws getting passed in countries around the globe where for example, a server is a taxable entity.   If revenue for a business is derived from a computer or server located in that country it falls under the jurisdiction of that countries tax authority.    Imagine yourself as a company using this wonderful global cloud infrastructure selling your widgets, products or services, and finding yourself with an unknown tax burden and liability in some “far flung” corner of the earth.   The Cloud providers today mostly provide Infrastructure services.  They do not go up the stack far enough to be able to effectively manage your entire system let alone be able to determine your tax liability.  The burden of proof to a large degree today would reside on the individual business running inside that infrastructure.  

In many ways those adopting these technologies are the least capable to deal with these kinds of challenges.  They are small to mid-sized companies who admittedly don’t have the capital, or operational sophistication to build out the kind of infrastructure needed to scale that quickly.   They are unlikely to have technologies such as robust configuration management databases to be able to to track virtual instances of their products and services, to tell what application ran, where it ran, how long it ran, and how much revenue was derived during the length of its life.   And this is just one example (Server as a taxable entity) of a law or legislative effort that could impact global users.  There are literally dozens of these kinds of bills/legislative initiatives/efforts (some well thought out, most not) winding their way through legislative bodies around the world.

You might think that you may be able to circumvent some of this by limiting your product or services deployment to the country closest to home, wherever home is for you.  However there are other efforts winding their way through or in large degree passed that impact the data you store, what you store, whose data are you storing, and the like. In most cases these initiatives are unrelated to the revenue legislations developing, but balanced they can give an interesting one – two punch.   For example many countries are requiring that for Safe Harbor purposes all information for any nationals of ‘Country X’ must be stored in ‘Country X’ to ensure that its citizenry is properly protected and under the jurisdiction of the law for those users.   In a cloud environment, with customers potentially from almost anywhere how do you ensure that this is the case?  How do you ensure you are compliant?   If you balance this requirement with the ‘server as a taxable entity’ example I just gave above there is an interesting exposure and liability for companies prove where and when revenue is derived.     Similarly there are some laws that are enacted as reactions against legislation in other countries.

In the post-911 era within the United States, the US Congress enacted a series of laws called the Patriot Act.   Due to some of the information search and seizure aspects of the law, Canada forbade that Canadian citizens data be stored in the United States in response.   To the best of my knowledge only a small number of companies actually even acknowledge this requirement and have architected solutions to address it, but the fact remains they are not in compliance with Canadian law.  Imagine you are a small business owner, using a cloud environment to grow your business, and suddenly you begin to grow your business significantly in Canada.  Does your lack of knowledge of Canadian law excuse you from your responsibilities there?  No.  Is this something that your infrastructure provider is offering to you? Today, no.  

I am only highlighting certain cases here to make the point that there is a world of complexity coming to the cloud space.  Thankfully these impacts have not been completely explored or investigated by most countries of the world, but its not hard to see a day/time where this becomes a very real thing where companies and the cloud eco-system in general will have to address.  At its most base level these are areas of potential revenue streams for governments and as such increase the likelihood of their eventual day in the sun.    I am currently personally tracking over 30 different legislative initiatives around the world (read as pre-de-facto laws) that will likely shape this Technology landscape for the big providers and potential cloud adopters some time in the future.

What is to come?

This first article was really just to bring out the basic premise of the conversation and topics I will be discussing and to lay the groundwork to a very real degree.  I have not even begun to touch on the extra-governmental impacts of social and environmental impacts that will likely change the shape even further.  This interaction of Technology, “The Cloud”, Political and Social Issues, exists today and although largely masked by the fact that eco-system of the cloud is not fully developed or matured, is no less a reality.   Any predictions that are made are extensions of existing patterns I see in the market already and do not necessarily represent a forgone conclusion, but rather the most likely developments based upon my interactions in this space.  As this Technology space continues to mature, the only certainty is uncertainty modulated against the backdrop of a world where increasingly geo-political forces will continue to shape the Technology of tomorrow.

\Mm

Cloud Détente – The Cloud Cat and Mouse Papers

image_thumb1

Over the last decade or so I have been lucky enough to be placed into a fairly unique position to work internationally deploying global infrastructure for cloud environments.  This work has spanned across some very large companies with a very dedicated focus on building out global infrastructure and managing through those unique challenges.   Strategies may have varied but the challenges faced by them all had some very common themes.   One of the more complex interactions when going through this process is what I call the rolling Cat and Mouse interactions between governments at all levels and these global companies.  

Having been a primary player in these negotiations and the development of measures and counter measures as a result of these interactions, I have come to believe there are some interesting potential outcomes that cloud adopters should think about and understand.   The coming struggle and complexity for managing regulating and policing multi-national infrastructure will not solely impact the large global players, but in a very real way begin to shape how their users will need to think through these socio-political  and geo-political realities. The potential impacts on their business, their adoption of cloud technologies, their resulting responsibilities and measure just how aggressively they look to the cloud for the growth of their businesses.

These observations and predictions are based upon my personal experiences.  So for whatever its worth (good or bad)  this is not the perspective of an academic writing from some ivory tower, rather they are  the observations of someone who has been there and done it.  I probably have enough material to write an entire book on my personal experiences and observations, but I have committed myself to writing a series of articles highlighting what I consider the big things that are being missed in the modern conversation of cloud adoption.  

The articles will highlight (with some personal experiences mixed in) the ongoing battle between Technocrats versus Bureaucrats.  I will try to cover a different angle on many of the big topics out there today such as :

  • Big Data versus Big Government
  • Rise of Nationalism as a factor in Technology and infrastructure distribution
  • The long struggle ahead for managing, regulating, and policing clouds
  • The Business, end-users, regulation and the cloud
  • Where does the data live? How long does it live? Why Does it Matter?
  • Logic versus Reality – The real difference between Governments and Technology companies.
  • The Responsibilities of data ownership
    • … regarding taxation exposure
    • … regarding PII impacts
    • … Safe Harbor

My hope is that this series and the topics I raise, while maybe a bit raw and direct, will cause you to think a bit more about the coming impacts on Technology industry at large, the potential coming impacts to small and medium size businesses looking to adopt these technologies, and the developing friction and complexity at the intersection of technology and government.

\Mm

Olivier Sanche, My Dear Friend, Adieu

The data center industry has suffered a huge loss this holiday weekend with the passing of Olivier Sanche, head of Apple’s Data Center program. He was an incredibly thoughtful man, a great father and husband, and very sincerely a great friend. As I got off the phone with his brother and wife in France who gave me this devastating news and I could not help but remember my first encounter with Olivier.  At the time he worked for Ebay and we were both invited to speak and debate at an industry event in Las Vegas.  As we sat in a room full of  ‘experts’  to discuss the future of our industry, the conversation quickly turned controversial.  Passions were raised and I found myself standing side by side with this enigmatic French giant on numerous topics.  His passion for the space coupled with his cool logic were items that endeared me greatly to the man.  We were comrades in ideas, and soon became fast friends.

Olivier was the type of person who could light up a room with his mere presence.   It was as if he embraced the entire room in one giant hug even if they were strangers.  He could sit quietly mulling a topic, pensively going through his calculations and explode into the conversation and rigorously debate everyone.  That passion never belied his ability to learn, to adapt, to incorporate new thinking into his persona either.  Through the years we knew each other I saw him forge his ideas through debate, always evolving.   Many people know the public Olivier, the Olivier they saw at press conferences, or speaking engagements, and the like. Some of us, got to know Olivier much better.  The data center industry is small indeed and those of us who have had the pleasure and terror at working in the worlds largest infrastructures know a special kind of bond.   We routinely meet off-hours and have dinner and drinks.   Its a small cadre of names you probably know, or have heard about, joined in the fact that we have all dealt with or are dealing with challenges most data center environments will never see.  In these less formal affairs, company positions melted away, technological challenges came to the fore, and most importantly the real people behind these companies emerge.   In these forums, you could always count on Olivier to be a warm and calming force.   He was incredibly intelligent, and although he might disagree, you could count on him to champion the free discussion of ideas.

It was in those types of forums where I truly met Olivier.   The man who was so dedicated to his family, and the light of his life little Emilie.  His honesty and direct to the point style made it easy to understand where you stood, and where he was coming from.

More information about memorial services and the like will be coming out shortly and they are trying to get the word out to all of his friends.

The world has lost a great mind, Apple has lost a visionary, His family has lost their world, and I have lost a good friend.

Adieu, Dear Olivier, You and your family will be in my thoughts and prayers.

Your friend,

Mike Manos

\Mm

Site Selection,Data Center Clustering and their Interaction

I have written many times on the importance of the site selection for data centers and its growing importance when one considers the regulatory and legislative efforts underway globally.   Those who make their living in this space know that this is going to have a significant impact on the future landscape of these electronic bit factories.   The on-going long term operational costs over the life of the facility,  their use of natural resources (such as power) and what they house and protect (PII data or Personally Identifiable Information) are even now significantly impacting this process for many large global firms, and is making its way into the real estate community.  This is requiring a series of crash courses in information security, power regulation and rate structures, and other complex issues for many in the Real Estate community. 

In speaking to a bunch of friends in the Real Estate side of the business, I thought it might be interesting to take a few of these standard criteria head on in an open discussion.   For this post I think I will take on two of the elemental ones in data center site selection. We will look at one major item, and one item that is currently considered a minor factor that is quickly on the rise in terms of its overall importance.  Namely Power and Water, respectively.

Watts the Big Deal?

Many think that power cost alone is the primary driver for data centers, and while it is always a factor there are many other facets that come into play underneath that broader category of Power.   While Site Selection Factors are always considered highly confidential I thought it might highlight some of the wider arcs in this category.

One such category getting quite a bit of attention is Power  Generation Mix.   The Generation mix is important because it is essentially the energy sources responsible how that area or region gets its energy.  Despite what politicians would lead you to believe, once an electron is on the grid it is impossible to tell from which source it came.   So ‘Green Energy’ and its multitude of definitions is primarily determined by the mix of energy sources for a given region.   A windmill for example does not generate an electron with a tiny label saying that is sourced from ‘green’ or ‘renewable’ sources.   Understanding the generation mix of your power will allow you to forecast and predict potential Carbon output as a result of Data Center Carbon production.   The Environmental Protection Agency in the US, produces a metric called the Carbon Emission Factor which can be applied to your consumption to assist you in calculating your carbon output and is based upon the generation mix of the areas you are looking to site select in.   Whether you are leasing or building your own facility you will likely find yourself falling into a mandatory compliance in terms of reporting for this kind of thing.

So you might be thinking, ‘Great, I just need to find the areas that have cheap power and a good Carbon Emission Factor right?’  The answer is no.  Many Site Selection processes that I see emerging in the generic space start and stop right at this line.   I would however advocate that one takes the next logical step which is to look at the relationship of these factors together and over a long period of time.

Generation Mix has long been considered to be a ‘Forever’ kind of thing.  The generation sources within a region, rarely changed, or have rarely changed over time.   But that is of course changing significantly in the new era that we live in.

Lets take the interplay (both historical and moving forward) of the Power Cost and its relationship with the Generation Mix.  As humans we like to think in simplistic terms.  Power costs for a specific region are ‘so many cents per kilowatt hour’ this changes based upon whether you are measured at a residential, commercial, or industrial rate schedule.   The rate schedule is a function of how much power you ultimately consume or promise to consume to the local utility.   The reality of course is much more complicated than that.   Power rates fluctuate constantly based upon the overall mix.   Natural disasters, regulation, etc. can have a significant impact on power cost over time.   Therefore its generally wise to look at the Generation Mix Price Volatility through the longer term lens of history and see how a region’s power costs oscillate between these types of events.     However you decide to capture or benchmark this it is a factor that should be considered. 

This is especially true when you take this Volatility factor and apply it the changing requirements of Carbon Reporting and impacts.  While the United States is unlikely to have a law similar to the CRC in the UK (Carbon Reduction Commitment), it will see legislation and regulation impacting the energy producers.  

You might be asking yourself, ‘Who cares if they go after those big bad energy companies and force them to put more ‘green power in their mixes’.  Well lets think about the consequences of these actions to you the user, and why its important to your site selection activity.

As the energy producers are regulated to create a more ‘green’ mix into their systems, two things will happen.  The first of course is that rates will rise.  The energy producers will need to sink large amounts of capital to invest into these technologies, plants, research and development, etc to come to alignment with the legal requirements they are being regulated to.   This effect will be uneven as many areas around the globe have quite a disparate mix of energy from region to region.   This will also mean that ‘greener’ power will likely result in ‘more expensive power’.   Assessing an area for the potential impacts to these kinds of changes is definitely important in a data center build scenario as you likely have a desire to ensure that your facility has the longest possible life which could span a couple of decades.  The second thing which may be a bit harder to guess at, is ‘which technology’ is a given region or area likely to pick and its resulting carbon output impact.   While I have a definite approach to thinking through such things, this is essentially the beginning of the true secret sauce to site selection expertise and the help you may require if you don’t have an internal group to go through this kind of data and modeling.  This is going to have an interesting impact on the ‘clustering’ effect that happens in our industry at large.

We have seen many examples like Quincy, Washington and San Antonio, Texas where the site selection process has led to many Data Center providers locating in the same area to benefit from this type of analysis (even if not directly exposed to the criteria).  There is a story (that I don’t know if its true or not) that in the early days when a new burger chain was looking to expand where it would place its restaurants, it used the footprint of its main competitor as its guide. The thinking was that they probably had a very scientific method for that selection and they would receive that same ancillary benefit without the cost and effort.   Again, not sure if that is true or not, but its definitely something likely to happen in our industry. 

In many markets these types of selections are in high demand.   Ascent Corporation out of St. Louis is in the process of building a modern facility just down the street from the Microsoft Mega-Facility near Chicago.   While Ascent was a part of the original Microsoft effort to build at that location, there has been an uptick in interest for being close to that facility for the same reasons as I have outlined here.  The result is their CH2 facility is literally a stones throw from the Microsoft Behemoth.  The reasons? Proximity to power, fiber, and improved water infrastructure are already there in abundance.  The facility even boasts container capabilities just like its neighbor.   The Elmhurst Electrical Substation sits directly across the highway from the facility with the first set of transmission poles within easy striking distance.  

Elmhurst Electrical Yard

The Generation mix of that area has a large nuclear component which has little to no carbon impact, and generates long term stability in terms of power cost fluctuations.   According to Phil Horstmann, President of Ascent, their is tremendous interest in the site and one of the key draws is the proximity of its nearby neighbor.  In the words of one potential tenant ‘Its like the decision to go to IBM in the 80s.  Its hard to argue against a location where Microsoft or Google has placed one of its facilities.’

This essentially dictates that there will be increasing demand on areas where this analysis is done or has been perceived to be done.   This is especially true where co-location and hosting providers can align their interests with those commercial locations where there is market demand.  While those that follow first movers will definitely benefit from these decisions (especially those without dedicated facility requirements), first movers continue to have significant advantage if they can get this process correct.

Tying into the power conversation is that of water.  With the significant drive for economization (whether water based or air-based)  water continues to be a factor.  What many people don’t understand is that in many markets the discharge water is clean to dump into the sewage system and to ‘dirty’ to discharge to retention ponds.  This causes all kinds of potential issues and understanding the underlying water landscape is important.   The size of the metropolitan sewage environments, ability to dig your own well efforts, the local water table and aquifer issues, your intended load and resulting water requirements, how the local county, muncipality, or region views discharge in general and which chemicals and in what quantities is important to think about today.  However, as the use of water increases in terms of its potential environmental scrutiny – water is quickly rising on the site selection radar of many operators and those with long term holds.

I hope this brief talk was helpful.  I hope to post a few other key factors and a general discussion in the near future.  

\Mm

C02K Doubter? Watch the Presidential address today

Are you a Data Center professional who doubts that Carbon legislation is going to happen or that this initiative will never get off the ground?   This afternoon President Obama plans to outline his intention to assess a cost for Carbon consumption at a conference highlighting his economic accomplishments to date.   The backdrop of this of course is the massive oil rig disaster in the Gulf.

As my talk at the Uptime Institute Symposium highlighted this type of legislation will have a big impact on data center and mission critical professionals.  Whether you know it or not, you will be front and center in assisting with the response, collection and reporting required to react to this kind of potential legislation.  In my talk where I questioned the audience in attendance it was quite clear that most of those in the room were vastly ill-prepared and ill-equipped to this kind of effort. 

If passed this type of legislation is going to cause a severe reaction inside organizations to ensure that they are in compliance and likely lead to a huge increase of spending in an effort to collect energy information along with reporting.  For many organizations this will result in significant spending.

image The US House of Representatives has already passed a version of this known as the Waxman Markey bill.   You can bet that there will be a huge amount of pressure to get a Senate version passed and out the door in the coming weeks and months.

This should be a clarion call for data center managers to step up and raise awareness within their organizations about this pending legislation and take a proactive role in establishing a plan for a corporate response.   Take an inventory of your infrastructure and assess what will you need to begin collecting this information?  It might even be wise to get a few quotes to get an idea or ballpark cost of what it might take to bring your organization up to the task.  Its probably better to start doing this now, than to be told by the business to get it done.

\Mm

Reflections on Uptime Symposium 2010 in New York

This week I had the honor to be a keynote Speaker at the Uptime Institute’s Symposium event in New York City.   I also participated in some industry panels which is always tons of fun. However, as a keynote at the first Symposium a few years back it was an interesting experience to come back and see how it has changed and evolved over the intervening years.  This year my talk was about the coming energy regulation and its impact on data centers, and more specifically what data center managers and mission critical facilities professionals could and should be doing to get their companies ready for what I call CO2K.   I know I will get a lot of pushback on the CO2K title, but I think my analogy makes sense.  First companies are generally not aware of the impact that their data centers and energy consumption have, Second most companies are dramatically unprepared and do not have the appropriate tools in place to collect the information, which will of course lead to the third item, lots of reactionary spending to get this technology and software in place.  While Y2K was generally a flop and a lot of noise, if legislation is passed (and lets be clear about the very direct statements the Obama administration has made on this topic) this work will lead to a significant change in reporting and management responsibilities for our industry.

Think we are ready for this legislation?

Brings me back to my first reflection on Symposium this year.   I was joking with Pitt Turner just before I went on stage that I was NOT going to ask the standard three questions I ask before every data center audience.   Lets face it, I thought, that “Shtick” had gotten old, and I have been asking those same three questions for at least that last three years at every conference I have spoken at (which is a lot).  However as I got on stage, talking about the the topic of regulation I had to ask, it was like a hidden burning desire I could not quench.  So there I went, “How many people are measuring for energy consumption and efficiency today?”  “Raise you hand if in your organization, the CIO sees the power bill?”  and then finally “How many people in here today have the appropriate tooling in place to collect and reporting energy usage in their data centers?”  It had to come out.   I saw Pitt shaking his head.  What was more surprising, was the amount of people who had raised their hands on those questions. Why?  About 10% of the audience had raised their hands.  Don’t get me wrong, 10% is about the highest I have seen that number at any event.  But those of you who are uninitiated into the UI Symposium lore, you need to understand something important, Symposium represents the hardest of the hard core data center people.   This is where all of us propeller heads geek it out in mechanical and electrical splendor, we dance and raise the “floor” (data center humor).  This amazing collection of the best of the best had only had a 10% penetration on the monitoring in their environments.   When this regulation comes, its going to hurt.  I think I will do a post at a later time on my talk at Symposium and what you as a professional can do to start raising awareness.  But for now, that was my first big startle point.

My second key observation this year was the amount of people.  Symposium is truly an international event and their were over 900 attendees for the talks, and if memory serves, about 1300 for the exhibition hall.  I had heard that 20 out of the worlds 30 time-zones had representatives at the conference.  It was especially good for one of the key recurring benefits of this event: Networking.   The networking opportunities were first rate and by the looks of the impromptu meetings and hallways conversations this continued to be an a key driver for the events success.  As fun as making new friends is, it was also refreshing to spend some time and quick catch ups with old friends like Dan Costello and Sean Farney from Microsoft, Andrew Fanara, Dr. Bob Sullivan, and a host of others.

My third observation and perhaps the one I was most pleased with with the diversity of thought in the presentations.  Its a fair to say that I have been critical of Uptime for some time by a seemingly droningly dogmatic recurring set of themes and particular bend of thinking.   While those topics were covered, so too were a myriad of what I will call counter-culture topics.  Sure there were still  a couple of the salesy presentations you find at all of these kinds of events, but the diversity of thought and approach this time around was striking.   Many of them addressed larger business issues, the impact, myths, approach to cloud computing, virtualization, and decidedly non-facilities related material affecting our worlds.   This might have something to do with the purchase by the 451 Group and its related Data Center think tank organization Tier 1, but it was amazingly refreshing and they knocked the ball out of the park.

My fourth observation was that the amount of time associated with the presentations was too short.   While I have been known to completely abuse any allotted timeslots in my own talks due to my desire to hear myself talk, I found that many presentations had to end due to time just as things were getting interesting.  Many of the hallways conversations were continuations of those presentations and it would have been better to keep the groups in the presentation halls.  

 

Calvin thumb on noseMy fifth observation revolved around the quantity, penetration and maturation of container and containment products, presentations and services.   When we first went public with the approach when I was at Microsoft the topic was so avant-garde and against the grain of common practices it got quite a reception (mostly negative).  This was followed by quite a few posts (like Stirring Anthills) which got lots of press attention and resulting industry experts stating that containers and containment were never going to work for most people.   If the presentations, products, and services represented at Uptime were any indication of industry adoption and embrace I guess I would have to make a childish gesture with thumb to my nose, wiggle my fingers and say…. Nah Nah .  🙂

 

I have to say the event this year was great and I enjoyed my time thoroughly.  A great time and a great job by all. 

\Mm

Open Source Data Center Initiative

There are many in the data center industry that have repeatedly called for change in this community of ours.  Change in technology, change in priorities, Change for the future.  Over the years we have seen those changes come very slowly and while they are starting to move a little faster now, (primarily due to the economic conditions and scrutiny over budgets more-so than a desire to evolve our space) our industry still faces challenges and resistance to forward progress.   There are lots of great ideas, lots of forward thinking, but moving this work to execution and educating business leaders as well as data center professionals to break away from those old stand by accepted norms has not gone well.

That is why I am extremely happy to announce my involvement with the University of Missouri in the launch of a Not-For-Profit Data Center specific organization.   You might have read the formal announcement by Dave Ohara who launched the news via his industry website, GreenM3.   Dave is another of of those industry insiders who has long been perplexed by the lack of movement and initiative we have had on some great ideas and stand outs doing great work.  More importantly, it doesn’t stop there.  We have been able to put together quite a team of industry heavy-weights to get involved in this effort.  Those announcements are forthcoming, and when they do, I think you will get a sense of the type of sea-change this effort could potentially have.

One of the largest challenges we have with regards to data centers is education.   Those of you who follow my blog know that I believe that some engineering and construction firms are incented ‘not to change’ or implementing new approaches.  The cover of complexity allows customers to remain in the dark while innovation is stifled. Those forces who desire to maintain an aura of black box complexity  around this space and repeatedly speak to the arcane arts of building out  data center facilities have been at this a long time.  To them, the interplay of systems requiring one-off monumental temples to technology on every single build is the norm.  Its how you maximize profit, and keep yourself in a profitable position. 

When I discussed this idea briefly with a close industry friend, his first question naturally revolved around how this work would compete with that of the Green Grid, or Uptime Institute, Data Center Pulse, or the other competing industry groups.  Essentially  was this going to be yet another competing though-leadership organization.  The very specific answer to this is no, absolutely not.   

These groups have been out espousing best practices for years.  They have embraced different technologies, they have tried to educate the industry.  They have been pushing for change (for the most part).  They do a great job of highlighting the challenges we face, but for the most part have waited around for universal good will and monetary pressures to make them happen.  It dawned on us that there was another way.   You need to ensure that you build something that gains mindshare, that gets the business leadership attention, that causes a paradigm shift.   As we put the pieces together we realized that the solution had to be credible, technical, and above all have a business case around it.   It seemed to us the parallels to the Open Source movement and the applicability of the approach were a perfect match.

To be clear, this Open Source Data Center Initiative is focused around execution.   Its focused around putting together an open and free engineering framework upon which data center designs, technologies, and the like can be quickly put together and more-over standardize the approaches that both end-users and engineering firms approach the data center industry. 

Imagine if you will a base framework upon which engineering firms, or even individual engineers can propose technologies and designs, specific solution vendors could pitch technologies for inclusion and highlight their effectiveness, more over than all of that it will remove much mystery behind the work that happens in designing facilities and normalize conversations.    

If you think of the Linux movement, and all of those who actively participate in submitting enhancements, features, even pulling together specific build packages for distribution, one could even see such things emerging in the data center engineering realm.   In fact with the myriad of emerging technologies assisting in more energy efficiency, greater densities, differences in approach to economization (air or water), use of containers or non use of containers, its easy to see the potential for this component based design.  

One might think that we are effectively trying to put formal engineering firms out of business with this kind of work.  I would argue that this is definitely not the case.  While it may have the effect of removing some of the extra-profit that results from the current ‘complexity’ factor, this initiative should specifically drive common requirements, and lead to better educated customers, drive specific standards, and result in real world testing and data from the manufacturing community.  Plus, as anyone knows who has ever actually built a data center, the devil is in the localization and details.  Plus as this is an open-source initiative we will not be formally signing the drawings from a professional engineering perspective. 

Manufacturers could submit their technologies, sample application of their solutions, and have those designs plugged into a ‘package’ or ‘RPM’ if I could steal a term from the Redhat Linux nomenclature.  Moreover, we will be able to start driving true visibility of costs both upfront and operating and associate those costs with the set designs with differences and trending from regions around the world.  If its successful, it could be a very good thing.  

We are not naive about this however.  We certainly expect there to be some resistance to this approach out there and in fact some outright negativity from those firms that make the most of the black box complexity components. 

We will have more information on the approach and what it is we are trying to accomplish very soon.  

 

\Mm

CIO Magazine Data Center Roundtable

On Wednesday January 13th, I will be co-hosting a Roundtable Dinner with Chicago area CIOs on the topic of data centers and the data center industry at large.   The event is sponsored by CIO Magazine and is likely to be a wide ranging conversation given the mix of executives slated to come.  The group will be made up of technology leadership from a diverse set of industries including Universities, Manufacturing, Financial Institutions, and Hospitality.  

I am betting the topics will range from data center legislation, impact of the cloud, technologies, and key trends.

I am looking forward to some good mid-western steak, great conversation, and walking away from the meeting with more important perspectives on what we are facing as an industry. 

I will try and post a summary of topics discussed later this week. 

 

\Mm

Kickin’ Dirt

mikeatquincy

I recently got an interesting note from Joel Stone, the Global Operations Chief at Global Switch.  As some of you might know Joel used to run North American Operations for me at Microsoft.  I guess he was digging through some old pictures and found this old photo of our initial site selection trip to Quincy, Washington.

As you can see, the open expanse of farmland behind me, ultimately became Microsoft’s showcase facilities in the Northwest.  In fact you can even see some farm equipment just behind me.   It got me reminiscing about that time and how exciting and horrifying that experience can be.

At the time Quincy, Washington was not much more than a small agricultural town, whose leaders did some very good things (infrastructurally speaking) and benefitted by the presence of large amounts of hydro-power.  When we went there, there were no other active data centers for hundreds of miles, there were no other technology firms present, and discussions around locating a giant industrial-technology complex here seemed as foreign as landing on the moon might have sounded during World War Two.

Yet if you fast forward to today companies like Microsoft, Yahoo, Sabey, Intuit, and others have all located technology parks in this one time agricultural hub.   Data Center Knowledge recently did an article on the impacts to Quincy. 

Many people I speak to at conferences generally think that the site selection process is largely academic.   Find the right intersection of a few key criteria and locate areas on a map that seem to fit those requirements.   In fact, the site selection strategy that we employed took many different factors into consideration each with its own weight leading ultimately to a ‘heat map’ in which to investigate possible locations. 

Even with some of the brightest minds, and substantial research being done, its interesting to me that ultimately the process breaks down into something I call ‘Kickin Dirt’.   Those ivory tower exercises ultimately help you narrow down your decisions to a few locations, but the true value of the process is when you get out to the location itself and ‘kick the dirt around’.   You get a feel for the infrastructure, local culture, and those hard to quantify factors that no modeling software can tell you.  

Once you have gone out and kicked the dirt,  its decision time.  The decision you make, backed by all the data and process in the world, backed by personal experience of the locations in question,  ultimately nets out to someone making a decision.   My experience is that this is something that rarely works well if left up to committee.  At some point someone needs the courage and conviction, and in some cases outright insanity to make the call. 

If you are someone with this responsibility in your job today – Do your homework, Kick the Dirt, and make the best call you can.  

To my friends in Quincy – You have come along way baby!  Merry Christmas!

 

\Mm

Data Center Regulation Awareness Increasing, Prepare for CO2K

This week I had the pleasure of presenting at the Gartner Data Center Conference in Las Vegas, NV.  This was my first time presenting at the Gartner event and it represented an interesting departure from my usual conference experience in a few ways and I came away with some new observations and thoughts.   As always, the greatest benefit I personally get from these events is the networking opportunities with some of the smartest people across the industry.  I was surprised by both the number of attendees ( especially given the economic drag and the almost universal slow-down on corporate travel) and the quality of questions I heard in almost every session.

My talk centered around the coming Carbon Cap and Trade Regulation and its specific impact on IT organizations and the data center industry.  I started my talk with a joke about how excited I was to be addressing a room of tomorrow’s eco-terrorists.  The joke went flat and the audience definitely had a fairly serious demeanor.   This was reinforced when I asked how many people in the audience thought that regulation was a real and coming concern for IT organizations.  Their response startled me.

I was surprised because nearly 85% of the audience had raised their hands.  If I contrast that to the response to the exact same question asked three months earlier at the Tier One Research Data Center Conference where only about 5% of the audience raised their hands, its clear that this is a message that is beginning to resonate, especially in the larger organizations.  

In my talk, I went through the Carbon Reduction Commitment legislation passed in the UK and the negative effects it is having upon data center and IT industry there, as well as the negative impacts to Site Selection Activity that it is causing firms to skip investing Data Center capital in the UK by and large.   I also went through the specifics of the Waxman-Markey bill in the US House of Representatives and the most recent thought on the various Senate based initiatives on this topic.   I have talked here about these topics before, so I will not rehash those issues for this post.  Most specifically I talked about the potential cost impacts to IT organizations and Data Center Operations and the complexity of managing both carbon reporting and both direct and indirect costs resulting from these efforts.  

While I was pleasantly surprised by the increased awareness of senior IT,  business managers, and Data Center Operators around the coming regulation impacts, I was not surprised by the responses I received with regards to their level of preparedness to reacting these initiatives.   Less than 10% of the room had the technology in place to even begin to collect the needed base information for such reporting and roughly 5% had begun a search for software or initiate development efforts to aggregate and report this information.

With this broad lack of infrastructural systems in place, let alone software for reporting –  I predict we are going to see a phenomena similar to the Y2K craziness in the next 2-3 years.  As the regulatory efforts here in the United States and across the EU begin to crystallize, organizations will need to scramble to get the proper systems and infrastructure in place to ensure compliance.   I call this coming phenomena – CO2K.  Regardless what you call it, I suspect, the coming years will be good for those firms with power management infrastructure and reporting capabilities.

\Mm