Industry Impact : Brothers from Different Mothers and Beyond…

Screen Shot 2013-11-15 at 12.19.43 PM

My reading material and video watching habits these past two weeks have brought me some incredible joy and happiness. Why?  Because Najam Ahmad of Facebook is finally getting some credit for the amazing work that he has done and been doing in the world of Software Defined Networking.  In my opinion Najam is a Force Majeure in the networking world.   He is passionate.  He is focused. He just gets things done.  Najam and I worked very closely at Microsoft as we built out and managed the company’s global infrastructure. So closely in fact that we were frequently referred to as brothers from different mothers.   Wherever Najam was-I was not far behind, and vice versa. We laughed. We cried.  We fought.  We had alot of fun while delivered some pretty serious stuff.  To find out that he is behind the incredible Open Compute Project advances in Networking is not surprising at all.   Always a forward thinking guy he has never been satisfied with the status quo.    
If you have missed any of that coverage you I strongly encourage you to have a read at the links below.   


This got me to thinking about the legacy of the Microsoft program on the Cloud and Infrastructure Industry at large.   Data Center Knowledge had an article covering the impact of some of the Yahoo Alumni a few years ago. Many of those folks are friends of mine and deserve great credit.  In fact, Tom Furlong now works side by side with Najam at Facebook.    The purpose of my thoughts are not to take away from their achievements and impacts on the industry but rather to really highlight the impact of some of the amazing people and alumni from the Microsoft program.  Its a long overdue acknowledgement of the legacy of that program and how it has been a real driving force in large scale infrastructure.   The list of folks below is by no means comprehensive and doesnt talk about the talented people Microsoft maintains in their deep stable that continue to drive the innovative boundaries of our industry.  

Christian Belady of Microsoft – Here we go, first person mentioned and I already blow my own rule.   I know Christian is still there at Microsoft but its hard not to mention him as he is the public face of the program today.  He was an innovative thinker before he joined the program at Microsoft and was a driving thought leader and thought provoker while I was there.  While his industry level engagements have been greatly sidelined as he steers the program into the future – he continues to be someone willing to throw everything we know and accept today into the wind to explore new directions.
Najam Ahmad of Facbook - You thought  I was done talking about this incredible guy?  Not in the least, few people have solved network infrastructure problems at scale like Najam has.   With his recent work on the OCP front finally coming to the fore, he continues to drive the capabilities of what is possible forward.  I remember long meetings with Network vendors where Najam tried to influence capabilities and features with the box manufacturers within the paradigm of the time, and his work at Facebook is likely to end him up in a position where he is both loved and revilved by the Industry at large.  If that doesn’t say your an industry heavy weight…nothing does.
James Hamilton of Amazon - There is no question that James continues to drive deep thinking in our industry. I remain an avid reader of his blog and follower of his talks.    Back in my Microsoft days we would sit  and argue philosophical issues around the approach to our growth, towards compute, towards just about everything.   Those conversations either changed or strengthed my positions as the program evolved.   His work in the industry while at Microsoft and beyond has continued to shape thinking around data centers, power, compute, networking and more.
Dan Costello of Google - Dan Costello now works at Google, but his impacts on the Generation 3 and Generation 4 data center approaches and the modular DC industry direction overall  will be felt for a very long time to come whether Google goes that route or not.   Incredibly well balanced in his approach between technology and business his ideas and talks continue to shape infrastructre at scale.  I will spare people the story of how I hired him away from his previous employer but if you ever catch me at a conference, its a pretty funny story. Not to mention the fact that he is the second best break dancer in the Data Center Industry.
Nic Bustamonte of Google – Nic is another guy who has had some serious impact on the industry as it relates to innovating the running and operating of large scale facilities.   His focus on the various aspects of the operating environments of large scale data centers, monioring, and internal technology has shifted the industry and really set the infancy for DCIM in motion.   Yes, BMS systems have been around forever, and DCIM is the next interation and blending of that data, but his early work here has continued to influence thinking around the industry.
Arne Josefsberg of ServiceNow - Today Arne is the CTO of Service Now, and focusing on infrastructure and management for enterprises to the big players alike and if their overall success is any measure, he continues to impact the industry through results.  He is *THE* guy who had the foresight of building an organiation to adapt to this growing change of building and operating at scale.   He the is the architect of building an amazing team that would eventually change the industry.
Joel Stone of Savvis/CenturyLink – Previously the guy who ran global operations for Microsoft, he has continued to drive excellence in Operations at Global Switch and now at Savvis.   An early adopter and implmenter of blending facilities and IT organizations he mastered issues a decade ago that most companies are still struggling with today.
Sean Farney of Ubiquity – Truly the first Data center professional who ever had to productize and operationalize data center containers at scale.   Sean has recently taken on the challenge of diversifying data center site selection and placement at Ubquity repurposing old neighorbood retail spaces (Sears, etc) in the industry.   Given the general challenges of finding places with a confluence of large scale power and network, this approach may prove to be quite interesting as markets continue to drive demand.   
Chris Brown of Opscode – One of the chief automation architects at my time at Microsoft, he has moved on to become the CTO of Opscode.  Everyone on the planet who is adopting and embracing a DevOps has heard of, and is probably using, Chef.  In fact if you are doing any kind of automation at large scale you are likely using his code.
None of these people would be comfortable with the attention but I do feel credit should be given to these amazing individuals who are changing our industry every day.    I am so very proud to have worked the trenches with these people. Life is always better when you are surrounded by those who challenge and support you and in my opinion these folks have taken it to the next level.
\Mm

The Soft Whisper that Big Data, Cloud Computing, and Infrastructure at Scale should treat as a Clarion Call.

The Cloud Jail

On Friday, August 23rd, the Chinese government quietly released Shi Tao from prison.   He was released a full fifteen months before his incarceration was supposed to end.  While certainly a relief to his family and friends, it’s likely a bittersweet ending to a sour turn of events.

Just who is Shi Tao and what the heck does he have to do with Big Data?  Why is he important to Cloud Computing and big infrastructure?   Is he a world-class engineer who understands technology at scale?  Is he a deep thinker of all things cloud?  Did he invent some new technology poised to revolutionize and leap frog our understanding?

No.  He is none of these things.  

He is a totem of sorts.   A living parable and a reminder of the realities that many in the Cloud Computing industry and those who deal with Big Data, rarely if ever address head on or give little mind to.   He represents the cautionary tale of what can happen if companies and firms don’t fully vet the impacts of their Technology Choices grounded by the real world.  The site selection of their data centers.   The impact of how data is stored.  Where that data is stored.   The methods used of storing the data.  In short a responsibility for the full accounting and consideration of their technological and informational artifacts. 

To an engineering mind that responsibility generally means the most efficient storage of data with the least amount of cost.  Using the most direct method or the highest performing algorithm.  In short…to continually build a better mouse trap.  

In site selecting of new data centers it would likely be limited to just the basic real estate and business drivers.   What is the power cost?  What is the land cost?  What is my access to water? Is there sufficient network nearby?  Can I negotiate tax breaks at the country and/or at local levels?

In selecting a cloud provider its generally about avoiding large capital costs and paying what I need, when I need it.

In the business landscape of tomorrow, these thoughts will prove short-sighted and may likely expose your company to significant cost and business risks they are not contemplating or worse!

Big Data is becoming a dangerous game.  To be fair content and information in general has always been a bit of a dangerous game.   In Technology, we just go on pretending we live under a Utopian illusion that fairness  ultimately rules the world.  It doesn’t.   Businesses have an inherent risk collecting, storing, analyzing , and using the data that they obtain.  Does that sound alarmist or  jaded?  Perhaps, but its spiced with some cold hard realities that are becoming ever more present every day and you ignore at your own peril.  

Shi was arrested in 2004 and sentenced to prison the following year on charges of disclosing state secrets.  His crime? He had sent details of a government memo restricting news coverage to a human rights group in the United States.  The Chinese government demanded that Yahoo! (his mail provider) turn over all mail records (Big Data) to the authorities. Something they ultimately did.  

Now Before you go and get your Western Democracy Sensibilities all in a bunch and cry foul-that ugly cold hard reality thing I was talking about plays a real part here.  As Yahoo was operating as a business inside China, they were bound by comply with Chinese law no matter how hard the action was to stomach for them.   Around that time Yahoo sold most of its stake in the Chinese market to Alibaba and as of the last month or so Yahoo has since left China altogether.  

Yahoo’s adventure in Data information risk and governmental oversight was not over however.  They were brought before the US Congress on charges of Human Rights Violations.   Placing them once again into a pot of boiling water from a governmental concern closer to home.  

These events took place back almost seven years ago and I would argue that the world of information, big data, and scaled infrastructure has actually gotten more convoluted and tricky to deal with.   With the advent of Amazon AWS and other cloud services, a lack of understanding of regional and local Safe Harbor practices amongst enterprises and startups alike,  concepts like chain of custody and complicated and recursive ownership rights can be obfuscated to the point of insanity if you don’t have a program to manage it.    We don’t have to use the example of China either, similar complexities are emerging across and internal to  Europe .  Is your company really thinking through Big Data?  Do you fully understand ownership in a clouded environment?  Who is responsible for taxation for you local business hosted internationally?  What if your cloud servers ,with your data, hosted by a cloud platform  were confiscated by local and regional governments without your direct involvement?  Are you strategically storing data in a way that protects yourself? Do you even have someone looking at these risks to your business? 

As a recovering network engineer I am reminded by an old joke referring to the OSI Model.   The OSI Model categorizes all functions of a communication system into seven logical layers.   It makes internetworking clear and efficient and easily categorized.  Of course, as every good network engineer knows, it doesn’t account for layers 8 and 9.  But wait!  You said there were only 7!  Well Layers 8 and 9 are Politics and Religion.    These layers exist in Cloud Computing and Big Data too and are potentially more impactful to the business overall.

All of these scenarios do not necessarily lend themselves to be the most direct or efficient, but its pretty clear that you can save yourself a whole lot of time and heartache if you think about them strategically.  The infrastructure of tomorrow is powerful, robust, and ubiquitous.   You simply cannot manage this complex eco-system the same ways you have in the past and just like the technology your thinking needs to evolve.   

\Mm

Insider Redux: Data Barn in a Farm Town

I thought I would start my first post by addressing the second New York Times article first. Why? Because it specifically mentions activities and messages sourced from me at the time when I was responsible for running the Microsoft Data Center program. I will try to track the timeline mentioned in the article with my specific recollections of the events. As Paul Harvey used to say, so then you could know the ‘REST of the STORY’.

I remember my first visit to Quincy, Washington. It was a bit of a road trip for myself and a few other key members of the Microsoft site selection team. We had visited a few of the local communities and power utility districts doing our due diligence on the area at large. Our ‘Heat map’ process had led us to Eastern Washington state. Not very far (just a few hours) from the ‘mothership’ of Redmond, Washington. It was a bit of a crow eating exercise for me as just a few weeks earlier I had proudly exclaimed that our next facility would not be located on the West Coast of the United States. We were developing an interesting site selection model that would categorize and weight areas around the world. It would take in FEMA disaster data, fault zones, airport and logistics information, location of fiber optic and carrier presence, workforce distributions, regulatory and tax data, water sources, and power. This was going to be the first real construction effort undertaken by Microsoft. The cost of power was definitely a factor as the article calls out. But just as equal was the generation mix of the power in the area. In this case a predominance of hydroelectric. Low to No carbon footprint (Rivers it turns out actually give off carbon emissions I came to find out). Regardless the generation mix was and would continue to be a hallmark of site selection of the program when I was there. The crow-eating exercise began when we realized that the ‘greenest’ area per our methodology was actually located in Eastern Washington along the Columbia River.

We had a series of meetings with Real Estate folks, the local Grant County PUD, and the Economic Development folks of the area. Back in those days the secrecy around who we were was paramount, so we kept our identities and that of our company secret. Like geeky secret agents on an information gathering mission. We would not answer questions about where we were from, who we were, or even our names. We ‘hid’ behind third party agents who took everyone’s contact information and acted as brokers of information. That was early days…the cloak and dagger would soon come out as part of the process as it became a more advantageous tool to be known in tax negotiations with local and state governments.

During that trip we found the perfect parcel of land, 75 acres with great proximity to local sub stations, just down line from the Dams on the nearby Columbia River. It was November 2005. As we left that day and headed back it was clear that we felt we had found Site Selection gold. As we started to prepare a purchase offer we got wind that Yahoo! was planning on taking a trip out to the area as well. As the local folks seemingly thought that we were a bank or large financial institution they wanted to let us know that someone on the Internet was interested in the area as well. This acted like a lightning rod and we raced back to the area and locked up the land before they Yahoo had a chance to leave the Bay Area. In these early days the competition was fierce. I have tons of interesting tales of cloak and dagger intrigue between Google, Microsoft, and Yahoo. While it was work there was definitely an air of something big on the horizon. That we were all at the beginning of something. In many ways many of the Technology professionals involved regardless of company forged some deep relationships and competition with each other.

Manos on the Bean Field December 2005The article talks about how the ‘Gee-Whiz moment faded pretty fast’. While I am sure that it faded in time (as all things do), I also seem to recall the huge increase of local business as thousands of construction workers descended upon this wonderful little town, the tours we would give local folks and city council dignitaries, a spirit of true working together. Then of course there was the ultimate reduction in properties taxes resulting from even our first building and an increase in home values to boot at the time. Its an oft missed benefit that I am sure the town of Quincy and Grant County has continued to benefit from as the Data Center Cluster added Yahoo, Sabey, IAC, and others. I warmly remember the opening day ceremonies and ribbon cutting and a sense of pride that we did something good. Corny? Probably – but that was the feeling. There was no talk of generators. There were no picket signs, in fact the EPA of Washington state had no idea on how to deal with a facility of this size and I remember openly working in partnership on them. That of course eventually wore off to the realities of life. We had a business to run, the city moved on, and concerns eventually arose.

The article calls out a showdown between Microsoft and the Power Utility District (PUD) over a fine for missing capacity forecasting target. As this happened much after I left the company I cannot really comment on that specific matter. But I can see how that forecast could miss. Projecting power usage months ahead is more than a bit of science mixed with art. It gets into the complexity of understanding capacity planning in your data centers. How big will certain projects grow. Will they meet expectations?, fall short?, new product launches can be duds or massive successes. All of these things go into a model to try and forecast the growth. If you think this is easy I would submit that NOONE in the industry has been able to master the crystal ball. I would also submit that most small companies haven’t been able to figure it out either. At least at companies like Microsoft, Google, and others you can start using the law and averages of big numbers to get close. But you will always miss. Either too high, or too low. Guess to low and you impact internal budgeting figures and run rates. Not Good. Guess to high and you could fall victim to missing minimal contracts with utility companies and be subject to fines.

In the case mentioned in the article, the approach taken if true would not be the smartest method especially given the monthly electric bill for these facilities. It’s a cost of doing business and largely not consequential at the amount of consumption these buildings draw. Again, if true, it was a PR nightmare waiting to happen.

At this point the article breaks out and talks about how the Microsoft experience would feel more like dealing with old-school manufacturing rather than ‘modern magic’ and diverts to a situation at a Microsoft facility in Santa Clara, California.

The article references that this situation is still being dealt with inside California so I will not go into any detailed specifics, but I can tell you something does not smell right in the state of Denmark and I don’t mean the Diesel fumes. Microsoft purchased that facility from another company. As the usage of the facility ramped up to the levels it was certified to operate at, operators noticed a pretty serious issue developing. While the building was rated to run at certain load size, it was clear that the underground feeders were undersized and the by-product could have polluted the soil and gotten into the water system. This was an inherited problem and Microsoft did the right thing and took the high road to remedy it. It is my recollection that all sides were clearly in know of the risks, and agreed to the generator usage whenever needed while the larger issue was fixed. If this has come up as a ‘air quality issue’ I personally would guess that there is politics at play. I’m not trying to be an apologist but if true, it goes to show that no good deed goes unpunished.

At this point the article cuts back to Quincy. It’s a great town, with great people. To some degree it was the winner of the Internet Jackpot lottery because of the natural tech resources it is situated on. I thought that figures quoted around taxes were an interesting component missed in many of the reporting I read.

“Quincy’s revenue from property taxes, which data centers do pay, has risen from $815,250 in 2005 to a projected $3.6 million this year, paying for a library and repaved streets, among other benefits, according to Tim Snead, the city administrator.”

As I mentioned in yesterday’s post my job is ultimately to get things done and deliver results. When you are in charge of a capital program as large as Microsoft’s program was at the time – your mission is clear – deliver the capacity and start generating value to the company. As I was presented the last cropThe last bag of beans harvested in Quincy of beans harvested from the field at the ceremony we still had some ways to go before all construction and capacity was ready to go. One of the key missing components was the delivery and installation of a transformer for one of the substations required to bring the facility up to full service. The article denotes that I was upset that the PUD was slow to deliver the capacity. Capacity I would add that was promised along a certain set of timelines and promises and commitments were made and money was exchanged based upon those commitments. As you can see from the article, the money exchanged was not insignificant. If Mr. Culbertson felt that I was a bit arrogant in demanding a follow through on promises and commitments after monies and investments were made in a spirit of true partnership, my response would be ‘Welcome to the real world’. As far as being cooperative, by April the construction had already progressed 15 months since its start. Hardly a surprise, and if it was, perhaps the 11 acre building and large construction machinery driving around town could have been a clue to the sincerity of the investment and timelines. Harsh? Maybe. Have you ever built a house? If so, then you know you need to make sure that the process is tightly managed and controlled to ensure you make the delivery date.

The article then goes on to talk about the permitting for the Diesel generators. Through the admission of the Department of Ecology’s own statement, “At the time, we were in scramble mode to permit our first one of these data centers.” Additionally it also states that:

Although emissions containing diesel particulates are an environmental threat, they were was not yet classified as toxic pollutants in Washington. The original permit did not impose stringent limits, allowing Microsoft to operate its generators for a combined total of more than 6,000 hours a year for “emergency backup electrical power” or unspecified “maintenance purposes.”

At the time all this stuff was so new, everyone was learning together. I simply don’t buy that this was some kind Big Corporation versus Little Farmer thing. I cannot comment on the events of 2010 where Microsoft asked for itself to be disconnected from the Grid. Honestly that makes no sense to me even if the PUD was working on the substation and I would agree with the articles ‘experts’.

Well that’s my take on my recollection of events during those early days of the Quincy build out as it relates to the articles. Maybe someday I will write a book as the process and adventures of those early days of birth of Big Infrastructure was certainly exciting. The bottom line is that the data center industry is amazingly complex and the forces in play are as varied as technology to politics to people and everything in between. There is always a deeper story. More than meets the eye. More variables. Decisions are never black and white and are always weighted against a dizzying array of forces.

\Mm

Patent Wars may Chill Data Center Innovation

Yahoo may have just sent a cold chill across the data center industry at large and begun a stifling of data center innovation.  In a May 3, 2012 article, Forbes did a quick and dirty analysis on the patent wars between Facebook and Yahoo. It’s a quick read but shines an interesting light on the potential impact something like this can have across the industry.   The article, found here,  highlights that :

In a new disclosure, Facebook added in the latest version of the filing that on April 23 Yahoo sent a letter to Facebook indicating that Yahoo believes it holds 16 patents that “may be relevant” to open source technology Yahoo asserts is being used in Facebook’s data centers and servers.

While these types of patent infringement cases happen all the time in the Corporate world, this one could have far greater ramifications on an industry that has only recently emerged into the light of sharing of ideas.    While details remain sketchy at the time of this writing, its clear that the specific call out of data center and servers is an allusion to more than just server technology or applications running in their facilities.  In fact, there is a specific call out of data centers and infrastructure. 

With this revelation one has to wonder about its impact on the Open Compute Project which is being led by Facebook.   It leads to some interesting questions. Has their effort to be more open in their designs and approaches to data center operations and design led them to a position of risk and exposure legally?  Will this open the flood gates for design firms to become more aggressive around functionality designed into their buildings?  Could companies use their patents to freeze competitors out of colocation facilities in certain markets by threatening colo providers with these types of lawsuits?  Perhaps I am reaching a bit but I never underestimate litigious fervor once the  proverbial blood gets in the water. 

In my own estimation, there is a ton of “prior art”, to use an intellectual property term, out there to settle this down long term, but the question remains – will firms go through that lengthy process to prove it out or opt to re-enter their shells of secrecy?  

After almost a decade of fighting to open up the collective industry to share technologies, designs, and techniques this is a very disheartening move.   The general Glasnost that has descended over the industry has led to real and material change for the industry.  

We have seen the mental shift of companies move from measuring facilities purely around “Up Time” measurements to one that is primarily more focused around efficiency as well.  We have seen more willingness to share best practices and find like minded firms to share in innovation.  One has to wonder, will this impact the larger “greening” of data centers in general.   Without that kind of pressure – will people move back to what is comfortable?

Time will certainly tell.   I was going to make a joke about the fact that until time proves out I may have to “lawyer” up just to be safe.  Its not really a joke however because I’m going to bet other firms do something similar and that, my dear friends, is how the innovation will start to freeze.

 

\Mm

The Cloud Politic – How Regulation, Taxes, and National Borders are shaping the infrastructure of the cloud

Most people think of ‘the cloud’ as a technical place defined by technology, the innovation of software leveraged across a scale of immense proportions and ultimately a belief that its decisions are guided by some kind of altruistic technical meritocracy.  At some levels that is true on others one needs to remember that the ‘cloud’ is ultimately a business.  Whether you are talking about the Google cloud, the Microsoft cloud, Amazon Cloud, or Tom and Harry’s Cloud Emporium, each is a business that ultimately wants to make money.   It never ceases to amaze me that in a perfectly solid technical or business conversation around the cloud people will begin to wax romantic and lose sight of common sense.  These are very smart technical or business savvy people but for some reason the concept of the cloud has been romanticized into something almost philosophical, a belief system,  something that actually takes on the wispy characteristics that the term actually conjures up.  

When you try to bring them down to the reality the cloud is essentially large industrial buildings full of computers, running applications that have achieved regional or even global geo-diversity and redundancy you place yourself in a tricky place that at best labels you a kill-joy and at worst a Blasphemer.

I have been reminded of late of a topic that I have been meaning to write about. As defined by my introduction above, some may find it profane, others will choose to ignore it as it will cause them to come crashing to the ground.   I am talking about the unseemly and terribly disjointed intersection of Government regulation, Taxes, and the Cloud.   This also loops in “the privacy debate” which is a separate conversation almost all to itself.   I hope to touch on privacy but only as it touches these other aspects.

As many of you know my roles past and present have focused around the actual technical delivery and execution of the cloud.    The place where pure software developers fear to tread.  The world of large scale design, construction and operations specifically targeted at a global infrastructure deployment and its continued existence into perpetuity.   Perpetuity you say?  That sounds a bit to grandiose doesn’t it?  My take is that once you have this kind of infrastructure deployed it will become an integral part of how we as a species will continue to evolve in our communications and our technological advances.  Something this cool is powerful.  Something this cool is a game changer.  Something this cool will never escape the watchful eyes of the world governments and in fact it hasn’t. 

There was a recent article at Data Center Knowledge regarding Microsoft’s decision remove its Azure Cloud platform out of the State of Washington and relocate them (whether virtually or physically) to be run in the state of Texas.  Other articles have highlighted similar conversations with Yahoo and the state of Washington, or Google and the state of  North Carolina.   These decisions all have to do with state level taxes and their potential impact on the upfront capital costs or long term operating costs of the cloud.   You are essentially seeing the beginning of a cat and mouse game that will last for some time on a global basis.  States and governments are currently using their blunt, imprecise instruments of rule (regulations and taxes) to try and regulate something they do not yet understand but know they need to play apart of.   Its no secret that technology is advancing faster than our society can gauge its overall impact or its potential effects and the cloud is no different.

In my career I have been responsible for the creation of at least 3 different site selection programs.  Upon these programs were based the criteria and decisions of where to place cloud and data center infrastructure would reside.  Through example and practice,  I have been able to deconstruct other competitors criteria and their relative weightings at least in comparison to my own and a couple of things jump out very quickly at anyone truly studying this space.   While most people can guess the need for adequate power and communications infrastructure, many are surprised that tax and regulation play such a significant role in even the initial sighting of a facility.   The reason is pure economics over the total lifetime of an installation. 

I cannot tell you how often I have economic development councils or business development firms come to me to tell me about the next ‘great’ data center location.  Rich in both power infrastructure and telecommunications, its proximities to institutions of higher learning, etc.   Indeed there are some really great places that would seem ideal for data centers if one allowed them to dwell in the “romanticized cloud”.   What they fail to note, or understand is that there may be legislation or regulation already on the books, or perhaps legislation currently winding its way through the system that could make it an inhospitable place or at least slightly less welcoming to a data center.    As someone responsible for tens of millions or hundreds of millions, or even billions of dollars worth of investment you find yourself in a role where you are reading and researching legislation often.  Many have noted my commentary on the Carbon Reduction Commitment in the UK, or my most recent talks about the current progress and data center impacts of the Waxman-Markey bill in the US House of Representatives.  You pay attention because you have to pay attention.   Your initial site selection is supremely important because you not only need to look for the “easy stuff” like power and fiber, but you need to look longer term, you need to look at the overall commitment of a region or an area to support this kind of infrastructure.   Very large business decisions are being made against these “bets” so you better get them right.  

To be fair the management infrastructure in many of these cloud companies are learning as they go as well.   Most of these firms are software companies who have now been presented with the dilemma of managing large scale capital assets.  Its no longer about Intellectual Property, its about physical property and there are some significant learning curves associated with that.   Add to the mix that this is whole cloud thing is something entirely new.    

One must also keep in mind that even with the best site selection program and the most robust up front due diligence, people change, governments change, rules change and when that happens it can and will have very large impacts on the cloud.   This is not something cloud providers are ignoring either.  Whether its through their software, through infrastructure, through a more modular approach they are trying to solve for the eventuality that things will change.   Think about the potential impacts from a business perspective.

Lets pretend you own a cloud and have just sunk 100M dollars into a facility to house part of your cloud infrastructure.   You spent lots of money in your site selection and up front due diligence to find the very best place to put a data center.   Everything is going great, after 5 years you have a healthy population of servers in that facility, you have found a model to monetize your service, so things are going great, but then the locale where your data center lives changes the game a bit.   They pass a law that states that servers engaged in the delivery of a service are a taxable entity.  Suddenly that place becomes very inhospitable to your business model.   You now have to worry about what that does to your business.   It could be quite disastrous.   Additionally if you rule that such a law would instantly impact your business negatively, you have the small matter of a 100M asset sitting in a region where you cannot use it.   Again a very bad situation.  So how do you architect around this?  Its a challenge that many people are trying to solve.   Whether you want to face it or not, the ‘Cloud’ will ultimately need to be mobile in its design.  Just like its vapory cousins in the sky, the cloud will need to be on the move, even if its a slow move.  Because just as there are forces looking to regulate and control the cloud, there are also forces in play where locales are interested in attracting and cultivating the cloud.  It will be a cycle that repeats itself over and over again.

So far we have looked at this mostly from a taxation perspective.   But there are other regulatory forces in play.    I will use the example of Canada. The friendly frosty neighbors to the great white north of the United States.  Its safe to say that Canada and US have had historically wonderful relations with one another.   However when one looks through the ‘Cloud’ colored looking glass there are some things that jump out to the fore. 

In response to the Patriot Act legislation after 9-11, the Canadian government became concerned with the rights given to the US government with regards to the seizure of online information.  They in turn passed a series of Safe-Harbor-like laws that stated that no personally identifiable information of Canadian citizens could be housed outside of the Canadian borders.    Other countries have done, or are in process with similar laws.   This means that at least some aspects of the cloud will need to be anchored regionally or within specific countries.    A boat can drift even if its anchored and so must components of the cloud, its infrastructure and design will need to accommodate for this.  This touches on the privacy issue I talked about before.   I don’t want to get into the more esoteric conversations of Information and where its allowed to live and not live, I try to stay grounded in the fact that whether my romantic friends like it or not, this type of thing is going to happen and the cloud will need to adapt.

Its important to note that none of the legislation focuses on ‘the cloud’ or ‘data centers’ just yet.   Just as the Waxman-Markey bill or CRC in the UK doesn’t specifically call out data centers, those laws will have significant impacts on the infrastructure and shape of the cloud itself. 

There is an interesting chess board developing between technology versus regulation.   They are inexorably intertwined with one another and each will shape the form of the other in many ways.   A giant cat an mouse game on a global level.   Almost certainly, this evolution wont be  the most “technically superior” solution.  In fact, these complications will make the cloud a confusing place at times.   If you desired to build your own application using only cloud technology, would you subscribe to a service to allow the cloud providers to handle these complications?  Would you and your application  be liable for regulatory failures in the storage of  Azerbaijani-nationals?  Its going to be an interesting time for the cloud moving forward. 

One can easily imagine personally identifiable information housed in countries of origin, but the technology evolving so that their actions on the web are held elsewhere, perhaps even regionally where the actions take place.  You would see new legislation emerging to potentially combat even that strategy and so the cycle will continue.  Likewise you might see certain types of load compute or transaction work moving around the planet to align with more technically savvy or advantageous locales.  Just as the concept of Follow the Moon has emerged for a potential energy savings strategy to move load around based on the lowest cost energy, it might someday be followed with a program similarly move information or work to more “friendly” locales.     The modularity movement of data center design will likely grow as well trying to reduce the overall exposure the cloud firms have in any given market or region.   

On this last note, I am reminded of one of my previous posts. I am firm in my belief that Data Centers will ultimately become the Sub-Stations of the information utility.  In that evolution they will become more industrial, more commoditized, with more intelligence at the software layer to account for all these complexities.  As my own thoughts and views evolve around this I have come to my own strange epiphany.  

Ultimately the large cloud providers should care less and less about the data centers they live in.  These will be software layer attributes to program against.  Business level modifiers on code distribution.   Data Centers should be immaterial components for the Cloud providers.  Nothing more than containers or folders in which to drop their operational code.  Today they are burning through tremendous amounts of capital believing that these facilities will ultimately give them strategic advantage.   Ultimately these advantages will be fleeting and short-lived.  They will soon find themselves in a place where these facilities themselves will become a drag on their balance sheets or cause them to invest more in these aging assets. 

Please don’t get me wrong, the cloud providers have been instrumental in pushing this lethargic industry into thinking differently and evolving.   For that you need give them appropriate accolades.  At some point however, this is bound to turn into a losing proposition for them.  

How’s that for Blasphemy?

\Mm