Industry Impact : Brothers from Different Mothers and Beyond…

Screen Shot 2013-11-15 at 12.19.43 PM

My reading material and video watching habits these past two weeks have brought me some incredible joy and happiness. Why?  Because Najam Ahmad of Facebook is finally getting some credit for the amazing work that he has done and been doing in the world of Software Defined Networking.  In my opinion Najam is a Force Majeure in the networking world.   He is passionate.  He is focused. He just gets things done.  Najam and I worked very closely at Microsoft as we built out and managed the company’s global infrastructure. So closely in fact that we were frequently referred to as brothers from different mothers.   Wherever Najam was-I was not far behind, and vice versa. We laughed. We cried.  We fought.  We had alot of fun while delivered some pretty serious stuff.  To find out that he is behind the incredible Open Compute Project advances in Networking is not surprising at all.   Always a forward thinking guy he has never been satisfied with the status quo.    
If you have missed any of that coverage you I strongly encourage you to have a read at the links below.   



This got me to thinking about the legacy of the Microsoft program on the Cloud and Infrastructure Industry at large.   Data Center Knowledge had an article covering the impact of some of the Yahoo Alumni a few years ago. Many of those folks are friends of mine and deserve great credit.  In fact, Tom Furlong now works side by side with Najam at Facebook.    The purpose of my thoughts are not to take away from their achievements and impacts on the industry but rather to really highlight the impact of some of the amazing people and alumni from the Microsoft program.  Its a long overdue acknowledgement of the legacy of that program and how it has been a real driving force in large scale infrastructure.   The list of folks below is by no means comprehensive and doesnt talk about the talented people Microsoft maintains in their deep stable that continue to drive the innovative boundaries of our industry.  

Christian Belady of Microsoft – Here we go, first person mentioned and I already blow my own rule.   I know Christian is still there at Microsoft but its hard not to mention him as he is the public face of the program today.  He was an innovative thinker before he joined the program at Microsoft and was a driving thought leader and thought provoker while I was there.  While his industry level engagements have been greatly sidelined as he steers the program into the future – he continues to be someone willing to throw everything we know and accept today into the wind to explore new directions.
Najam Ahmad of Facbook – You thought  I was done talking about this incredible guy?  Not in the least, few people have solved network infrastructure problems at scale like Najam has.   With his recent work on the OCP front finally coming to the fore, he continues to drive the capabilities of what is possible forward.  I remember long meetings with Network vendors where Najam tried to influence capabilities and features with the box manufacturers within the paradigm of the time, and his work at Facebook is likely to end him up in a position where he is both loved and revilved by the Industry at large.  If that doesn’t say your an industry heavy weight…nothing does.
James Hamilton of Amazon – There is no question that James continues to drive deep thinking in our industry. I remain an avid reader of his blog and follower of his talks.    Back in my Microsoft days we would sit  and argue philosophical issues around the approach to our growth, towards compute, towards just about everything.   Those conversations either changed or strengthed my positions as the program evolved.   His work in the industry while at Microsoft and beyond has continued to shape thinking around data centers, power, compute, networking and more.
Dan Costello of Google – Dan Costello now works at Google, but his impacts on the Generation 3 and Generation 4 data center approaches and the modular DC industry direction overall  will be felt for a very long time to come whether Google goes that route or not.   Incredibly well balanced in his approach between technology and business his ideas and talks continue to shape infrastructre at scale.  I will spare people the story of how I hired him away from his previous employer but if you ever catch me at a conference, its a pretty funny story. Not to mention the fact that he is the second best break dancer in the Data Center Industry.
Nic Bustamonte of Google – Nic is another guy who has had some serious impact on the industry as it relates to innovating the running and operating of large scale facilities.   His focus on the various aspects of the operating environments of large scale data centers, monioring, and internal technology has shifted the industry and really set the infancy for DCIM in motion.   Yes, BMS systems have been around forever, and DCIM is the next interation and blending of that data, but his early work here has continued to influence thinking around the industry.
Arne Josefsberg of ServiceNow – Today Arne is the CTO of Service Now, and focusing on infrastructure and management for enterprises to the big players alike and if their overall success is any measure, he continues to impact the industry through results.  He is *THE* guy who had the foresight of building an organiation to adapt to this growing change of building and operating at scale.   He the is the architect of building an amazing team that would eventually change the industry.
Joel Stone of Savvis/CenturyLink – Previously the guy who ran global operations for Microsoft, he has continued to drive excellence in Operations at Global Switch and now at Savvis.   An early adopter and implmenter of blending facilities and IT organizations he mastered issues a decade ago that most companies are still struggling with today.
Sean Farney of Ubiquity – Truly the first Data center professional who ever had to productize and operationalize data center containers at scale.   Sean has recently taken on the challenge of diversifying data center site selection and placement at Ubquity repurposing old neighorbood retail spaces (Sears, etc) in the industry.   Given the general challenges of finding places with a confluence of large scale power and network, this approach may prove to be quite interesting as markets continue to drive demand.   
Chris Brown of Opscode – One of the chief automation architects at my time at Microsoft, he has moved on to become the CTO of Opscode.  Everyone on the planet who is adopting and embracing a DevOps has heard of, and is probably using, Chef.  In fact if you are doing any kind of automation at large scale you are likely using his code.
None of these people would be comfortable with the attention but I do feel credit should be given to these amazing individuals who are changing our industry every day.    I am so very proud to have worked the trenches with these people. Life is always better when you are surrounded by those who challenge and support you and in my opinion these folks have taken it to the next level.
\Mm

Google Purchase of Deep Earth Mining Equipment in Support of ‘Project Rabbit Ears’ and Worldwide WIFI availability…

(10/31/2013 – Mountain View, California) – Close examination of Google’s data center construction related purchases has revealed the procurement of large scale deep earth mining equipment.   While the actual need for the deep mining gear is unclear, many speculate that it has to do with a secretive internal project that has come to light known only as Project: Rabbit Ears. 

According to sources not at all familiar with Google technology infrastructure strategy, Project Rabbit ears is the natural outgrowth of Google’ desire to provide ubiquitous infrastructure world wide.   On the surface, these efforts seem consistent with other incorrectly speculated projects such as Project Loon, Google’s attempt to provide Internet services to residents in the upper atmosphere through the use of high altitude balloons, and a project that has only recently become visible and the source of much public debate – known as ‘Project Floating Herring’, where apparently a significantly sized floating barge with modular container-based data centers sitting in the San Francisco Bay has been spied. 

“You will notice there is no power or network infrastructure going to any of those data center shipping containers,” said John Knownothing, chief Engineer at Dubious Lee Technical Engineering Credibility Corp.  “That’s because they have mastered wireless electrical transfer at the large multi-megawatt scale.” 

Real Estate rates in the Bay Area have increased almost exponentially over the last ten years making the construction of large scale data center facilities an expensive endeavor.  During the same period, The Port of San Francisco has unfortunately seen a steady decline of its import export trade.  After a deep analysis it was discovered that docking fees in the Port of San Francisco are considerably undervalued and will provide Google with an incredibly cheap real estate option in one of the most expensive markets in the world. 

It will also allow them to expand their use of renewable energy through the use of tidal power generation built directly into the barges hull.   “They may be able to collect as much as 30 kilowatts of power sitting on the top of the water like that”, continues Knownothing, “and while none of that technology is actually visible, possible, or exists, we are certain that Google has it.”

While the technical intricacies of the project fascinate many, the initiative does have its critics like Compass Data Center CEO, Chris Crosby, who laments the potential social aspects of this approach, “Life at sea can be lonely, and no one wants to think about what might happen when a bunch of drunken data center engineers hit port.”  Additionally, Crosby mentions the potential for a backslide of human rights violations, “I think we can all agree that the prospect of being flogged or keel hauled really narrows down the possibility for those outage causing human errors. Of course, this sterner level of discipline does open up the possibility of mutiny.”

However, the public launch of Project Floating Herring will certainly need to await the delivery of the more shrouded Project Rabbit Ears for various reasons.  Most specifically the primary reason for the development of this technology is so that Google can ultimately drive the floating facility out past twelve miles into International waters where it can then dodge all national, regional, and local taxation, the safe harbor and privacy legislation of any country or national entity on the planet that would use its services.   In order to realize that vision, in the current network paradigm, Google would need exceedingly long network cables  to attach to Network Access Points and Carrier Connection points as the facilities drive through international waters.

This is where Project Rabbit Ears becomes critical to the Google Strategy.   Making use of the deep earth mining equipment, Google will be able to drill deep into the Earths crust, into the mantle, and ultimately build a large Network Access Point near the Earth’s core.  This Planetary WIFI solution will be centrally located to cover the entire earth without the use of regional WIFI repeaters.  Google’s floating facilities could then gain access to unlimited bandwidth and provide yet another consumer based monetization strategy for the company. 

Knownothing also speculates that such a move would allow Google to make use of enormous amounts of free geo-thermic power and almost singlehandedly become the greenest power user on the planet.   Speculation also abounds that Google could then sell that power through its as yet un-invented large scale multi-megawatt wireless power transfer technology as unseen on its floating data centers.

Much of the discussion around this kind of technology innovation driven by Google has been given credible amounts of veracity and discussed by many seemingly intelligent technology based news outlets and industry organizations who should intellectually know better, but prefer not to acknowledge the inconvenient lack of evidence.

 

\Mm

Editors Note: I have many close friends in the Google Infrastructure organization and firmly believe that they are doing some amazing, incredible work in moving the industry along especially solving problems at scale.   What I find simply amazing is in the search for innovation how often our industry creates things that may or may not be there and convince ourselves so firmly that it exists. 

2014 The Year Cloud Computing and Internet Services will be taxed. A.K.A Je déteste dire ça. Je vous l’avais dit.

 

france

Its one of those times I really hate to be right.  As many of you know I have been talking about the various grass roots efforts afoot across many of the Member EU countries to start driving a more significant tax regimen on Internet based companies.  My predictions for the last few years have more been cautionary tales based on what I saw happening from a regulatory perspective on a much smaller scale, country to country.

Today’s Wall Street Journal has an article discussing France’s movements to begin taxation on Internet related companies who derive revenue from users and companies across the entirety of the EU, but holding those companies responsible to the tax base in each country.   This could likely mean that such legislation is likely to become quite fractured and tough for Internet Companies to navigate.  The French proposition is asking the European Commission to draw up proposals by the Spring of 2014.

This is likely to have a very interesting (read as cost increases) across just about every aspect of Internet and Cloud Computing resources.  From a business perspective this is going to increase costs which will likely be passed on to consumers in small but interesting ways.  Internet advertising will need to be differentiated on a country by country basis, and advertisers will end up having different cost structures, Cloud Computing Companies will DEFINITELY need to understand where instances of customer instances were, and whether or not they were making money.  Potentially more impactful, customers of Cloud computing may be held to account for taxation accountability that they did not know they had!  Things like Data Center Site Selection are likely going to become even more complicated from a tax analysis perspective as countries with higher populations will likely become no-go zones (perhaps) or require the passage of even more restrictive laws around it.

Its not like the seeds of this haven’t been around since 2005, I think most people just preferred to keep a blind eye to the tax that the seed was sprouting into a full fledged tree.   Going back to my Cat and Mouse Papers from a few years ago…  The Cat has caught the mouse, its now the mouse’s move.

\Mm

 

Authors Note: If you don’t have a subscription to the WSJ, All Things Digital did a quick synopsis of the article here.

Familiarity Breeds “Unsent”–The Modern Email problem at Scale.

Everyone has heard of the old sayings “Familiarity breeds contempt”, so I thought it apropos to introduce it as a less understood concept when it comes to large scale Internet mail challenges.   The “spark” for me was a copy of a TechCrunch article entitled “Why No one has tamed e-mail” by Gentry Underwood that was forwarded to me this morning.  It’s a really good article and highlights some of the challenges the mail space is / has been / continues to go through.  Not sure if it really matters, but for full disclosure TechCrunch is an AOL company, although the content was generated by the guest author. 

The article highlights many of the challenges with mail today and how it is broken, or more correctly how it has become broken over time.  The challenges around SPAM, the content of mail being long and convoluted, most e-mail clients being absolutely terrible and the like.  The author also correctly highlights that it’s a hard problem to solve from a technology, design, and business perspective.  As good as a primer as I think this article is, there are some larger issues at play here to truly fix the mail problem.

 

First Problem – This is not an Email Problem

E-Mail is just a symptom of a larger set of challenges.  I would posit that the problem space itself has evolved and morphed from e-mail into a “Digital Lifestyle Organization” (DLO) issue.  Solving issues relating to e-mail, only solves part of the problem space.   The real challenge is that users are now being bombarded by messages, information, news, and the like from tons of different sources.   The signal to noise ratio is getting extremely high and is resulting people’s frustrations to grow increasingly caustic.   Yes, there is Spam.  But there is an equal amount of opt-in notes and messages, social updates from various social media platforms, RSS feeds, News updates, and a lists of other categories.  It can all be a bit mind-boggling and off-putting.  The real question should not be how to solve the e-mail problem, but rather – how to solve the Digital Lifestyle Organization problem.   Mail is simply a single element in the larger problem.

That’s not to say the points in the article are wrong or cant be applied to this wider definition.  You still face a huge design, interface, workflow challenge trying to organize this kind of data.   The client in whatever form it takes must be easy to use and intuitive.  A task that is elusive in the e-mail space to be sure, and is likely to be even more rare in this new “DLO” world. 

Second Problem – There is comfort in an old pair of Shoes

One of the interesting things I have learned here at AOL since taking on the role of CTO is that there is actually strata of different kinds of users.  I would place myself in the category of power user willing to try and jump platforms for better look/feel, even minor differentiated features and capabilities.  This layer of “Technorati” are constantly looking for the next best thing.  There are other strata however, that don’t necessarily embrace that kind of rapid change, moderate change, or in fact there are layers that don’t like any change at all!  It’s a complicated set of issues and engineering challenges.  At a rough business level, you can lose users because you change to much or don’t change at all.   Some of my friends in the space consider this a timing issue… of when to perfectly time these transitions.  The facts are however, these different stratas change at their own pace and we should understand that certain strata will never change.  In essence you have a few options – An introduce a platform that changes regardless of the user base desires, stay stagnant and never change, or try to find some kind of hybrid solution knowing that it may ultimately increase some of the cost structures around supporting different kinds of environments.  Unless, of course, you build something new that is SO Compelling at to change the game entirely.  

Candidly speaking, this is an issue that AOL has struggled with longer than just about everyone else in the space.   Historically speaking we have oscillated across all of these solutions over close to 30 years with varying degrees of success.  While we can debate what ‘success’ looks like, there is no denying that we have been looking at the science of e-mail for a long time.  With a loyal user base of tens of millions to hundreds of millions users, it provides us with a rich data set to analyze behaviors, usage patterns, user feedback, costs, trends, and the like.   Its this data that highlights the different usage patterns and strata of users.   Its data that is impossible to get as a start-up, and so immensely massive that categorization is no trivial task.

The process we use in terms of understanding these strata could best be described as ‘Electronic Ethnography’.  There are some interesting differentiations in how mail and Digital Lifestyle in aggregate  is used across a variety of variables.  Things like age, gender, location, locale, friends, social circles, and a host of others all play a role in defining the strata.  Simply speaking there are some strata that simply don’t want to change.   Others are very comfortable with their current e-mail experience and don’t see a driving need to change, etc. 

 

An example of Electronic Ethnography

This essentially nets out to the fact that E-mail and information aggregation is a very complex space.  We must face the fact there will be segments of users that will simply not change because something cooler has come about or some features were added or modified.  In my personal opinion the only way to truly impact and change these behaviors is to come up with something so compelling, so different, that it changes the game and solves the DLO issues.

 

Third Problem – BIG and Complex

While this was called out by Gentry Underwood in his article it cannot be stated enough.  Whether you focus on mail, the larger DLO space, or any kind of personal information aggregation – there are a host of factors, variables, and challenges to really solve this space.   It also drives big infrastructure, operations, and the like.  Its not going to be easy.  As the TechCrunch article headlines – Go Big or Go Home.   It’s a huge problem space, It’s a huge opportunity, and half measures may help but in and of themselves wont do much to move the needle.   Mail and what I call the DLO space is a huge opportunity of the future of our usage of the Internet medium, in fact it may be the biggest.   There will likely continue to be many casualties trying to solve it.

 

Fourth Problem – Monetization

From a pure business perspective – Its hard to make money off of mail.   The most common way to monetize mail (or aggregated information) is likely to be advertising.  However, advertising has a direct negative impact on the overall user experience in general and is a key driver of user loss.  You can easily see this in the overall reduction of “advertising” in mail across a number of key players.   Another method is tying it to an overall paid user subscription. But this is challenging as well, are the features and overall “stickiness” of your product something that customers will see a continued value for.  At AOL we have both models in use.   Interestingly, we have users in both models, that fall into the strategy that consider “change” as bad.  As mentioned in the third problem, mail is a big problem, and will require some kind of monetization scheme to justify some of the efforts.  While the larger players have existing user bases to help with this challenge, it’s a real issue for some of the more innovative ideas coming out of smaller start-ups, and is likely a key reason for their potential demise.  The person or firm who comes up with a non-ad/non-subscription based monetization strategy will truly change the game in this space.

 

With Google’s purchase of Sparrow, the re-design of Microsoft’s Outlook product, some interesting announcements that we have coming out, and a small explosion of start-ups in this space – Things are starting to get interesting.  Hard.  But interesting for sure.  May have to post more on this in the near future.

 

\Mm

Rolling Clouds – My Move into the Mobile Cloud

As many of you saw in my last note, I have officially left Digital Realty Trust to address some personal things.   While I get those things in order I am not sitting idling by.   I am extremely happy to announce that I have taken a role at Nokia as their VP of Service Operations.  In this role I will have global responsibility for the strategy, operation and run of infrastructure aspects for Nokia’s new cloud and mobile services platforms.

Its an incredibly exciting role especially when you think of the fact that the number of mobile hand-held’s around the world are increasingly becoming the interface by which people are consuming information.  Whether that be Navigation-based applications or other content related platforms your phone is becoming your gateway to the world. 

I am also very excited by the fact that there are some fierce competitors in this space as well.  Once again I will be donning my armor and doing battle with my friends at Google.   Their Droid platform is definitely interesting and it will be interesting to see how that develops.  I have a great amount of respect for Urs Hoelze and their cloud platform is something I am fairly familiar with .  I will also be doing battle with the folks from Apple (and interestingly my good friend Olivier Sanche).  Apple definitely has the high end hand-held market here in the US, but its experience in Cloud platforms and operations is not very sophisticated just yet.  On some levels I guess I am even competing against the infrastructure and facilities I built out at Microsoft at least as it relates to the mobile world.  Those are some meaty competitors and as you have seen before, I love a good fight.

In my opinion, Nokia has some very interesting characteristics that position it extremely well if not atop the fray in this space.   First there is no arguing about Nokia penetration of hand-held devices across the world.  Especially in markets like India, China, South America, and other emerging Internet-using populations.    Additionally these emerging economies are skipping past ground-based wired technologies to wireless connectivity.   As a result of that, Nokia has an incredible presence already in those markets.   Their OVI platform today already has a significant population of users (measured at least in the 10s of millions) and so scale at the outset is definitely there.    When I think about the challenge that Google has in getting device penetration out there, or Apples high-end (and mostly US) only approach you can see the opportunity.    I am extremely excited to get going.

Hope you will join me for an incredible ride!

\Mm

The Cloud Politic – How Regulation, Taxes, and National Borders are shaping the infrastructure of the cloud

Most people think of ‘the cloud’ as a technical place defined by technology, the innovation of software leveraged across a scale of immense proportions and ultimately a belief that its decisions are guided by some kind of altruistic technical meritocracy.  At some levels that is true on others one needs to remember that the ‘cloud’ is ultimately a business.  Whether you are talking about the Google cloud, the Microsoft cloud, Amazon Cloud, or Tom and Harry’s Cloud Emporium, each is a business that ultimately wants to make money.   It never ceases to amaze me that in a perfectly solid technical or business conversation around the cloud people will begin to wax romantic and lose sight of common sense.  These are very smart technical or business savvy people but for some reason the concept of the cloud has been romanticized into something almost philosophical, a belief system,  something that actually takes on the wispy characteristics that the term actually conjures up.  

When you try to bring them down to the reality the cloud is essentially large industrial buildings full of computers, running applications that have achieved regional or even global geo-diversity and redundancy you place yourself in a tricky place that at best labels you a kill-joy and at worst a Blasphemer.

I have been reminded of late of a topic that I have been meaning to write about. As defined by my introduction above, some may find it profane, others will choose to ignore it as it will cause them to come crashing to the ground.   I am talking about the unseemly and terribly disjointed intersection of Government regulation, Taxes, and the Cloud.   This also loops in “the privacy debate” which is a separate conversation almost all to itself.   I hope to touch on privacy but only as it touches these other aspects.

As many of you know my roles past and present have focused around the actual technical delivery and execution of the cloud.    The place where pure software developers fear to tread.  The world of large scale design, construction and operations specifically targeted at a global infrastructure deployment and its continued existence into perpetuity.   Perpetuity you say?  That sounds a bit to grandiose doesn’t it?  My take is that once you have this kind of infrastructure deployed it will become an integral part of how we as a species will continue to evolve in our communications and our technological advances.  Something this cool is powerful.  Something this cool is a game changer.  Something this cool will never escape the watchful eyes of the world governments and in fact it hasn’t. 

There was a recent article at Data Center Knowledge regarding Microsoft’s decision remove its Azure Cloud platform out of the State of Washington and relocate them (whether virtually or physically) to be run in the state of Texas.  Other articles have highlighted similar conversations with Yahoo and the state of Washington, or Google and the state of  North Carolina.   These decisions all have to do with state level taxes and their potential impact on the upfront capital costs or long term operating costs of the cloud.   You are essentially seeing the beginning of a cat and mouse game that will last for some time on a global basis.  States and governments are currently using their blunt, imprecise instruments of rule (regulations and taxes) to try and regulate something they do not yet understand but know they need to play apart of.   Its no secret that technology is advancing faster than our society can gauge its overall impact or its potential effects and the cloud is no different.

In my career I have been responsible for the creation of at least 3 different site selection programs.  Upon these programs were based the criteria and decisions of where to place cloud and data center infrastructure would reside.  Through example and practice,  I have been able to deconstruct other competitors criteria and their relative weightings at least in comparison to my own and a couple of things jump out very quickly at anyone truly studying this space.   While most people can guess the need for adequate power and communications infrastructure, many are surprised that tax and regulation play such a significant role in even the initial sighting of a facility.   The reason is pure economics over the total lifetime of an installation. 

I cannot tell you how often I have economic development councils or business development firms come to me to tell me about the next ‘great’ data center location.  Rich in both power infrastructure and telecommunications, its proximities to institutions of higher learning, etc.   Indeed there are some really great places that would seem ideal for data centers if one allowed them to dwell in the “romanticized cloud”.   What they fail to note, or understand is that there may be legislation or regulation already on the books, or perhaps legislation currently winding its way through the system that could make it an inhospitable place or at least slightly less welcoming to a data center.    As someone responsible for tens of millions or hundreds of millions, or even billions of dollars worth of investment you find yourself in a role where you are reading and researching legislation often.  Many have noted my commentary on the Carbon Reduction Commitment in the UK, or my most recent talks about the current progress and data center impacts of the Waxman-Markey bill in the US House of Representatives.  You pay attention because you have to pay attention.   Your initial site selection is supremely important because you not only need to look for the “easy stuff” like power and fiber, but you need to look longer term, you need to look at the overall commitment of a region or an area to support this kind of infrastructure.   Very large business decisions are being made against these “bets” so you better get them right.  

To be fair the management infrastructure in many of these cloud companies are learning as they go as well.   Most of these firms are software companies who have now been presented with the dilemma of managing large scale capital assets.  Its no longer about Intellectual Property, its about physical property and there are some significant learning curves associated with that.   Add to the mix that this is whole cloud thing is something entirely new.    

One must also keep in mind that even with the best site selection program and the most robust up front due diligence, people change, governments change, rules change and when that happens it can and will have very large impacts on the cloud.   This is not something cloud providers are ignoring either.  Whether its through their software, through infrastructure, through a more modular approach they are trying to solve for the eventuality that things will change.   Think about the potential impacts from a business perspective.

Lets pretend you own a cloud and have just sunk 100M dollars into a facility to house part of your cloud infrastructure.   You spent lots of money in your site selection and up front due diligence to find the very best place to put a data center.   Everything is going great, after 5 years you have a healthy population of servers in that facility, you have found a model to monetize your service, so things are going great, but then the locale where your data center lives changes the game a bit.   They pass a law that states that servers engaged in the delivery of a service are a taxable entity.  Suddenly that place becomes very inhospitable to your business model.   You now have to worry about what that does to your business.   It could be quite disastrous.   Additionally if you rule that such a law would instantly impact your business negatively, you have the small matter of a 100M asset sitting in a region where you cannot use it.   Again a very bad situation.  So how do you architect around this?  Its a challenge that many people are trying to solve.   Whether you want to face it or not, the ‘Cloud’ will ultimately need to be mobile in its design.  Just like its vapory cousins in the sky, the cloud will need to be on the move, even if its a slow move.  Because just as there are forces looking to regulate and control the cloud, there are also forces in play where locales are interested in attracting and cultivating the cloud.  It will be a cycle that repeats itself over and over again.

So far we have looked at this mostly from a taxation perspective.   But there are other regulatory forces in play.    I will use the example of Canada. The friendly frosty neighbors to the great white north of the United States.  Its safe to say that Canada and US have had historically wonderful relations with one another.   However when one looks through the ‘Cloud’ colored looking glass there are some things that jump out to the fore. 

In response to the Patriot Act legislation after 9-11, the Canadian government became concerned with the rights given to the US government with regards to the seizure of online information.  They in turn passed a series of Safe-Harbor-like laws that stated that no personally identifiable information of Canadian citizens could be housed outside of the Canadian borders.    Other countries have done, or are in process with similar laws.   This means that at least some aspects of the cloud will need to be anchored regionally or within specific countries.    A boat can drift even if its anchored and so must components of the cloud, its infrastructure and design will need to accommodate for this.  This touches on the privacy issue I talked about before.   I don’t want to get into the more esoteric conversations of Information and where its allowed to live and not live, I try to stay grounded in the fact that whether my romantic friends like it or not, this type of thing is going to happen and the cloud will need to adapt.

Its important to note that none of the legislation focuses on ‘the cloud’ or ‘data centers’ just yet.   Just as the Waxman-Markey bill or CRC in the UK doesn’t specifically call out data centers, those laws will have significant impacts on the infrastructure and shape of the cloud itself. 

There is an interesting chess board developing between technology versus regulation.   They are inexorably intertwined with one another and each will shape the form of the other in many ways.   A giant cat an mouse game on a global level.   Almost certainly, this evolution wont be  the most “technically superior” solution.  In fact, these complications will make the cloud a confusing place at times.   If you desired to build your own application using only cloud technology, would you subscribe to a service to allow the cloud providers to handle these complications?  Would you and your application  be liable for regulatory failures in the storage of  Azerbaijani-nationals?  Its going to be an interesting time for the cloud moving forward. 

One can easily imagine personally identifiable information housed in countries of origin, but the technology evolving so that their actions on the web are held elsewhere, perhaps even regionally where the actions take place.  You would see new legislation emerging to potentially combat even that strategy and so the cycle will continue.  Likewise you might see certain types of load compute or transaction work moving around the planet to align with more technically savvy or advantageous locales.  Just as the concept of Follow the Moon has emerged for a potential energy savings strategy to move load around based on the lowest cost energy, it might someday be followed with a program similarly move information or work to more “friendly” locales.     The modularity movement of data center design will likely grow as well trying to reduce the overall exposure the cloud firms have in any given market or region.   

On this last note, I am reminded of one of my previous posts. I am firm in my belief that Data Centers will ultimately become the Sub-Stations of the information utility.  In that evolution they will become more industrial, more commoditized, with more intelligence at the software layer to account for all these complexities.  As my own thoughts and views evolve around this I have come to my own strange epiphany.  

Ultimately the large cloud providers should care less and less about the data centers they live in.  These will be software layer attributes to program against.  Business level modifiers on code distribution.   Data Centers should be immaterial components for the Cloud providers.  Nothing more than containers or folders in which to drop their operational code.  Today they are burning through tremendous amounts of capital believing that these facilities will ultimately give them strategic advantage.   Ultimately these advantages will be fleeting and short-lived.  They will soon find themselves in a place where these facilities themselves will become a drag on their balance sheets or cause them to invest more in these aging assets. 

Please don’t get me wrong, the cloud providers have been instrumental in pushing this lethargic industry into thinking differently and evolving.   For that you need give them appropriate accolades.  At some point however, this is bound to turn into a losing proposition for them.  

How’s that for Blasphemy?

\Mm