Industry Impact : Brothers from Different Mothers and Beyond…

Screen Shot 2013-11-15 at 12.19.43 PM

My reading material and video watching habits these past two weeks have brought me some incredible joy and happiness. Why?  Because Najam Ahmad of Facebook is finally getting some credit for the amazing work that he has done and been doing in the world of Software Defined Networking.  In my opinion Najam is a Force Majeure in the networking world.   He is passionate.  He is focused. He just gets things done.  Najam and I worked very closely at Microsoft as we built out and managed the company’s global infrastructure. So closely in fact that we were frequently referred to as brothers from different mothers.   Wherever Najam was-I was not far behind, and vice versa. We laughed. We cried.  We fought.  We had alot of fun while delivered some pretty serious stuff.  To find out that he is behind the incredible Open Compute Project advances in Networking is not surprising at all.   Always a forward thinking guy he has never been satisfied with the status quo.    
If you have missed any of that coverage you I strongly encourage you to have a read at the links below.   



This got me to thinking about the legacy of the Microsoft program on the Cloud and Infrastructure Industry at large.   Data Center Knowledge had an article covering the impact of some of the Yahoo Alumni a few years ago. Many of those folks are friends of mine and deserve great credit.  In fact, Tom Furlong now works side by side with Najam at Facebook.    The purpose of my thoughts are not to take away from their achievements and impacts on the industry but rather to really highlight the impact of some of the amazing people and alumni from the Microsoft program.  Its a long overdue acknowledgement of the legacy of that program and how it has been a real driving force in large scale infrastructure.   The list of folks below is by no means comprehensive and doesnt talk about the talented people Microsoft maintains in their deep stable that continue to drive the innovative boundaries of our industry.  

Christian Belady of Microsoft – Here we go, first person mentioned and I already blow my own rule.   I know Christian is still there at Microsoft but its hard not to mention him as he is the public face of the program today.  He was an innovative thinker before he joined the program at Microsoft and was a driving thought leader and thought provoker while I was there.  While his industry level engagements have been greatly sidelined as he steers the program into the future – he continues to be someone willing to throw everything we know and accept today into the wind to explore new directions.
Najam Ahmad of Facbook – You thought  I was done talking about this incredible guy?  Not in the least, few people have solved network infrastructure problems at scale like Najam has.   With his recent work on the OCP front finally coming to the fore, he continues to drive the capabilities of what is possible forward.  I remember long meetings with Network vendors where Najam tried to influence capabilities and features with the box manufacturers within the paradigm of the time, and his work at Facebook is likely to end him up in a position where he is both loved and revilved by the Industry at large.  If that doesn’t say your an industry heavy weight…nothing does.
James Hamilton of Amazon – There is no question that James continues to drive deep thinking in our industry. I remain an avid reader of his blog and follower of his talks.    Back in my Microsoft days we would sit  and argue philosophical issues around the approach to our growth, towards compute, towards just about everything.   Those conversations either changed or strengthed my positions as the program evolved.   His work in the industry while at Microsoft and beyond has continued to shape thinking around data centers, power, compute, networking and more.
Dan Costello of Google – Dan Costello now works at Google, but his impacts on the Generation 3 and Generation 4 data center approaches and the modular DC industry direction overall  will be felt for a very long time to come whether Google goes that route or not.   Incredibly well balanced in his approach between technology and business his ideas and talks continue to shape infrastructre at scale.  I will spare people the story of how I hired him away from his previous employer but if you ever catch me at a conference, its a pretty funny story. Not to mention the fact that he is the second best break dancer in the Data Center Industry.
Nic Bustamonte of Google – Nic is another guy who has had some serious impact on the industry as it relates to innovating the running and operating of large scale facilities.   His focus on the various aspects of the operating environments of large scale data centers, monioring, and internal technology has shifted the industry and really set the infancy for DCIM in motion.   Yes, BMS systems have been around forever, and DCIM is the next interation and blending of that data, but his early work here has continued to influence thinking around the industry.
Arne Josefsberg of ServiceNow – Today Arne is the CTO of Service Now, and focusing on infrastructure and management for enterprises to the big players alike and if their overall success is any measure, he continues to impact the industry through results.  He is *THE* guy who had the foresight of building an organiation to adapt to this growing change of building and operating at scale.   He the is the architect of building an amazing team that would eventually change the industry.
Joel Stone of Savvis/CenturyLink – Previously the guy who ran global operations for Microsoft, he has continued to drive excellence in Operations at Global Switch and now at Savvis.   An early adopter and implmenter of blending facilities and IT organizations he mastered issues a decade ago that most companies are still struggling with today.
Sean Farney of Ubiquity – Truly the first Data center professional who ever had to productize and operationalize data center containers at scale.   Sean has recently taken on the challenge of diversifying data center site selection and placement at Ubquity repurposing old neighorbood retail spaces (Sears, etc) in the industry.   Given the general challenges of finding places with a confluence of large scale power and network, this approach may prove to be quite interesting as markets continue to drive demand.   
Chris Brown of Opscode – One of the chief automation architects at my time at Microsoft, he has moved on to become the CTO of Opscode.  Everyone on the planet who is adopting and embracing a DevOps has heard of, and is probably using, Chef.  In fact if you are doing any kind of automation at large scale you are likely using his code.
None of these people would be comfortable with the attention but I do feel credit should be given to these amazing individuals who are changing our industry every day.    I am so very proud to have worked the trenches with these people. Life is always better when you are surrounded by those who challenge and support you and in my opinion these folks have taken it to the next level.
\Mm

The Soft Whisper that Big Data, Cloud Computing, and Infrastructure at Scale should treat as a Clarion Call.

The Cloud Jail

On Friday, August 23rd, the Chinese government quietly released Shi Tao from prison.   He was released a full fifteen months before his incarceration was supposed to end.  While certainly a relief to his family and friends, it’s likely a bittersweet ending to a sour turn of events.

Just who is Shi Tao and what the heck does he have to do with Big Data?  Why is he important to Cloud Computing and big infrastructure?   Is he a world-class engineer who understands technology at scale?  Is he a deep thinker of all things cloud?  Did he invent some new technology poised to revolutionize and leap frog our understanding?

No.  He is none of these things.  

He is a totem of sorts.   A living parable and a reminder of the realities that many in the Cloud Computing industry and those who deal with Big Data, rarely if ever address head on or give little mind to.   He represents the cautionary tale of what can happen if companies and firms don’t fully vet the impacts of their Technology Choices grounded by the real world.  The site selection of their data centers.   The impact of how data is stored.  Where that data is stored.   The methods used of storing the data.  In short a responsibility for the full accounting and consideration of their technological and informational artifacts. 

To an engineering mind that responsibility generally means the most efficient storage of data with the least amount of cost.  Using the most direct method or the highest performing algorithm.  In short…to continually build a better mouse trap.  

In site selecting of new data centers it would likely be limited to just the basic real estate and business drivers.   What is the power cost?  What is the land cost?  What is my access to water? Is there sufficient network nearby?  Can I negotiate tax breaks at the country and/or at local levels?

In selecting a cloud provider its generally about avoiding large capital costs and paying what I need, when I need it.

In the business landscape of tomorrow, these thoughts will prove short-sighted and may likely expose your company to significant cost and business risks they are not contemplating or worse!

Big Data is becoming a dangerous game.  To be fair content and information in general has always been a bit of a dangerous game.   In Technology, we just go on pretending we live under a Utopian illusion that fairness  ultimately rules the world.  It doesn’t.   Businesses have an inherent risk collecting, storing, analyzing , and using the data that they obtain.  Does that sound alarmist or  jaded?  Perhaps, but its spiced with some cold hard realities that are becoming ever more present every day and you ignore at your own peril.  

Shi was arrested in 2004 and sentenced to prison the following year on charges of disclosing state secrets.  His crime? He had sent details of a government memo restricting news coverage to a human rights group in the United States.  The Chinese government demanded that Yahoo! (his mail provider) turn over all mail records (Big Data) to the authorities. Something they ultimately did.  

Now Before you go and get your Western Democracy Sensibilities all in a bunch and cry foul-that ugly cold hard reality thing I was talking about plays a real part here.  As Yahoo was operating as a business inside China, they were bound by comply with Chinese law no matter how hard the action was to stomach for them.   Around that time Yahoo sold most of its stake in the Chinese market to Alibaba and as of the last month or so Yahoo has since left China altogether.  

Yahoo’s adventure in Data information risk and governmental oversight was not over however.  They were brought before the US Congress on charges of Human Rights Violations.   Placing them once again into a pot of boiling water from a governmental concern closer to home.  

These events took place back almost seven years ago and I would argue that the world of information, big data, and scaled infrastructure has actually gotten more convoluted and tricky to deal with.   With the advent of Amazon AWS and other cloud services, a lack of understanding of regional and local Safe Harbor practices amongst enterprises and startups alike,  concepts like chain of custody and complicated and recursive ownership rights can be obfuscated to the point of insanity if you don’t have a program to manage it.    We don’t have to use the example of China either, similar complexities are emerging across and internal to  Europe .  Is your company really thinking through Big Data?  Do you fully understand ownership in a clouded environment?  Who is responsible for taxation for you local business hosted internationally?  What if your cloud servers ,with your data, hosted by a cloud platform  were confiscated by local and regional governments without your direct involvement?  Are you strategically storing data in a way that protects yourself? Do you even have someone looking at these risks to your business? 

As a recovering network engineer I am reminded by an old joke referring to the OSI Model.   The OSI Model categorizes all functions of a communication system into seven logical layers.   It makes internetworking clear and efficient and easily categorized.  Of course, as every good network engineer knows, it doesn’t account for layers 8 and 9.  But wait!  You said there were only 7!  Well Layers 8 and 9 are Politics and Religion.    These layers exist in Cloud Computing and Big Data too and are potentially more impactful to the business overall.

All of these scenarios do not necessarily lend themselves to be the most direct or efficient, but its pretty clear that you can save yourself a whole lot of time and heartache if you think about them strategically.  The infrastructure of tomorrow is powerful, robust, and ubiquitous.   You simply cannot manage this complex eco-system the same ways you have in the past and just like the technology your thinking needs to evolve.   

\Mm

The Cloud Cat and Mouse Papers – The Primer

image_thumb1_thumb

Cat and Mouse with Multi-national infrastructure –The Participants to date

There is an ever-changing, game of cat and mouse developing in the world of cloud computing.   Its not a game that you as a consumer might see, but it is there.  An undercurrent that has been there from the beginning.   It pits Technology Companies and multi-national infrastructure against local and national governments.  For some years this game of cat and mouse has been quietly played out in backrooms, development and technology roadmap re-works, across negotiation tables, and only in the rarest of cases – have they come out for industry scrutiny or visibility.  To date the players have been limited to likes of Google, Microsoft, Amazon, and others who have scaled their technology infrastructure across the globe and in large measure those are the players that governments have moved against in an ever intricate chess game.  I myself have played apart in the measure/counter-measure give and take of this delicate dance.

The primary issues in this game have to do with realization of revenue for taxation purposes, Safe Harbor and issues pertaining to personally identifiable information, ownership of Big Data, the nature of what is stored, how it is stored, and where it is stored, the intersection where politics and technology meet.   A place where social issues and technology collide.   You might call them storm clouds, just out of sight, but there is thunder on the horizon that you the consumer and/or potential user of global cloud infrastructure will need to be aware of because eventually the users of cloud infrastructure will become players in the game as well. 

That is not to say that the issues I tease out here are all gloom and doom.  In fact, they are great opportunities for potential business models, additional product features, and even cloud eco-system companies or niche solutions unto themselves.  A way to drive significant value for all sides.   I have been toying with more than a few of these ideas myself here over the last few months.

To date these issues have mostly manifested in the global build up of infrastructure for the big Internet platforms.  The Products and Services the big guys in the space use as their core money-making platforms or primary service delivery platforms.  Rarely if ever do these companies use this same infrastructure for their Infrastructure as a service (IAAS) or Platform as a Services (PAAS) offerings.  However, as you will see, the same challenges will and do apply to these offerings as well.  In some cases they are even more acute and problematic in a situation where there may be a multi-tenancy with the potential to put even more burden on future cloud users.

If I may be blunt about this there is an interesting lifecycle to this food chain whereby the Big Technology companies consistently have the upper hand and governmental forces through the use of their primary tools – regulation and legislation – are constantly playing catch up.  This lifecycle is unlikely to change for at least five reasons.

  • The Technology Companies will always have the lens of the big picture of multi-national infrastructure.   Individual countries, states, and locales generally only have jurisdiction or governance over that territory, or population base that is germane to their authority.
  • Technology Companies can be of near singular purpose on a very technical depth of capability or bring to bear much more concentrated “brain power” to solve for evolutions in the changing socio-political landscape to continually evolve measures and counter-measures to address these changes.
  • By and large Governments rely upon technologies and approaches to become mainstream before there is enough of a base understanding of the developments and impacts before they can act.  This generally places them in a reactionary position. 
  • Governmental forces generally rely upon “consultants” or “industry experts” to assist in understanding these technologies, but very few of these industry experts have ever really dealt with multi-national infrastructure and fewer still have had to strategize and evolve plans around these types of changes. The expertise at that level is rare and almost exclusively retained by the big infrastructure providers.
  • Technology Companies have the ability to force a complete game-change to the rules and reality by completely changing out the technology used to deliver their products and services, change development and delivery logic and/or methodology to almost affect a complete negation of the previous method of governance, making it obsolete. 

That is not to say that governments are unwilling participants in this process forced into a subservient role in the lifecycle.  In fact they are active participants in attracting, cultivating, and even subsidizing these infrastructural investments in areas of under their authority and jurisdiction.  Using tools like Tax breaks, real estate and investment incentives, and private-public partnerships do have both initial and ongoing benefits for the Governments as well.  In many ways these  are “golden handcuffs” for Technology Companies who enter into this cycle, but like any kind of constraint – positive or negative – the planning and strategy to unfetter themselves begins almost immediately.

Watson, The Game is Afoot

Governments, Social Justice, Privacy, and Environmental forces have already begun to force changes in the Technology landscape for those engaged in multi-national infrastructure.  There are tons of articles freely available on the web which articulate the kinds of impacts these forces have had and will continue to have on the Technology Companies.  The one refrain through all of the stories is the resiliency of those same Technology Companies to persevere and thrive despite what might be crucial setbacks in other industries.

In some cases the technology changes and adapts to meet the new requirements, in some cases, approaches or even vacating “un-friendly” environs across any of these spectrums becomes an option, and in some cases, there is not an insignificant bet that any regulatory or compulsory requirements will be virtually impossible or too technically complex to enforce or even audit.

Lets take a look at a couple of the examples that have been made public that highlight this kind of thing.   Back in 2009, Microsoft migrated substantial portions of their Azure Cloud Services out of Washington State to its facilities located in San Antonio Texas.  While the article specifically talks about certain aspects of tax incentives being held back, there were of course other factors involved.   One doesn’t have to look far to understand that Washington State also has an B&O Tax (Business and Occupation Tax) which is defined as a gross receipts tax. It is measured on the value of products, gross proceeds of sale, or gross income of the business.  As you can imagine, interpreting this kind of tax as it relates to online and cloud income and the like could be very tricky and regardless would be complex and technical problem  to solve.  It could have the undesired impact of placing any kind of online business at an interesting disadvantage,  or at another level place an unknown tax burden on its users.   I am not saying this was a motivating factor in Microsoft’s decision but you can begin to see the potential exposure developing.   In this case, The Technology could rapidly change and move the locale of the hosted environments to minimize the exposure, thus thwarting any governmental action.  At least for the provider, but what of the implications if you were a user of the Microsoft cloud platform and found yourself with an additional or unknown tax burden.  I can almost guarantee that back in 2009 that this level of end user impact (or revenue potential from a state tax perspective) had not even been thought about.   But as with all things, time changes and we are already seeing examples of exposure occurring across the game board that is our planet.

We are already seeing interpretations or laws getting passed in countries around the globe where for example, a server is a taxable entity.   If revenue for a business is derived from a computer or server located in that country it falls under the jurisdiction of that countries tax authority.    Imagine yourself as a company using this wonderful global cloud infrastructure selling your widgets, products or services, and finding yourself with an unknown tax burden and liability in some “far flung” corner of the earth.   The Cloud providers today mostly provide Infrastructure services.  They do not go up the stack far enough to be able to effectively manage your entire system let alone be able to determine your tax liability.  The burden of proof to a large degree today would reside on the individual business running inside that infrastructure.  

In many ways those adopting these technologies are the least capable to deal with these kinds of challenges.  They are small to mid-sized companies who admittedly don’t have the capital, or operational sophistication to build out the kind of infrastructure needed to scale that quickly.   They are unlikely to have technologies such as robust configuration management databases to be able to to track virtual instances of their products and services, to tell what application ran, where it ran, how long it ran, and how much revenue was derived during the length of its life.   And this is just one example (Server as a taxable entity) of a law or legislative effort that could impact global users.  There are literally dozens of these kinds of bills/legislative initiatives/efforts (some well thought out, most not) winding their way through legislative bodies around the world.

You might think that you may be able to circumvent some of this by limiting your product or services deployment to the country closest to home, wherever home is for you.  However there are other efforts winding their way through or in large degree passed that impact the data you store, what you store, whose data are you storing, and the like. In most cases these initiatives are unrelated to the revenue legislations developing, but balanced they can give an interesting one – two punch.   For example many countries are requiring that for Safe Harbor purposes all information for any nationals of ‘Country X’ must be stored in ‘Country X’ to ensure that its citizenry is properly protected and under the jurisdiction of the law for those users.   In a cloud environment, with customers potentially from almost anywhere how do you ensure that this is the case?  How do you ensure you are compliant?   If you balance this requirement with the ‘server as a taxable entity’ example I just gave above there is an interesting exposure and liability for companies prove where and when revenue is derived.     Similarly there are some laws that are enacted as reactions against legislation in other countries.

In the post-911 era within the United States, the US Congress enacted a series of laws called the Patriot Act.   Due to some of the information search and seizure aspects of the law, Canada forbade that Canadian citizens data be stored in the United States in response.   To the best of my knowledge only a small number of companies actually even acknowledge this requirement and have architected solutions to address it, but the fact remains they are not in compliance with Canadian law.  Imagine you are a small business owner, using a cloud environment to grow your business, and suddenly you begin to grow your business significantly in Canada.  Does your lack of knowledge of Canadian law excuse you from your responsibilities there?  No.  Is this something that your infrastructure provider is offering to you? Today, no.  

I am only highlighting certain cases here to make the point that there is a world of complexity coming to the cloud space.  Thankfully these impacts have not been completely explored or investigated by most countries of the world, but its not hard to see a day/time where this becomes a very real thing where companies and the cloud eco-system in general will have to address.  At its most base level these are areas of potential revenue streams for governments and as such increase the likelihood of their eventual day in the sun.    I am currently personally tracking over 30 different legislative initiatives around the world (read as pre-de-facto laws) that will likely shape this Technology landscape for the big providers and potential cloud adopters some time in the future.

What is to come?

This first article was really just to bring out the basic premise of the conversation and topics I will be discussing and to lay the groundwork to a very real degree.  I have not even begun to touch on the extra-governmental impacts of social and environmental impacts that will likely change the shape even further.  This interaction of Technology, “The Cloud”, Political and Social Issues, exists today and although largely masked by the fact that eco-system of the cloud is not fully developed or matured, is no less a reality.   Any predictions that are made are extensions of existing patterns I see in the market already and do not necessarily represent a forgone conclusion, but rather the most likely developments based upon my interactions in this space.  As this Technology space continues to mature, the only certainty is uncertainty modulated against the backdrop of a world where increasingly geo-political forces will continue to shape the Technology of tomorrow.

\Mm

Cloud Détente – The Cloud Cat and Mouse Papers

image_thumb1

Over the last decade or so I have been lucky enough to be placed into a fairly unique position to work internationally deploying global infrastructure for cloud environments.  This work has spanned across some very large companies with a very dedicated focus on building out global infrastructure and managing through those unique challenges.   Strategies may have varied but the challenges faced by them all had some very common themes.   One of the more complex interactions when going through this process is what I call the rolling Cat and Mouse interactions between governments at all levels and these global companies.  

Having been a primary player in these negotiations and the development of measures and counter measures as a result of these interactions, I have come to believe there are some interesting potential outcomes that cloud adopters should think about and understand.   The coming struggle and complexity for managing regulating and policing multi-national infrastructure will not solely impact the large global players, but in a very real way begin to shape how their users will need to think through these socio-political  and geo-political realities. The potential impacts on their business, their adoption of cloud technologies, their resulting responsibilities and measure just how aggressively they look to the cloud for the growth of their businesses.

These observations and predictions are based upon my personal experiences.  So for whatever its worth (good or bad)  this is not the perspective of an academic writing from some ivory tower, rather they are  the observations of someone who has been there and done it.  I probably have enough material to write an entire book on my personal experiences and observations, but I have committed myself to writing a series of articles highlighting what I consider the big things that are being missed in the modern conversation of cloud adoption.  

The articles will highlight (with some personal experiences mixed in) the ongoing battle between Technocrats versus Bureaucrats.  I will try to cover a different angle on many of the big topics out there today such as :

  • Big Data versus Big Government
  • Rise of Nationalism as a factor in Technology and infrastructure distribution
  • The long struggle ahead for managing, regulating, and policing clouds
  • The Business, end-users, regulation and the cloud
  • Where does the data live? How long does it live? Why Does it Matter?
  • Logic versus Reality – The real difference between Governments and Technology companies.
  • The Responsibilities of data ownership
    • … regarding taxation exposure
    • … regarding PII impacts
    • … Safe Harbor

My hope is that this series and the topics I raise, while maybe a bit raw and direct, will cause you to think a bit more about the coming impacts on Technology industry at large, the potential coming impacts to small and medium size businesses looking to adopt these technologies, and the developing friction and complexity at the intersection of technology and government.

\Mm

ATC Ribbon Cutting

grandopening

In my previous post I had mentioned how extremely proud I was of the Technology teams here at AOL in delivering a truly state of the art Data Center facility with some incredible ground breaking technology.  As I mentioned the facility was actually in production use faster than we could get the ribbon cutting ceremony scheduled.  I thought I would share a small slice of the pictures of the internal Ribbon Cutting Event.

___manos-gounares-cloud

Alex Gounares, former fellow Microsoft alum and AOL CTO and I presided over the celebration.   In this photo, Alex and I talk over some of the technologies used in our cloud with one our cloud engineers.  As the facility is based upon pre-racked technologies and modular facility and network build components it allows for significant cost and capital optimization. this allows us to build only when demand and growth dictates the need. All machines in the background are live and have been live for a few weeks.

___cut

After receiving two very large scissors which were remarkably sharp and precise for their size we were ready to go.   A few short words about the phenomenal job our teams performed and it was time for some ribbon to kiss raised floor.

 

 

___

At the end of the day the real reason why this project was such a success really breaks down to the team responsible for this incredible win.   An effort like this took incredibly smart people from different organizations working together to make this a reality.    The achievement is even more impressive in my mind when you think about the fact that in many cases our 90 day to live timeframe included design and execution on the go!   My guess is our next one may be significantly faster without all that design time. The true heroes of ATC are below!

the team

 

\Mm

(Special thanks goes out to Krysta Scharlach for the permission and use of her pictures in this post)

Private Clouds – Not just a Cost and Technology issue, Its all about trust, the family jewels, corporate value, and identity

I recently read a post by my good friend James Hamilton at Amazon regarding Private Clouds.   James and I worked closely together at Microsoft and he was always a good source for out of the box thinking and challenging the status quo.    While James post found here, speaks to the Private Cloud initiative being what amounts to be an evolutionary dead end, I would have to respectfully disagree.

James’ post starts out by correctly pointing out that at scale the large cloud players have the resources and incentive to achieve some pretty incredible cost savings.  From an infrastructure perspective he is dead on.  But I don’t necessarily agree that this innovation will never reach the little guy.  In my role at Digital Realty Trust I think I might have a pretty unique perspective on the infrastructure developments both at the “big” guys along with what most corporate enterprises have available to them from a leasing or commercial perspective.  

Companies like Digital Realty Trust,  Equinix, Terramark, Dupont Fabros, and a host of others in the commercial data center space are making huge advancements in this space as well.   The free market economy has now placed an importance on low PUE highly efficient buildings.   You are starting to see these firms commission buildings with Commission PUEs Sub 1.4.   Compared to most existing data center facilities this is a huge improvement.  Likewise these firms are incented to hire mechanical and electrical experts.  This means that this same expertise is available to the enterprise through leasing arrangements.  Where James is potentially correct is at that next layer of IT specific equipment.

This is an area where there is an amazing amount of innovation happening by Amazon, Google, and Microsoft.   But even here in this space there are firms stepping up to provide solutions to bring extensive virtualization and cloud-like capabilities to bear.    Companies like Hexagrid have software solutions offerings that are being marketed to typical co-location and hosting firms to do the same thing.  Hexagrid and others are focusing on the software and hardware combinations to deliver full service solutions for those companies in this space.    In fact (as some comments on James’ blog mention) there is a lack of standards and a fear of vendor lock-in by choosing one of the big firms.  Its an interesting thought to ponder if a software+hardware solution offered to the hundreds of co-location players and hosting firms might be more of a universal solution without fear of lockdown.  Time will tell.

But this brings up one of the key criticisms that this is not just about cost and technology.   I believe what is really at stake here is much more than that.   James makes great points on greater resource utilization of the big cloud players and how much more efficient they are at utilizing their infrastructure.   To which i will snarkly (and somewhat tongue-in-cheek) say to that, “SO WHAT!”  🙂   Do enterprises really care about this?  Do they really optimize for this?  I mean if you pull back that fine veneer of politically correct answers  and “green-suitable” responses is that what their behavior in REAL LIFE is indicative of?    NO.

This was a huge revelation for me when I moved into my role at Digital.  When I was at Microsoft, I optimized for all of the things that James mentions because it made sense to do when you owned the whole pie.   In my role at Digital I have visibility into tens of data centers, across hundreds of customers that span just about every industry.  There is not, nor has there been a massive move (or any move for that matter) to become more efficient in the utilization of their resources.   We have had years of people bantering about how wonderful, cool, and how revolutionary a lot of this stuff is, but world wide Data center utilization levels have remained abysmally low.   Some providers bank on this.  Over subscription of their facilities is part of their business plan.  They know companies will lease and take down what they think they need, and never take it down in REALITY.   

So if this technology issue is not a motivating factor what is?  Well cost is always part of the equation.   The big cloud providers will definitely deliver cost savings, but private clouds could also deliver cost savings as well.   More importantly however, Private clouds will allow companies to retain their identity and uniqueness, and keep what makes them competitively them –Them.

I don’t so much see it as a Private cloud or Public cloud kind of thing but more of a Private Cloud AND Public cloud kind of thing.   To me it looks more an exercise of data abstraction.   The Public offerings will clearly offer infrastructure benefits in terms of cost, but will undoubtedly lock a company into that single solution.  The IT world has been bit before by putting all their eggs in a single basket and the need for flexibility will remain more key.    Therefore you might begin to see Systems Integrators, Co-location and hosting firms, and others build their own platforms, or much more likely, build platforms that umbrella over the big cloud players to give enterprises the best of both worlds. 

Additionally we must keep in mind that  the biggest resistance to the adoption of the cloud is not technology or cost but RISK and TRUST.  Do you, Mr. CIO, trust Google to run all of your infrastructure? your applications?  Do you Mrs. CIO, Trust Microsoft or Amazon to do the same for you?    The answer is not a blind yes or no.   Its a complicated set of minor yes responses and no responses.   They might feel comfortable outsourcing mail operations, but not the data warehouse manifesting decades of customer information.     The Private cloud approach will allow you to spread your risk.   It will allow you to maintain those aspects of the business that are core to the company. 

The cloud is an interesting place, today.  It is dominated by technologists.  Extremely smart engineering people who like to optimize and solve for technological challenges.  The actual business adoption of this technology set has yet to be fully explored.   Just wait until the “Business” side of the companies get their hooks into this technology set and start placing other artificial constraints, or optimizations around other factors.  There are thousands of different motivators out in the world.  Once they starts to happen earnest.  I think what you will find is a solution that looks more like a hybrid solution than the pure plays we dream about today.

Even if you think my ideas and thoughts on this topic is complete BS, I would remind you of something that I have told my teams for a very long time.  “There is no such thing as a temporary data center.”  This same mantra will hold true for the cloud.  If you believe that the Private Cloud will be a passing and temporary thing, just keep in mind that there will be systems and solutions build to this technology approach thus imbuing it with a very very long life.  

\Mm

Panel at Data Center Dynamics – London

On November 10th and 11th I will be speaking on two panels at the Data Center Dynamics event in London.  The theme for the two day event is Carbon: Risk or Opportunity.   In the morning, On Day One, I am speaking  in a panel entitled The Data Center Efficiency Schism …New Realities in Design with Ed Ansett from HP/EYP and my old friend Lex Coors from Interxion.   The afternoon has me on another panel with Liam Newcombe with the British Computer Society entitled ‘The Shape of the Cloud to Come’ moderated by Data Center Dynamics CTO, Stephen Worn.  Liam and I have passion for this space and our past conversations on this topic in particular and other related topics have been quite entertaining (or so I have been told).  To top it off, this panel is moderated by Stephen who is not known for being timid either, so I am really looking forward to the discussion there.

The entire event should be quite super-charged especially given the recent Carbon Reduction Commitment legislation in the UK.  For those of you keeping a close eye on the emerging impact of carbon legislation across the world this event is likely to source a number of lightning rods and thought leadership to watch out for.  

If you have not signed up and will be in London, I would strongly encourage you to do so.   As always if you happen to see me wandering about, please feel free to stop and chat awhile. 

\Mm

Changing An Industry of Cottage Industries

If you happen to be following the news around Digital Realty Trust you may have seen the recent announcement of our Pod Architecture Services (PAS) offering.   Although the response has been deafening  there seems to be a lot of questions and confusion around what it is, what it is not, and what this ultimately means for Digital Realty Trust and our place in the industry.

First a simple observation – the Data Center Industry as it stands today is in actuality an industry of cottage industries.   Its an industry dominated by boutique firms in specialized niches all in support of the building out of these large technically complex facilities.  For the initiated its a world full of religious arguments like battery versus rotary, air-side economization versus water-side economization, raised floor versus no raised floor.  To the uninitiated its an industry categorized by mysterious wizards of calculus and fluid dynamics and magical electrical energies.  Its an illusion the wizards of the collective cottage industries are well paid and incented to keep up.   They ply their trade in ensuring that each facility’s creation is a one-off event, and likewise, so is the next one.  Its a world of competing General Contractors, architecture firms, competing electrical and mechanical firms, of specialists in all sizes, shapes and colors.   Ultimately – in my mind there is absolutely nothing wrong with this.  Everyone has the right to earn a buck no matter how inefficient the process. 

After all, there is a real science to most of the technologies and application of design involved in data centers and the magical mysteries they control are real.   They are are all highly trained professionals educated in some very technical areas.   But if we are honest,  each generally looks at the data center world from their own corner of the eco-system and while they solve their own issues and challenges quite handily, they stumble when having to get out of their comfort zone.  When they need to cooperate with other partners in the eco-system and solve more far reaching issues  it almost universally results in those solutions being applied in a one-off or per-job perspective.  I can tell you that leveraging consistency across a large construction program is difficult at best even with multiple projects underway, let alone a single project. 

The challenge of course is that in reality the end-user/owner/purchaser does not view the data center as an assembly of different components but rather as a single data center facility.   The complexity in design and construction are must-have headaches for the business manager who ultimately just wants to sate his need to have capacity for some application or business solution.  

Now into that background enter our POD Architecture Services offering.  In a nutshell it allows those customers who do not necessarily want to lease (or cannot due to regulatory or statutory reasons) a facility to use their own capital in building a facility without all the complexity associated with these buildings.

Our PAS offering is ultimately a way for a company to purchase a data center product.   Leveraging Digital’s Data Center product approach a company can simply select the size and configuration of their facility using the same “SKU’s”  we use internally in building out of our own facilities.   In effect we license the use of our design to these customers so that they enjoy the same benefits as our the customers of our turn-key data center facilities.

This means that customers of our PAS product can leverage our specific designs which are optimized across the four most crucial aspects of the data center lifecycle.   Our facility design is optimized around cost, mission, long term simplicity in operability and efficiency.  This is anchored around a belief that a design is comprised of both upfront first time costs along with the lifetime costs of the facility.    This is an “owners” perspective, which is the only perspective we have.  As the world’s largest data center REIT and wholesaler, we need to take the full picture into account.  We will own these facilities for a very long time. 

Many designs like to optimize around the technology, or around the upfront facility costs, or drive significant complexity in design to ensure that every possible corner case is covered in the facility.  But the fact is if you cut corners up front, you potentially  pay for it over the life of the asset, if you look for the most technologically advanced gear or provide for lots of levers, knobs, and buttons for the ultimate flexibility you open yourself up for more human error or drive more costs in the ongoing operation of the facility.   The owners perspective is incredibly important.    Many times companies allow these decisions to be made by their partner firms (the cottage industries) and this view gets clouded.  Given the complexity of these buildings and the fact that they are not built often by the customers in the first time its hard to maintain that owners perspective without dedicated and vigilant attention.  PAS changes all that as the designs have already been optimized and companies can simply purchase the product they most desire with the guarantee of what they receive on the other end of the process, is what they expected.

Moreover, PAS includes the watchful eye, experience and oversight of our veteran data center construction management.   These are highly qualified program managers who have built tens of data centers and act as highly experienced owners representatives.  The additional benefit being that they have built this product multiples of times and have become very good at the delivery of these types of facilities based upon our standardized product.   In addition, our customers can leverage our significant supply chain of construction partners, parts and equipment which allows for incredible benefits in the speed of delivery of these buildings along with reductions in upfront costs due to our volume purchasing.

This does not mean that Digital is going to become an Architectural or engineering firm and stamp drawings.  This does not mean we will become the general contractor.  This simply means that we will leverage our supply chain to deliver our designs and facilities based upon our best practices on behalf of the customers in a process that is completely managed and delivered by experienced professionals.  We will still have general contractors, and A&E firms, and the like that have experience in building our standardized product.  We are driving towards standardization.   If you believe there is value in having each facility as a one off design, the more power to you. We have a service there too, its call Build to Suit.  PAS is just another step in the formal definition of standard data center product.  Its a key element in modularization of capacity.    It is through standardization by which we can begin to have a larger impact on efficiency, and other key “Green” initiatives.   

I have often referred to my belief that data centers are simply the sub-stations of the Information Utility.   This service allows for commercial companies to start benefitting from the same standardization and efficiency gains that we are making in the wholesale space and enjoy the same cost factors.

Hope that makes things a bit clearer!

 

\Mm

The Cloud Politic – How Regulation, Taxes, and National Borders are shaping the infrastructure of the cloud

Most people think of ‘the cloud’ as a technical place defined by technology, the innovation of software leveraged across a scale of immense proportions and ultimately a belief that its decisions are guided by some kind of altruistic technical meritocracy.  At some levels that is true on others one needs to remember that the ‘cloud’ is ultimately a business.  Whether you are talking about the Google cloud, the Microsoft cloud, Amazon Cloud, or Tom and Harry’s Cloud Emporium, each is a business that ultimately wants to make money.   It never ceases to amaze me that in a perfectly solid technical or business conversation around the cloud people will begin to wax romantic and lose sight of common sense.  These are very smart technical or business savvy people but for some reason the concept of the cloud has been romanticized into something almost philosophical, a belief system,  something that actually takes on the wispy characteristics that the term actually conjures up.  

When you try to bring them down to the reality the cloud is essentially large industrial buildings full of computers, running applications that have achieved regional or even global geo-diversity and redundancy you place yourself in a tricky place that at best labels you a kill-joy and at worst a Blasphemer.

I have been reminded of late of a topic that I have been meaning to write about. As defined by my introduction above, some may find it profane, others will choose to ignore it as it will cause them to come crashing to the ground.   I am talking about the unseemly and terribly disjointed intersection of Government regulation, Taxes, and the Cloud.   This also loops in “the privacy debate” which is a separate conversation almost all to itself.   I hope to touch on privacy but only as it touches these other aspects.

As many of you know my roles past and present have focused around the actual technical delivery and execution of the cloud.    The place where pure software developers fear to tread.  The world of large scale design, construction and operations specifically targeted at a global infrastructure deployment and its continued existence into perpetuity.   Perpetuity you say?  That sounds a bit to grandiose doesn’t it?  My take is that once you have this kind of infrastructure deployed it will become an integral part of how we as a species will continue to evolve in our communications and our technological advances.  Something this cool is powerful.  Something this cool is a game changer.  Something this cool will never escape the watchful eyes of the world governments and in fact it hasn’t. 

There was a recent article at Data Center Knowledge regarding Microsoft’s decision remove its Azure Cloud platform out of the State of Washington and relocate them (whether virtually or physically) to be run in the state of Texas.  Other articles have highlighted similar conversations with Yahoo and the state of Washington, or Google and the state of  North Carolina.   These decisions all have to do with state level taxes and their potential impact on the upfront capital costs or long term operating costs of the cloud.   You are essentially seeing the beginning of a cat and mouse game that will last for some time on a global basis.  States and governments are currently using their blunt, imprecise instruments of rule (regulations and taxes) to try and regulate something they do not yet understand but know they need to play apart of.   Its no secret that technology is advancing faster than our society can gauge its overall impact or its potential effects and the cloud is no different.

In my career I have been responsible for the creation of at least 3 different site selection programs.  Upon these programs were based the criteria and decisions of where to place cloud and data center infrastructure would reside.  Through example and practice,  I have been able to deconstruct other competitors criteria and their relative weightings at least in comparison to my own and a couple of things jump out very quickly at anyone truly studying this space.   While most people can guess the need for adequate power and communications infrastructure, many are surprised that tax and regulation play such a significant role in even the initial sighting of a facility.   The reason is pure economics over the total lifetime of an installation. 

I cannot tell you how often I have economic development councils or business development firms come to me to tell me about the next ‘great’ data center location.  Rich in both power infrastructure and telecommunications, its proximities to institutions of higher learning, etc.   Indeed there are some really great places that would seem ideal for data centers if one allowed them to dwell in the “romanticized cloud”.   What they fail to note, or understand is that there may be legislation or regulation already on the books, or perhaps legislation currently winding its way through the system that could make it an inhospitable place or at least slightly less welcoming to a data center.    As someone responsible for tens of millions or hundreds of millions, or even billions of dollars worth of investment you find yourself in a role where you are reading and researching legislation often.  Many have noted my commentary on the Carbon Reduction Commitment in the UK, or my most recent talks about the current progress and data center impacts of the Waxman-Markey bill in the US House of Representatives.  You pay attention because you have to pay attention.   Your initial site selection is supremely important because you not only need to look for the “easy stuff” like power and fiber, but you need to look longer term, you need to look at the overall commitment of a region or an area to support this kind of infrastructure.   Very large business decisions are being made against these “bets” so you better get them right.  

To be fair the management infrastructure in many of these cloud companies are learning as they go as well.   Most of these firms are software companies who have now been presented with the dilemma of managing large scale capital assets.  Its no longer about Intellectual Property, its about physical property and there are some significant learning curves associated with that.   Add to the mix that this is whole cloud thing is something entirely new.    

One must also keep in mind that even with the best site selection program and the most robust up front due diligence, people change, governments change, rules change and when that happens it can and will have very large impacts on the cloud.   This is not something cloud providers are ignoring either.  Whether its through their software, through infrastructure, through a more modular approach they are trying to solve for the eventuality that things will change.   Think about the potential impacts from a business perspective.

Lets pretend you own a cloud and have just sunk 100M dollars into a facility to house part of your cloud infrastructure.   You spent lots of money in your site selection and up front due diligence to find the very best place to put a data center.   Everything is going great, after 5 years you have a healthy population of servers in that facility, you have found a model to monetize your service, so things are going great, but then the locale where your data center lives changes the game a bit.   They pass a law that states that servers engaged in the delivery of a service are a taxable entity.  Suddenly that place becomes very inhospitable to your business model.   You now have to worry about what that does to your business.   It could be quite disastrous.   Additionally if you rule that such a law would instantly impact your business negatively, you have the small matter of a 100M asset sitting in a region where you cannot use it.   Again a very bad situation.  So how do you architect around this?  Its a challenge that many people are trying to solve.   Whether you want to face it or not, the ‘Cloud’ will ultimately need to be mobile in its design.  Just like its vapory cousins in the sky, the cloud will need to be on the move, even if its a slow move.  Because just as there are forces looking to regulate and control the cloud, there are also forces in play where locales are interested in attracting and cultivating the cloud.  It will be a cycle that repeats itself over and over again.

So far we have looked at this mostly from a taxation perspective.   But there are other regulatory forces in play.    I will use the example of Canada. The friendly frosty neighbors to the great white north of the United States.  Its safe to say that Canada and US have had historically wonderful relations with one another.   However when one looks through the ‘Cloud’ colored looking glass there are some things that jump out to the fore. 

In response to the Patriot Act legislation after 9-11, the Canadian government became concerned with the rights given to the US government with regards to the seizure of online information.  They in turn passed a series of Safe-Harbor-like laws that stated that no personally identifiable information of Canadian citizens could be housed outside of the Canadian borders.    Other countries have done, or are in process with similar laws.   This means that at least some aspects of the cloud will need to be anchored regionally or within specific countries.    A boat can drift even if its anchored and so must components of the cloud, its infrastructure and design will need to accommodate for this.  This touches on the privacy issue I talked about before.   I don’t want to get into the more esoteric conversations of Information and where its allowed to live and not live, I try to stay grounded in the fact that whether my romantic friends like it or not, this type of thing is going to happen and the cloud will need to adapt.

Its important to note that none of the legislation focuses on ‘the cloud’ or ‘data centers’ just yet.   Just as the Waxman-Markey bill or CRC in the UK doesn’t specifically call out data centers, those laws will have significant impacts on the infrastructure and shape of the cloud itself. 

There is an interesting chess board developing between technology versus regulation.   They are inexorably intertwined with one another and each will shape the form of the other in many ways.   A giant cat an mouse game on a global level.   Almost certainly, this evolution wont be  the most “technically superior” solution.  In fact, these complications will make the cloud a confusing place at times.   If you desired to build your own application using only cloud technology, would you subscribe to a service to allow the cloud providers to handle these complications?  Would you and your application  be liable for regulatory failures in the storage of  Azerbaijani-nationals?  Its going to be an interesting time for the cloud moving forward. 

One can easily imagine personally identifiable information housed in countries of origin, but the technology evolving so that their actions on the web are held elsewhere, perhaps even regionally where the actions take place.  You would see new legislation emerging to potentially combat even that strategy and so the cycle will continue.  Likewise you might see certain types of load compute or transaction work moving around the planet to align with more technically savvy or advantageous locales.  Just as the concept of Follow the Moon has emerged for a potential energy savings strategy to move load around based on the lowest cost energy, it might someday be followed with a program similarly move information or work to more “friendly” locales.     The modularity movement of data center design will likely grow as well trying to reduce the overall exposure the cloud firms have in any given market or region.   

On this last note, I am reminded of one of my previous posts. I am firm in my belief that Data Centers will ultimately become the Sub-Stations of the information utility.  In that evolution they will become more industrial, more commoditized, with more intelligence at the software layer to account for all these complexities.  As my own thoughts and views evolve around this I have come to my own strange epiphany.  

Ultimately the large cloud providers should care less and less about the data centers they live in.  These will be software layer attributes to program against.  Business level modifiers on code distribution.   Data Centers should be immaterial components for the Cloud providers.  Nothing more than containers or folders in which to drop their operational code.  Today they are burning through tremendous amounts of capital believing that these facilities will ultimately give them strategic advantage.   Ultimately these advantages will be fleeting and short-lived.  They will soon find themselves in a place where these facilities themselves will become a drag on their balance sheets or cause them to invest more in these aging assets. 

Please don’t get me wrong, the cloud providers have been instrumental in pushing this lethargic industry into thinking differently and evolving.   For that you need give them appropriate accolades.  At some point however, this is bound to turn into a losing proposition for them.  

How’s that for Blasphemy?

\Mm

Forecast Cloudy with Continued Enterprise

image

This post is a portion of my previous post that I broke into two.   It more clearly defines where I think the market is evolving to and why companies like Digital Trust Realty will be at the heart of the change in our industry.

 

The Birth of Utilities and why a Century ago will matter today and forward…

I normally kick off my vision of the future talk with a mention first to history (my former Microsoft folks are probably groaning at this moment if they are reading this).   I am a huge history buff.  In December of 1879, Thomas Edison harnessed the power of electricity for the first time to light a light bulb.  What’s not apparent is that this “invention” was in itself not complete.  To get this invention from this point to large scale, commercial application required a host of other things to be invented as well.   While much ado is made about the successful kind of filament used to ensure a consistent light source, there were no less than at least seven other inventions to make electric light (and ultimately the electric utility) practical for everyone.  Invention of things like the parallel circuit, an actual durable light bulb, an improved dynamo, underground conductor networks, devices to maintain constant voltage, insulting materials and safety fuses, the light socket, the on/off switch, and a bunch of other minor things.   Once all these things were solved, the creation of the first public electricity utility was created.  On September of 1882, the first commercial power station, located on Pearl Street in lower Manhattan opened its doors and began providing light and electrical power to all customers within the “massive” area of one square mile.   This substation was a marvel of technology staffed with 10s of technicians, maintaining the complex machinery to exacting standards.   The ensuing battle between Direct Current and Alternating Current was then created and in some areas still continues today. More on this in a bit.

A few years earlier a host of people were working on what would eventually become known as the telephone.   In the United States this work is attributed to Alexander Graham Bell and its that story I will focus on here for a second.  Through trial and error Bell and his compatriot Watson, accidently stumbled across a system to transfer sound in June of 1875.  After considerable work on refinement the product launched (There is an incredibly interesting history of this at SCRIBD), and after ever more additional trial and error the first telephonic public utility was created with the very first central office coming online in January of 1878 in New Haven, Connecticut.  This first central office was a marvel to behold.  Again the extremely high tech equipment with a host of people ensuring that telephonic utility was always available and calls were transferred appropriately.  Interestingly by 1881 only 9 cities with populations above 10,000 were without access to the telephone utility and only 1 above 15,000!  That is an adoption rate that remains boggling even by today’s standards.  

These are significant moments in time that truly changed the world in the way we live every day.   Today we are the birth of another such utility.  The Information Utility.   Many people I have spoken to claim this “Information Utility” is something different. It’s more of a product, because it uses existing utility services. Some maintain that its truly not revolutionary because its not leveraging new concepts.   But the same can be said of those utilities as well.   The communications infrastructure we use today whether telephone or data has its very roots in the telegraph.  The power utilities have a lot to thank the gas-lamp utilities of the past for solving early issues as well.  Everything old is new again and everything gets refined into something newer and better.  Some call this new Information Utility the “cloud”, others the Information-sphere, others just call it the Internet.  Regardless what you call it, access to information is going to be at your finger tips more today and tomorrow than it has ever been before. 

Even though this utility is built upon existing services, this utility too will have its infrastructure.  Just as the electric utility has its sub-stations and distribution yards, and the communication utilities have central offices, so too will data centers become the distribution mechanism for the the Information Utility.   We still have a lot of progress to make as well.   Not everything is invented or understood yet.  Just as Edison had to invent a host of other items to make electricity practical, and Bell and Watson have to develop the telephone, the telephone ringer (or more correctly, thumper or buzzer), so to does our information Utility have a long way to go.  In some respects its even more complicated than its predecessors as their was not burdened with legislation and government involvement that would affect its early development.  The “Cloud” does.

And that innovation does not always come from a select few.  Westinghouse and his alternating current eventually won out over direct current because it found its killer app and business case.   Alternating current was clearly the technically superior and better for distribution. They had even demonstrated generating power at Niagara Falls and successfully transferred that power all the way to Buffalo, New York! Something direct current was unable to do.  In the end, Westinghouse worked with appliance manufacturers to create devices that used alternating current.  By driving his killer app (things like refrigerators), Edison eventually lost out.  So too will the cloud have its killer apps.  The pending software and services battle will be interesting to note.  However what is interesting to me is that it was the business case that drove adoption and evolution here.  This also modified how the utility was used and designed.   DC substations gave way to AC substations, what used to take scores of people to support has dwindled to occasional visitations and pre-scheduled preventative maintenance.   At the data center level, we cannot afford to think that these killer applications will not change our world.    Our killers applications are coming and it will forever change how our world does business.  Data Centers and their evolution are at the heart of our future. 

On Fogs, Mist, and the Clouds Ahead . . .

After living in Seattle for close to 10 years, you learn you become an expert in three things.  Clouds, rain, and more clouds.   Unlike the utilities of the past, this new Information Utility is going to be made up of lots of independent cloudlets full of services.  The Microsoft’s, Google’s and Amazon’s of the world will certainly play a large part of the common platforms used by everyone, but the applications, products, content, customer information, and key development components will continue to have a life in facilities and infrastructure owned or controlled by companies providing those services.    In addition, external factors are already beginning to have a huge influence on cloud infrastructure.  Despite the growing political trend of trans-nationalism where countries are giving up some of their sovereign rights to participate in more regionally-aware economics and like-minded political agendas, that same effect does not seem to be taking place in the area of taxation and regulation of cloud and information infrastructure.  Specifically as it relates to electronic or intellectual property entities that derive revenue from infrastructure housed in those particular countries or do so (drive revenue) off of online activity of citizens of those nations.

There are numerous countries today that have, or are, seriously engaged in establishing and managing their national boundaries digitally online.  What do I mean by that?  There is a host of legislation across the globe that is the beginning to govern the protection and online management of their their citizens through legislation and mandates in accordance with their own laws.   This is having (and will continue to have) a dramatic impact on how infrastructure and electronic products and services will be deployed, where that data is stored,  and how revenue from that activity can and will be taxed by the local country.  This level of state exercised control can be economically, politically, or socially motivated and cloud services providers need to pay attention to it.   A great example of this is Canada which has passed legislation in response to the U.S. Patriot Act.   This legislation forbids personally identifiable information (PII) of Canadian citizens to be housed outside of the boundaries of Canada or perhaps more correctly, forbids its storage in the United States.    There are numerous laws and legislation making their way across Europe and Asia as well.   That puts an interesting kink in the idea of a world wide federated cloud user-base where information will be stored “in the cloud”.  From an infrastructure perspective it will mandate that there are facilities in each country to house that data.  While the data storage and retention challenge is an interesting software problem to solve the physical fact that the data will need to remain in a local geography will require data centers and components of cloud infrastructure to be present.  I expect this to continue as governments become more technically savvy and understand the impact of the rate of change being caused by this technology evolution. Given the fact that data centers are extremely capital intensive only a few players will be able to deploy private global infrastructures.  This means that the “information sub-station” providers will have an even more significant role in driving the future standards of this new Information Utility. One might think that this could be a service that is ultimately provided by the large cloud providers as a service.   That could be a valid assumption however, there is an interesting wrinkle developing around taxation or more correctly exposure to double taxation or multiple-country taxation that those large providers will face.   In my opinion the federation of “information substation” providers will provide the best balance of off-setting taxation issues and still providing a very granular and regionally acceptable way to service customers. That is where companies like Digital Realty Trust are going to come in and drive significant value and business protection.

I watch a lot of these geo-political and economic developments pretty closely as it relates to Data Center and infrastructure legislation and will continue to do so.  But even outside of these issues, the “cloud” or whatever term you like will continue to evolve and the “channels” created by this paradigm will continue to drive innovation at the products and services level.  Its at this level where the data center story will continue to evolve as well.   To start we need to think about the business version of the IT “server-hugging” phenomena. For the uninitiated, “Server Huggers” are those folks in an IT department who believe that the servers have to be geographically close  in order to work on them. In some cases its the right mentality, in others, where the server is located truly doesn’t matter.   It’s as much a psychological experiment as a technical one.   At a business level, there is a general reluctance to house the company jewels outside of corporate controlled space.  Sometimes this is regulated (like banks and financial institutions), most often its because those resources (proprietary applications, data sets, information stores, etc) are crucial to the success of the company, and in many instances ARE the company. Not something you necessarily want to outsource to others for control.  Therefore wholesale adoption of cloud resources is still a very very long way off.  That is not to say that this infrastructure wont get adopted into solutions that companies ultimately use to grow their own businesses.  This is going to drive tons of innovation where businesses evolve their applications , create new business models, and join together in mutually beneficial alliances that will change the shape, color, and feel of the cloud.  In fact, the cloud or “Information Utility” becomes the ultimate channel distribution mechanism.

The first grouping I can see evolving is fraternal operating groups or FOGs.  This is essentially a conglomeration of like minded or related industry players coming together to build shared electronic compute exchanges or product and service exchanges.  These applications and services will be highly customized to that particular industry. They will never be sated by the solutions that the big players will be putting into play, they are too specialized.  This infrastructure will likely not sit within individual company data centers but are likely to be located in common ground facilities or leased facilities with some structure for joint ownership.   Whether large or small, business to business, or business to consumer, I see this as an evolving sector.  There will be definitely companies looking to do this on their behalf, but given the general capital requirements to get into this type of business these FOG Agreements may be just the answer to find a great trade off between capital investment and return on the compute/service.

The next grouping builds off of the “company jewels” mindset and how it could blend with cloud infrastructure.  To continue the overly used metaphor of clouds,I will call them Managed Instances Stationed Territorially or MISTs.   There will likely be a host of companies that want to take advantage of the potential savings of cloud managed infrastructure, but want the warm and fuzzy feeling knowing its literally right in their backyard.  Imagine servers and infrastructure deployed at each customer data center, but centrally managed from cloud service providers.   Perhaps its owned by the cloud provider, perhaps the infrastructure has been purchased by the end-user company.   One can imagine the container-based server solutions being dropped into container-ready facilities or jury-rigged in the parking lot of a corporate owned or leased facility.  This gives companies the ability to structure their use of cloud technologies and map them into their own use case scenarios.  What makes the most sense for them.  The recent McKinsey paper talked about how certain elements of the cloud are more expensive than managing the resources through traditional means.  This is potentially a great hybrid scenario where companies can integrate as they need to using those services.  One could even see Misty FOGs or Foggy Mists.  I know the analogy is getting old at this point, but hopefully you can see that the future isn’t as static as some would have you believe.  This ability to channelize the technologies of the cloud will have a huge impact on business costs, operations, and technology.   It also suggests that mission critical infrastructure is not going to go away but become even more important and potentially more varied.  This is why I think that the biggest infrastructure impact will occur in the “information substation provider” level.  Data Centers aren’t going away, they might actually be growing in terms of demand, and the one thing definitely for sure is that they are evolving today and will continue to evolve as this space matures.  Does your current facility allow for this level of interconnectivity?  Do you have the ability to have a mixed solution management providers in your facility?  Lots of questions lots of opportunities to develop answers.

The last grouping is potentially an evolution of modern content delivery infrastructure or edge computing capabilities.  I will quit with the cutesy cloud names and call this generically Near Cloud Content Objects.   Given that products, services, and data will become the domain of those entities owning them, and a general reluctance to wholesale store them in someone else’s infrastructure, one could see that this proprietary content could leverage the global cloud infrastructures through regional gateways where they will be able to maintain ownership and control of their asset.  This becomes even more important when you balance into this the economic and geo-political aspects emerging in cloud compute.

In the end the cloud approach is going to significantly drive data center demand and cause it to evolve even further.  It will  not as some would like to project end the need for corporate data centers.  Then there is that not so little issue of the IT Applications and internal company services we use everyday.  This leads me into my next point . . .

The Continued and Increasing Importance of Enterprise Data Centers

This post has concentrated a lot on the future of cloud computing, so I will probably tick off a bunch of cloud-fan-folk with this next bit, but the need for the corporate data centers is not going away.  They may change in size, shape, efficiency, and the like, but there is a need to continue to maintain a home for those company jewels and to serve internal business communities.  The value of any company is the information and intellectual property developed, maintained, and driven by their employees.   Concepts like FOGs and MISTs and such still require ultimate homes or locations for that work to be terminated into or results sent to.  Additionally look at the suite of software each company may have in its facilities today supporting their business.  We are at least a decade or more away before those could be migrated to a distributed cloud based infrastructure.  Think about the migration costs of any particular application you have, then compound that with having the complexity of having your data stored in those cloud environments as well.  Are you then locked into a single cloud provider forever? It obviously requires cloud interoperability, which doesn’t exist today with exception of half-hearted non-binding efforts that don’t actually include any of the existing cloud providers.   If you believe as I do that the “cloud”  will actually be many little and large channelized solution cloudlets, you have to believe that the corporate data center is here to stay.  The mix of applications and products in your facilities may differ in the future, but you will still have them.  That’s not to say the facilities themselves will not have to evolve.  They will.  With changing requirements around energy efficiency and green reporting, along with the geo-political and other regulations coming through the pipeline, the enterprise data center will still be an area full of innovation as well.  

/Mm