Google Purchase of Deep Earth Mining Equipment in Support of ‘Project Rabbit Ears’ and Worldwide WIFI availability…

(10/31/2013 – Mountain View, California) – Close examination of Google’s data center construction related purchases has revealed the procurement of large scale deep earth mining equipment.   While the actual need for the deep mining gear is unclear, many speculate that it has to do with a secretive internal project that has come to light known only as Project: Rabbit Ears. 

According to sources not at all familiar with Google technology infrastructure strategy, Project Rabbit ears is the natural outgrowth of Google’ desire to provide ubiquitous infrastructure world wide.   On the surface, these efforts seem consistent with other incorrectly speculated projects such as Project Loon, Google’s attempt to provide Internet services to residents in the upper atmosphere through the use of high altitude balloons, and a project that has only recently become visible and the source of much public debate – known as ‘Project Floating Herring’, where apparently a significantly sized floating barge with modular container-based data centers sitting in the San Francisco Bay has been spied. 

“You will notice there is no power or network infrastructure going to any of those data center shipping containers,” said John Knownothing, chief Engineer at Dubious Lee Technical Engineering Credibility Corp.  “That’s because they have mastered wireless electrical transfer at the large multi-megawatt scale.” 

Real Estate rates in the Bay Area have increased almost exponentially over the last ten years making the construction of large scale data center facilities an expensive endeavor.  During the same period, The Port of San Francisco has unfortunately seen a steady decline of its import export trade.  After a deep analysis it was discovered that docking fees in the Port of San Francisco are considerably undervalued and will provide Google with an incredibly cheap real estate option in one of the most expensive markets in the world. 

It will also allow them to expand their use of renewable energy through the use of tidal power generation built directly into the barges hull.   “They may be able to collect as much as 30 kilowatts of power sitting on the top of the water like that”, continues Knownothing, “and while none of that technology is actually visible, possible, or exists, we are certain that Google has it.”

While the technical intricacies of the project fascinate many, the initiative does have its critics like Compass Data Center CEO, Chris Crosby, who laments the potential social aspects of this approach, “Life at sea can be lonely, and no one wants to think about what might happen when a bunch of drunken data center engineers hit port.”  Additionally, Crosby mentions the potential for a backslide of human rights violations, “I think we can all agree that the prospect of being flogged or keel hauled really narrows down the possibility for those outage causing human errors. Of course, this sterner level of discipline does open up the possibility of mutiny.”

However, the public launch of Project Floating Herring will certainly need to await the delivery of the more shrouded Project Rabbit Ears for various reasons.  Most specifically the primary reason for the development of this technology is so that Google can ultimately drive the floating facility out past twelve miles into International waters where it can then dodge all national, regional, and local taxation, the safe harbor and privacy legislation of any country or national entity on the planet that would use its services.   In order to realize that vision, in the current network paradigm, Google would need exceedingly long network cables  to attach to Network Access Points and Carrier Connection points as the facilities drive through international waters.

This is where Project Rabbit Ears becomes critical to the Google Strategy.   Making use of the deep earth mining equipment, Google will be able to drill deep into the Earths crust, into the mantle, and ultimately build a large Network Access Point near the Earth’s core.  This Planetary WIFI solution will be centrally located to cover the entire earth without the use of regional WIFI repeaters.  Google’s floating facilities could then gain access to unlimited bandwidth and provide yet another consumer based monetization strategy for the company. 

Knownothing also speculates that such a move would allow Google to make use of enormous amounts of free geo-thermic power and almost singlehandedly become the greenest power user on the planet.   Speculation also abounds that Google could then sell that power through its as yet un-invented large scale multi-megawatt wireless power transfer technology as unseen on its floating data centers.

Much of the discussion around this kind of technology innovation driven by Google has been given credible amounts of veracity and discussed by many seemingly intelligent technology based news outlets and industry organizations who should intellectually know better, but prefer not to acknowledge the inconvenient lack of evidence.

 

\Mm

Editors Note: I have many close friends in the Google Infrastructure organization and firmly believe that they are doing some amazing, incredible work in moving the industry along especially solving problems at scale.   What I find simply amazing is in the search for innovation how often our industry creates things that may or may not be there and convince ourselves so firmly that it exists. 

2014 The Year Cloud Computing and Internet Services will be taxed. A.K.A Je déteste dire ça. Je vous l’avais dit.

 

france

Its one of those times I really hate to be right.  As many of you know I have been talking about the various grass roots efforts afoot across many of the Member EU countries to start driving a more significant tax regimen on Internet based companies.  My predictions for the last few years have more been cautionary tales based on what I saw happening from a regulatory perspective on a much smaller scale, country to country.

Today’s Wall Street Journal has an article discussing France’s movements to begin taxation on Internet related companies who derive revenue from users and companies across the entirety of the EU, but holding those companies responsible to the tax base in each country.   This could likely mean that such legislation is likely to become quite fractured and tough for Internet Companies to navigate.  The French proposition is asking the European Commission to draw up proposals by the Spring of 2014.

This is likely to have a very interesting (read as cost increases) across just about every aspect of Internet and Cloud Computing resources.  From a business perspective this is going to increase costs which will likely be passed on to consumers in small but interesting ways.  Internet advertising will need to be differentiated on a country by country basis, and advertisers will end up having different cost structures, Cloud Computing Companies will DEFINITELY need to understand where instances of customer instances were, and whether or not they were making money.  Potentially more impactful, customers of Cloud computing may be held to account for taxation accountability that they did not know they had!  Things like Data Center Site Selection are likely going to become even more complicated from a tax analysis perspective as countries with higher populations will likely become no-go zones (perhaps) or require the passage of even more restrictive laws around it.

Its not like the seeds of this haven’t been around since 2005, I think most people just preferred to keep a blind eye to the tax that the seed was sprouting into a full fledged tree.   Going back to my Cat and Mouse Papers from a few years ago…  The Cat has caught the mouse, its now the mouse’s move.

\Mm

 

Authors Note: If you don’t have a subscription to the WSJ, All Things Digital did a quick synopsis of the article here.

Budget Challenged States, Data Center Site Selection, and the Dangers of Pay to Play

Site Selection can be a tricky thing.  You spend a ton of time upfront looking for that perfect location.   The confluence of dozens of criteria, digging through fiber maps, looking at real estate, income and other state taxes.   Even the best laid plans, and most thoughtful of approaches can be waylaid by changes in government, the emergence of new laws, and other regulatory changes which can put your selection at risk.  I was recently made aware of yet another cautionary artifact you might want to pay attention to: Pay to Play laws and budget challenged States.  

As many of my frequent readers know, I am from Chicago.  In Chicago, and Illinois at large “Pay to Play” has much different connotations than the topic I am about to bring up right now.  In fact the Chicago version broke out into an all out National and International Scandal.  There is a great book about it if you are interested, aptly entitled, Pay to Play.

The Pay to Play that I am referring to is an emerging set of regulations and litigation techniques that require companies to pay tax bills upfront (without any kind of recourse or mediation) which then forces companies to litigate to try and recover those taxes if unfair.   Increasingly I am seeing this in states where the budgets are challenged and governments are looking for additional funds and are targeting Internet based products and services.   In fact, I was surprised to learn that AOL has been going through this very challenge.  While I will not comment on the specifics of our case (its not specifically related to Data Centers anyway) it may highlight potential pitfalls and longer term items to take into effect when performing Data Center Site Selection.    You can learn more about the AOL case here, if you are interested.

For me it highlights that lack of understanding of Internet services by federal and local governments combined with a lack of inhibition in aggressively pursuing revenue despite that lack of understanding can be dangerous and impactful to companies in this space.   These can pose real dangers especially in where one site selects for their facility.    These types of challenges can come into play whether you are building your own facility, selecting a colocation facility and hosting partner, or if stretched eventually where your cloud provider may have located their facility.  

It does beg the question as to whether or not you have checked into the financial health of the States you may be hosting your data and services in.   Have you looked at the risk that this may pose to your business?  It may be something to take a look at!

 

\Mm

The Cloud Cat and Mouse Papers–Site Selection Roulette and the Insurance Policies of Mobile infrastructure

cnm-roulette

Its always hard to pick exactly where to start in a conversation like this especially since this entire process really represents a changing life-cycle.   Its more of a circular spiral that moves out (or evolves) as new data is introduced than a traditional life-cycle because new data can fundamentally shift the technology or approach.   That being said I thought I would start our conversations at a logical starting point.   Where does one place your infrastructure?  Even in its embryonic “idea phase” the intersection of government and technology begins its delicate dance to a significant degree. These decisions will ultimately have an impact on more than just where the Capital investments a company decides to make are located.  It has affects on the products and services they offer, and as I propose, an impact ultimately on the customers that use the services at those locations.

As I think back to the early days of building out a global infrastructure, the Site Selection phase started at a very interesting place.   In some ways we approached it with a level of sophistication that has still to be matched today and in other ways, we were children playing a game whose rules had not yet been defined.

I remember sitting across numerous tables with government officials talking about making an investment (largely just land purchase decisions) in their local community.  Our Site Selection methodology had brought us to these areas.  A Site Selection process which continued to evolve as we got smarter, and as we started to truly understand the dynamics of the system were being introduced to.   In these meetings we always sat stealthily behind a third party real estate partner.  We never divulged who we were, nor were they allowed to ask us that directly.  We would pepper them with questions, and they in turn would return the favor.  It was all cloak and dagger with the Real Estate entity taking all action items to follow up with both parties.

Invariably during these early days -  these locales would always walk away with the firm belief that we were a bank or financial institution.   When they delved into our financial viability (for things like power loads, commitment to capital build-out etc.) we always stated that any capital commitments and longer term operational cost commitments were not a problem.    In large part the cloak and dagger aspect was to keep land costs down (as we matured, we discovered this was quite literally the last thing we needed to worry about) as we feared that once our name became attached to the deal our costs would go up.   These were the early days of seeding global infrastructure and it was not just us.  I still laugh at the fact that one of our competitors bound a locality up so much in secrecy – that the community referred to the data center as Voldemort – He who shall not be named, in deference to the Harry Potter book series.

This of course was not the only criteria that we used.  We had over 56 by the time I left that particular effort with various levels of importance and weighting.   Some Internet companies today use less, some about the same, and some don’t use any, they ride on the backs of others who have trail-blazed a certain market or locale.   I have long called this effect Data Center Clustering.    The rewards for being first mover are big, less so if you follow them ultimately still positive. 

If you think about most of the criteria used to find a location it almost always focuses on the current conditions, with some acknowledge in some of the criteria of the look forward.  This is true for example when looking at power costs.   Power costs today are important to siting a data center, but so is understanding the generation mix of that power, the corresponding price volatility, and modeling that ahead to predict (as best as possible) longer term power costs.

What many miss is understanding the more subtle political layer that occurs once a data center has been placed or a cluster has developed. Specifically that the political and regulatory landscape can change very quickly (in relationship to the life of a data center facility which is typically measured in 20, 30, or 40 year lifetimes).  It’s a risk that places a large amount of capital assets potentially in play and vulnerable to these kinds of changes.   Its something that is very hard to plan or model against.  That being said there are indicators and clues that one can use to at least play risk factors against or as some are doing – ensuring that the technology they deploy limits their exposure.    In cloud environments the question remains open – how liable are companies using cloud infrastructure in these facilities at risk?   We will explore this a little later.

That’s not to say that this process is all downside either.  As we matured in our approach, we came to realize that the governments (local or otherwise) were strongly incented to work with us on getting us a great deal and in fact competed over this kind of business.   Soon you started to see the offers changing materially.  It was little about the land or location and quickly evolved to what types of tax incentives, power deals, and other mechanisms could be put in play.   You saw (and continue to see) deals structured around sales tax breaks, real estate and real estate tax deals, economic incentives around breaks in power rates, specialized rate structures for Internet and Cloud companies and the like.   The goal here of course was to create the public equivalent of “golden handcuffs” for the Tech companies and try to marry them to particular region, state, or country.  In many cases – all three.  The benefits here are self apparent.  But can they (or more specifically will they) be passed on in some way to small companies who make use of cloud infrastructure in these facilities? While definitely not part of the package deals done today – I could easily see site selection negotiations evolving to incent local adoption of cloud technology in these facilities or provisions being put in place tying adoption and hosting to tax breaks and other deal structures in the mid to longer timeframe for hosting and cloud companies.

There is still a learning curve out there as most governments mistakenly try and tie these investments with jobs creation.   Data Centers, Operations, and the like represents the cost of goods sold (COGS) to the cloud business.  Therefore there is a constant drive towards efficiency and reduction of the highest cost components to deliver those products and services.   Generally speaking, people, are the primary targets in these environments.   Driving automation in these environments is job one for any global infrastructure player.  One of the big drivers for us investing and developing a 100% lights-out data center at AOL was eliminating those kinds of costs.  Those governments that generally highlight job creation targets over other types typically don’t get the site selection.    After having commissioned an economic study done after a few of my previous big data center builds I can tell you that the value to a region or a state does not come from the up front jobs the data center employs.  After a local radio stationed called into question the value of having such a facility in their backyard, we used a internationally recognized university to perform a third party “neutral” assessment of the economic benefits (sans direct people) and the numbers were telling.  We had surrendered all construction costs and other related material to them, and they investigated over the course of a year through regional interviews and the like of what the direct impacts of a data center was on the local community, and the overall impacts by the addition.  The results of that study are owned by a previous employer but I  can tell you with certainty – these facilities can be beneficial to local regions.

No one likes constraints and as such you are beginning to see Technology companies use their primary weapon – technology – to mitigate their risks even in these scenarios.   One cannot argue for example, that while container-based data centers offer some interesting benefits in terms of energy and cost efficiencies, there is a certain mobility to that kind of infrastructure that has never been available before.    Historically, data centers are viewed as large capital anchors to a location.    Once in place, hundreds of millions to billions (depending on the size of the company) of dollars of capital investment are tied to that region for its lifespan.   Its as close to permanent in the Tech Industry as building a factory was during the industrial revolution. 

In some ways Modularization of the data center industry is/can/will have the same effect as the shipping container did in manufacturing.   All puns intended.  If you are unaware of how the shipping container revolutionized the world, I would highly recommend the book “The Box” by Marc Levinson, it’s a quick read and very interesting if you read it through the lens of IT infrastructure and the parallels of modularization in the Data Center Industry at large.

It gives the infrastructure companies more exit options and mobility in the future than they would have had in the past under large capital build-outs.  Its an insurance policy if you will for potential changes is legislation or regulation that might negatively impact the Technology companies over time.  Just another move in the cat and mouse games that we will see evolving here over the next decade or so in terms of the interactions between governments and global infrastructure. 

So what about the consumers of cloud services?  How much of a concern should this represent for them?  You don’t have to be a big infrastructure player to understand that there are potential risks in where your products and services live.  Whether you are building a data center or hosting inside a real estate or co-location provider – these are issues that will affect you.  Even in cases where you only use the cloud provisioning capabilities within your chosen provider – you will typically be given options of what region or area would you like you gear hosted in.  Typically this is done for performance reasons – reaching your customers – but perhaps this information might cause you to think of the larger ramifications to your business.   It might even drive requirements into the infrastructure providers to make this more transparent in the future.

These evolutions in the relationship between governments and Technology and the technology options available to them will continue to shape site selection policy for years to come.   So too will it ultimately affect the those that use this infrastructure whether directly or indirectly remains to be seen.  In the next paper we will explore the this interaction more deeply as it relates to the customers of cloud services and the risks and challenges specifically for them in this environment.

\Mm

Site Selection,Data Center Clustering and their Interaction

I have written many times on the importance of the site selection for data centers and its growing importance when one considers the regulatory and legislative efforts underway globally.   Those who make their living in this space know that this is going to have a significant impact on the future landscape of these electronic bit factories.   The on-going long term operational costs over the life of the facility,  their use of natural resources (such as power) and what they house and protect (PII data or Personally Identifiable Information) are even now significantly impacting this process for many large global firms, and is making its way into the real estate community.  This is requiring a series of crash courses in information security, power regulation and rate structures, and other complex issues for many in the Real Estate community. 

In speaking to a bunch of friends in the Real Estate side of the business, I thought it might be interesting to take a few of these standard criteria head on in an open discussion.   For this post I think I will take on two of the elemental ones in data center site selection. We will look at one major item, and one item that is currently considered a minor factor that is quickly on the rise in terms of its overall importance.  Namely Power and Water, respectively.

Watts the Big Deal?

Many think that power cost alone is the primary driver for data centers, and while it is always a factor there are many other facets that come into play underneath that broader category of Power.   While Site Selection Factors are always considered highly confidential I thought it might highlight some of the wider arcs in this category.

One such category getting quite a bit of attention is Power  Generation Mix.   The Generation mix is important because it is essentially the energy sources responsible how that area or region gets its energy.  Despite what politicians would lead you to believe, once an electron is on the grid it is impossible to tell from which source it came.   So ‘Green Energy’ and its multitude of definitions is primarily determined by the mix of energy sources for a given region.   A windmill for example does not generate an electron with a tiny label saying that is sourced from ‘green’ or ‘renewable’ sources.   Understanding the generation mix of your power will allow you to forecast and predict potential Carbon output as a result of Data Center Carbon production.   The Environmental Protection Agency in the US, produces a metric called the Carbon Emission Factor which can be applied to your consumption to assist you in calculating your carbon output and is based upon the generation mix of the areas you are looking to site select in.   Whether you are leasing or building your own facility you will likely find yourself falling into a mandatory compliance in terms of reporting for this kind of thing.

So you might be thinking, ‘Great, I just need to find the areas that have cheap power and a good Carbon Emission Factor right?’  The answer is no.  Many Site Selection processes that I see emerging in the generic space start and stop right at this line.   I would however advocate that one takes the next logical step which is to look at the relationship of these factors together and over a long period of time.

Generation Mix has long been considered to be a ‘Forever’ kind of thing.  The generation sources within a region, rarely changed, or have rarely changed over time.   But that is of course changing significantly in the new era that we live in.

Lets take the interplay (both historical and moving forward) of the Power Cost and its relationship with the Generation Mix.  As humans we like to think in simplistic terms.  Power costs for a specific region are ‘so many cents per kilowatt hour’ this changes based upon whether you are measured at a residential, commercial, or industrial rate schedule.   The rate schedule is a function of how much power you ultimately consume or promise to consume to the local utility.   The reality of course is much more complicated than that.   Power rates fluctuate constantly based upon the overall mix.   Natural disasters, regulation, etc. can have a significant impact on power cost over time.   Therefore its generally wise to look at the Generation Mix Price Volatility through the longer term lens of history and see how a region’s power costs oscillate between these types of events.     However you decide to capture or benchmark this it is a factor that should be considered. 

This is especially true when you take this Volatility factor and apply it the changing requirements of Carbon Reporting and impacts.  While the United States is unlikely to have a law similar to the CRC in the UK (Carbon Reduction Commitment), it will see legislation and regulation impacting the energy producers.  

You might be asking yourself, ‘Who cares if they go after those big bad energy companies and force them to put more ‘green power in their mixes’.  Well lets think about the consequences of these actions to you the user, and why its important to your site selection activity.

As the energy producers are regulated to create a more ‘green’ mix into their systems, two things will happen.  The first of course is that rates will rise.  The energy producers will need to sink large amounts of capital to invest into these technologies, plants, research and development, etc to come to alignment with the legal requirements they are being regulated to.   This effect will be uneven as many areas around the globe have quite a disparate mix of energy from region to region.   This will also mean that ‘greener’ power will likely result in ‘more expensive power’.   Assessing an area for the potential impacts to these kinds of changes is definitely important in a data center build scenario as you likely have a desire to ensure that your facility has the longest possible life which could span a couple of decades.  The second thing which may be a bit harder to guess at, is ‘which technology’ is a given region or area likely to pick and its resulting carbon output impact.   While I have a definite approach to thinking through such things, this is essentially the beginning of the true secret sauce to site selection expertise and the help you may require if you don’t have an internal group to go through this kind of data and modeling.  This is going to have an interesting impact on the ‘clustering’ effect that happens in our industry at large.

We have seen many examples like Quincy, Washington and San Antonio, Texas where the site selection process has led to many Data Center providers locating in the same area to benefit from this type of analysis (even if not directly exposed to the criteria).  There is a story (that I don’t know if its true or not) that in the early days when a new burger chain was looking to expand where it would place its restaurants, it used the footprint of its main competitor as its guide. The thinking was that they probably had a very scientific method for that selection and they would receive that same ancillary benefit without the cost and effort.   Again, not sure if that is true or not, but its definitely something likely to happen in our industry. 

In many markets these types of selections are in high demand.   Ascent Corporation out of St. Louis is in the process of building a modern facility just down the street from the Microsoft Mega-Facility near Chicago.   While Ascent was a part of the original Microsoft effort to build at that location, there has been an uptick in interest for being close to that facility for the same reasons as I have outlined here.  The result is their CH2 facility is literally a stones throw from the Microsoft Behemoth.  The reasons? Proximity to power, fiber, and improved water infrastructure are already there in abundance.  The facility even boasts container capabilities just like its neighbor.   The Elmhurst Electrical Substation sits directly across the highway from the facility with the first set of transmission poles within easy striking distance.  

Elmhurst Electrical Yard

The Generation mix of that area has a large nuclear component which has little to no carbon impact, and generates long term stability in terms of power cost fluctuations.   According to Phil Horstmann, President of Ascent, their is tremendous interest in the site and one of the key draws is the proximity of its nearby neighbor.  In the words of one potential tenant ‘Its like the decision to go to IBM in the 80s.  Its hard to argue against a location where Microsoft or Google has placed one of its facilities.’

This essentially dictates that there will be increasing demand on areas where this analysis is done or has been perceived to be done.   This is especially true where co-location and hosting providers can align their interests with those commercial locations where there is market demand.  While those that follow first movers will definitely benefit from these decisions (especially those without dedicated facility requirements), first movers continue to have significant advantage if they can get this process correct.

Tying into the power conversation is that of water.  With the significant drive for economization (whether water based or air-based)  water continues to be a factor.  What many people don’t understand is that in many markets the discharge water is clean to dump into the sewage system and to ‘dirty’ to discharge to retention ponds.  This causes all kinds of potential issues and understanding the underlying water landscape is important.   The size of the metropolitan sewage environments, ability to dig your own well efforts, the local water table and aquifer issues, your intended load and resulting water requirements, how the local county, muncipality, or region views discharge in general and which chemicals and in what quantities is important to think about today.  However, as the use of water increases in terms of its potential environmental scrutiny – water is quickly rising on the site selection radar of many operators and those with long term holds.

I hope this brief talk was helpful.  I hope to post a few other key factors and a general discussion in the near future.  

\Mm

C02K Doubter? Watch the Presidential address today

Are you a Data Center professional who doubts that Carbon legislation is going to happen or that this initiative will never get off the ground?   This afternoon President Obama plans to outline his intention to assess a cost for Carbon consumption at a conference highlighting his economic accomplishments to date.   The backdrop of this of course is the massive oil rig disaster in the Gulf.

As my talk at the Uptime Institute Symposium highlighted this type of legislation will have a big impact on data center and mission critical professionals.  Whether you know it or not, you will be front and center in assisting with the response, collection and reporting required to react to this kind of potential legislation.  In my talk where I questioned the audience in attendance it was quite clear that most of those in the room were vastly ill-prepared and ill-equipped to this kind of effort. 

If passed this type of legislation is going to cause a severe reaction inside organizations to ensure that they are in compliance and likely lead to a huge increase of spending in an effort to collect energy information along with reporting.  For many organizations this will result in significant spending.

image The US House of Representatives has already passed a version of this known as the Waxman Markey bill.   You can bet that there will be a huge amount of pressure to get a Senate version passed and out the door in the coming weeks and months.

This should be a clarion call for data center managers to step up and raise awareness within their organizations about this pending legislation and take a proactive role in establishing a plan for a corporate response.   Take an inventory of your infrastructure and assess what will you need to begin collecting this information?  It might even be wise to get a few quotes to get an idea or ballpark cost of what it might take to bring your organization up to the task.  Its probably better to start doing this now, than to be told by the business to get it done.

\Mm

Open Source Data Center Initiative

There are many in the data center industry that have repeatedly called for change in this community of ours.  Change in technology, change in priorities, Change for the future.  Over the years we have seen those changes come very slowly and while they are starting to move a little faster now, (primarily due to the economic conditions and scrutiny over budgets more-so than a desire to evolve our space) our industry still faces challenges and resistance to forward progress.   There are lots of great ideas, lots of forward thinking, but moving this work to execution and educating business leaders as well as data center professionals to break away from those old stand by accepted norms has not gone well.

That is why I am extremely happy to announce my involvement with the University of Missouri in the launch of a Not-For-Profit Data Center specific organization.   You might have read the formal announcement by Dave Ohara who launched the news via his industry website, GreenM3.   Dave is another of of those industry insiders who has long been perplexed by the lack of movement and initiative we have had on some great ideas and stand outs doing great work.  More importantly, it doesn’t stop there.  We have been able to put together quite a team of industry heavy-weights to get involved in this effort.  Those announcements are forthcoming, and when they do, I think you will get a sense of the type of sea-change this effort could potentially have.

One of the largest challenges we have with regards to data centers is education.   Those of you who follow my blog know that I believe that some engineering and construction firms are incented ‘not to change’ or implementing new approaches.  The cover of complexity allows customers to remain in the dark while innovation is stifled. Those forces who desire to maintain an aura of black box complexity  around this space and repeatedly speak to the arcane arts of building out  data center facilities have been at this a long time.  To them, the interplay of systems requiring one-off monumental temples to technology on every single build is the norm.  Its how you maximize profit, and keep yourself in a profitable position. 

When I discussed this idea briefly with a close industry friend, his first question naturally revolved around how this work would compete with that of the Green Grid, or Uptime Institute, Data Center Pulse, or the other competing industry groups.  Essentially  was this going to be yet another competing though-leadership organization.  The very specific answer to this is no, absolutely not.   

These groups have been out espousing best practices for years.  They have embraced different technologies, they have tried to educate the industry.  They have been pushing for change (for the most part).  They do a great job of highlighting the challenges we face, but for the most part have waited around for universal good will and monetary pressures to make them happen.  It dawned on us that there was another way.   You need to ensure that you build something that gains mindshare, that gets the business leadership attention, that causes a paradigm shift.   As we put the pieces together we realized that the solution had to be credible, technical, and above all have a business case around it.   It seemed to us the parallels to the Open Source movement and the applicability of the approach were a perfect match.

To be clear, this Open Source Data Center Initiative is focused around execution.   Its focused around putting together an open and free engineering framework upon which data center designs, technologies, and the like can be quickly put together and more-over standardize the approaches that both end-users and engineering firms approach the data center industry. 

Imagine if you will a base framework upon which engineering firms, or even individual engineers can propose technologies and designs, specific solution vendors could pitch technologies for inclusion and highlight their effectiveness, more over than all of that it will remove much mystery behind the work that happens in designing facilities and normalize conversations.    

If you think of the Linux movement, and all of those who actively participate in submitting enhancements, features, even pulling together specific build packages for distribution, one could even see such things emerging in the data center engineering realm.   In fact with the myriad of emerging technologies assisting in more energy efficiency, greater densities, differences in approach to economization (air or water), use of containers or non use of containers, its easy to see the potential for this component based design.  

One might think that we are effectively trying to put formal engineering firms out of business with this kind of work.  I would argue that this is definitely not the case.  While it may have the effect of removing some of the extra-profit that results from the current ‘complexity’ factor, this initiative should specifically drive common requirements, and lead to better educated customers, drive specific standards, and result in real world testing and data from the manufacturing community.  Plus, as anyone knows who has ever actually built a data center, the devil is in the localization and details.  Plus as this is an open-source initiative we will not be formally signing the drawings from a professional engineering perspective. 

Manufacturers could submit their technologies, sample application of their solutions, and have those designs plugged into a ‘package’ or ‘RPM’ if I could steal a term from the Redhat Linux nomenclature.  Moreover, we will be able to start driving true visibility of costs both upfront and operating and associate those costs with the set designs with differences and trending from regions around the world.  If its successful, it could be a very good thing.  

We are not naive about this however.  We certainly expect there to be some resistance to this approach out there and in fact some outright negativity from those firms that make the most of the black box complexity components. 

We will have more information on the approach and what it is we are trying to accomplish very soon.  

 

\Mm