Google Purchase of Deep Earth Mining Equipment in Support of ‘Project Rabbit Ears’ and Worldwide WIFI availability…

(10/31/2013 – Mountain View, California) – Close examination of Google’s data center construction related purchases has revealed the procurement of large scale deep earth mining equipment.   While the actual need for the deep mining gear is unclear, many speculate that it has to do with a secretive internal project that has come to light known only as Project: Rabbit Ears. 

According to sources not at all familiar with Google technology infrastructure strategy, Project Rabbit ears is the natural outgrowth of Google’ desire to provide ubiquitous infrastructure world wide.   On the surface, these efforts seem consistent with other incorrectly speculated projects such as Project Loon, Google’s attempt to provide Internet services to residents in the upper atmosphere through the use of high altitude balloons, and a project that has only recently become visible and the source of much public debate – known as ‘Project Floating Herring’, where apparently a significantly sized floating barge with modular container-based data centers sitting in the San Francisco Bay has been spied. 

“You will notice there is no power or network infrastructure going to any of those data center shipping containers,” said John Knownothing, chief Engineer at Dubious Lee Technical Engineering Credibility Corp.  “That’s because they have mastered wireless electrical transfer at the large multi-megawatt scale.” 

Real Estate rates in the Bay Area have increased almost exponentially over the last ten years making the construction of large scale data center facilities an expensive endeavor.  During the same period, The Port of San Francisco has unfortunately seen a steady decline of its import export trade.  After a deep analysis it was discovered that docking fees in the Port of San Francisco are considerably undervalued and will provide Google with an incredibly cheap real estate option in one of the most expensive markets in the world. 

It will also allow them to expand their use of renewable energy through the use of tidal power generation built directly into the barges hull.   “They may be able to collect as much as 30 kilowatts of power sitting on the top of the water like that”, continues Knownothing, “and while none of that technology is actually visible, possible, or exists, we are certain that Google has it.”

While the technical intricacies of the project fascinate many, the initiative does have its critics like Compass Data Center CEO, Chris Crosby, who laments the potential social aspects of this approach, “Life at sea can be lonely, and no one wants to think about what might happen when a bunch of drunken data center engineers hit port.”  Additionally, Crosby mentions the potential for a backslide of human rights violations, “I think we can all agree that the prospect of being flogged or keel hauled really narrows down the possibility for those outage causing human errors. Of course, this sterner level of discipline does open up the possibility of mutiny.”

However, the public launch of Project Floating Herring will certainly need to await the delivery of the more shrouded Project Rabbit Ears for various reasons.  Most specifically the primary reason for the development of this technology is so that Google can ultimately drive the floating facility out past twelve miles into International waters where it can then dodge all national, regional, and local taxation, the safe harbor and privacy legislation of any country or national entity on the planet that would use its services.   In order to realize that vision, in the current network paradigm, Google would need exceedingly long network cables  to attach to Network Access Points and Carrier Connection points as the facilities drive through international waters.

This is where Project Rabbit Ears becomes critical to the Google Strategy.   Making use of the deep earth mining equipment, Google will be able to drill deep into the Earths crust, into the mantle, and ultimately build a large Network Access Point near the Earth’s core.  This Planetary WIFI solution will be centrally located to cover the entire earth without the use of regional WIFI repeaters.  Google’s floating facilities could then gain access to unlimited bandwidth and provide yet another consumer based monetization strategy for the company. 

Knownothing also speculates that such a move would allow Google to make use of enormous amounts of free geo-thermic power and almost singlehandedly become the greenest power user on the planet.   Speculation also abounds that Google could then sell that power through its as yet un-invented large scale multi-megawatt wireless power transfer technology as unseen on its floating data centers.

Much of the discussion around this kind of technology innovation driven by Google has been given credible amounts of veracity and discussed by many seemingly intelligent technology based news outlets and industry organizations who should intellectually know better, but prefer not to acknowledge the inconvenient lack of evidence.

 

\Mm

Editors Note: I have many close friends in the Google Infrastructure organization and firmly believe that they are doing some amazing, incredible work in moving the industry along especially solving problems at scale.   What I find simply amazing is in the search for innovation how often our industry creates things that may or may not be there and convince ourselves so firmly that it exists. 

Best of Luck to a Great Guy…

I just read that Chris Crosby has announced his departure at Digital Realty Trust. As an alumni of that great firm I can definitely tell you that Chris’ hand-prints are all over that company.  I had the pleasure of interacting with him, before I joined,  heavily in my role there, and have maintained our relationship since leaving.   Chris was an influential force in defining how that company ran, operated, and ultimately succeeded in dominating the wholesale data center market.  Not to mention that he was always a charismatic tour-de-force as one of its primary faces and pitchmen.

I don’t know what Chris is up to next but I wish him the greatest success and happiness.  If his near term goal is a little time off, Lord knows he has earned it.  I do know that his tornado like energy wont keep him out of the fray for long.

\Mm

More Chiller Side Chat Redux….

I have been getting continued feedback on the Chiller Side Chat that we did live on Monday, September 14th.  I wanted to take a quick moment and discuss one of the recurring themes of emails I have been receiving on the topic of data center site selection and the decisions that result at the intersection of data center technology, process and cost.    One of the key things that we technical people often forget is that the data center is first and foremost a business decision.  The business (whatever kind of business it is) has a requirement to improve efficiency through automation, store information, or whatever it is that represents the core function of that business.  The data center is at the heart of those technology decisions and the ultimate place where those solutions will reside.  

As the primary technical folks in an organization whether you represent IT or Facilities,  we can find ourselves in the position of getting deeply involved with the technical aspects of the facility – the design, the construction or retro-fit, the amount of power or cooling required, the amount of redundancy we need and the like.  Those in upper management however view this substantially in a different way.    Its all about business.  As I have gotten a slew of these mails recently I decided to try and post my own response.  As I thought about how I would go about this, I would keep going back to Chris Crosby’s discussion at Data Center Dynamics about two years ago.   As you know I was at Microsoft, at the time and felt that he did an excellent job of outlining the way the business person views data center decisions.    So I went digging around and found this video of Chris talking about it.  Hopefully this helps!  If not let me know and I am happy to discuss further or more specifically.

\Mm

Data Center Dynamics – San Francisco July 17th

image

For those of you interested I will be at Data Center Dynamics in San Francisco on July 17th.   I am scheduled to be on a panel moderated by James Staten of Forrester Research at 11:05AM along with Tom Furlong of Facebook, John Haas of Intel, and Bill Mazzetti of Rosendin Electric entitled ‘The Data Center Efficiency Schism…New Realities in Design’.  I may also do a guest appearance during the Chris Crosby keynote.    When I am not speaking I am likely to be found wandering around the event from session to session so please feel free to stop me and say Hi.

I am also planning on attending a Green Grid event at the conference later in the day as well. 

If you would like more information or are looking for registration information it can be found here.  See you there!

\Mm

Schneider / Digital New York Speaking Engagement

new-york-city.jpg

Just in case anyone wanted to connect – I wanted to highlight that I will be co-presenting the keynote at the Schneider Symposium at the Millenium Broadway hotel in New York City with Chris Crosby of Digital Realty Trust.  I will also be giving a talk on practical applications of Energy Efficiency and sitting on an Energy Efficiency  panel led by Dan Golding from Tier One Research.   The program kicks off at 8am on Wednesday.   Feel free to stop and say hi!

/Mm

Forecast Cloudy with Continued Enterprise

image

This post is a portion of my previous post that I broke into two.   It more clearly defines where I think the market is evolving to and why companies like Digital Trust Realty will be at the heart of the change in our industry.

 

The Birth of Utilities and why a Century ago will matter today and forward…

I normally kick off my vision of the future talk with a mention first to history (my former Microsoft folks are probably groaning at this moment if they are reading this).   I am a huge history buff.  In December of 1879, Thomas Edison harnessed the power of electricity for the first time to light a light bulb.  What’s not apparent is that this “invention” was in itself not complete.  To get this invention from this point to large scale, commercial application required a host of other things to be invented as well.   While much ado is made about the successful kind of filament used to ensure a consistent light source, there were no less than at least seven other inventions to make electric light (and ultimately the electric utility) practical for everyone.  Invention of things like the parallel circuit, an actual durable light bulb, an improved dynamo, underground conductor networks, devices to maintain constant voltage, insulting materials and safety fuses, the light socket, the on/off switch, and a bunch of other minor things.   Once all these things were solved, the creation of the first public electricity utility was created.  On September of 1882, the first commercial power station, located on Pearl Street in lower Manhattan opened its doors and began providing light and electrical power to all customers within the “massive” area of one square mile.   This substation was a marvel of technology staffed with 10s of technicians, maintaining the complex machinery to exacting standards.   The ensuing battle between Direct Current and Alternating Current was then created and in some areas still continues today. More on this in a bit.

A few years earlier a host of people were working on what would eventually become known as the telephone.   In the United States this work is attributed to Alexander Graham Bell and its that story I will focus on here for a second.  Through trial and error Bell and his compatriot Watson, accidently stumbled across a system to transfer sound in June of 1875.  After considerable work on refinement the product launched (There is an incredibly interesting history of this at SCRIBD), and after ever more additional trial and error the first telephonic public utility was created with the very first central office coming online in January of 1878 in New Haven, Connecticut.  This first central office was a marvel to behold.  Again the extremely high tech equipment with a host of people ensuring that telephonic utility was always available and calls were transferred appropriately.  Interestingly by 1881 only 9 cities with populations above 10,000 were without access to the telephone utility and only 1 above 15,000!  That is an adoption rate that remains boggling even by today’s standards.  

These are significant moments in time that truly changed the world in the way we live every day.   Today we are the birth of another such utility.  The Information Utility.   Many people I have spoken to claim this “Information Utility” is something different. It’s more of a product, because it uses existing utility services. Some maintain that its truly not revolutionary because its not leveraging new concepts.   But the same can be said of those utilities as well.   The communications infrastructure we use today whether telephone or data has its very roots in the telegraph.  The power utilities have a lot to thank the gas-lamp utilities of the past for solving early issues as well.  Everything old is new again and everything gets refined into something newer and better.  Some call this new Information Utility the “cloud”, others the Information-sphere, others just call it the Internet.  Regardless what you call it, access to information is going to be at your finger tips more today and tomorrow than it has ever been before. 

Even though this utility is built upon existing services, this utility too will have its infrastructure.  Just as the electric utility has its sub-stations and distribution yards, and the communication utilities have central offices, so too will data centers become the distribution mechanism for the the Information Utility.   We still have a lot of progress to make as well.   Not everything is invented or understood yet.  Just as Edison had to invent a host of other items to make electricity practical, and Bell and Watson have to develop the telephone, the telephone ringer (or more correctly, thumper or buzzer), so to does our information Utility have a long way to go.  In some respects its even more complicated than its predecessors as their was not burdened with legislation and government involvement that would affect its early development.  The “Cloud” does.

And that innovation does not always come from a select few.  Westinghouse and his alternating current eventually won out over direct current because it found its killer app and business case.   Alternating current was clearly the technically superior and better for distribution. They had even demonstrated generating power at Niagara Falls and successfully transferred that power all the way to Buffalo, New York! Something direct current was unable to do.  In the end, Westinghouse worked with appliance manufacturers to create devices that used alternating current.  By driving his killer app (things like refrigerators), Edison eventually lost out.  So too will the cloud have its killer apps.  The pending software and services battle will be interesting to note.  However what is interesting to me is that it was the business case that drove adoption and evolution here.  This also modified how the utility was used and designed.   DC substations gave way to AC substations, what used to take scores of people to support has dwindled to occasional visitations and pre-scheduled preventative maintenance.   At the data center level, we cannot afford to think that these killer applications will not change our world.    Our killers applications are coming and it will forever change how our world does business.  Data Centers and their evolution are at the heart of our future. 

On Fogs, Mist, and the Clouds Ahead . . .

After living in Seattle for close to 10 years, you learn you become an expert in three things.  Clouds, rain, and more clouds.   Unlike the utilities of the past, this new Information Utility is going to be made up of lots of independent cloudlets full of services.  The Microsoft’s, Google’s and Amazon’s of the world will certainly play a large part of the common platforms used by everyone, but the applications, products, content, customer information, and key development components will continue to have a life in facilities and infrastructure owned or controlled by companies providing those services.    In addition, external factors are already beginning to have a huge influence on cloud infrastructure.  Despite the growing political trend of trans-nationalism where countries are giving up some of their sovereign rights to participate in more regionally-aware economics and like-minded political agendas, that same effect does not seem to be taking place in the area of taxation and regulation of cloud and information infrastructure.  Specifically as it relates to electronic or intellectual property entities that derive revenue from infrastructure housed in those particular countries or do so (drive revenue) off of online activity of citizens of those nations.

There are numerous countries today that have, or are, seriously engaged in establishing and managing their national boundaries digitally online.  What do I mean by that?  There is a host of legislation across the globe that is the beginning to govern the protection and online management of their their citizens through legislation and mandates in accordance with their own laws.   This is having (and will continue to have) a dramatic impact on how infrastructure and electronic products and services will be deployed, where that data is stored,  and how revenue from that activity can and will be taxed by the local country.  This level of state exercised control can be economically, politically, or socially motivated and cloud services providers need to pay attention to it.   A great example of this is Canada which has passed legislation in response to the U.S. Patriot Act.   This legislation forbids personally identifiable information (PII) of Canadian citizens to be housed outside of the boundaries of Canada or perhaps more correctly, forbids its storage in the United States.    There are numerous laws and legislation making their way across Europe and Asia as well.   That puts an interesting kink in the idea of a world wide federated cloud user-base where information will be stored “in the cloud”.  From an infrastructure perspective it will mandate that there are facilities in each country to house that data.  While the data storage and retention challenge is an interesting software problem to solve the physical fact that the data will need to remain in a local geography will require data centers and components of cloud infrastructure to be present.  I expect this to continue as governments become more technically savvy and understand the impact of the rate of change being caused by this technology evolution. Given the fact that data centers are extremely capital intensive only a few players will be able to deploy private global infrastructures.  This means that the “information sub-station” providers will have an even more significant role in driving the future standards of this new Information Utility. One might think that this could be a service that is ultimately provided by the large cloud providers as a service.   That could be a valid assumption however, there is an interesting wrinkle developing around taxation or more correctly exposure to double taxation or multiple-country taxation that those large providers will face.   In my opinion the federation of “information substation” providers will provide the best balance of off-setting taxation issues and still providing a very granular and regionally acceptable way to service customers. That is where companies like Digital Realty Trust are going to come in and drive significant value and business protection.

I watch a lot of these geo-political and economic developments pretty closely as it relates to Data Center and infrastructure legislation and will continue to do so.  But even outside of these issues, the “cloud” or whatever term you like will continue to evolve and the “channels” created by this paradigm will continue to drive innovation at the products and services level.  Its at this level where the data center story will continue to evolve as well.   To start we need to think about the business version of the IT “server-hugging” phenomena. For the uninitiated, “Server Huggers” are those folks in an IT department who believe that the servers have to be geographically close  in order to work on them. In some cases its the right mentality, in others, where the server is located truly doesn’t matter.   It’s as much a psychological experiment as a technical one.   At a business level, there is a general reluctance to house the company jewels outside of corporate controlled space.  Sometimes this is regulated (like banks and financial institutions), most often its because those resources (proprietary applications, data sets, information stores, etc) are crucial to the success of the company, and in many instances ARE the company. Not something you necessarily want to outsource to others for control.  Therefore wholesale adoption of cloud resources is still a very very long way off.  That is not to say that this infrastructure wont get adopted into solutions that companies ultimately use to grow their own businesses.  This is going to drive tons of innovation where businesses evolve their applications , create new business models, and join together in mutually beneficial alliances that will change the shape, color, and feel of the cloud.  In fact, the cloud or “Information Utility” becomes the ultimate channel distribution mechanism.

The first grouping I can see evolving is fraternal operating groups or FOGs.  This is essentially a conglomeration of like minded or related industry players coming together to build shared electronic compute exchanges or product and service exchanges.  These applications and services will be highly customized to that particular industry. They will never be sated by the solutions that the big players will be putting into play, they are too specialized.  This infrastructure will likely not sit within individual company data centers but are likely to be located in common ground facilities or leased facilities with some structure for joint ownership.   Whether large or small, business to business, or business to consumer, I see this as an evolving sector.  There will be definitely companies looking to do this on their behalf, but given the general capital requirements to get into this type of business these FOG Agreements may be just the answer to find a great trade off between capital investment and return on the compute/service.

The next grouping builds off of the “company jewels” mindset and how it could blend with cloud infrastructure.  To continue the overly used metaphor of clouds,I will call them Managed Instances Stationed Territorially or MISTs.   There will likely be a host of companies that want to take advantage of the potential savings of cloud managed infrastructure, but want the warm and fuzzy feeling knowing its literally right in their backyard.  Imagine servers and infrastructure deployed at each customer data center, but centrally managed from cloud service providers.   Perhaps its owned by the cloud provider, perhaps the infrastructure has been purchased by the end-user company.   One can imagine the container-based server solutions being dropped into container-ready facilities or jury-rigged in the parking lot of a corporate owned or leased facility.  This gives companies the ability to structure their use of cloud technologies and map them into their own use case scenarios.  What makes the most sense for them.  The recent McKinsey paper talked about how certain elements of the cloud are more expensive than managing the resources through traditional means.  This is potentially a great hybrid scenario where companies can integrate as they need to using those services.  One could even see Misty FOGs or Foggy Mists.  I know the analogy is getting old at this point, but hopefully you can see that the future isn’t as static as some would have you believe.  This ability to channelize the technologies of the cloud will have a huge impact on business costs, operations, and technology.   It also suggests that mission critical infrastructure is not going to go away but become even more important and potentially more varied.  This is why I think that the biggest infrastructure impact will occur in the “information substation provider” level.  Data Centers aren’t going away, they might actually be growing in terms of demand, and the one thing definitely for sure is that they are evolving today and will continue to evolve as this space matures.  Does your current facility allow for this level of interconnectivity?  Do you have the ability to have a mixed solution management providers in your facility?  Lots of questions lots of opportunities to develop answers.

The last grouping is potentially an evolution of modern content delivery infrastructure or edge computing capabilities.  I will quit with the cutesy cloud names and call this generically Near Cloud Content Objects.   Given that products, services, and data will become the domain of those entities owning them, and a general reluctance to wholesale store them in someone else’s infrastructure, one could see that this proprietary content could leverage the global cloud infrastructures through regional gateways where they will be able to maintain ownership and control of their asset.  This becomes even more important when you balance into this the economic and geo-political aspects emerging in cloud compute.

In the end the cloud approach is going to significantly drive data center demand and cause it to evolve even further.  It will  not as some would like to project end the need for corporate data centers.  Then there is that not so little issue of the IT Applications and internal company services we use everyday.  This leads me into my next point . . .

The Continued and Increasing Importance of Enterprise Data Centers

This post has concentrated a lot on the future of cloud computing, so I will probably tick off a bunch of cloud-fan-folk with this next bit, but the need for the corporate data centers is not going away.  They may change in size, shape, efficiency, and the like, but there is a need to continue to maintain a home for those company jewels and to serve internal business communities.  The value of any company is the information and intellectual property developed, maintained, and driven by their employees.   Concepts like FOGs and MISTs and such still require ultimate homes or locations for that work to be terminated into or results sent to.  Additionally look at the suite of software each company may have in its facilities today supporting their business.  We are at least a decade or more away before those could be migrated to a distributed cloud based infrastructure.  Think about the migration costs of any particular application you have, then compound that with having the complexity of having your data stored in those cloud environments as well.  Are you then locked into a single cloud provider forever? It obviously requires cloud interoperability, which doesn’t exist today with exception of half-hearted non-binding efforts that don’t actually include any of the existing cloud providers.   If you believe as I do that the “cloud”  will actually be many little and large channelized solution cloudlets, you have to believe that the corporate data center is here to stay.  The mix of applications and products in your facilities may differ in the future, but you will still have them.  That’s not to say the facilities themselves will not have to evolve.  They will.  With changing requirements around energy efficiency and green reporting, along with the geo-political and other regulations coming through the pipeline, the enterprise data center will still be an area full of innovation as well.  

/Mm

Starting something new….

image

This post was an interesting struggle for me.  What should my first post since my departure from Microsoft be about?  I have a great amount of topics that I definitely want to talk about regarding the distance and gap from the executive suite, to Information Technology to the data center floor and why there continues to be challenge in this space across the industry.  In fact I probably have a whole series of them.  I am thinking of calling them “Chiller-side Chats” aimed at priming both sides in conversations with the other.   There are some industry-wide metric related topics that I want to take on, interesting trends I see developing, and literally a host of other things ranging from technology to virtualization.  While at Microsoft I maintained Loosebolts and an internal Microsoft blog which as it turns out was quite a bit of work.   I now have time to focus my energies in one place here at Loosebolts and unfortunately I may subject everyone reading this to even more of my wild ramblings.  But to talk to any of these technical issues, business issues, or industry issues would be ignoring the gigantic, purple spotted, white elephant in the room.   In fact, by the time I finished the original version of this post it was 6 pages long, and ran far afield on what I think is fundamentally changing in the data center space.  Instead of subjecting you to one giant blog, I was counseled by close friends to cut it down a bit into different sections.  So I will chop it up into two seperate posts.  The first question of course is – Why did I leave Microsoft for Digital Realty Trust?

I accomplished a great deal at Microsoft and I am extremely proud of my work there.  I have an immense amount of pride in the team that I developed there and the knowledge that it continues to drive that vision within the company.  Rest assured Microsoft has a great vision for where things are going in that space and the program is on rails as they say.  My final goodbye post talks more about my feelings there.  Within it, however, are some of the seeds (to continue that farming analogy even farther!)  of my departure.  First we need to pull our heads out of the tactical world of data centers and look at the larger emerging landscape in which data centers sit.  Microsoft, along with Google, Amazon and a few others are taking aim at Cloud Computing and are designing, building, and operating a different kind of infrastructure with different kinds of requirements. Specifically building ubiquitous services around the globe.  In my previous role, I was tasked with thinking about and building this unique infrastructure in concert with hundreds of development groups taking aim at building a core set of services for the cloud.   A wonderful blend of application and infrastructure.  Its a great thing.  But as my personal thought processes matured and deepened on this topic flavored with what I was seeing as emerging trends in business, technology and data center requirements I had a personal epiphany.  The concept of large monolithic clouds ruling the Information-sphere was not really complete.  Don’t get me wrong, they will play a large and significant role in how we compute tomorrow, but instead of an oligarchy of the few, I realized that enterprise data centers are here to stay and additionally we are likely to see an explosion of different cloud types are on the horizon.

In my opinion it is here in this new emerging space where the Information Utility will ultimately be born, defined, and true innovation in our industry (data center-wise) will take place.   This may seem rather unintuitive given the significant investments being made by the big cloud players but it is really not.   We have to remember that today, any technology must sate basic key requirements.  First and foremost amongst these is that it must solve the particular business problems.  Technology for technology sake will never result in significant adoption and the big players are working to perfect platforms that will work across a predominance of applications being specifically developed for their infrastructure.   In effect they are solving for their issues.  Issues that most of those looking to leverage cloud or shared compute will not necessarily match in either scale or standardization of server and IT environments.    There will definitely be great advances in technology, process, and a host of other areas, as a result of this work, but their leveragability is ultimately minimized as their environments, while they look like each other’s, will not easily map into the enterprise, near-enterprise, or near-cloud space.   The NASA space program has had thousands of great solutions, and some of them have been commercialized for the greater good.  I see similar things happening in the data center space.  Not everyone can get sub 1.3 Average PUE numbers, but they can definitely use those learnings to better their own efficiency in some  way.  While these large platforms in conjunction with enterprise data centers will provide key and required services, the innovation and primary requirement drivers in the future will come from the channel. 

So Why Digital Realty Trust?

Innovation can happen everywhere in any situation but it is most often born under the pressure of constraints.  While there are definitely some constraints that the big players have in evolving their programs, the real focus and attention in the industry will be at the Enterprise and Information Sub Station provider layer.   This is the part of the industry that is going to feel the biggest pinch as the requirements evolve.  Whether they be political, economical, social, or otherwise this layer will define how most of the data center industry looks like.   It is here at this layer in which a majority of companies around the world will be.  It is here at this layer that will be the most exciting for me personally.  The Moon Missions were great but they were not about bringing space travel to the masses.  Definitely some great learnings there that can be leveraged, but the commercialization and solution to the masses problem is different, perhaps bigger, and in my opinion more challenging.   At the end of the day it has to be economical and worthwhile.  We have to solve that basic business need and use case or it will remain an interesting scientific curiosity much like electricity was viewed before the light bulb. 

In Digital Realty Trust I found the great qualities I was looking for in any company.   First, they are positioned to provide either “information substation” or “enterprise” solutions and will need to solve for both.  They are effectively right in the middle of solving these issues and they are big enough to have a dramatic impact on the industry.  Secondly, and perhaps more importantly, they have a passionate, forward looking management team whom I have interacted with in the Industry for quite some time.  Let me reiterate that passionate point a moment, this is not some real estate company looking to make a quick buck on mission critical space.  I have seen enough of those in my career.  This is a firm focused on educating the market, driving innovation in application of technology, and near zealot commitment on driving efficiencies for their customers.  Whether its their frequent webinars, their industry speaking engagements, or personal conversations they are dedicated to this space, and dedicated on informing their customers.  Even when we have disagreed on topics or issues in the past, its always a great respectful conversation.  In a nutshell, they GET IT.  Another key piece that probably needs some addressing is that bit about application of technology.  We are living in some interesting times with data center technologies in a wonderful and terrible time of evolution.   The challenge for any enterprise is making heads and tails of which technologies will be great for them, what works, what doesn’t, what’s vaporware versus what is truly going to drive value.   The understanding and application of that technology is an area that Digital knows very well and the scale of their deployments allow them to learn the hard lessons before their clients have to.   Moreover they are implementing these technologies and building solutions that will fit for everyone, today! 

Another area where there is significant alignment in terms of my own personal beliefs and those of Digital Realty Trust is around speed of execution and bringing capacity online just in time.   Its no secret that I have been an active advocate of moving from big build and construction to a just in time production model.  These beliefs have long been espoused by Chris Crosby, Jim Smith, and the rest of the Digital team for some time and is very clearly articulated in the POD ARCHITECTURE approach that they have been developing for quite a few years.  Digital has done a great job of bringing this approach to the market for enterprise users and wants to drive it even faster!  One of my primary missions will be to develop the ability to deliver data center capacity start to finish in 16 weeks.   You cannot get there without a move to standardizing the supply chain and driving your program to production rather than pure construction.   Data Center planning and capacity planning is the single largest challenge in this industry.  The typical business realizes to late that they are in need to add data center capacity and these efforts typically result in significant impacts to their own business needs through project delays or cost.  As we all know, data center capacity is not ubiquitous and getting capacity just in time is either very expensive or impossible in most markets.  You can solve this problem by trying to force companies to do a better job of IT and capacity planning (i.e. boiling the ocean) or you can change how that capacity is developed, procured, and delivered.   This is one of my major goals and something I am looking forward to delivering.

In the end, my belief is that it will be companies like Digital Realty Trust at the spearhead of driving the design, physical technology application and requirements for the global Information Utility infrastructure.  They will clearly be situated the closest to those changing requirements for the largest amount of affected groups.  It is going to be a huge challenge. A challenge, I for one am extremely excited about and can’t wait to dig in and get started.

\Mm