Google has spent more than $1 billion to buy and renovate a former paper mill in Finland that can store its user data. Nestled in the caves of a Norwegian mountain, a regional IT company uses a facility built by a local investment group and cooled by a fjord. Microsoft says it will spend $250 million to construct a data center in Finland to manage its cloud services, as part of its agreement to acquire Nokia’s device business. In Luleå, a city of 50,000 on the banks of the Lule less than 70 miles south of the Arctic Circle, Facebook fired up a 300,000-square-foot data center last year.
Google, an early convert to Puppet, uses its products to rapidly update the software running servers and employee PCs. So does music-streaming service Spotify. On servers or PCs, “it would have been impossible to grow and manage the Spotify infrastructure without a configuration management tool like Puppet,” says company engineer Johan Haals. GE Capital uses Chef to manage its servers and network switches, and chief engineer Justin Arbuckle says he can use it to distribute a new app across a network in a couple of hours. “In the past, it would have literally taken weeks and weeks,” he says.
"We have brought the storage into the cluster, and we have
commoditized it," says Ting, and this is why the US government,
financial services firms, healthcare companies, and educational
institutions are running a lot of proofs of concepts with the NX-3000
series of appliances.
About 25 per cent of the iron is going to Uncle Sam, which is in many
cases putting server clusters into vehicles to get image and signal
processing at command posts or into the field in Humvees, in some cases.
Virtual desktop infrastructure was the obvious early-use case for
Nutanix machines, and it is still driving a lot of deals, but Ting says
the company is seeing companies dump Microsoft workloads such as
Exchange Server and SQL Server on the boxes, and has just closed a deal
this month with a Global 2000-class company for 1,500 server nodes to
run an analytics workload."
Nice summary of the innovations Microsoft has implemented in data center design, energy and water sources and efficiencies over the last decade. I cataloged Microsoft’s data center efficiencies in The New Polymath, and Facebook’s and Google’s in The New Technology Elite, and it may be time to revisit Microsoft for my next book especially specific to energy innovations from the blog:
Biomass generation project in Europe that would operate on waste fuel
Large photovoltaic solar project in the Southwestern U.S.
Fuel cell installation that would improve reliability and eliminate the need for back-up diesel generators in situations where the power grid goes down
Combined heat and power (CHP) projects that capture waste heat for reuse
Too late for the current Colorado mess (false color satellite image below), but the US Forest Service is finally upgrading its firefighting fleet which averages 50+ years of service. The new planes including the Neptune BAe-146 in photo will be “turbine powered, can carry a minimum of 2,400 gallons of retardant, and have a cruise speed of at least 300 knots when fully loaded. “
In addition the Forest Service has plans to acquire 6,000 SPOT devices which are “rugged, pocket-sized satellite-based personal trackers that work in places where cell phones and two way radios fail.”
Photo Credit Digital Globe - Overview of the High Park Fire near Fort Collins, Colorado. In this image, red areas are healthy vegetation, blue areas are burnt and white is fire smoke
“In the next five years a new way of thinking about, constructing and operating IT will emerge. Data centers are no longer the size of mini-marts but instead are mega-marts like Rob Roy’s 2.2 million square foot Switch data center in Las Vegas (photo below) . Servers are no longer the unit of computing, but instead are being taken completely apart or are a mere component in the new data-center sized computer, a trend being pushed by Frank Frankovsky at Facebook and at the Open Compute Foundation.
The walls between data centers will also matter less and less as software defined networks help create secure, flexible bandwidth between data centers and eventually continents, which folks like Martin Casado of Nicira are working on. This new era sees infrastructure as a service and the hardware becomes a fungible element, supporting a river of data and applications that flow on top of it. The U.S. government is certainly taking advantage of this shift with its Digital Government Strategy, led by U.S. CIO Steve VanRoekel.”
The NSA Chief has contradicted the story below in a Congressional hearing, but this Wired story on the agency’s technological reach makes for interesting reading.
“Under construction by contractors with top-secret clearances, the blandly named Utah Data Center is being built for the National Security Agency. A project of immense secrecy, it is the final piece in a complex puzzle assembled over the past decade. Its purpose: to intercept, decipher, analyze, and store vast swaths of the world’s communications as they zap down from satellites and zip through the underground and undersea cables of international, foreign, and domestic networks. The heavily fortified $2 billion center should be up and running in September 2013. Flowing through its servers and routers and stored in near-bottomless databases will be all forms of communication, including the complete contents of private emails, cell phone calls, and Google searches, as well as all sorts of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital “pocket litter.””