The security at NY4 bears this out. To get from the parking lot to a spot where you could touch one of the servers you’d have to go through five checkpoints. One of them is a so-called man trap with two automatic steel doors that never open at the same time. Your palm print is required twice in addition to your PIN code. A wall of video monitors captures every nook and cranny of the 338,000-square-foot building.
Once you’re in, the space is enveloped by a rush of white noise from the thousands of computer fans whirring away to keep the servers cool. To help maintain the temperature, the ceiling is 45 feet high, roughly four stories up. It’s barely visible—not just because of its height, but also thanks to all of the suspended trays of cables and cooling ducts running overhead. All this goes toward one statistic: Equinix says in its annual filing that it kept its facilities up and running 99.9999 percent of the time in 2015.
Microsoft says underwater data centers would be ideal in many ways. First, they can help reduce cooling costs and emissions from a regular data center by taking advantage of the lower surrounding temperatures for cooling, though the capsule doesn’t actually consume water for cooling.
An underwater capsule can also be built and deployed within 90 days. That’s a great turnaround time if your cloud service needs extra help during a major event like the Super Bowl, when tons of users want to access their data in a specific location. A Project Natick vessel could also be dropped off the coast of a disaster zone to enable faster access to data when it counts.
Finally, about half the world’s population lives within 124 miles (200 kilometers) of the ocean. Placing datacenters offshore brings them closer to more users, which in turn would make latency (the time it takes data to transfer from a server to your PC or smartphone) goes down dramatically.
Hundreds of millions of particle collisions take place every second, at the heart of LHC's detectors. The sensors generate about one petabyte of data every second, an amount no computing system in the world could be able to store if it was generated for any prolonged period.
Most of the data is discarded quickly, as sophisticated systems select what could be of interest for the scientists and filter out the clutter. Then, tens of thousands of processor cores go even further and choose just one percent of the remaining events - information which then gets stored and is later analyzed by physicists.
The datacenter can save 6GB of data per second at the peak rate of the LHC. However, this gigantic machine doesn't run 24/7. "We're expecting about 30 petabytes per year of LHC run two - that would represent something like 250 years of high-definition video," Frédéric Hemmer, IT department head, told ZDNet.
Apple® today announced a €1.7 billion plan to build and operate two data centres in Europe, each powered by 100 percent renewable energy. The facilities, located in County Galway, Ireland (see artist impression below) and Denmark’s central Jutland, will power Apple’s online services including the iTunes Store®, App Store℠, iMessage®, Maps and Siri® for customers across Europe.
The two data centres, each measuring 166,000 square metres, are expected to begin operations in 2017 and include designs with additional benefits for their communities. For the project in Athenry, Ireland, Apple will recover land previously used for growing and harvesting non-native trees and restore native trees to Derrydonnell Forest. The project will also provide an outdoor education space for local schools, as well as a walking trail for the community. In Viborg, Denmark, Apple will eliminate the need for additional generators by locating the data centre adjacent to one of Denmark’s largest electrical substations. The facility is also designed to capture excess heat from equipment inside the facility and conduct it into the district heating system to help warm homes in the neighboring community.
The Altoona (Iowa) facility is the first in Facebook’s fleet to feature a building-wide network fabric – an entirely new way to do intra-data center networking the company’s infrastructure engineers have devised.
The social network is moving away from the approach of arranging servers into multiple massive compute clusters within a building and interconnecting them with each other. Altoona has a single network fabric whose scalability is limited only by the building’s physical size and power capacity.
As America’s retailers struggle to keep up with online shopping, the Internet is starting to settle into some of the very spaces where brick-and-mortar customers used to shop. The shift brings welcome tenants to some abandoned stretches of the suburban landscape, though it doesn’t replace all the jobs and sales-tax revenue that local communities lost when stores left the building.
Venyu Solutions LLC, a data-center operator that is renovating the former department store in Jackson, sees more opportunity for conversion because of sheer amount of distressed retail properties. “Who else wants them?” said Brian Vandegrift, the company’s executive vice president of sales. “You’re not competing with people in substantial businesses who want those spaces.”
Google has spent more than $1 billion to buy and renovate a former paper mill in Finland that can store its user data. Nestled in the caves of a Norwegian mountain, a regional IT company uses a facility built by a local investment group and cooled by a fjord. Microsoft says it will spend $250 million to construct a data center in Finland to manage its cloud services, as part of its agreement to acquire Nokia’s device business. In Luleå, a city of 50,000 on the banks of the Lule less than 70 miles south of the Arctic Circle, Facebook fired up a 300,000-square-foot data center last year.