Previous post
Next post

Google Reincarnates Dead Paper Mill as Data Center of Future

Google's Finland data center is the ultimate metaphor for the Internet Age (Photos: Google)

Joe Kava found himself on the southern coast of Finland, sending robotic cameras down an underground tunnel that stretched into the Baltic Sea. It’s not quite what he expected when he joined Google to run its data centers.

In February of 2009, Google paid about $52 million for an abandoned paper mill in Hamina, Finland, after deciding that the 56-year-old building was the ideal place to build one of the massive computing facilities that serve up its myriad online services. Part of the appeal was that the Hamina mill included an underground tunnel once used to pull water from the Gulf of Finland. Originally, that frigid Baltic water cooled a steam generation plant at the mill, but Google saw it as a way to cool its servers.

Joe Kava, Google's head of data center operations and construction

Those robotic cameras — remote-operated underwater vehicles that typically travel down oil pipelines — were used to inspect the long-dormant tunnel, which ran through the solid granite bedrock sitting just beneath the mill. As it turns out, all 450 meters of the tunnel were in excellent condition, and by May 2010, it was moving sea water to heat exchangers inside Google’s new data center, helping to cool down thousands of machines juggling web traffic. Thanks in part to that granite tunnel, Google can run its Hamina facility without the energy-sapping electric chillers found in the average data center.

“When someone tells you we’ve selected the next data center site and it’s a paper mill built back in 1953, your first reaction might be: ‘What the hell are you talking about?,’” says Kava. “‘How am I going to make that a data center?’ But we were actually excited to learn that the mill used sea water for cooling…. We wanted to make this as a green a facility as possible, and reusing existing infrastructure is a big part of that.”

Kava cites this as a prime example of how Google “thinks outside the box” when building its data centers, working to create facilities that are both efficient and kind to the world around them. But more than that, Google’s Hamina data center is the ideal metaphor for the internet age. Finnish pulp and paper manufacturer Stora Enso shut down its Summa Mill early in 2008, citing a drop in newsprint and magazine-paper production that led to “persistent losses in recent years and poor long-term profitability prospects.” Newspapers and magazines are slowly giving way to web services along the lines of, well, Google, and some of the largest services are underpinned by a new breed of computer data center — facilities that can handle massive loads while using comparatively little power and putting less of a strain on the environment.

Google was at the forefront of this movement, building new-age facilities not only in Finland, but in Belgium, Ireland, and across the U.S. The other giants of the internet soon followed, including Amazon, Microsoft and Facebook. Last year, Facebook opened a data center in Prineville, Oregon that operates without chillers, cooling its servers with the outside air, and it has just announced that it will build a second facility in Sweden, not far from Google’s $52-million Internet Metaphor.

The heat exchanger room in Google's Hamina data center

The Secrets of the Google Data Center

Google hired Joe Kava in 2008 to run its Data Center Operations team. But this soon morphed into the Operations and Construction team. Originally, Google leased data center space inside existing facilities run by data center specialists, but now, it builds all its own facilities, and of late, it has done so using only its own engineers. “We used to hire architecture and engineering firms to do the work for us,” Kava says. “As we’ve grown over the years and developed our own in-house talent, we’ve taken more and more of that work on ourselves.”

Over those same years, Google has said precious little about the design of the facilities and the hardware inside them. But in April 2009, the search giant released a video showing the inside of its first custom-built data center — presumably, a facility in The Dalles, Oregon — and it has since lifted at least part of the curtain on newer facilities in Hamina and in Saint-Ghislain, Belgium.

According to Kava, both of these European data centers operate without chillers. Whereas the Hamina facility pumps cold water from the Baltic, the Belgium data center uses an evaporative cooling system that pulls water from a nearby industrial canal. “We designed and built a water treatment plant on-site,” Kava says. “That way, we’re not using potable water from the city water supply.”

For most of the year, the Belgium climate is mild enough to keep temperatures where they need to be inside the server room. As Kava points out, server room temperatures needn’t be as low as they traditionally are. As recently as August 2008, the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) recommended that data center temperatures range from 68 and 77 degrees Fahrenheit — but Google was advising operators to crank the thermostat to above 80 degrees.

“The first step to building an efficient data center…is to just raise the temperature,” Kava says. “The machines, the servers, the storage arrays, everything — they run just fine at much much more elevated temperatures than the average data center runs at. It’s ludicrous to me to. walk into a data centers that’s running at 65 or 68 degrees Fahrenheit or less.”

There are times when the temperature gets so hot inside the data centers, Google will order employees out of the building — but keep the servers running. “We have what we call ‘excursion hours’ or ‘excursion days.’ Normally, we don’t have to do anything [but] tell our employees not to work in the data center during those really hot hours and just catch up on office work.”

At sites like Belgium, however, there are days when it’s too hot even for the servers, and Google will actually move the facility’s work to one of its other data centers. Kava did not provide details, but he did acknowledge that this data center shift involves a software platform called Spanner. This Google-designed platform was discussed at a symposium in October 2009, but this is the first time Google has publicly confirmed that Spanner is actually in use.

“If it really, really got [hot] and we needed to reduce the load in the data center,” Kava says, “then, yes, we have automatic tools and systems that allow for that, such as Spanner.”

According to the presentation Google gave at that 2009 symposium, Spanner is a “storage and computation system that spans all our data centers [and that] automatically moves and adds replicas of data and computation based on constraints and usage patterns.” This includes constraints related to bandwidth, packet loss, power, resources, and “failure modes” — i.e. when stuff goes wrong inside the data center.

The platform illustrates Google’s overall approach to data center design. The company builds its own stuff and will only say so much about that stuff. It views technology such as Spanner as a competitive advantage. But one thing is clear: Google is rethinking the data center.

The approach has certainly had an effect on the rest of the industry. Like Google, Microsoft has experimented with data center modules — shipping containers prepacked with servers and other equipment — that can be pieced together into much larger facilities. And with Facebook releasing the designs of its Prineville facility — a response to Google’s efforts to keep its specific designs a secret — others are following the same lead. Late last year, according to Prineville city engineer Eric Klann, two unnamed companies — codenamed “Maverick” and “Cloud” were looking to build server farms based on Facebook’s chillerless design, and it looks like Maverick is none other than Apple.

Large Data Centers, Small Details

This month, in an effort to show the world how kindly its data centers treat the outside world, Google announced that all of its custom-built US faccilities have received ISO 14001 and OHSAS 18001 certification — internationally recognized certifications that rate the environmental kindness and safety not only of data centers but all sorts of operations.

This involved tracking everything from engineering tools to ladders inside the data center. “You actually learn a lot when you go through these audits, about things you never even considered,” Kava says. His point is that Google pays attention to even the smallest details of data center design — in all its data centers. It will soon seek similar certification for its European facilities as well.

In Finland, there’s a punchline to Google’s Baltic Sea water trick. As Kava explains, the sea water is just part of the setup. On the data center floor, the servers give off hot air. This air is transferred to water-based cooling systems sitting next to the servers. And Google then cools the water from these systems by mixing it with the sea water streaming from the Baltic. When the process is finished, the cold Baltic water is no longer cold. But before returning it to the sea, Google cools it back down — with more cold sea water pulled from the Baltic. “When we discharge back to the Gulf, it’s at a temperature that’s similar to the inlet temperature,” Kava says. “That minimizes any chance of environmental disturbance.”

According to Kava, the company’s environmental permits didn’t require that it temper the water. “I makes me feel good,” he says. “We don’t do just what we have to do. We look at what’s the right thing to do.” It’s a common Google message. But Kava argues that ISO certification is proof that the company is achieving its goals. “If you’re close to something, you may believe you’re meeting a standard. But sometimes it’s good to have a third-party come in.”

The complaint, from the likes of Facebook, is that the Google doesn’t share enough about how it has solved particular problems that will plague any large web outfit. Reports, for instance, indicate that Google builds not only its own servers but its own networking equipment, but the company has not even acknowledged as much. That said, over the past few years, Google is certainly sharing more.

We asked Joe Kava about the networking hardware, and he declined to answer. But he did acknowledge the use of Spanner. And he talked and talked about that granite tunnel and Baltic Sea. He even told us that when Google bought that paper mill, he and his team were well aware that the purchase made for a big fat internet metaphor. “This didn’t escape us,” he says.

Cade Metz is the editor of Wired Enterprise. Got a NEWS TIP related to this story -- or to anything else in the world of big tech? Please e-mail him: cade_metz at wired.com.

Glad you liked it. Would you like to share?

Sharing this page …

Thanks! Close

Add New Comment

  • Image
  • Share on:

    Twitter

Showing 40 comments

  • brenro 1 comment collapsed Collapse Expand
    If nothing else I bet it smells better than a paper mill now.
  • Olov Candell 3 comments collapsed Collapse Expand
    It would be even more green to locate the datacenter where the hot water could be used, e.g. for district heating in an urban area, instead of wasting it into the sea...
  • pacocornholio 1 comment collapsed Collapse Expand
    Good point.  

    I remember flying into L'viv in the Ukraine in February (yeah, bad move) and seeing lot of large buildings with their top floor windows wide open on a freezing day.  Someone finally explained that the Soviets had installed central heating for the city's major buildings - turned on in October, off in April - and that nobody could control the amount of heat they received so opening the windows was the only way they could regulate building temperatures.
  • Nathan Galen 1 comment collapsed Collapse Expand
    build an apartment that uses steam heating right next door lol
  • erth 2 comments collapsed Collapse Expand
    there is this place called the great lakes in north america. how about repopulating this area?
  • maddcribbage 1 comment collapsed Collapse Expand
    Google is the most global company on Earth, they need offices and data centers everywhere.
  • pacocornholio 1 comment collapsed Collapse Expand
    Now I know why all my Google searches have been returning information about herring.
  • Eez Feedz 1 comment collapsed Collapse Expand
    Why aren't we running data servers in the far north?  No one bothers us here in Canada and we have plenty of cold to go around.
  • Petzl 4 comments collapsed Collapse Expand
    On the data center floor, the servers give off hot air. This air is transferred to water-based cooling systems sitting next to the servers. And Google then cools the water from these systems by mixing it with the sea water streaming from the Baltic. When the process is finished, the cold Baltic water is no longer cold. But before returning it to the sea, Google cools it back down — with more cold sea water pulled from the Baltic. “When we discharge back to the Gulf, it’s at a temperature that’s similar to the inlet temperature.


    Anyone find this rather strange?  To recapitulate: 1. Hot server air transfers heat to cool Baltic seawater. 2. Heated Baltic seawater transfers heat to cool more Baltic seawater. 3. Heated Baltic seawater discharged back into Baltic. But it's suddenly cool now?  Why do you need the Baltic-to-Baltic step in first place? And just how is this 2nd round of Baltic seawater not heated?  Did they install cooling towers for it to be discharged to the air? You'd think google, of all corporations that are people, would know not to bullsh-t the technical community, and they'd have a more sensible-sounding explanation.
  • dragoneer59 1 comment collapsed Collapse Expand
    Oh dear. 

    DQ = MC (DT).  M = mass, C = heat capacity, DT = temperature change, DQ = change in thermal heat energy.

    So, as an *example*, since they tell us nothing about the mass mixing ratio or the temperatures:

    Take one part water at, say, T1 = 60 Celsius (from cooling down servers), mix with, say, 3 parts raw sea water in the "tempering building" at, say, T2 = 10 Celsius, and what do you have as your final temperature, T3?

    T1 + 3T2 = 4T3  (T1 = 60, T2 = 10, T3 = resulting temperature of mixing)

    T3 = 22.5 C

    They guy directly says they send the hot water to a "tempering building" where they directly mix in raw cold water from the sea and mix the two, before discharging THAT final result back into the sea.
  • Jeff W 2 comments collapsed Collapse Expand
    Not really BS if you've got any actual real-world experience dealing with thermal exchange.


    When I make beer, for example, I can cool a pot of boiling wort down to ~70 degrees F with just over twice the volume of the boiling wort of 'room temp' tap water, using nothing more than a coil of copper and a consistent, low-pressure/volume flow. 

    I imagine that, having dropped the wort down approx 130degree F, it wouldn't take much more volume of similarly chilled water to reduce to temperature of the original cooling water down in just the same fashion. Of course, I'm only talking about 15 US gallons, and I'm not concerned about the temp of the water too much (except I don't want to boil the roots of my garden when I use it for watering) but I've done this so any times it makes absolute sense on a bigger scale, like here.
    I find it astonishing when a someone in 'the technical community' cannot take the time to research some basic science before spouting disbelief of a process you initiate yourself anytime you put ice in your drink.
  • Petzl 1 comment collapsed Collapse Expand
    Sorry, I forgot that google is god to some freshmen who brew beer in their frat's garage and missed that class on the laws of thermodynamics.  In any case, google's explanation makes it sound like somehow they are having a zero net effect to the river, which is obviously untrue. There's heat energy that has to be accounted for. Unless they have cooling towers, all that heat is going into the river.  All they're doing is diluting it alot. Instead of diluting it a little.  I guess the "shock" effect is lessened than if they directly ejected it, but all that heat energy is still going in. Half a mile downstream, the effect is still going to be the same.
  • numberswonk 2 comments collapsed Collapse Expand
    I'm curious how much cooling a normal data center requires that they're replacing? How many tons of airconditioning? What % of that is water cooling and what % is intense refrigerant like freon? I wouldn't think that's a company secret or unique to Google. 

    Some years ago I saw proposal for a data center to be built near Reno, Nevada due to proximity to power grid, a power plant, plus series of gas turbine peaker plants. Big fiber ran along I-80, too. I prepared an analysis for wind energy on the site, but at that time it didn't make the cut.
  • mihaelb 1 comment collapsed Collapse Expand
    takes a LOT.  one place where I worked had just 15 servers, and they needed an air conditioning unit the size of a large washing machine (think the ones you see in laundromats).  That thing made a huge racket, and I doubt it was economical.  a data center like the ones Google has is many thousands of times bigger
  • L. M. Johnson 1 comment collapsed Collapse Expand
    I love to see old things used in new ways. Gives me a strange high.
  • Fire Angel 1 comment collapsed Collapse Expand
    Great ideas.  A lot of companies could do with reducing their impact on the environment, and I like the way they went further than the law required them to do by re-cooling the water before returning it to the sea.

    Also building that centre next to the sea is fine, it will be quite a few years before it has to move even if global warming has worse than expected effects on sea level.  Finished paper doesn't like water either, so I think it's safe to assume that the building keeps out the weather.
  • Jamoke Balaa 11 comments collapsed Collapse Expand
    Building a data center next to the sea is insane.  Water and electronics don't mix.  There is a phenomena called a storm that could cause havoc.  Apparently Google does not believe in the validity of global warming either (which is more severe in the Northern Latitudes).  Wealthy idiots!!!
  • Joachim Sjöblom 1 comment collapsed Collapse Expand
    It's in Finland, where there's a sizeable amount of islands, which're brilliant for breaking up waves, between the coast an the open sea. Not to mention that the Gulf of Finland is too narrow and shallow for bigger waves to develop. There's also all kinds of weather up there (Temperature difference between summer and winter can be as much as 70-80 degrees centigrade), so the buildings are quite...versatile. Also, I'm fairly sure global warming is the reason Google does shit like this
  • Eez Feedz 1 comment collapsed Collapse Expand
    Tons of computers use liquid cooling.... Plus if this building has survived for 50+ years, methinks it'll be fine for the next little while.  It's probably already been through all kinds of weather over that time span anyway. 

    lol global warming.... I'm not touching that one.
  • keithschm 6 comments collapsed Collapse Expand
    "
    global warming"  Really?????   I have a bridge to sell you in Brooklyn if you are interested?
  • Hunter Shoptaw 4 comments collapsed Collapse Expand
    Yes, Global warming, really. If you don't believe in it, then you just haven't read the facts.
  • keithschm 1 comment collapsed Collapse Expand

    Global Warming is a a political tool. Climate change happens as it has for millions of years. Pollution and Stupidity is an issue.
  • keithschm 2 comments collapsed Collapse Expand

    Global Warming is a a political tool. Climate change happens as it has for millions of years. Pollution and Stupidity is an issue.
  • Jeff W 1 comment collapsed Collapse Expand
    double posting happens as it has for millions of threads...
  • Chris Hansford 1 comment collapsed Collapse Expand
    Is your bridge located somewhere on this great flat expanse that we call Earth?  You know, the center of all the universe, around which revolves the cosmos?
  • Eez Feedz 1 comment collapsed Collapse Expand
    Tons of computers use liquid cooling.... Plus if this building has survived for 50+ years, methinks it'll be fine for the next little while.  It's probably already been through all kinds of weather over that time span anyway. 

    lol global warming.... I'm not touching that one.
  • Eez Feedz 1 comment collapsed Collapse Expand
    Tons of computers use liquid cooling.... Plus if this building has survived for 50+ years, methinks it'll be fine for the next little while.  It's probably already been through all kinds of weather over that time span anyway. 

    lol global warming.... I'm not touching that one.
  • b1313536 1 comment collapsed Collapse Expand
    It's nice that they keep their impact to a minimum by cooling the water down... 
    But imagine a future where there are bays of new species of flora and fauna... basking in the warm waters coming from datacenters. Think eerie beauty like Icelandic or Japanese thermal springs, or think of the conditions that helped "cook" Cueva de los Cristales... Only this time it's people connecting over the Internet, and computers doing work for them, that is creating those micro climates! Processing sustaining life as a side effect...



    (Hopefully this weird future will not last long. Heat = inefficiency)
  • mihaelb 1 comment collapsed Collapse Expand
    Wow!  nice!

    They even bothered with cooling the water down a bit before returning it to the ocean.  Very environmentally-conscious. A lot of companies can and should follow this example.  Maybe Google should outsource consulting services like these?
  • Witchdoc59 1 comment collapsed Collapse Expand
    Dear Google;

    I know of another nice abandoned paper mill.   Plenty of power,  on ocean front property.  World class salmon fishing and scuba diving, great weather.
  • R R 1 comment collapsed Collapse Expand
    Nice.  My company (in a warm Pacific Island climate) cools the building so cold that workers then put space heaters under their desks!  Crazy.

    Question: Do they also serve sardines in the employee cafeteria?
  • woodyeagle 6 comments collapsed Collapse Expand
    What do they use to cool the water before returning it to the sea?
  • 3weight 5 comments collapsed Collapse Expand
    you're kidding, right?
  • aworldnervelink 1 comment collapsed Collapse Expand
    This explanation doesn't make a whole lot of sense. Sure, by mixing in more sea water the temperature of the discharge water is lowered... but the total amount of heat discharged remains the same, it's just spread out over a greater volume of water.
  • debasser 2 comments collapsed Collapse Expand
    Obviously 3weight you can't read : "When the process is finished, the cold Baltic water is no longer cold. But before returning it to the sea, Google cools it back down — with more cold sea water pulled from the Baltic. “When we discharge back to the Gulf, it’s at a temperature that’s similar to the inlet temperature,” Kava says. “That minimizes any chance of environmental disturbance.”
  • 3weight 1 comment collapsed Collapse Expand
    debaser, trust me when I tell you that you're a complete imbecile:

    my reply was to woodyeagle (take all the time you need to figure it out...)

    now go put your helmet back on and finish your chocolate milk.
  • aworldnervelink 1 comment collapsed Collapse Expand
    (double post)
  • bertbopper 1 comment collapsed Collapse Expand
    Too bad other brands than Nikon don't have auto-vignetting correction for wide angle lenses...
  • Knowles2 1 comment collapsed Collapse Expand
    Much of Facebook infrastructure is rumoured to run on many of the ideas Google developed and release early in its life in the form research papers, which open source community use to developed their own version which Facebook and the like now use in their own facilities. Considering Facebook are now major rivals with Google,  It no wonder that it keeping everything so close to its chest. By the time we learn anything substantial about Spanner they will probably have developed Spanner 2.

Reactions