‘There are many outdated server rooms in Estonia.’ How to get in shape for a hot summer?

Remember the heatwaves of last summer? While beach holidays are getting more and more enjoyable in Estonia, high temperatures threaten the most business-critical part of our digital society – the IT infrastructure on which the digital state, banking, cloud services of all kinds, apps, e-commerce, etc. are built.

Photo: Pixabay

Last July, the more than 40-degree heat knocked out the data centers of Google in London – cooling systems proved inadequate against the elements, with the outage also affecting websites hosted by WordPress across Europe. The September heatwave hit the data center of Twitter in Sacramento, on the west coast of the US, really hard. Inside the company, the incident was described as ‘outrageous’ – the heat made all the physical equipment dysfunctional.

If even data centers specifically designed to house servers cannot withstand heat waves, you can only imagine what this means for those that store servers on their own premises, for example, in office buildings with no suitable conditions. ‘There are many server rooms/buildings in Estonia with outdated solutions. This is especially noticeable in view of the rising energy prices,’ says Allan Suurkask, CEO of BVT Partners OÜ, a company selling ventilation and cooling equipment.

Should we fear heatwaves also in Estonia?

The heat record of Estonia dates back to 1992, when a temperature of 35.6 °C was recorded at the Võru station. The past two summers have also been exceptionally hot – the summer of 2022 was the second hottest on record. According to the Environment Agency, two prolonged heatwaves reached Estonia with a total of three heatwaves in the south. The only summer hotter than last year’s was the summer before that, in 2021.

Tõnu Grünberg, head of Greenergy Data Centers (GDC) in Hüüru near Tallinn, the most powerfully cooled data center in the Baltics, warns that climate change, most acutely felt here in the summer heat, will primarily threaten servers in modestly cooled office buildings and older data centers. A typical office building is simply not designed for this kind of load and usually, the cooling is also unduplicated, so if the equipment were to stop due to overload, then that’s it. Older data centers, in turn, can fail to keep up with newer equipment and tired cooling equipment at the end of its life cycle can also be a risk factor.

‘Compared to, for example, general construction, the IT sector is a very young and emerging part of construction. Solutions that worked 10–15 years ago are now obsolete. Note that this is not just an Estonian problem,’ explains Suurkask. ‘However, aside from existing server rooms, it is even more unfortunate if new or renovation projects do not include good cooling solutions either. Estonia is small and, unfortunately, we are slow in accumulating experience and knowledge. Due to the small size of the market, there are few specialists who can dedicate themselves exclusively to the cooling and air conditioning systems of data centers and servers,’ says the head of BVT Partners.

Even big global players are not ready for the heatwaves

Data centers are an integral part of our digital society. Everything that we do online physically resides on devices and servers mainly stored in data centers. Servers need cool temperatures to run, so heat is their biggest foe. The equipment that receives, processes, and transmits data generates heat in the process anyway, so a heat wave is like pouring fuel on the fire.

‘First of all, there is certainly a threat to the most important thing – the reliability and longevity of the servers. The other major problem is energy consumption and I can honestly say that a large number of people do not realise how much energy is wasted due to old or poorly designed cooling systems,’ explains Suurkask. ‘It happens often that a cooling system is built that is too powerful and therefore inefficient. Over time, however, more servers are added and the cooling system can no longer provide the necessary cooling. Sometimes, even a system that is too powerful cannot provide adequate cooling. A tractor is not suitable for pulling a wheelbarrow, either, even though it has plenty of power for it,’ notes Suurkask.

Perhaps most surprisingly, while heatwaves have become more common with every passing year, even major companies like Twitter are not prepared for extreme weather. The former head of security of Twitter revealed last August that it does not take much to shut down Twitter for weeks or months. All it takes is the temporary but simultaneous interruption of some data centers, Gizmodo reported.

Valuable lessons emerge from fatal experiences

However, there is a cure to this increasingly common IT headache. Grünberg recalls a story from his own experience that gives a clue for how to tackle today’s problems. ‘I remember an incident from a distant past where mobile communication equipment in a data center overheated. The services were disrupted for thousands of people and fixing the problem took days,’ says the current CEO of GDC. At that time, there were no proper data centers in Estonia and the problem was caused by a failure of the cooling equipment. ‘As the unit was unduplicated, the room was no longer cooled and the temperature of the equipment kept increasing. We became aware of the problem when the equipment itself sent an alert. There were no separate sensors in the room. Before we could react, the equipment shut down and repairs were complicated. Ultimately, the lesson was that relying on a single cooling unit is not a reliable plan,’ says Grünberg.

Others have similar experiences to share. ‘I recall an incident from a previous job where the air conditioner in the server room shut down due to voltage fluctuations. There was no backup device, so I received an alert soon,’ says Aivar Karu, CIO of GDC. ‘I was at home at the time and as the workplace was close by, I arrived quite quickly. By this time, however, most servers had already performed a thermal shutdown. I remember that the motherboard of a server had been bent by the heat so much that the memory DIMMs had fallen out,’ says Karu.

Both of these examples highlight the core of the problem: the company was counting only on one device. To answer the question in the title, the key word here is the redundancy of support systems. This suggests that there is more of something than is needed at minimum. While redundancies are generally often a nuisance, they are useful in the case of business-critical infrastructure.

How are servers protected from the heat in the largest data center of the Baltics?

You should only learn from the best. The cooling systems of GDC are built on the logic that both the shared data room and private rooms have back-up devices that switch on if any of the devices that normally keep the temperature within the appropriate range stops working. Risks must be mitigated already at the design stage. ‘Our server rooms are designed and built according to the logic that the power and cooling devices have a reserve of power and, in addition, there are backups for everything. That means we are ready for a greater work load and if something were to happen, backup systems will take over,’ Grünberg points out, detailing the advantages of a building constructed as a data center.

Location also plays a role. For example, the GDC data center is located in the outskirts of Tallinn, on a limestone cliff 43 metres above sea level. The cooling equipment itself is located 60 metres above sea level. This means that the building is exposed to sea winds providing constant relief and on average, only around 300 hours of additional cooling is required per year.

The automation of the cooling systems is entrusted to artificial intelligence created by Siemens, i.e. a computer program capable of learning that operates the devices maintaining temperature in the server rooms. This is an innovative solution in the global context and means significant energy savings.

Seemingly quite far-fetched solutions are also used in other parts of the world. For example, Microsoft has been experimenting with underwater data centers, where heat is not an issue, already for a decade. As part of the pilot project, the first prototype underwater data center with a diameter of 2.5 metres was deployed nine metres deep in the Pacific Ocean in 2015. There have also been talks of building a nuclear-powered data center on the Moon, which would allow using new cooling methods.

However, what suggestions do we have for those who do not wish to move to the Moon, go underwater, or even use the services of the most reliable data center in the Baltics, but seek to use their own server room or office building instead? How can we ensure the necessary conditions so that we do not have to watch the weather forecast nervously all summer? Suurkask recommends the following:

  • Map your needs as accurately as possible in terms of power, duplication, temperature, and humidity. Keep in mind that even ‘just in case’ solutions usually involve not only higher investments but also higher running costs.

  • Pay attention to the airflow in the room – the ultimate goal is not to cool the rooms, but the servers in the room.

  • If possible, consider ‘free cooling’ (for systems that are large enough). Our climate offers opportunities for energy savings.

  • Although technically more complex than free cooling, there are many ways nowadays to recycle heat from servers.

In short, there are many options. According to Grünberg, summer presents challenges for organisations operating in the digital field, but they can be overcome: ‘Just as it is worth putting on protective sunscreen before going to the beach on a hot day, it is wise to proactively take measures to ensure the reliability of your IT equipment to avoid interruptions caused by the heat’.

Previous
Previous

Five myths about data centers! Is cloud storage really replacing data centers?

Next
Next

Why did the Estonian IT company with the highest recommendation index decide to use the services of GDC?