Information for:

  • Masters
  • Postgrad courses
  • Courses
  • Faculty and research
  • About UPF-BSM

Do we pollute when we store our information in the cloud?

25 Enero - 2022
Database

Ana Freire
Director of the Operations, Technology & Science Department
__

 

Why is a floppy disk still used as an icon to store our digital documents? Hardly anyone under the age of 30 has ever used or seen this storage device. Some users even interpreted it as a representation of a vending machine. Not only is the floppy disk no longer in use, but also the meaning of this icon has changed: it has gone from representing the few megabytes offered by the floppy disk to giving us access to “unlimited” storage: the cloud.

The cloud is made up of hundreds or thousands of heat-dissipating computers housed in industrial buildings that require large infrastructures to maintain their temperature

Some people attribute ethereal characteristics to this cloud, but it is something totally tangible and earthly. The cloud is made up of computers that store and process data of all kinds: our work documents, e-mails, photos, the movies we watch online, or the pages we search for on Google. This corresponds to huge amounts of information that cannot be stored on a single device. Hence, the cloud is made up of hundreds, or even thousands, of computers housed in industrial buildings (data centres) belonging to companies that create the applications we use every day such as Instagram, Amazon, or Netflix. To give an idea of the size of the “cloud”, Google alone has data centres in more than 20 locations around the world.

Energy cost of storing in the cloud

Let us look at a much smaller scale: our personal computer or our smartphone. Who has not noticed at some point that these devices overheat under excessive use? The heat dissipated by a laptop when we rest it on our lap is perfectly perceptible, especially when it is performing a demanding task. If the temperature soars, the device may stop working for a short period of time.

Back to the cloud. Hundreds or thousands of computers dissipating heat in a single space, even a large one, can raise the room temperature considerably. For this reason, all data centres have infrastructures to maintain the temperature of these spaces at levels that allow the machines to function properly. Air conditioning can account for up to 40% of the energy consumption of these centres.

A widely used strategy is to locate data centres in cold climate zones or in sea water, as Google does in one of its centres located in Hamina (Finland)

Already in 2016, it was estimated that Google consumed more energy than the entire city of San Francisco. As this has a direct economic impact, there are many efforts that companies like this are making to reduce their energy consumption. One widely used strategy is to locate date centres in cold climate zones, to take advantage of outside air to cool their spaces or even sea water, as is done in a Google data centre located in Hamina (Finland).

Artificial Intelligence is also helping to improve the energy efficiency of these data centres. Deep learning algorithms are trained with data from sensors placed in these large infrastructures to predict the PUE (Power Usage Effectiveness), which represents the ratio between the total energy consumed by a data centre and the amount of energy that is consumed by the IT equipment, which allows estimating the energy used by other equipment, such as cooling equipment

How to control the technological carbon footprint

Artificial Intelligence might seem to be a solution to the high-energy consumption of large data centres. However, it has recently been pointed out that it may be an agent capable of generating a high carbon footprint. This is due to the innovative models of natural language processing that it is being used for. These models, based on Artificial Intelligence, seek to understand human language in a very advanced way to develop, for example, more complex conversational agents than the current Siri or Alexa virtual assistants. To do so, they are trained on huge amounts of data written by humans (for example, the entire set of Wikipedia pages).

AI is an agent capable of generating a high carbon footprint. A recent study estimated that the BERT algorithm needs the same energy to train itself as an airplane flying across the US

It is easy to understand that processing these texts is no simple task and involves a large computational load. A study by the University of Massachusetts estimated that BERT, the algorithm designed by Google to optimize its search engine by improving the understanding of its users’ queries, requires the same energy to train itself as an airplane crossing the United States from San Francisco to New York. Systems like this one will be very necessary, both for the implementation of assistance robots and for automating user service tasks, so we could expect an increase in these implementations in the immediate future.

Social networks, mailboxes, chatbots... all these technologies demand energy without the end user noticing. Accelerated digitization and the incremental use of technology, especially after the COVID-19 pandemic, are leading to a carbon footprint that needs to be controlled. As end users, let us contribute to the reduction of these emissions by making responsible use of technology we use every day: free up storage space in our cloud applications, upload less multimedia content to social networks or exchange links instead of attaching files. Let us be aware that the save icon hides an energy cost that impacts the other side of the planet

ODS Ana Freire ESP
NEWSLETTER UPF-BSM
Subscribe to receive our news in your email inbox