The energy costs of mining cryptocurrencies largely matches or exceeds the costs of mining an equivalent value of physical metals, reports a paper published online this week in Nature Sustainability.
In the context of cryptocurrencies, ‘mining’ is the competitive activity in which computers run intensive calculations to confirm new transactions and add them to the currency’s shared public ledger - called the blockchain. New coins are awarded to the first computer(s) to complete each calculation successfully. Therefore, cryptocurrency coins not only have a real-world value - in that they can be traded or spent online - but also a physical cost in the energy used by the computers competing to create them.
Max Krause and Thabet Tolaymat calculate the average energy consumed to create one US dollar’s worth of the cryptocurrencies Bitcoin, Ethereum, Litecoin, and Monero between 1 January 2016 and 30 June 2018 - which is 17, 7, 7, and 14 megajoules, respectively. They compare these values to the energy costs of mining one US dollar’s worth of different metals, including aluminium (122 MJ), copper (4 MJ), gold (5 MJ), platinum (7 MJ), and the rare earth oxides used in mobile phones and other electronics (9 MJ). Apart from aluminium, the authors find that the energy costs of mining cryptocurrencies are comparable with, or more intensive than, those of mining the physical metals - and are expected to rise as more people use, buy and mine cryptocurrencies.
The authors also estimate that cryptocurrency mining generated between 3-15 million tonnes of CO2 emissions over the time period studied. For example, mining in China generated four times more CO2 per coin than mining in Canada due to differences in electricity production.