Scientists from the U.S. Department of Energy’s Oak Ridge Institute for Science and Education conducted a study to evaluate how much energy is consumed by mining cryptocurrencies in comparison to aluminium, copper, gold, platinum and rare earth oxides.
The scientists reviewed the period from Jan.1, 2016, to June 30, 2018, and found that the mining of Bitcoin (BTC), Ethereum (ETH), Litecoin (LTC) and Monero (XMR) consumed an average of 17, 7, 7 and 14 megajoules (MJ) to generate one U.S. dollar, respectively.
In comparison, mining aluminium, copper, gold, platinum and rare earth oxides consumed 122, 4, 5, 7 and 9 MJ to generate the same value. These findings indicate that mineral mining, with the exception of aluminium and some oxides, draws less energy than crypto.
Crypto mining power consumption compared to ‘real’ mining. Source: nature.com
Moreover, the dataset for the study shows that energy consumption for three of four mentioned digital currencies — BTC, ETH and LTC — tends to grow from year to year. For instance, in 2016 BTC needed 17 MJ to generate one U.S. dollar, but now it consumes 19 MJ.
The report states that the energy requirements per dollar will continue to increase. The scientists conclude that over the three years included in study, mining was responsible for 3–15 million tons of carbon dioxide (CO2) emissions.
High energy consumption is considered by some to be an “Achilles Heel” for major cryptocurrencies. According to a February report, crypto mining in Iceland was expected consume more energy than households in 2018.
In May, economist Alex de Vries, who published an article on “Bitcoin’s Growing Energy Problem,” claimed that BTC mining will use 0.5 percent of the world’s energy by 2018.
However, U.S. clean energy expert Katrina Kelly later challenged those predictions, stating that the debate was “oversimplified.” Kelly noted that Iceland, for example, mostly relies on renewable sources of energy, meaning that bitcoins mined there would have a neutral carbon footprint.