What do hydrogen, quantum computers and satellites have in common? 🫧💻🛰️
They're all key technologies to reach climate neutrality by 2050.
Our #StrategicForesight report identifies 10 areas for action to maximise synergies between the green and digital transitions in the EU.
@EU_Commission These claims go entirely against the consensus in the field.
The projected growth in AI, blockchain, IoT will lead to a massive rise in emissions, not a reduction. And contrary to your claim, none of these technologies is essential in reducing emissions.
Quantum Computing is unlikely to be mainstream by 2050 and has currently no promise of energy efficiency. There are much more promising compute technologies. Space-based services cause emissions in the upper atmosphere which leads to additional warming of those layers, making global warming worse.
Please check with experts before posting things like this.(fwiw, I am an expert in low-carbon and sustainable computing so green & digital transition is my area)
@wim_v12e @EU_Commission I find these proposed 'solutions' seem like more tech solutionism. There's no points on reducing emissions of current activities either.
Do you have some references to publications when you say this is the consensus? I'd be curious to read up on it
@frox In terms of how much consensus there is on this, I was reviewing a draft whitepaper on this topic (what they call "Twin Transition", green+digital) by the organisation grouping twenty one of Europe's most distinguished research-intensive universities in sixteen countries, and they echo the views in that article (and fwiw, in my article https://wimvanderbauwhede.github.io/articles/frugal-computing/)
@wim_v12e Does performance per Watt really not increase exponentially anymore? My main computer is now 3x as fast (when clocked down to consume the same 30W of energy) as my previous home-server was. Processors in notebooks have reduced their consumption and are much better at clocking down when not needed. @frox
@ArneBab @frox Good point, thank you, I need to clarify that. It is still exponential but it's less than linear in a log-log graph, because the time required to double performance per Watt gets longer and longer. The projection is that it will saturate in the next two decades. I will amend that in the article. It does not really matter as the projected growth in computation is higher than the project rate of improvement in performance per Watt.
@ArneBab @frox Yes, on a global scale. For global warming that is the only scale that matters.
It is not as if any country already has excess renewable energy, and the global energy demand is growing. The scale at which computing is projected to grow could easily gobble up all that renewable energy and leave the rest to burn fossil fuels. This is very much the strategy of Google, Microsoft etc. They can claim they're green at the expense of the rest of the world.
@ArneBab @frox In the end, it is about utility. The problem with the current system is that utility is only considered in terms of money, so a company's utility of computing is determined by the profit, not by its benefit to society. There is no incentive to take into account the externalities (ecological as well as social etc).
The ideal situation is one where it is profitable to be green, and similar for social responsibility. But to get there, we can't keep the status quo. I think we need a combination of consumer demands and government intervention.
But for both of these the awareness needs to be there that unlimited growth in computing is disastrous. So that is my main message.
On the tech/science side, I strongly believe that we can achieve computational tasks at a fraction of the current energy cost. There simply has been no incentive to get there. But doing that on its own is not enough because then the demand would simply increase, offsetting the gains.
@wim_v12e I like your long answer. For ways to get there: optimizing hot code paths — either by working in a fast language from the start or by delegating them to a fast language — can easily make a factor 10 to 100 difference; according to the benchmarksgame: https://benchmarksgame-team.pages.debian.net/benchmarksgame/box-plot-summary-charts.html — you could say „never use Python“ until one pulls out Cython and goes down to the speed of C for 95% of the runtime while writing 95% of the code in fewer lines (that consume high-powered developer compute). @frox