Follow

I wrote an article about the need for low-carbon and and the path towards zero-carbon computing.

tl;dr:

** The problem:
* By 2040 emissions from computing alone will be close to half the emissions level acceptable to keep global warming below 2°C. This growth in computing emissions is unsustainable: it would make it virtually impossible to meet the emissions warming limit.
* The emissions from production of computing devices far exceed the emissions from operating them, so even if devices are more energy efficient producing more of them will make the emissions problem worse. Therefore we must extend the useful life of our computing devices.
** The solution:
As a society we need to start treating computational resources as finite and precious, to be utilised only when necessary, and as effectively as possible. We need frugal computing: achieving the same results for less energy.
** The vision: please read the article, I'm out of characters.

wimvanderbauwhede.github.io/ar

@wim_v12e this is really important! thank you for spreading the word

@categorille Thank you for reading it, and for saying so. I really hope I'll be able to make a contribution with this.

@wim_v12e I like your broad view on #greenIT!

I have been reading sustainability reports of some hosting providers and mostly they talk only about efficiency (= less idling servers, newer servers, and less cooling)... but #sustainability contains also consistency (which servers do I buy, which energy do I consume, which buildings do I have) and sufficiency ("less is more"/use only when necessary)

Though, the reports I liked the most:
- hostsharing.net/ziele/digitale (it's German only; they explicitly say that they run their hardware as long as possible and if they buy new hardware, then based on sustainability aspects, too)
- ungleich.ch (I believe their message, but they are small...)
see also
thegreenwebfoundation.org/
and
lite.framacalc.org/green-webho
(maybe you have some other good examples)

But here again hosting providers are just the tip of the iceberg and relying on other resources like network, buildings, hardware vendors, cooling...
- pad.hacc.space/heat-producing-
- pad.hacc.space/green-hardware-

@greenfediverse FYI

@aligyie @wim_v12e @greenfediverse Buying new hardware as rarely as possible can be very bad for energy efficiency. Updating a rack of 2010 servers to 2020 ones can easily cut the power consumption in half while increasing the computing power at the same time.

@dmbaturin @aligyie @greenfediverse
In general for computing devices, servers as well as desktops or laptops, the total emissions from production exceed those from operation over a currently typical lifetime. So upgrading too soon to newer servers results in net higher emissions.
Of course the economic argument is different: it will reduce your power bill.

@dmbaturin @greenfediverse
4 problems i see when replacing hardware too often:
1. the replaced hardware will still be used by some third party vendors. So, the overall consumption is growing and growing, which we see in the graphs in wim's article.
2. recycling of computer waste is still a big problem. There are many movies out there like "Sodom", going more into detail. Also we are missing Cradle to Cradle hardware. There is a lot to do to get to a circular economy here. Fairphone did a great job for mobile phones, but great part of the IT world is lacking a similar solution. HP is one of the better ones, but lobbying against right for repair...
3. production of computer hardware is highly complex, requires big fabs and is far from being sustainable, see @wim_v12e 's recent comment.
4. for power consumption we have at least a theoretical solution: wind/sun energy + hydrogen for UPS - but this definitely does not work if we build datacenters like crazy as described in wim's article.

@aligyie @greenfediverse @wim_v12e Oh, I'm not saying people should replace the hardware often. More that there are tradeoffs, and sometimes they can be big enough to justify that.

@aligyie @dmbaturin @greenfediverse That's right, the current uptake of renewables is too slow to cover the projected rise in demand for computing. Even nuclear doesn't help, it takes too long to build the extra capacity. So the only current solution is not to use more power. And that will be the case for the next two decades at least, but most likely even longer.

@dmbaturin @aligyie @wim_v12e @greenfediverse Buying new hardware is not better : most of the energy consummed by IT stuff comes from its production, not its use. So using your hardware the longest is better for the planet.

@aligyie @greenfediverse

Thanks for reading, and also for the references! I'll check them out!

@wim_v12e I'm not a fan of replacing computers every 2 years and I like the idea of aiming to increase computers' lifespan, but the part about eventually stopping to produce new computers sounds pretty dangerous.
If we stop making computers, we will soon forget how to make them. The only people capable of doing it will die, the infrastructure will fall apart or get sold for scrap. We will end up relying on a technology nobody understands. That's very fragile IMO.

@wolf480pl That is a very interesting point. It is not quite as dramatic as that though. With my assumptions it would take several centuries to get there. And I am not saying that we should not make computers anymore, only that when you've made one it could be expected to last forever. Of course there will still be devices that fail an can't be repaired
And I don't quite agree that not making something means the knowledge will be forgotten. The fact that we need to be able to repair them means we need to know how they work.

@wolf480pl In particular, CPUs and memory don't last forever, there are physical limits (electromigration etc), so we'll have to keep producing ICs, only fewer and use them for much longer. And we'll need devices where we can replace the ICs, like in the good old days of yore ^_^

@wim_v12e @wolf480pl another option i see is that until we have developed this magical "everlasting hardware" we probably have although developed better ways of recycling, it will still take a huge amount of energy but the failed or outdated hardware will hopefully provide a big part of the needed resources which currently are ripped out of the earth and thave other non CO2 impacts on the environment.

@wim_v12e thanks for writing this eye(further)opening article!
the environmental impact of computing is one of the reasons why I struggele with pursuing an "IT-career"
On one side i am amazed about what is possible and like to tinker arround with hard&software but on the other aide i am just scared how carelessly most IT-folks are showing off with their always bleeding edge hardware and litterally throwing it arround or abusing it with totally useless inefficcent software.

@glowl Thanks for reading it! There is really a huge need for change in attitude on all sides: manufactures, tech companies, IT people and end users. I am a computing scientists so I work on the technical side but the sociological and economical aspects are actually more important.

@wim_v12e if i find the time i would like to translate the article into german, i know a few people who could benefit from reading it but aren't particularly fluent in english.

would that be ok for you?

@glowl Yes of course. My German is not good enough to write it but I'll be able to read the translation just fine. Thanks!

@wim_v12e Very nice article :) I especially liked the structure and that you gave a summary in the beginning. This makes reading much easier for me.

@maxi Thank you for saying so, I'm glad the "key points" at the start were of help to you.

@wim_v12e I definitely agree that we should be trying to make computing more environmentally sustainable, but I just quite can’t get behind the wording of this article. It comes off too much as “computer bad, kill it” for me, and I’m not sure how I feel about that.

@wim_v12e I think what threw me off was “to be utilised only when necessary, and as effectively as possible.” Not so much the effectively part, since we should strive to have software work effectively and efficiently, but more the utilization part. Kinda came off as a “don’t use it, ever” tone.

I personally enjoy programming, especially in areas like app development and game development. I do usually strive to make efficient stuff, and I do regret making Hyperspace Desktop with Electron years ago (we’re since looking to move on in areas like Starlight and maybe a Flutter client for various reasons). I also do like exploring and integrating machine learning when it’s relevant and has benefits, though I wouldn’t necessarily pull out deep learning from the get-go, because that’s expensive.

It’s likely just me that has issue with the wording, though I do agree with you on making stuff more efficient and environmentally sustainable such as waning away from Electron apps, preferring native toolkits (or native-performing cross-platform toolkits), not upgrading devices every year, and focusing on computational complexity

@wim_v12e With that said, I’ll be sure to share this with my fellow CS peers to make them at least aware of it. I know my ML class last semester looked into the computational and environmental costs of deep learning and agreed we shouldn’t necessarily use it for everything and try other algorithms first.

@marquiskurt I suppose you can read it that way. My view is more that computing until now has been treated effectively as an unlimited resource, and we should treat it as a limited resource. And any limited resource must be used frugally.

@wim_v12e Of course, I think we’re beginning to see that as the chip shortages continue, along with other companies trying to recycle and reuse the materials to make the next product. I suppose a good start would be warning people to not upgrade their device every year, hardware-wise, and pushing to make more native apps over stuff like Electron apps. Over time, we should try to tackle bigger things such as making games (especially triple-A titles) more energy efficient and getting companies to support their devices for much longer. I anticipate the latter will be a bit easier for Apple since they tend to support devices for quite a while, but we should push for other companies to go beyond one or two years of support. Also continuing to support projects like Ubuntu Touch can help to bring new life to older devices, granted that they are still supported (or make a port to make them supported again).

@marquiskurt I think so too, change is already happening, but it is too slow right now. What matters most is to get the message out to as many people as possible, so a change in consumer attitude leads to more pressure on companies, and also that developers see the importance of reducing emissions and see ways to do it. On the whole I am still positive.

@wim_v12e Are you planning on publishing an academic paper on this topic or have a link to a pre-print? I appreciate you.

@theruran I think that I can't compete with the existing published papers at the moment. For example this one [1] has a very similar message, so I don't think I'd have much to add.

But I am now officially leading the "Low Carbon and Sustainable Computing" activity in my department, so eventually I expect to publish on this topic.

[1] cell.com/patterns/fulltext/S26

@theruran I want to focus more on outreach though, giving talks about the article I wrote. So I you know of a virtual venue, let me know ^_^

@theruran My main Low Carbon activity at the moment is as a participant in a project to propose recommendations to achieve a net zero digital research infrastructure for the UK by 2040 or sooner. I will be one of the co-authors of that report.

net-zero-dri.ceda.ac.uk/

I don't like the concept of "net zero" but in this project, what it means is that after we have applied all possible ways to reduce emissions from this infrastructure, there will still be some emissions, e.g. from manufacturing. So to achieve actual zero, offsetting will be needed.

I am very critical of offsetting and even more of carbon capture/sequestration and storage though. I think if we do all the rest, we might not need the offsetting part anyway.

The idea is that our recommendations will be applicable to industry and organisations as well, because of course the UK digital research infrastructure is a small contributor even to the UK's emissions. The purpose is to serve as an example of what can be done.

@wim_v12e thanks for the reference, it's very comprehensive! and good luck in your new role 😊

The idea is yet another example of "first world problem" expediency:

#^Powered by Hubzilla

You must enable javascript for your browser to be able to view this content.                                                                                                                  Loading...                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       ...

@wim_v12e great article. I don’t think consumers understand how energy intensive chip manufacturers is. Even improving production yield at smt and assembly is a huge issue. Let alone the waste once the product actually reaches a consumer.

@kiwiguy Quite so, and not just the chips. For example, for IoT devices it's the plastic in their casing and packaging that has the highest manufacturing emissions.

@wim_v12e the article is magnificent, thank you for it. I'll be sharing it everywhere.

@m2m Oh, thank you, please do! I'm also to give talks about it so if you would know of a community that wants to hear this, let me know.

@wim_v12e 25 years! I thought I was doing great with my 10-year-old desktop and 7-year-old laptop (both of which I do not intend to replace before 2025 unless I have a compelling reason to do so).
A computer that is 25 years old today would be an old-world Mac or a PC built for Windows 98. I agree with the article but woof! That’s a big thing to ask.

@reinderdijkhuis The rate of increase in memory, CPU and number of cores has slowed down a lot over the past 10 years though, so that there is effectively not much difference between a computer from 10 years ago and today, whereas as you point out, there is a huge difference with 20 years ago. I would gladly have kept on using my 10-year-old computer for another 10 years. Making that possible should be our goal.

@wim_v12e Agree entirely. By the way, as I wrote up the post for my secret link log, I realized that maybe some people are getting hung up on the definition of scarcity that is implicit in the article. You mean computer resources are scarce-as-in-oil but some people including in this thread read it as scarce-as-in-gold. Is this a good way to think about it?

@reinderdijkhuis What is the difference between scarce-as-in-oil and scarce-as-in-gold? I am not familiar with those terms.

I did not say that compute resources are scarce, I said that we should treat them as finite and precious.

Finite does not necessarily mean they have to be scarce, that depends on the demand vs availability. For example, there is a finite supply of air in the world, but as yet air is not scarce. (It could become scarce if we keep on polluting it of course.)

My ideal is definitely to do more with less, so that the finite amount of compute bounded by the need for sustainability lets us do all that is necessary.

But it is very well possible that if we truly limit our computing to what is sustainable, it will become more scarce and therefore more expensive.

Can you elaborate what people are hung up about? What is the problem they see?

Never mind, forget I said anything. Sorry to waste your time.

@wim_v12e A possible positive side-effect of frugal computing could be that it would discourage the trend of making everything around us ‘smart’’. A return to ‘dumb’ as the norm would mean more privacy and security for all.

@TomEtty The most frugal computing is no computing, so I agree with that. It would be nice. The businesses that push "smart" would not like it though. Still, in particular things like right to repair might give a push in that direction. "smart" all to often means "impossible to repair".

@wim_v12e Thank you for this great article. Read it on the train and directly discussed it.

I would love a solarpunk future where my device stays with me for at least 25 years. I thought about learning Micro SMD soldering to repair my devices with minor bugs (e.g. the charging connector), but that wouldnt solve the ending software support. Policies would help.

@philipp Thank you for reading and discussing it ! Spreading the message is my main aim.

Sign in to participate in the conversation
Cybrespace

cybrespace: the social hub of the information superhighway jack in to the mastodon fediverse today and surf the dataflow through our cybrepunk, slightly glitchy web portal support us on patreon or liberapay!