** The problem:
* By 2040 emissions from computing alone will be close to half the emissions level acceptable to keep global warming below 2°C. This growth in computing emissions is unsustainable: it would make it virtually impossible to meet the emissions warming limit.
* The emissions from production of computing devices far exceed the emissions from operating them, so even if devices are more energy efficient producing more of them will make the emissions problem worse. Therefore we must extend the useful life of our computing devices.
** The solution:
As a society we need to start treating computational resources as finite and precious, to be utilised only when necessary, and as effectively as possible. We need frugal computing: achieving the same results for less energy.
** The vision: please read the article, I'm out of characters.
@wim_v12e I think what threw me off was “to be utilised only when necessary, and as effectively as possible.” Not so much the effectively part, since we should strive to have software work effectively and efficiently, but more the utilization part. Kinda came off as a “don’t use it, ever” tone.
I personally enjoy programming, especially in areas like app development and game development. I do usually strive to make efficient stuff, and I do regret making Hyperspace Desktop with Electron years ago (we’re since looking to move on in areas like Starlight and maybe a Flutter client for various reasons). I also do like exploring and integrating machine learning when it’s relevant and has benefits, though I wouldn’t necessarily pull out deep learning from the get-go, because that’s expensive.
It’s likely just me that has issue with the wording, though I do agree with you on making stuff more efficient and environmentally sustainable such as waning away from Electron apps, preferring native toolkits (or native-performing cross-platform toolkits), not upgrading devices every year, and focusing on computational complexity
@wim_v12e With that said, I’ll be sure to share this with my fellow CS peers to make them at least aware of it. I know my ML class last semester looked into the computational and environmental costs of deep learning and agreed we shouldn’t necessarily use it for everything and try other algorithms first.
@marquiskurt I suppose you can read it that way. My view is more that computing until now has been treated effectively as an unlimited resource, and we should treat it as a limited resource. And any limited resource must be used frugally.
@wim_v12e Of course, I think we’re beginning to see that as the chip shortages continue, along with other companies trying to recycle and reuse the materials to make the next product. I suppose a good start would be warning people to not upgrade their device every year, hardware-wise, and pushing to make more native apps over stuff like Electron apps. Over time, we should try to tackle bigger things such as making games (especially triple-A titles) more energy efficient and getting companies to support their devices for much longer. I anticipate the latter will be a bit easier for Apple since they tend to support devices for quite a while, but we should push for other companies to go beyond one or two years of support. Also continuing to support projects like Ubuntu Touch can help to bring new life to older devices, granted that they are still supported (or make a port to make them supported again).
@marquiskurt I think so too, change is already happening, but it is too slow right now. What matters most is to get the message out to as many people as possible, so a change in consumer attitude leads to more pressure on companies, and also that developers see the importance of reducing emissions and see ways to do it. On the whole I am still positive.