developers having powerful development hardware is actively harmful to usable software existing
there is literally an order of magnitude of power difference between the machines many developers use to create their software and the machines that their users will have - and that's assuming the users have brand new (but average) hardware
today's developers write software pretending that there will always be infinite resources, because the computers we have *let* them do that without consequences
i tend to bring all the compiling tasks into the cloud nowadays, not because of performance but to be able to reproduce builds across teams.
@ky0ko this is a good thread about a serious issue that I had never considered before. Thanks for posting.
@confusedcharlot honestly even many 7 year old laptops still have features that let devs get away with this
a 7 year old laptop, if it is 64 bit, can still build modern compiler tools
an arm laptop manufactured today, if it is 32 bit, cannot
@ky0ko tbh yeah. I purposefully changed to a cheaper, less powered Chromebook partially because of this (also because, so fucking portable and battery good!!!!) and it's cheaper! and water proof and screen is better!!! like wtf!!!
@ky0ko I remember John Romero saying how useful it is to have a more powerful machine to develop on (hence why #Doom and #Quake were developed on NeXT), but he also said that every Dev should have a minimum-spec consumer machine right next to their workstation, to make sure it still works even on the lowest hardware.
But instead, we have the current versions of #Microsoft Office that can't even run properly on a 2 GHz laptop with 8 GB of RAM. >:(
@ky0ko I was just looking at the "standard developer PC" at work... it's an 8-core Xeon, 32GB RAM, 500GB SSD, decent NVidia card... they haven't given me anything like that good, unfortunately, but then our web app is expected to run on cheap old Android tablets. I suppose I should thank them for keeping me focused on performance.
@ky0ko as a developer, i have enough problems 😑
there's plenty of tooling for simulating smaller screens, worse network connections, and limited CPU/GPU resources, but it's rare to see it used outside the games industry
also, anywhere with a remotely competent QA group is going to catch this stuff: testers rarely get issued hardware that matches dev hardware. sometimes they get old hardware on purpose, especially mobile.
the underlying problem is we as an industry barely QA anything now…
@VyrCossont none of this comes close to helping when the issue is with the development tools themselves
@VyrCossont this is all too often the case in some form or another, whether it's compiler tools, middleware libraries, or something else
@ky0ko oh yeah there's room for improvement there, but that's a much higher bar than expecting software targeted at *end users* to be usable on what end users are using.
IDEs are fantastic, but they're heavy in a league far past the average runaway webapp. enabling stuff like whole-codebase symbol indexing and type checking and fast incremental updates to that state isn't exactly cheap; you're running the compiler basically all the time…
@ky0ko it's not rare for me to have to have Xcode, IntelliJ, the iOS simulator, Charles or Wireshark, and Docker with a few dozen VMs running simultaneously while working on a feature, on one laptop.
at my first job i had a similar level of tooling deployed, but it wouldn't run comfortably on one machine, so they gave me *four* Xeon workstations, and a separate antique box to check my mail…
@VyrCossont imo that indicates a serious problem with xcode, intellij, the ios simulator, charles, wireshark, and the software running in the docker vms, that needs to be addressed
@ky0ko okay. yeah. that solves everything. i could dig up a copy of CodeWarrior from the mid-90s and try to backport this iOS app to the Newton, but somehow i don't think our users are going to be very happy with the handwriting recognition 😁
@VyrCossont which reminds me i need to finish getting modern ssl working on classic mac os and working on that project i was going to work on...
@ky0ko plus an external departmental compile farm/object cache for build acceleration.
are there some tools that are absolute clonkers and that ignore basic good practices like caching and parallelized I/O? yeah, absolutely, and they should be named and shamed. (SBT. absolute winner.)
but in general human time is worth more than machine time when it comes to doing my actual job of shipping usable software on time, and i'll throw silicon at that problem any day.
@ky0ko and to add to that, modern apps, frameworks, and web connected stuff, doesn’t scale on performance the way stuff used to. When software was a single binary in a single application that maybe made a couple of linear web requests, slowing down execution was linear with slowed down user experience. But now, the 100s of linked in things and web backends which time out or error unpredictably if your machine isn’t ‘good enough’, it’ll behave totally erratically.
@ky0ko I’ve long been saying that developers should write their software using the following hardware:
Windows: VIA C7-M ULV, Eden, or Eden ULV at 1.0 GHz, VX855 chipset, 1 GiB RAM, 4200 RPM hard drive. (You could go slower on the storage, but a 4200 RPM HDD is painful enough. Otherwise, everything’s the slowest hardware that ever existed that meets Windows 10 32-bit’s stated minimum requirements. And, fun fact: a lot of thin clients from the mid to late 2000s used almost exactly this hardware, except for the HDD.)
Linux: If x86 is needed, use the VIA box or slower. Otherwise, use a Raspberry Pi 1 or 0, with a cheap SD card.
macOS: Fuck, thanks to Apple’s cutoffs being so damn high… both a MacBookAir5,1 with the i5-3317U, 4 GiB RAM, and 64 GB SSD, and a Macmini6,1 with the base 4 GiB RAM and 500 GB 5400 RPM HDD? That gets you the slowest supported CPU and the slowest HDD, although in two different machines. And both of those are from 2012, and are honestly fast enough CPUs.
@ky0ko Agree and disagree.
Compile times being fast and responsiveness like hot reloading is pretty invaluable when developing. It's more than the environment for running the app or whatever not having processors throttled to mimic users.
On the related note to me is the internet speed/latency/assumption of unlimited data (like preloading pages so it quicker IF I click on them but wasteful).
@toastal the people developing the development tools should be targeting those to be able to work usably on low end consumer hardware too.
development tools not being responsive enough or even not working / being able to build at all on low end hardware is part of what prompted me to make this post.
@ky0ko Hmm.. I see your point. It *does* ruin the accessibility of contributions and usability. I think it might be a bit impractical on the lowest-end hardware in some cases--you can only make things compile so fast. It makes sense to demand *some* higher specs because it's a professional experience, but the bar can certainly be lowered. Expecting everything to run on a RaspPi doesn't work for all workloads. If anything it lowers power consumption to increase battery life even on high-end machines.
@toastal @ky0ko I think part of the issue is that a lot of development tooling is made for software shops that are targeting a niche industries with a high bankroll, or originating out of mainframe era where everybody was plugged into one giant machine.
It is as much a cultural artifact as it is a design issue. There's completely different ethos you take when you're part of a big org's R&D dept versus somebody who's trying to hack at open source code.
@toastal @ky0ko framing it as a cultural conflict highlights that different industries tend toward different developmental goals. Which I think is a separate issue (but interrelated) with the inherent costly aspects of the development tooling.
We can start the frame the problem as a form of institutional gatekeeping. One that arose out of a historical context of how software dev kind of has two different focuses. IBM mainframe vs. IBM PC.
Only gotten worse since then.
@toastal @ky0ko I say this because I'm investigating the raspberry pi thing as my formal tooling. There's this strange collaborative compute sharing model that a lot of places would use that allows individuals to scale compute intensive problems. But we only really see it and stuff like render farms and other things easily amenable to parallelism. Part of this is a lack of interest in developing multi-threaded compilation. Largerly because of no industry demand.
@ky0ko I do web dev. Constantly tested on low spec stuff. The problem on the web isn't always the app or page itself. Our shit ran 80% faster with ads and analytics blocked. The monetization industry fucking sucks
@ky0ko also: the more powerful hardware gets, the easier it is to break existing encryption algorithms. when new tougher encryption algorithms replace the old ones, older machines become less capable of functioning online and using https-based services.
@ky0ko i can't agree more.
To me Libre Software also have no ethic meaning if it means it must be used only by the 30% richest on the planet. It shoudl run without lagg or serious handicap on a single core 1Ghz - 100Mo or shouldn't be (I'm talkiing about regular, everyday softwares, not professionnaly oriented ones).
To me there is only a handul of web browser complying to what I'd call "reasonable" software ; #dillo is one of them and I can't be feel sad to see it's not updated anymore :(
@ky0ko Also it's an insult to ecology and a support to capitalism to drive people to "buy a new PC" every now and then.
Using a raspberrypi with 1core 0,7Ghz as my everday laptop for a few months was an eye opener to me.
In the sense that you can do 99% of what people do on the web with a crappy computer - but the vast majority of "well known software" aren't fit for it. Also the vast majority of website becomes unwelcome (as mastodont or diaspora) because of heavy JS usage.
@ky0ko Altought there is some workaround, like the bitlbee hack to get out most of mastodon and such. Or using youtube-dl piped with a video player to get embedded videos.
What I get out of this, is that, internet access disparity is not a real problem. It's a socially ingineered one ; we made web and softwares bloated. But even the oldest intel1 can do 99% of what people do.
@ky0ko agreed. When I was a team lead on a Mac product I made sure I had the worst Mac our product was meant to run on, so I had to suffer what our users would. If it performed okay for me, it'd be more than good for most of our users.
@ky0ko partial disagree; having decent gear makes this behaviour easier but at the end of the day people who behave like this just aren't doing the work imo
performance/resource issues are defects and should be treated as such
@ky0ko That is what test machines are for. You kinda need somewhat powerful boxes for all the extra debugging overhead...
developers with powerful computers
I don't understand what you're advocating - that developers compile and test a big C program on a Raspberry Pi? Doesn't that seem agonizingly slow?
I know it's terrible when, "The program runs fine on a $400 CPU, SSD storage, a 32GB of RAM." ==> "Sell it to someone with a $35 CPU, spinning platter storage, and 2GB of RAM that are mostly running other programs."
But fast developer machines make the most sense to me.
developers with powerful computers
if they need something bigger to run the builds, and their product isnt the build tools themselves, they can get a central build server
if their product *is* the build tools, sucks for them, they should figure out how to make them faster
I hate to wait during development
@ky0ko but those are different things.
Development machines have to be as powerful as possible to be able to to proper work (imo).
QA should simply be done with resource-limited virtual machines (or if it's a separate department with slow(er) machines, preferably different ones).
Additionally trying to code as efficiently as possible and keeping all target devices in mind doesn't have anything to do with the hardware the developer is using.
Cybrespace is an instance of Mastodon, a social network based on open web protocols and free, open-source software. It is decentralized like e-mail.