So mentioning DOOM 1 VS webpage sizes reminded that up until not that long ago all of my software had to fit under 16MB (megabytes!) of RAM and didn't have much trouble at that.
We could fit a fully functional app in a couple of kilobytes (!!) on PalmOS but Android apps take dozens of megabytes like it's nothing.
I get that bigger resolutions means bigger art assets (unless you use vector art and you probably should) but with the exception of games and media.. why the heck is everything so heavy now days?
As a developer, I have limited time (and incentive). I could implement something in say a 100 lines of code with 4 hours of effort or just import a 1MB library from which I might call one or two functions. This will end up increasing the size of the package that gets delivered to the end user, though there are some optimizations during the packaging process.
In some scripting languages like Ruby or Python, you install the libraries at the system level. This combined with a good package manager (e.g. apt on Debian) will keep the size of your software small, since it can share the libraries with others.
But not much difference in terms of size at runtime.
It's mostly a developer incentive issue more than framework or OS.
@polychrome An aspect I found interesting was debugging hooks. As I've seen now with several projects, compiling without debugging hooks results in an executable of ~1MB and compiling with hooks blows the executable up to ~100MB, all so the program can report what broke when and how. Not sure this is the full story and I'm far from understanding why, but it's a pattern that seems to be consistent.
@polychrome HARD AGREE
i do a lot of coding in 4KiB ROM / 128 bytes of RAM for the atari 2600
it CAN be done
the level of complete isolation between disciplines is dangerous now
i do know there was an enormous software problem in the 70s, and that software engineers were worth more, so capitalism probably plays into it
i think the way we measure ''productivity'' is also false. there's a difference between ''value'' and ''number of products/components produced''
@polychrome Mostly due to Graphic Operating Systems and all the guff that goes along with that (massive amounts of libraries linked in, etc), and, coding laziness.
When you *had* to make sure your code worked on a machine with limited resources, you made sure your code was tight. When faced with seemingly limitless resources, programmers got lazy.
@polychrome People tend to overlook the code their compiler produces, often forget their assembler & sometimes dont even know what their linker is doing Leaving everything over to #automation in your #IDE gives too much control to that ENV:
It's like photograpy: Unenlightened people think that they are creating a properley balanced #photograph when the camera is on #FULL #Auto All they do is make a #composition usually a lousy one, like towering over the kid they shoot Take control of your code!
@polychrome Amen. A lot has to do with a kind of presentation spectacle that we have been trained to expect: fancy fades, transitions, spinners, videos, animated gimmicks, etc. The core functionality of most "apps" could be done with a boring HTML form or green-screen text application but that does not feed into our conception of ourselves as "high-tech". The result is that a huge amount of coding and hardware expense goes into maintaining this pretense of technological progress.
@polychrome I should note that the most dangerous win32/64 malicious code are still the programs written in assembly by those blackhats. Remember the one which gave your bios a checksum error, essentialy bricking your motherboard? It happened in the 1990's and that program was less than 4096 bytes in a .exe program
ｃｙｂｒｅｓｐａｃｅ: the social hub of the information superhighway
jack in to the mastodon fediverse today and surf the dataflow through our cybrepunk, slightly glitchy web portal