[Return] [Catalog]

1 guest@cc 1969-12-31T17:00:00 [ImgOps] [iqdb]
File: Macintosh-Plus.jpg (JPEG, 81.3 KB, 1000x892)
Does making software for weaker machines ever pay off?
I feel like modern software is bloated, but on the other hand, if I were to make light software, very little amount of people would appreciate that.
»
2 guest@cc 1969-12-31T17:00:00
market = pay off
less market > less pay
less market more likely looking at real established professionals for stuff
»
3 guest@cc 1969-12-31T17:00:00
That depends on how far are you willing to go back. Software intended for weaker machines runs even faster in modern machines. Don't be lazy.
»
4 guest@cc 1969-12-31T17:00:00
Actually, I think a great deal of people would appreciate it. Because the majority of Windows machines worldwide still use either Windows XP, Vista or 7 because of hardware limitations preventing them from using 8.x and 10. So there is very much a market for light software. There's a similar pattern in the mobile market as well, with millions of people preferring the "Lite" version of an app for a certain social networking website spanning the entire globe.

tl;dr just make software for the developing world.
»
5 guest@cc 1969-12-31T17:00:00
Lightwight programming and optimization truly is becoming a lost art.
Just take a look at Android, it needs at least 2GB of ram to function properly and not kill applications all the time because it runs out of memory while back in the day Windows XP could run on only 512MB of ram no problem.
Same goes for the web, 'modern' websites need mountains of javascript to 'function' making browsers hog memory like a madman yet actual functionality only has decreased.

The abundance of ram and cpu power has made developers lazy and has taken away the need to program efficiently and optimize and thereby irreversibly changed everything for the worse.
»
6 guest@cc 1969-12-31T17:00:00
>>5
Abundance of memory and speed has not made developers lazy. Laziness isn't even a bad thing either. It's developers who don't care about what they make that are the problem.
A lazy developer can still use good algorithms.
»
7 guest@cc 1969-12-31T17:00:00
>hardware limitations preventing...
>8.x and 10

>>4 you're joking right, my dude?
»
8 guest@cc 1969-12-31T17:00:00
>>7
8.x and 10 run like garbage on old hardware. That much is known. There was a myth circulating when Windows 10 came out that it's lighter than 7 and will run fine on old hardware, but that was just that - a myth.
»
9 guest@cc 1969-12-31T17:00:00
it's kind of roundabout, but I'd meant to imply that people concerned with their technology wouldn't upgrade
I'm kind of stupid, sorry
»
10 guest@cc 1969-12-31T17:00:00
The true loss from bloated software is the loss of simplicity and therefore understandability, I think.
to not be bloated isnt necesarially to be simple though, but to be bloated is certainly not to be.
»
11 guest@cc 2019-02-06T08:42:52
just give me back my netscape internet
»
12 guest@cc 2019-02-12T09:51:08 [ImgOps] [iqdb]
File: xp-tan.jpg (JPEG, 71.24 KB, 750x563)
If you still support Windows XP, you're a pretty cool guy.
»
13 guest@cc 2019-02-18T01:57:37
nice nipples
»
14 guest@cc 2019-02-21T14:02:13
>>13
I think they are pretty nice.
»
15 guest@cc 2019-04-15T07:38:18
Not entirely on topic, but a big part of why newer software is slower and requires more memory is because of the increased memory bandwidth of a higher resolution, frame rate and color pallet. Even with dedicated hardware and vram, video is a glutton for resources. A 1984 Macintosh was way less powerful than a modern machine, but it also only needed to draw one bit per pixel, and has a fraction of the pixels to draw.

Hardware's also way less specific now. The pc compatible standard eventually expanded to the wintel standard, and that then gradually became even broader. A pc used to be a very particular set of hardware that you could write more or less directly to. Early software would often be written directly in assembly in order to make optimal use of resources. That's not really possible anymore since PC basically just means "hardware that can run windows and/or linux and is mostly compatible with hardware for other similar systems". Instead, you're writing to high level APIs like opengl. This means you can't optimize your code as well as you don't know everything about the platform it's going to be running on

Basically, my point is, the increased quality and quantity of hardware actually makes software worse by default
»
16 guest@cc 2019-04-28T00:50:31
>>10
Yes!
People like the GNU guys forget that at the core of all this the foundational stone is that: understandability of your system. Everything can be fully open and libre, but if it's unable to be understood by the average user (where a user isn't a mere luser), there's no point. The system as a whole, and ideally the individual programs, should be understandable. After all, the user is the master of the system, not viceversa. This reminds me of Alan Kay and the whole computers as a **tool** to aid the human mind vs. computers as things that think for the human: https://www.fastcompany.com/40435064/what-alan-kay-thinks-about-the-iphone-and-technology-now
Overall the whole trend is devastatingly against such a perspective. Of course, it's much easier to just bloat your software with unneeded dependencies and such, and it does make economical sense as people will end up buying into it more than a software that has a model of the user being a thinking being. After all, there are more stupid people than smart, and overall they tend to buy into useless stuff more easily. So the system itself benefits shitty software. Worst of all is, nobody is to blame for this. This whole thing is being generated organically. It's just a convergence of interests.
Those of us that value software that doesn't insult your intelligence will have to keep to our cliques and use the non-shit software made by our people.
»
17 guest@cc 2019-04-28T18:47:49
>>16
>Noone is to blame for this ... [its] being generated organically

You just described it as being a consequence of tech firms persuing the profit motive.
> After all, there are more stupid people than smart,

This isn't really the answer.
Using any system requires learning it first, since all computer systems are arbitrary inventions that do not reflect reality closely in function or interface.
Once one has learned some system, learning any other will seem onerous, not worth doing. Think of learning a second language after you already know your first one.
People learn systems that act like 'sophisticated TV'(as alan kay put it) first, because phone companies, microsoft and apple market their products.
Additionaly, all existing, viable software systems that serve as tools(or more accurately toolboxes) that I know have a few features that give them a higher initial learning curve. The systems we're maligning here can be used in a rudimentary capacity by knowing only a few things about the system, knowing how to use a mouse and press buttons on a keyboard. From there, someone whos brain isnt totally calcified can learn more by fucking around and seeing which buttons and knobs do what, up to a (fairly low) cieling of skill. And in that time, they're able to make rudimentary use of the system.
all tool-like systems I know of, on the other hand, are more opaque at lower levels of proficiency, and require more particular skills to be discovered in the same way. Namely, they all emphasize a language-based interface, which makes using the system with an insufficient knowledge of its grammar and vocabulary like trying to speak in a foreign tongue. And learning the system requires reading documentation, ,which is a skill that has to be learned itself. Ideally most tools should be so trivial and straightforward that this is no trouble at all, but thats not realistic for /all/ useful tools in ones toolbox(though we should try to use the fizzbuz tier program wherever possible for this reason.) If someone is unused to reading text to learn new things, they'll have trouble learning any of these systems. That doesnt necessarially mean they're stupid, just that they havnt had to in the past.
If we want to live in a world where everyone uses sensible computer s ystems that empower them, rather than just enable them to do whatever the tech-clergy imagined/decided they want to, it has to be something thats taught in school like reading and writing.
Reading and writing are hard to learn, but we collectively agree that its worthwhile to teach everyone these skills, that a society in which everyone can read and write is better than one thats mostly illiterate. We'd have to make the same kind of agreement that a society in which everyone can do a good portion of their own computing is better than one in which they can't.
»
18 guest@cc 2019-04-29T16:47:03
I wish we lived in a world where Lisp machines took off.
»
19 guest@cc 2019-05-01T19:49:02
>>18
GuixSD might make that a reality though. At least as a Lisp/Scheme machine

It's about as close to a lisp/scheme machine as it gets, if I'm not mistaken.
It uses Shepherd as its init system which is apparently super "hackable" and written in Guile Scheme. It uses Guix as the package manager which is also written in Guile.

The whole GNU community seems really into scheme and lisp, which I guess isn't surprising considering Emacs uses lisp.
There's a lot of talk on the guix mailing list about getting Hurd running on it which uses "C and LISP as system langauges". I'm not entirely sure what that means though, but it almost sounds like at that point you'd have a (micro)kernel, init system, and package manager all being hackable through lisp or scheme.
And I also sometimes see discussion about other tools or system components and possibly implementing scheme alternatives. Like a scheme shell, which I think would be awesome.
»
20 guest@cc 2019-05-01T21:36:03
>>19
UNIX with as much lisp as possible thrown in is not a lisp OS, any more than windows with cygwin is UNIX or any OS running a web browser is a javascript system.
»
21 guest@cc 2019-05-01T22:01:54
>>20
Then I wonder if there are any somewhat modern kernels out there which are written in lisp
»
22 guest@cc 2019-05-01T23:09:43
>>21
>Then I wonder if there are any somewhat modern kernels out there which are written in lis

No, the problem with that first suggestion isn't that the linux kernel isn't written in lisp. A UNIX kernel written in lisp wouldn't be a 'lisp OS' in the sense that anon was pining for either.
Or else, if you werent implying unix, the question is meaningless, like asking about an engine without a car, on the topic of the car.
»
23 guest@cc 2019-05-02T01:38:32
Might as well try using emacs as an OS.
»
24 guest@cc 2019-05-11T11:20:16
>>21
https://github.com/froggey/Mezzano

[Return] [Catalog]
Delete Post:
OptionsPassword
Name
Comment
File