Do you think millennials who grew up with the early Internet and home computers will be as bad with future technology as boomers are with current technology?

jcrabapple@dmv.pub to Ask Lemmy@lemmy.world – 566 points –

My wife and I started talking about this after she had to help an old lady at the DMV figure out how to use her iPhone to scan a QR code. We're in our early 40s.

444

You are viewing a single comment

I'm an "Oregon Trail" older millennial, whose first computer was a Tandy 286 running DOS (and Tandy Deskmate GUI shell). My parents bought me the thing but had no idea how to use it themselves, so six?-year-old me had to figure it out entirely on my own. I had to understand things like IRQ conflicts, "high memory", the difference between CGA, EGA and VGA, and (to some extent) how to use the DOS command line just in order to get my games to work.

Without that formative experience, I would not have become the Linux-using, self-hosting, software engineer I am today.

Frankly, I fear for the zoomers. I'm currently trying to figure out how to give my own kids at least a taste of a similar experience, because the last thing I want is for them to be slaves to whatever "easy"-but-exploitative technology the FAANGs of the world are constantly trying to shove down the throats of everybody who doesn't know better.

I remember playing in the low moby. I think our generation benefited a bit in our understanding because we had to deal with "bad" tech. Finicky, difficult to use, you had to understand it because it made no effort to understand you. Me playing with a presario 486 back in the 90s and all of our computers after that seems to have led to a certain knack that I offhandedly refer to as "speaking Machine". You just develop an understanding of what a computer is likely trying to do, what individual steps it would need to complete in order to do that and which one of those steps fucked up based on what it did instead of what it was trying to do. As tech got better the level of understanding required to use it decreased. This is good overall, but it's gonna mean fewer low level hackers and that even people who build things on computers don't fundamentally understand how they do what they do because it's all abstracted away. I'm a java coder now but I'm also the only coder I know that has ever worked in assembly. Outside specialized embedded hardware applications, why would you bother?

I’m a java coder now but I’m also the only coder I know that has ever worked in assembly. Outside specialized embedded hardware applications, why would you bother?

What's really crazy is that even $1 microcontrollers are faster than that 486 these days, and I'm pretty sure some of them are capable of running Java themselves. I was going to say something like "you have to go really small, e.g. ATtiny, to find something you need to use C on," but nope, even those can (sort of) run Java!

Not to mention the fact that you can run an entire desktop OS much more sophisticated than Windows 95 was on a $5 Raspberry Pi Zero...

Memba when we named our loop variables shit like "i" to save space in the symbol table? Uphill, both ways, in 3 feet of snow?