Redkey

@Redkey@programming.dev
0 Post – 119 Comments
Joined 1 years ago

I once had a manager hand me a project brief and ask me how quickly I thought I could complete it. I was managing my own workload (it was a bad situation), but it was a very small project and I felt that I had time to put everything else on hold and focus on it. So, I said that I might be able to get it done in four days, but I wouldn't commit to less than a week just to be sure.

The manger started off on this half-threatening, half-disappointed rant about how the project had a deadline set in stone (in four days' time), and how the head of the company had committed to it in public (which in hindsight was absolute rot). I was young and nervous, but fortunately for me every project brief had a timeline of who had seen it, and more importantly, when they had received it. I noticed that this brief had originated over three months prior, and had been sitting on this manager's desk for almost a month. I was the first developer in the chain. That gave me the guts to say that my estimate was firm, and that if anyone actually came down the ladder looking for heads to set rolling (one of the manager's threats), they could come to me and I would explain.

In the end nothing ever came of it because I managed to get the job done in three days. They tried to put the screws to me over that small of a project.

6 more...

This is a short, interesting video, but there's really nothing here for any competent programmer, even a fresh graduate. It turns out they they update the software by sending the update by radio (/s). The video hardly goes any deeper than that, and also makes a couple of very minor layman-level flubs.

There is a preservation effort for the old NASA computing hardware from the missions in the 50s and 60s, and you can find videos about it on YouTube. They go into much more detail without requiring much prior knowledge about specific technologies from the period. Here's one I watched recently about the ROM and RAM used in some Apollo missions: https://youtu.be/hckwxq8rnr0?si=EKiLO-ZpQnJa-TQn

One thing that struck me about the video was how the writers expressed surprise that it was still working and also so adaptable. And my thought was, "Well, yeah, it was designed by people who knew what they were doing, with a good budget, lead by managers whose goal was to make excellent equipment, rather than maximize short-term profits."

1 more...

Yeah, I'm sure that almost all of us have felt this way at one time or another. But the thing is, every team behind every moronic, bone-headed interface "update" that you've ever hated also sees themselves in the programmer's position in this meme.

"If you wish to be a writer, write."

Epictetus delivered this burn over 1900 years ago.

4 more...

It's a persistent dynamic memory allocation that's accessed by multiple processes! :)

5 more...

Thr34dN3cr0 wrote (14:12 5/17/2019):

Does anyone have a way to fix this in the latest version? I've been looking all day but none of the answers I've found work.

Thr34dN3cr0 wrote (14:48 5/17/2019):

nvm figured it out.

Re: the Acceptance stage.

Years ago I worked at a family-run business with a good working environment. The staff were once told a story of how, earlier in the company's history, a manager made a mistake that caused the company a substantial monetary loss.

The manager immediately offered their resignation, but the owner said to them, "Why would I let you go now? I've just spent all this money so you could learn a valuable lesson!"

So yeah, generally, most managers' reaction to accidentally deleting vital data from production is going to be to fire the developer as a knee-jerk "retaliation", but if you think about it, the best response is to keep that developer; your data isn't coming back either way, but this developer has just learned to be a lot more careful in the future. Why would you send them to a potential competitor?

There's nothing wrong with wanting to stick to original hardware, if you already have it or can afford to buy it.

Setting up a Pi or other single-board system as a dedicated retro game emulator is also an absolutely valid choice IMO. It's a fun, generally affordable little project that you can tinker with forever, e.g. changing cases and controllers, UI tweaks, ROM file organization, per-game settings optimization. But I don't think that it's ever been the "best" emulation option for anyone who didn't already have their heart set on "doing something fun and interesting with a Pi".

The smartphone you already have, dedicated retro gaming handhelds, Android TV boxes or sticks, and cheap/secondhand/already-owned PCs (desktop, notebook, or kiosk) all arguably match or exceed the performance and value-for-money of any Pi-based system.

Yet in any thread where someone new to emulation is asking for advice, there's always a flock of folks who suggest getting a Pi like it's the only game in town. It honestly baffles me a little. Especially because almost all of them are just running a pretty frontend over Retroarch, and Retroarch is available for virtually every modern consumer computing platform (and so are a lot of pretty frontends, if that's a selling point).

For context, I've got a dozen or so retro systems, but I prefer to emulate as much as possible.

Let me know if you find one that uses AI to find groupings of my search terms in its catalogues instead of using AI to reduce my search to the nearest common searches made by others, over some arbitrary popularity threshold.

Theoretical search: "slip banana peel 1980s comedy movie"
Expected results in 2010: Pages about people slipping on banana peels, mostly in comedy movies, mostly from the 80s.
Expected results in 2024: More than I ever wanted to know about buying bananas online, the health impacts of eating too many or not enough bananas, and whatever "celebrities" have recently said something about them. Nothing about movies from the 80s.

4 more...

People are writing a lot of things that I agree with, but I want to chime in with two points.

The first, which one or two other commenters have touched on, is that in 2024 we have approximately 50 years of content already in existence. There's no need to limit ourselves to what's been released in the last 12 months. Classic books, music, plays, and movies stay popular for decades or centuries. Why feel shamed out of playing old games by 12-year-olds and the megacorps?

The second thing is, yes, try indie games, and IMO the best place to find them is for PCs on itch.io. Forget 95% of what's marketed as "indie" on consoles.

This isn't a slight against you, OP, or this game, but I'm just suddenly struck by the way that, "aside from the first few hours," or more commonly, "it gets better a couple of hours in," has become a fairly common and even somewhat acceptable thing to say in support of a game, as part of a recommendation.

As I get older I'm finding that I actually want my games to have a length more akin to a movie or miniseries. If a game hasn't shown me something worthwhile within an hour or so, I'm probably quitting it and never coming back.

5 more...

Problem is, you could pirate every single game on dreamcast. Just get a legit copy of the game (renting, buying and returning, borrow from a friend), and have a CD burner.

Then you could make a 1:1 copy of the game in roughly an hour.

You make it sound trivial. While Sega left a security hole open for games to be loaded from a regular CD, the official games were released on GD-ROMs, a dual-layer CD with a 1.2 GB capacity.

So first off, you couldn't read them completely in a regular CD-ROM or even DVD-ROM drive. (I'm not counting the "swap" method because it's failure-prone and involves partially dismantling the drive and fiddling with it during operation.) You had to connect your console to a computer and use some custom software to read the GD-ROM on the console, and send the data over.

Once you had the data, you then had the problem of trying to fit a potentially 1.2 GB GD-ROM image onto a regular CD-ROM. A handful of games were actually small enough to fit already, and 80-minute and 99-minute CD-Rs would work in the DC and could store larger games. But for many games, crackers had to modify the game files to make them fit.

Often they would just strip all the music first, because that was an easy way to save a decent amount of space. Then if that wasn't enough, they would start stripping video files, and/or re-encoding audio and textures at lower fidelity.

Burning a CD-R from a downloaded file was easy, but ripping the original discs and converting them to a burnable image generally was not.

I reserve further comments until I know whether you posted this in this community: a) deliberately but seriously, b) deliberately and sarcastically, or c) by accident.

2 more...

Whatever it may have become in later years, Alan Kay, who is often called "The Father of Object-oriented Programming", outlined the message-passing idea as the main concept he was driving at, originally.

He also says that he probably misnamed it.

Here's a discussion in which the man himself makes a (small) appearance: https://softwareengineering.stackexchange.com/questions/46592/so-what-did-alan-kay-really-mean-by-the-term-object-oriented

Many special editions of the Dreamcast were released over the years, in a variety of different colours. However, every special edition Dreamcast that I've seen has had some other visual change besides just the colour.

Looking at the pictures, I suspect that your console has just been put into a replacement aftermarket shell. However, if the bottom half of the console is solid grey, and there's a pale yellow limited edition number sticker on the back, then you have a quite rare Code: Veronica (Claire version) console. I'm guessing that if the sticker was there, though, you wouldn't be here asking about it, as it's a real giveaway.

That being said, it still looks cool, but I don't think it's going to command an especially high price or anything. The controllers were always available in a wide range of colours, although it's funny that the beige controller on the right has the blue European swirl. I guess someone got a deal.

5 more...

As a half-joking response to this half-joking admission, I got started with the Usborne programming books as a kid, and they laid some excellent foundations for my later study. They're all available online for free these days, so grab an emulator and user manual for your 80s 8-bit home computer of choice, and dive in!

When I go to that URL on a stock, direct FF install, I still see that notice.

2 more...

Some of the things you mentioned seem to belong more properly in the development environment (e.g. code editor), and there are plenty of those that offer all kinds of customization and extensibilty. Some other things are kind of core to the language, and you'd really be better off switching languages than trying to shoehorn something in where it doesn't fit.

As for the rest, GCC (and most C/C++ compilers) generates intermediate files at each of the steps that you mentioned. You can also have it perform those steps atomically. So, if you wanted to perform some extra processing at any point, you could create your own program to do so by working with those intermediate files, and automate the whole thing with a makefile.

You could be on to something here, but few people seem to take advantage of the possibilities that already exist, and combining that with the fact that most newer languages/compilers deliberately remove these intermediate steps, this suggests to me that whatever problems this situation causes may have other, existing solutions.

I don't know much about them myself, but have you read about the LLVM toolchain or compiler-compilers like yacc? If you haven't, it might answer some questions.

9 more...

It really depends on your expectations. Once you clarified that you meant parity with current consoles, I understood why you wrote what you did.

I'm almost the exact opposite of the PC princesses who can say with a straight face that running a new AAA release at anything less than high settings at 4K/120fps is "unplayable". I stopped watching/reading a lot of PC gaming content online because it kept making me feel bad about my system even though I'm very happy with its performance.

Like a lot of patient gamers, I'm also an older gamer, and I grew up with NES, C64, and ancient DOS games. I'm satisfied with medium settings at 1080/60fps, and anything more is gravy to me. I don't even own a 4K display. I'm happy to play on low settings at 720/30fps if the actual game is good. The parts in my system range from 13 to 5 years old, much of it bought secondhand.

The advantage of this compared to a console is that I can still try to run any PC game on my system, and I might be satisfied with the result; no-one can play a PS5 game on a PS3.

Starfield is the first game to be released that (looking at online performance videos) I consider probably not being worth trying to play on my setup. It'll run, but the performance will be miserable. If I was really keen to play it I might try to put up with it, but fortunately I'm not.

You could build a similar system to mine from secondhand parts for dirt cheap (under US$300, possibly even under US$200) although these days the price/performance sweet spot would be a few years newer.

1 more...

I love low-level stuff and this still took me a little while to break down, so I'd like to share some notes on the author's code snippet that might help someone else.

The function morse_decode is meant to be called iteratively by another routine, once per morse "character" c (dot, dash, or null) in a stream, while feeding its own output back into it as state. As long as the function returns a negative value, that value represents the next state of the machine, and the morse stream hasn't yet been resolved into an output symbol. When the return value is positive, that represents the decoded letter, and the next call to morse_decode should use a state of 0. If the return value is 0, something has gone wrong with the decoding.

state is just a negated index into the array t, which is actually two arrays squeezed into one. The first 64 bytes are a binary heap of bytes in the format nnnnnnlr, each corresponding to one node in the morse code trie. l and r are single bits that represent the existence of a left or right child of the current node (i.e. reading a dot or dash in the current state leading to another valid state). nnnnnn is a 6-bit value that, when shifted appropriately and added to 63, becomes an index into the second part of the array, which is a list of UTF-8/ASCII codes for letters and numbers for the final output.

That was my first take as well, coming back to C++ in recent years after a long hiatus. But once I really got into it I realized that those pointer types still exist (conceptually) in C, but they're undeclared and mostly unmanaged by the compiler. The little bit of automagic management that does happen is hidden from the programmer.

I feel like most of the complex overhead in modern C++ is actually just explaining in extra detail about what you think is happening. Where a C compiler would make your code work in any way possible, which may or may not be what you intended, a C++ compiler will kick out errors and let you know where you got it wrong. I think it may be a bit like JavaScript vs TypeScript: the issues were always there, we just introduced mechanisms to point them out.

You're also mostly free to use those C-style pointers in C++. It's just generally considered bad practice.

Assuming C/C++, dare we even ask what this teacher uses instead of switch statements? Or are her switch statements unreadable rat's nests of extra conditions?

This is a good life lesson. We're all idiots about certain things. Your teacher, me, and even you. It's even possible to be a recognized expert in a field yet still be an idiot about some particular thing in that field.

Just because some people use a screwdriver as a hammer and risk injuring themselves and damaging their work, that's not a good reason to insist that no-one should ever use a screwdriver under any circumstances, is it?

Use break statements when they're appropriate. Don't use them when they're not. Learn the difference from code that many other people recommend, like popular open-source libraries and tutorials. If there's a preponderance of break statements in your code, you may be using a suboptimal approach.

But unfortunately, for this course, your best bet is to nod, smile, and not use any break statements. Look at it as a personal learning experience; by forcing yourself sit down and reason out how you can do something without using break statements, you might find some situations where they weren't actually the best solution. And when you can honestly look back and say that the solution with break statements is objectively better, you'll be able to use that approach with greater confidence in the future.

4 more...

I am currently working on a game for the Atari 2600, and you just gave a good outline of my code. And I love it.

Did you read all the way to the end of the article? I did.

At the very bottom of the piece, I found that the author had already expressed what I wanted to say quite well:

In my humble opinion, here’s the key takeaway: just write your own fucking constructors! You see all that nonsense? Almost completely avoidable if you had just written your own fucking constructors. Don’t let the compiler figure it out for you. You’re the one in control here.

The joke here isn't C++. The joke is people who expect C++ to be as warm, fuzzy, and forgiving as JavaScript.

I'd argue that you do need to be good at math to be an effective programmer, it's just that that doesn't mean what a lot of people think it means. You don't need to know all the ins and outs of quadratics, integrals, and advanced trigonometry, but I think you do need to have a really solid, gut-level understanding of basic algebra and a bit of set theory. If you're the sort of person whose head starts to swim when you see "y=3x+2", you're going to find programming difficult at best.

You're quite right! I didn't notice that.

3 more...

Back in the olden days, when we used kerosene-powered computers and it took a three day round trip to get IP packets via the local stagecoach mail delivery, we still had games even though Steam didn't exist yet. :b

We used to transfer software on these things called disks. Some of them were magnetic, and some of them used lasers (you could tell them apart because for the laser ones it was usually spelled "disc" with a "c").

Anyway, those dis(k/c)s mostly still work, and we still have working drives that can read them, and because the brilliant idea of making software contact the publisher to ask if it was OK to run had only just been invented, we can generally still play games from the period that way. Some people kept their old games, but others sell them secondhand, which I believe the publishers still haven't managed to lobby successfully to be made illegal, unless I missed a news report.

Even if you can't get the original physical media for a game, sites like GOG sell legal digital downloads of many old games, which are almost always just the actual old software wrapped in a compatibility layer of some kind that is easy to remove, so you can usually get the games running natively on period hardware/software. Finally, some nicer developers and publishers have officially declared some of their old games as free for everyone to play.

There are still legal options for playing old games on old systems.

3 more...

Similar concepts (i.e. connect to random strangers' devices when in close physical proximity, and trade mini profiles/trading tokens/whatever) have been done at least half a dozen times, both before and after Nintendo, but somehow they never seem to stick. Street Pass may have been the most successful iteration that I'm aware of. I think that it's hard to get critical mass. Users are excited at first when they set things up, but after a few days or weeks of not getting any hits, they tend to lose interest and turn the service off to save battery life.

I wonder when he last sat down and really played a game for an afternoon. Now that everything is plugged in and ready to go at the push of a few (spreadsheet-tracked) buttons, he has finally overcome all the difficulty of switching consoles and can now play through all the games he's been wanting to play. Right? Right?

One thing that wasn't mentioned in the article is default settings. In so many CLI programs (and FOSS in general), there seems to be some kind of allergy to default settings. I don't know whether it's a fear of doing the wrong thing, or a failure to sympathize with users who haven't spent the last three months up to their elbows in whatever the program does. But so often I've come to a new program, and if I've managed to stick with it, come to the realization later that at least half of the settings I needed to research for hours and enter manually every time could have been set to static defaults or easily-derived assumptions based on other settings in 99% of cases. Absolutely let your users override any and all defaults, but please use defaults.

I'd also be interested in the overlap between people saying, "LOL just get gud" about using the command line, and people who are terrified of using C++.

I played this on the PS2 and it's s fantastic experience.

Interestingly, the PAL version (and probably the Japanese version, too) has content that wasn't in the NA version. There's an extra puzzle, a semi-hidden alternative "funny/happy" coda after the main ending if you play through a second time, and some extra in-game options that are unlocked after you finish the game for the first time, including understandable subtitles for ALL characters, even ones that are normally speaking an unknown language. I'm not sure if the hidden weapon you can get in the middle of the game becomes a light saber on the second playthrough in the NA version as it does in the PAL version, but it may.

This was before video streaming sites, so there were many arguments on forums about how these things are in the game, no they aren't you trolls, yes they are here's a picture, that's obviously fake... and so on. It was interesting that once people figured out that the NA and PAL versions were different, there was a vocal core of NA players still insisting that it was all fake for quite a long time afterward.

The definition of the Date object explicitly states that any attempt to set the internal timestamp to a value outside of the maximum range must result in it being set to "NaN". If there's an implementation out there that doesn't do that, then the issue is with that implementation, not the standard.

This game is criminally unknown even in Japan. I first learned about it years ago and wish I had paid the high price to buy a copy then; now it sells for 10 times as much, and that's if you can even find a copy. It's the only game missing from my physical collection that I know I want.

I've tried a couple of times to get into it over the years, but the language barrier was always too high for me and broke immersion. Hopefully I and others can now give this game the attention that I think it deserves.

Edit: A link to the patch: https://www.romhacking.net/translations/7187/

There was already a Spanish patch a few years ago, as well.

Edit 2: And now it's been flagged as "noncompliant". I'm not sure what that means, but it's not available from that site for the moment.

Unfortunately I think that Sega themselves weren't the only group lacking confidence in the Dreamcast. In fact, I feel like they put up a valiant fight, with marketing and first-party titles.

Critics and consumers all had an extremely "wait and see" attitude that I think took the theoretical advantage of the incredibly early launch and turned it into a huge liability. People didn't want to commit to buying their next console without seeing what the other offers were going to be. So Sega had to work hard for about two years to keep the real and actually available Dreamcast positioned high in the market while their competitors had the luxury of showing jaw-dropping demos of "potential" hardware (i.e. "Here is some video produced on $50,000 graphics workstation hardware that is made by the same company that's currently in talks to produce our GPU.")

Third-party publishers also didn't want to put any serious budget toward producing games for the Dreamcast, because they didn't want to gamble real money on the install base increasing. This resulted in several low-effort PS1 ports that made very little use of the Dreamcast hardware, which in turn lowered consumer opinion of the console. When some of these games were later ported to PS2 as "upgraded" or "enhanced" versions, that only further entrenched the poor image of the Dreamcast.

I have owned all four major consoles of that generation since they were still having new games published for them. And if I had to choose only one console to keep from that group, it'd be the PlayStation 2, because of the game library. It's huge and varied. I have literally hundreds of games for it, while I only have a few dozen games for the others. But looking at the average quality of the graphics and sound in the games for those systems, I'd also rank the PS2 in last place, even behind the DC.

Sony was a massive juggernaut in the console gaming market at the time. The PlayStation 1 had taken the worldwide market by storm, and become the defacto standard console. It's easy to forget that the console launches for this generation were unusually spaced out over a four year period, and Sony was the company best positioned to turn that to their favour. People weren't going to buy a DC without seeing the PS2, but once they did, many were happy to buy a PS2 without waiting for Nintendo or Microsoft to release their consoles. The added ability to play DVDs at exactly the time when that market was hitting its stride (and more affordably than many dedicated DVD players) absolutely boosted their sales in a big way. Nintendo's GameCube didn't do that, and by the time the original X-Box came to market, it wasn't nearly as much of a consideration.

If you want to emulate those systems, then yes, you're going to need a fairly beefy computer. You could, as others have suggested, buy a good secondhand system and upgrade it with a GPU and more/better RAM.

But I want to pass on a warning as someone who also loves emulation and wishes they could "have everything in one place": a lot of emulators just aren't there yet, but some people are eager to kid themselves and others that they are.

16-bit systems and before typically have outstanding emulators available. Some systems from the next couple of generations are also very reliable (e.g. PS1, Dreamcast), while others mostly work well with minimal tinkering and only a small handful of exceptions (e.g. N64, Saturn). But after those, the reliability of emulators drops off fairly smoothly. Even the venerable PCSX2, for example, will run almost every known PS2 game in some fashion, but many games outside the biggest hits still have problems that make them terrible. And I don't mean picky things like, "Three notes in the bassline on this background music are slightly off," I mean, "The walls aren't rendered in most areas."

I really recommend having a good look at the compatibility lists for emulators you're interested in before you dive too deep down this hole. It's one thing to have a powerful PC already and think, "why not give it a go?" but another thing to build a new (to you) PC specifically for emulating these systems. I suspect that you may have been spoiled a bit by that fact that even the RP4 only has enough power to run those more stable emulators for older systems.

That XKCD reminds me of the case a year or three ago where some solo dev that no-one had ever heard of was maintaining a library that a couple of other very popular and major libraries depended on. Something somewhere broke for some reason, and normally this guy would've been all over it before most people even realized there had been a problem, but he was in hospital or jail or something, so dozens of huge projects that indirectly relied on his library came crashing down.

What upset me most was reading the community discussion. I didn't see a single person saying, "How can we make sure that some money gets to this guy and not just the more visible libraries that rely so heavily on his work?", even though the issue was obliquely raised in several places, but I did see quite a few saying, "How can we wrest this code out of this guy's hands against his will and make multiple other people maintain it (but not me, I'm too busy) so we don't have a single point of failure?"

  • GBA or GBA SP, you'll get a backlit screen as well (around the same budget?)

Careful! The original (landscape) GBA had a similar, unlit, reflective screen to the Gameboy Colour, and even the GBA SP was frontlit for most of its run. Only the later GBA SPs had the bright, backlit screens.

I modded an original GBA with an aftermarket frontlighting kit back in the day. I didn't like the GBA SP as it made my hands start to cramp up after only a few minutes.

"If you were making food, would you use onion powder?"

I thought that this was going to be a play on the phrase, "Don't trust anyone over 30," but it's just a very short piece about Dunning-Kruger aimed specifically at some C++ concepts.

2 more...