Rule

Persona3Reload@lemmy.blahaj.zone to 196@lemmy.blahaj.zone – 585 points –
19

Implying human managers are held accountable.

Only when basic grunt scapegoats fail to repel the call for accountability, a management level unit is sacrificed.

A computer can never find out. That is why a computer must never fuck around.

Canadian airline is complaining about finding out when their computer fucked around.

I disagree.

Machines are made by humans, and can be built responsibly.

Yeah, but even machines that are "built responsibly" (whatever that may mean) can make mistakes. Correction: they will make mistakes, because decision making isn't linear, and afaik computers are only good at linear tasks, such as calculations. And once they do, who should be held accountable? The AI's creator? The person/company who accepted whatever decision the AI made? Or nobody? When people are deciding, it's a bit easier to know who to blame. But how do you do it when the decision is by an algorithm?

And sure, maybe AI's can help in decision making, but shouldn't decisions be made by people in the end?

And sure, maybe AIโ€™s can help in decision making, but shouldnโ€™t decisions be made by people in the end?

It depends on what decision you're looking at.

What does being held accountable even mean? Punishment? Or just removeval of their services bc well machines can be removed from service.

If a machine stops working we hold it "accountable" by servicing it or throwing it out. Plus there are plenty machines that make decisions and we use the without much fuss

Exactly. Responsibility doesn't mean that you go to jail. It means that you fix your mistakes.