Back in uni, my colleagues and I had something we called "default mode" -- the idea that all technology had an inherent desire to kill all humans or otherwise be as destructive to life and property as possible. "Default mode" had to be actively prevented by careful engineering -- e.g. all devices are assumed to be maximally harmful until you engineer them to be otherwise with a high degree of confidence.
We also had something we called "destructive optimization". This was essentially the elimination of an object that was so poorly fitted to it's purpose that it made it actively harder to do the intended thing. So, like smashing a tool that is so bad, that the task is easier to accomplish without it. Often, these tools would be inherited from graduating grad students on the instruction of a well-meaning supervisor. For example an overly complex and poorly documented robotic arm that has weird bugs inherent to the design, iterated on a dozen times -- less work to redo than fix!
The terms are best used in tandem, e.g. "it entered default mode, and had to be destructively optimized".
Nearly two decades later, I still think in these terms and laugh about it (while also taking them seriously). I now own an engineering company. My focus is still firmly on preventing "default mode". I also make OK money "destructively optimizing" software tools sometimes.
That gave me a chuckle. I‘d probably work with you. :)
TIL the default mode is my default mode
Yeah human beings are pretty good at default mode :(
Often, I think our machines contain the best of us. Maybe that's the real reason people seem afraid of AI.
Back in uni, my colleagues and I had something we called "default mode" -- the idea that all technology had an inherent desire to kill all humans or otherwise be as destructive to life and property as possible. "Default mode" had to be actively prevented by careful engineering -- e.g. all devices are assumed to be maximally harmful until you engineer them to be otherwise with a high degree of confidence.
We also had something we called "destructive optimization". This was essentially the elimination of an object that was so poorly fitted to it's purpose that it made it actively harder to do the intended thing. So, like smashing a tool that is so bad, that the task is easier to accomplish without it. Often, these tools would be inherited from graduating grad students on the instruction of a well-meaning supervisor. For example an overly complex and poorly documented robotic arm that has weird bugs inherent to the design, iterated on a dozen times -- less work to redo than fix!
The terms are best used in tandem, e.g. "it entered default mode, and had to be destructively optimized".
Nearly two decades later, I still think in these terms and laugh about it (while also taking them seriously). I now own an engineering company. My focus is still firmly on preventing "default mode". I also make OK money "destructively optimizing" software tools sometimes.
That gave me a chuckle. I‘d probably work with you. :)
TIL the default mode is my default mode
Yeah human beings are pretty good at default mode :(
Often, I think our machines contain the best of us. Maybe that's the real reason people seem afraid of AI.
Ya terminated!