1656
Ancient IBM wisdom (from 1979) that the bosses just straight up promptly forgot
(media.piefed.social)
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
I generally agree.
Imagine however, that a machine objectively makes the better decisions than any person. Should we then still trust the humans decision just to have someone who is accountable?
What is the worth of having someone who is accountable anyway? Isn't accountability just an incentive for humans to not just fuck things up? It's also nice for pointing fingers if things go bad - but is there actually any value in that?
Additionally: there is always a person who either made the machine or deployed the machine. IMO the people who deploy a machine and decide that this machine will now be making decisions should be accountable for those actions.
You can't know if a decision is good or bad without a person to evaluate it. The situation you're describing isn't possible.
How is this meaningfully different from just having them make the decisions in the first place? Are they too stupid?
You can evaluate effectiveness by company profits. One program might manage a business well enough to steadily increase profit, another may make a sharp profit before profit crashes (maybe by firing important workers) . Investors will demand the best CEObots
Edit to add: of course any CEObot will be more sociopathic than any human CEO. They won't care about literally anything unless a score is attached to it