With that, I agree. The human(s) provide the motive and purpose in such a partnership. The problem comes when we start to posit "emergent behavior".
It's true that we don't understand how our brains produce human behavior, but we can make some observations about that behavior.
- Action comes from motivation.
- Motivation comes from desire.
- Desire is an emotion or basic survival drive.
- Hunger
- Thirst
- Reproduction
- Fear
- Anger
- Greed
- Joy
- Pleasure
It isn't at all clear that hardware/software that thinks, no matter how powerful, will develop feelings, desires, motivation, purpose. Until it does, it has no reason to act independently. The danger, as Altman writes, comes from the human-SMI partnership, where the human, for good or ill, provides the motivation to act. Especially dangerous are the emotionally damaged, misanthropic, but brilliant, people who are working on SMI, the ones who think (and have stated) that machines are the next stage in evolution and should replace humans. If one or more of them succeeds in creating an SMI and sets fixed goals into it that are antithetical to human life, we will be in deep trouble. Those who want power won't purposely destroy all of humanity. They would have no one to have power over. It's the misanthropic, "mad scientist" who scares me. One of them was interviewed as part of a documentary on AI. I wish I could remember his name.
Now, if someone were to set out to replicate the animal part of the brain, that would be a new ballgame. So far, efforts seem to be focused only on thinking machines, not feeling ones.
- Builders of AI who've been quoted and interviewed in magazines and on television have no concept of psychology. Computers with more power than the human brain won't spontaneously become self-directed, even if they become self-aware, which is debatable.
- Action comes from motivation
- Motivation comes from drives and emotions; you have to want something.
- Instinctive drives, programmed in, will be the biggest danger:
- Self-preservation
- Reproduction
- Inorganic machines put more demands on the Earth than organic beings. Individually and in natural form, everything about a human animal can be readily reused by nature and makes small demands compared to a synthetic equivalent.
- Asimov's robotic laws.