Sentience and AI

Pyro Pilots Lounge. For all topics *not* covered in other DBB forums.

Moderators: fliptw, roid

Post Reply
User avatar
sdfgeoff
DBB Ace
DBB Ace
Posts: 498
Joined: Wed Jan 12, 2011 1:07 am
Location: Low Earth Orbit
Contact:

Sentience and AI

Post by sdfgeoff »

I came up with a theorem a couple weeks ago:
Sentience is a matter of opinion
I also see it as a bit of a pun.

The idea is that if an entity has it's own ideas on some topic then it is sentient. This can be expanded to: "If an AI can present an original reasoned argument why you shouldn't turn it off, don't turn it off"
Turning off any existing AI? No problem, it doesn't come up with it's own ideas. Current AI's are mostly genetic algorithms and neural networks. No matter how complex they are, they are technically unable to come up with opinions and thus (by my definition) will never become sentient.

Thus we need a new approach to AI. Anyone got any ideas for how to build AI's that can formuate opinions?
Eh?
User avatar
Spidey
DBB Grand Master
DBB Grand Master
Posts: 10724
Joined: Thu Jun 28, 2001 2:01 am
Location: Earth

Re: Sentience and AI

Post by Spidey »

If you want a machine to give you an “opinion” just program in some random errors…that is how humans come up with opinions. :wink:

Although I don’t understand why you would want an opinion from a machine, I would think you would always want the correct answer instead, or at least as close as the programming can get to a correct answer.
User avatar
sdfgeoff
DBB Ace
DBB Ace
Posts: 498
Joined: Wed Jan 12, 2011 1:07 am
Location: Low Earth Orbit
Contact:

Re: Sentience and AI

Post by sdfgeoff »

There is a big difference between opinions and errors. I may want an opinion but I can't think of a single case where I'd want an error.

Why would you want an opinion:
- Sometimes there are no correct answers (eg is option A better than option B when both will fulfill the requirements?)
- Sometimes there isn't enough information, not even for an approximation, or maybe there isn't the time to get a complete solution.
- Sometimes it's good to have a second opinion on a topic - perhaps even more so if it has a completely different thought process to a regular human (chat to any engineer)
- Anything involving philosophy or psycological assesment requires opinions. (eg why does the universe bother to exist? is person X nice?)
- Anything involving art or music requires opinions, which is why there are no good procedural music generators yet.
etc.
Eh?
User avatar
Spidey
DBB Grand Master
DBB Grand Master
Posts: 10724
Joined: Thu Jun 28, 2001 2:01 am
Location: Earth

Re: Sentience and AI

Post by Spidey »

An opinion is a subjective conclusion derived mostly from lack of complete information, coupled to a faulty analyzing process. (the human brain)

Machines shouldn’t have opinions, but if you did want an opinion, then you would have to duplicate the reasons why humans can’t produce absolutely correct answers.

An opinion is a conclusion drawn from an imperfect biological machine, but if you are looking to create that spark of life that separates us from the rocks…I’m not sure that is possible.

Looking at your list perhaps you mean personal preference/taste rather than opinion, again…same problem.
User avatar
Tunnelcat
DBB Grand Master
DBB Grand Master
Posts: 13309
Joined: Sat Mar 24, 2007 12:32 pm
Location: Pacific Northwest, U.S.A.

Re: Sentience and AI

Post by Tunnelcat »

Spidey wrote:If you want a machine to give you an “opinion” just program in some random errors…that is how humans come up with opinions. :wink:
It's not random errors that humans use to form an opinion. It's heuristics. :wink:

https://en.wikipedia.org/wiki/Heuristic ... processing
Cat (n.) A bipolar creature which would as soon gouge your eyes out as it would cuddle.
User avatar
sdfgeoff
DBB Ace
DBB Ace
Posts: 498
Joined: Wed Jan 12, 2011 1:07 am
Location: Low Earth Orbit
Contact:

Re: Sentience and AI

Post by sdfgeoff »

Yes opinions are derived from incomplete information and (sometimes) incomplete analysis, but they fill a gap where information cannot be accurately determined.
If you ask a computer "What music should I listen to" it will suggest a combination of what you have previously listened to and enjoyed mixed with other things other people have liked. Chances you like the music are moderately high. But what if you want something new? You can't ask the opac in the library "reccomend me a good science fiction book," but you can ask a librarian.

For a robotic car, deciding whether to overtake is also based on incomplete information, you cannot predict the future, so a robotic car make the assumptions the programmer made (that the other driver and oncomming traffic will behave sensibly). I'd rather my robotic car looked at how the guy infront of him had behaved, looked at how traffic behaved in reality (normally 10km above the speed limit) and made it's assumptions based on that.
Can this behaviour be programmed? Yes, but doing so is long and complicated and requires the programmer to have thought of nearly everything.


Heuristics is great. Teaching a machine unbounded heuristics is, well, an interesting challenge. It's almost as bad as teaching it reasoning....
Eh?
User avatar
Tunnelcat
DBB Grand Master
DBB Grand Master
Posts: 13309
Joined: Sat Mar 24, 2007 12:32 pm
Location: Pacific Northwest, U.S.A.

Re: Sentience and AI

Post by Tunnelcat »

Once we figure out how to define and program base emotional logic into a machine, we're all screwed. Self-driving cars will then be capable of road rage. :lol:
Cat (n.) A bipolar creature which would as soon gouge your eyes out as it would cuddle.
Post Reply