Robots learn how to "lie"

Talk about anything in here.

Robots learn how to "lie"

Postby Momo-P » Sat Aug 22, 2009 9:54 am

Source
With the development of killer drones, it seems like everyone is worrying about killer robots.
Now, as if that wasn't bad enough, we need to start worrying about lying, cheating robots as well.

In an experiment run at the Laboratory of Intelligent Systems in the Ecole Polytechnique Fédérale of Lausanne, France, robots that were designed to cooperate in searching out a beneficial resource and avoiding a poisonous one learned to lie to each other in an attempt to hoard the resource. Picture a robo-Treasure of the Sierra Madre.

The experiment involved 1,000 robots divided into 10 different groups. Each robot had a sensor, a blue light, and its own 264-bit binary code "genome" that governed how it reacted to different stimuli. The first generation robots were programmed to turn the light on when they found the good resource, helping the other robots in the group find it.

The robots got higher marks for finding and sitting on the good resource, and negative points for hanging around the poisoned resource. The 200 highest-scoring genomes were then randomly "mated" and mutated to produce a new generation of programming. Within nine generations, the robots became excellent at finding the positive resource, and communicating with each other to direct other robots to the good resource.

However, there was a catch. A limited amount of access to the good resource meant that not every robot could benefit when it was found, and overcrowding could drive away the robot that originally found it.

After 500 generations, 60 percent of the robots had evolved to keep their light off when they found the good resource, hogging it all for themselves. Even more telling, a third of the robots evolved to actually look for the liars by developing an aversion to the light; the exact opposite of their original programming!


It's kind of debateable whether or not you really wanna consider it lying, I mean...they ARE robots for heaven's sake, but still...pretty interesting.
Momo-P
 
Posts: 482
Joined: Sat Jul 30, 2005 11:34 pm

Postby Technomancer » Sat Aug 22, 2009 11:44 am

Momo-P (post: 1340320) wrote:Source


It's kind of debateable whether or not you really wanna consider it lying, I mean...they ARE robots for heaven's sake, but still...pretty interesting.


They are robots, but they still are evolving a form of deception (and a strategy for detecting deception). Given more variables, it would be interesting to see what other behaviours might evolve, and how the robots would interact.
The scientific method," Thomas Henry Huxley once wrote, "is nothing but the normal working of the human mind." That is to say, when the mind is working; that is to say further, when it is engaged in corrrecting its mistakes. Taking this point of view, we may conclude that science is not physics, biology, or chemistry—is not even a "subject"—but a moral imperative drawn from a larger narrative whose purpose is to give perspective, balance, and humility to learning.

Neil Postman
(The End of Education)

Anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that my ignorance is just as good as your knowledge

Isaac Aasimov
User avatar
Technomancer
 
Posts: 2379
Joined: Fri Jun 13, 2003 11:47 am
Location: Tralfamadore

Postby Tsukuyomi » Sat Aug 22, 2009 12:01 pm

Even if they are AI, they are learning ^^ Even if it is trhough glitches and whatnot ^^

That is pretty interesting ^^ I just now imagined a group of robots going,"I didn't find anything. Nope, nothing at all." XDDD
Image
User avatar
Tsukuyomi
 
Posts: 8222
Joined: Mon Aug 09, 2004 12:00 pm
Location: I am a figment of your imagination... I live only in your dreams... I haunt you ~(O_O)~

Postby sharien chan » Sat Aug 22, 2009 3:06 pm

Thats weird. Kind of cool...but mostly weird. Who would have thought? Though sometimes I wonder if people realize that maybe there are some experiments or inventions we shouldn't do.
User avatar
sharien chan
 
Posts: 454
Joined: Fri Feb 08, 2008 10:36 am
Location: lalalala life

Postby SnoringFrog » Sat Aug 22, 2009 7:53 pm

That's definitely interesting, and I agree with Technomancer, it'd be interesting to see how this would go with extra variables involved.
UC Pseudonym wrote:For a while I wasn't sure how to answer this, and then I thought "What would Batman do?" Excuse me while I find a warehouse with a skylight...
[SIZE="7"][color="MediumTurquoise"]Cobalt Figure 8[/color][/SIZE]
DeviantArt || Myspace || Facebook || Greasemonkey Scripts || Stylish Userstyles
User avatar
SnoringFrog
 
Posts: 1159
Joined: Tue Jul 26, 2005 9:25 pm
Location: Liberty University, VA

Postby blkmage » Sat Aug 22, 2009 8:33 pm

The article links to another article which links to a more detailed article.

At the beginning, the light had no significance. The robots then learned to shine a light whenever they found resources. They then learned that if a light shone, then that meant there were resources there. Pretty soon, they realized that resources were scarce, so some of the learned not to shine lights when they found resources. After that, some robots learned not to depend on the light to find resources. As a result, those who continued to shine lights weren't in as bad shape as before.

Also interesting is that the attraction to lights and the notification through lights isn't binary. That is, the robots weren't always attracted to or always ignoring lights. A lot of them would be attracted sometimes and the same goes for the decision to shine a light when resources were discovered.
User avatar
blkmage
 
Posts: 4529
Joined: Mon May 03, 2004 5:40 pm

Postby Fish and Chips » Sat Aug 22, 2009 9:39 pm

"Killbot, you're not thinking of annihilating humanity are you?"
"No. Sir."

Greatest scientific achievement.
User avatar
Fish and Chips
 
Posts: 4415
Joined: Sat Dec 16, 2006 2:33 pm
Location: Nowhere.

Postby Solid Ronin » Sat Aug 22, 2009 9:43 pm

Hey, Andrew won't you believe in him...

Image
Image
User avatar
Solid Ronin
 
Posts: 1700
Joined: Fri Aug 15, 2003 4:00 am
Location: Houston

Postby Warrior4Christ » Sat Aug 22, 2009 10:59 pm

Knowing a bit about AI, I'm pretty sure they pre-programmed all the behaviour in, including lying, lie-detection, turning light on or off, etc. and then the 'gene' it has determines its behaviour at runtime. So it's no coincidence that they turned to lying, since that existed in their original programming.
Everywhere like such as, and MOES.

"Expect great things from God; attempt great things for God." - William Carey
User avatar
Warrior4Christ
 
Posts: 2045
Joined: Sat Aug 20, 2005 8:10 pm
Location: Carefully place an additional prawn on the barbecue

Postby Technomancer » Sun Aug 23, 2009 5:39 am

Warrior4Christ (post: 1340453) wrote:Knowing a bit about AI, I'm pretty sure they pre-programmed all the behaviour in, including lying, lie-detection, turning light on or off, etc.


No, it wasn't. Only the basic aspects of the hardware control were programmed in. The actual program that controlled the robot's behaviour was the genome itself (in other words, a genetic program). So for example, in the case of optical signalling, while the capability always existed in the robot, the question of when and why signalling should occur, was completely controlled by the adaptation of the genetic program.
The scientific method," Thomas Henry Huxley once wrote, "is nothing but the normal working of the human mind." That is to say, when the mind is working; that is to say further, when it is engaged in corrrecting its mistakes. Taking this point of view, we may conclude that science is not physics, biology, or chemistry—is not even a "subject"—but a moral imperative drawn from a larger narrative whose purpose is to give perspective, balance, and humility to learning.

Neil Postman
(The End of Education)

Anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that my ignorance is just as good as your knowledge

Isaac Aasimov
User avatar
Technomancer
 
Posts: 2379
Joined: Fri Jun 13, 2003 11:47 am
Location: Tralfamadore

Postby shooraijin » Sun Aug 23, 2009 8:03 am

The more interesting part might be the algorithm that "improved" the internal behaviour.
"you're a doctor.... and 27 years.... so...doctor + 27 years = HATORI SOHMA" - RoyalWing, when I was 27
"Al hail the forum editting Shooby! His vibes are law!" - Osaka-chan

I could still be champ, but I'd feel bad taking it away from one of the younger guys. - George Foreman
User avatar
shooraijin
 
Posts: 9927
Joined: Thu Jun 26, 2003 12:00 pm
Location: Southern California

Postby Kaligraphic » Sun Aug 23, 2009 8:47 am

Of course, the people conducting the experiment had to have been looking for that sort of response. If they'd been looking for the highest total score, they'd have added a bonus based on the highest total score, rather than just using individual scores.

So when a killbot goes berserk because of this sort of programming, for all the weapons and ammo it will have stockpiled, at least it'll also be attacking other killbots as potential competitors. Sporadic, individual killbot murder sprees might be enough to keep a killbot operator out of jail ("It must have just been one of those crazy killbots. Genetic algorithm inbreeding and all that.") but they won't spark an uprising. They'll all be waiting for the opportune moment.

See, it's when they learn to cooperate that they'll have a chance of exterminating us.



More seriously, though, if the study operated as blkmage says, then it isn't lying at all. Rather, it would simply be a lack of active cooperation.
The cake used to be a lie like you, but then it took a portal to the deception core.
User avatar
Kaligraphic
 
Posts: 2002
Joined: Wed Jul 21, 2004 12:00 pm
Location: The catbox of DOOM!

Postby Technomancer » Sun Aug 23, 2009 9:12 am

Kaligraphic (post: 1340496) wrote:Of course, the people conducting the experiment had to have been looking for that sort of response. If they'd been looking for the highest total score, they'd have added a bonus based on the highest total score, rather than just using individual scores.


True, the fitness criterion will define the kind of solutions arrived at. A more complex simulation (or different scoring) might very have favoured the evolution of cooperative behaviour. A more interesting approach though would be to still focus on individual rewards, but allow for the possibility that the average personal reward might be higher in the presence of cooperation. Such a system might also have room for the evolution of "cheating" strategies, as well as various counter-strategies, etc.

Incidentally, a bit of background on the underlying method can be found here:
http://en.wikipedia.org/wiki/Genetic_programming
The scientific method," Thomas Henry Huxley once wrote, "is nothing but the normal working of the human mind." That is to say, when the mind is working; that is to say further, when it is engaged in corrrecting its mistakes. Taking this point of view, we may conclude that science is not physics, biology, or chemistry—is not even a "subject"—but a moral imperative drawn from a larger narrative whose purpose is to give perspective, balance, and humility to learning.

Neil Postman
(The End of Education)

Anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that my ignorance is just as good as your knowledge

Isaac Aasimov
User avatar
Technomancer
 
Posts: 2379
Joined: Fri Jun 13, 2003 11:47 am
Location: Tralfamadore

Postby minakichan » Sun Aug 23, 2009 1:06 pm

This is AWESOOOOOOOOOOOOOME!

Or would be if this was the precursor to a robot uprising. =(
ImageImage
User avatar
minakichan
 
Posts: 1547
Joined: Thu Nov 11, 2004 8:19 pm
Location: Tejas

Postby Whitefang » Sun Aug 23, 2009 5:41 pm

I think the robots would have evolved quite differently if they had to seek out mates with compatible scores and would "die" or be decommissioned when their own scores became too low. I would also like to know if the robots ever tried to fool the other robots by turning on their light near no source or the bad source.
"It's not easy to act in the name of justice."

"Justice is not the only right in this world"
User avatar
Whitefang
 
Posts: 261
Joined: Wed Nov 19, 2008 9:17 pm
Location: Paradise

Postby ich1990 » Sun Aug 23, 2009 8:01 pm

These self optimizing AI systems are really cool. I can already envision people selectively "breeding" robots for gladiator style competitions.
Where an Eidolon, named night, on a black throne reigns upright.
User avatar
ich1990
 
Posts: 1546
Joined: Mon Apr 16, 2007 2:01 pm
Location: The Land of Sona-Nyl

Postby SnoringFrog » Sun Aug 23, 2009 8:05 pm

Whitefang (post: 1340617) wrote:I think the robots would have evolved quite differently if they had to seek out mates with compatible scores and would "die" or be decommissioned when their own scores became too low. I would also like to know if the robots ever tried to fool the other robots by turning on their light near no source or the bad source.


I believe something did mention that they would turn their lights on at no/bad source sometimes, because it also mentioned how some developed a lack of dependence on using other robots' lights to locate the good source.

These self optimizing AI systems are really cool. I can already envision people selectively "breeding" robots for gladiator style competitions.
I'm seeing BattleBots on steroids--er...genetic programming.
UC Pseudonym wrote:For a while I wasn't sure how to answer this, and then I thought "What would Batman do?" Excuse me while I find a warehouse with a skylight...
[SIZE="7"][color="MediumTurquoise"]Cobalt Figure 8[/color][/SIZE]
DeviantArt || Myspace || Facebook || Greasemonkey Scripts || Stylish Userstyles
User avatar
SnoringFrog
 
Posts: 1159
Joined: Tue Jul 26, 2005 9:25 pm
Location: Liberty University, VA

Postby Fish and Chips » Sun Aug 23, 2009 10:21 pm

ich1990 (post: 1340673) wrote:These self optimizing AI systems are really cool. I can already envision people selectively "breeding" robots for gladiator style competitions.
"I am Sp@rticus."
User avatar
Fish and Chips
 
Posts: 4415
Joined: Sat Dec 16, 2006 2:33 pm
Location: Nowhere.

Postby Maokun » Mon Aug 24, 2009 6:19 am

Fish and Chips (post: 1340443) wrote:"Killbot, you're not thinking of annihilating humanity are you?"
"No. Sir."

Greatest scientific achievement.


Fish and Chips (post: 1340709) wrote:"I am Sp@rticus."


I lol'd. You are in a roll, sir.

As for the article I believe that while the "lying" mechanism wasn't programmed in from the beginning, the experiment was conditioned to obtain that result. Personally I found less interesting that the robots lied than the fact they developed a sense of ownership over their findings. Lying was just a device to protect it.

I'd feel worried about the unavoidably upcoming "Day of the Machines" if I weren't rooting myself for a "Night of the Living Dead" scenario. Now, if they were the day and the night of the same day, things could get ugly.
User avatar
Maokun
 
Posts: 1135
Joined: Sun Apr 19, 2009 2:55 am
Location: The Valley of the Wind

Postby shooraijin » Mon Aug 24, 2009 6:21 am

Fish and Chips (post: 1340709) wrote:"I am Sp@rticus."


Except that this lot would probably say, "Yeah! He's Sp@rt@cus. Kill him!"
"you're a doctor.... and 27 years.... so...doctor + 27 years = HATORI SOHMA" - RoyalWing, when I was 27
"Al hail the forum editting Shooby! His vibes are law!" - Osaka-chan

I could still be champ, but I'd feel bad taking it away from one of the younger guys. - George Foreman
User avatar
shooraijin
 
Posts: 9927
Joined: Thu Jun 26, 2003 12:00 pm
Location: Southern California

Postby NarutoAngel221 » Thu Aug 27, 2009 7:55 am

Well it seems like robots are getting innovative right now I just do hope they will not be as powerful as they are and hope they can make a robot with a heart like human does
Naruto Forever. Can't wait for a Naruto MMORPG
NarutoAngel221
 
Posts: 19
Joined: Mon Aug 24, 2009 10:03 pm

Postby ich1990 » Thu Aug 27, 2009 8:38 am

NarutoAngel221 (post: 1341738) wrote:and hope they can make a robot with a heart like human does


That would probably be a bad idea.
Where an Eidolon, named night, on a black throne reigns upright.
User avatar
ich1990
 
Posts: 1546
Joined: Mon Apr 16, 2007 2:01 pm
Location: The Land of Sona-Nyl

Postby KagayakiWashi » Thu Aug 27, 2009 1:04 pm

Nooooooo! If it gets much worse than that, videogames will start to cheat and lie to the player even more!
"To be a good listener, you must acquire a musical culture...you must be familiar with the history and development of music, you must listen...to receive music you have to open your ears and wait for the music, you must believe that it is something you need ...to listen is an effort, and just to hear has no merit. A duck hears also." - Igor Stravinsky
Are you hurting? Struggling with something? Need an ear? Check out The Hopeline! https://www.thehopeline.com/CSDefault.aspx
The Blog! http://kagayakiwashi.livejournal.com/
User avatar
KagayakiWashi
 
Posts: 800
Joined: Sun Apr 20, 2008 8:04 pm
Location: Constantly chasing the dragonfly of love....or something like that

Postby Bobtheduck » Sun Aug 30, 2009 2:27 pm

KagayakiWashi (post: 1341824) wrote:Nooooooo! If it gets much worse than that, videogames will start to cheat and lie to the player even more!


More than programmers tell them to in order to secure a false sense of challenge? I doubt it.
https://www.youtube.com/watch?v=evcNPfZlrZs Watch this movie なう。 It's legal, free... And it's more than its premise. It's not saying Fast Food is good food. Just watch it.
Legend of Crying Bronies: Twilight's a Princess
Image
User avatar
Bobtheduck
 
Posts: 5867
Joined: Mon Aug 25, 2003 9:00 am
Location: Japan, currently. Gonna be Idaho, soon.

Postby WhiteMage212 » Mon Aug 31, 2009 8:20 pm

this is getting pretty scary. robots will soon replace low class and physical labor jobs at this rate. The only reason I would love advanced robots that would rebel is because I could carry a chain gun and shoot them all up.
In the beginning, God created HTML...- R. Zion
Men cry not for themselves, but for there comrades.-FF7 Crisis Core
"If it's not the gun that takes you down, it's the pen- myself
Know God, No fear.
If it doesn't fit, you must edIT! MOES. http://www.christiananime.net/showthread.php?t=43825[/URL].
User avatar
WhiteMage212
 
Posts: 148
Joined: Tue Feb 03, 2009 8:29 pm
Location: A place where dreams come... I mean Torrance


Return to General

Who is online

Users browsing this forum: No registered users and 68 guests