PDA

View Full Version : Would an Intelligient Construct disobey you?



Sims
2011-03-07, 10:58 PM
I like the idea of a "Smart" Iron Golem. But if he were created (by any means necessary) would he not listen? Or would ordering him around be easier?

Callista
2011-03-07, 11:01 PM
D&D 3.5?

He would be capable of understanding the concept of disobedience, but he would probably not be able to choose to disobey you. Like all constructs, though, he does have a chance of going berserk, and giving him orders he does not want to obey may increase this chance.

It is possible to create constructs that are entirely free-willed and capable of disobeying even their creators. Warforged are one example.

Jack_Simth
2011-03-07, 11:16 PM
I like the idea of a "Smart" Iron Golem. But if he were created (by any means necessary) would he not listen? Or would ordering him around be easier?

Assuming D&D 3.5? Depends on how he became intelligent.

By default, a golem obeys it's creator and it's master.

Certain types of golems have a chance of going berserk, and becoming uncontrolled (the iron golem is not one of them, so this line doesn't matter).

The Awaken Construct spell (Spell Compendium) specifically gives the critter free will - so it'll obey you, or not, as much as any other person would (well, it is friendly, initially).

However, if the method of granting intelligence doesn't specify in that regard, the default rules for the critter in question dominate - so if, for example, you used the Intelligent Item (http://www.d20srd.org/srd/magicItems/intelligentItems.htm) creation rules (such as they are) to give your iron golem intelligence, then it *will* obey you, as nothing in the intelligent item rules overrides the golem rules of obedience (although it may occasionally demand a concession...).

So exactly as the rules are written, it depends on how it became intelligent. Make sense?

Sims
2011-03-07, 11:25 PM
Assuming D&D 3.5? Depends on how he became intelligent.

By default, a golem obeys it's creator and it's master.

Certain types of golems have a chance of going berserk, and becoming uncontrolled (the iron golem is not one of them, so this line doesn't matter).

The Awaken Construct spell (Spell Compendium) specifically gives the critter free will - so it'll obey you, or not, as much as any other person would (well, it is friendly, initially).

However, if the method of granting intelligence doesn't specify in that regard, the default rules for the critter in question dominate - so if, for example, you used the Intelligent Item (http://www.d20srd.org/srd/magicItems/intelligentItems.htm) creation rules (such as they are) to give your iron golem intelligence, then it *will* obey you, as nothing in the intelligent item rules overrides the golem rules of obedience (although it may occasionally demand a concession...).

So exactly as the rules are written, it depends on how it became intelligent. Make sense?

So I guess the trappted Soul of King Chibi isn't gonna work here XD but yeah, i see what you mean

Pyrite
2011-03-08, 04:52 AM
Remember that even if your intelligent golem has to still follow your orders, it has gained the ability to interpret your orders, or make it's own choices within the parameters of your orders. Of course this doesn't mean that it can interpret the order to "protect me" as "kill me" unless it's gone completely delusional, (or in some freaky situation like an approaching sphere of annihilation where death really would be the best protection.) but it does mean that "stop that thief" can have a variety of outcomes depending on the golem's beliefs, predilections, or mood. You might end up with an unconscious thief, a dead thief, etc.

Gnoman
2011-03-08, 07:53 AM
Also, several of Asimov's stories discuss ways in which robots could get around hard-coded blocks in their programming, including the Three Laws. In one story, for example, robots killed thousands of people.

Jack_Simth
2011-03-08, 07:57 AM
So I guess the trappted Soul of King Chibi isn't gonna work here XD but yeah, i see what you mean
If you have a method of making it intelligent that is homebrewed and/or houseruled, then you'll need to ask the question of the designer (person who made the method) or adjudicator (DM).

And, of course, don't forget interpertations, as Pyrite noted.

FMArthur
2011-03-08, 09:10 AM
If it's totally free-willed without fixed behaviour 'coding', this is essentially the same question as "would a child you raised disobey you?" and the answer is absolutely at some point but it depends on how you raised it and its personality whether it does so in a 'bad' way and if you're still friends afterward. If you treat it right you'll probably be on good terms at the very least. If you demand obediance rather than simply ask for help it might react quite similarly to a human under those circumstances - might disobey right then and there, or obey and grow resentful over time.

Otherworld Odd
2011-03-08, 09:29 AM
I know Homunculus go insane if you die, but they would never disobey the person who created them. If you made them to not disobey you, they wouldn't disobey you. I disagree with this question being the same as "would a child you raised disobey you?" as children are humans (or otherwise) with their own feelings and completely free will. You (the parent) didn't craft their brain for a specific purpose.

FMArthur
2011-03-08, 10:16 AM
Hey, check out that first clause in the first sentence of my post. Pretty cool, isn't it? What you might not know is that it's actually made up of words that mean things. :smalltongue:

Otherworld Odd
2011-03-08, 10:21 AM
Hey, check out that first clause in the first sentence of my post. Pretty cool, isn't it? What you might not know is that it's actually made up of words that mean things. :smalltongue:

Okay, I'll admit I missed that part but you're coming off kind of rude to me.

Cartigan
2011-03-08, 10:24 AM
Also, several of Asimov's stories discuss ways in which robots could get around hard-coded blocks in their programming, including the Three Laws. In one story, for example, robots killed thousands of people.

Was that an Asimov story or "Asimov" story? I can't think of any story where robots managed to kill anyone, though there were those with modified laws which made them rather dangerous.

And isn't this what happened to Warforged in Eberron.

The_Jackal
2011-03-08, 01:27 PM
No, but it would probably correct your spelling. :P

Seriously, anyone who's ever written code will tell you that getting even a simple algorithm to work can be fiendishly difficult. Why should 'programming' a construct be any different? Generally, the most sophisticated the application, the more potential there is for errors to creep up, so it seems to me like a more intelligent automaton would be far more likely to do something unexpected.

Cartigan
2011-03-08, 01:29 PM
No, but it would probably correct your spelling. :P

Seriously, anyone who's ever written code will tell you that getting even a simple algorithm to work can be fiendishly difficult. Why should 'programming' a construct be any different? Generally, the most sophisticated the application, the more potential there is for errors to creep up, so it seems to me like a more intelligent automaton would be far more likely to do something unexpected.
Cause it's magic. Duh.

Pyrite
2011-03-08, 01:35 PM
Cause it's magic. Duh.

Magic only works that way in favor of the writer/GM.