1. - Top - End - #57
    Titan in the Playground
     
    NecromancerGuy

    Join Date
    Jul 2013

    Default Re: [Thought experiment] If alignments are objective how do we know what they represe

    Quote Originally Posted by Segev View Post
    Okay. So the objectively right answer to any (alignment-related) question about what you should do is always "the moral thing." This is tautological and circular, which makes it not a very useful ANSWER, but it still is a useful TERM in this case.
    Yes, it is a term. A term used to reference the right answer to the overall question and to other questions of moral relevance.

    Quote Originally Posted by Segev View Post
    The answer to the question of, "What is the moral thing for me to do?" when posed a situation will always, then, be, "whatever your (target) alignment dictates."
    This is not a shared premise. Why would doing what your target alignment dictate be the right answer to "What ought one do?". That would be assuming the conclusion.

    Quote Originally Posted by Segev View Post
    The part I bolded is, if I am parsing you correctly, the point I've been trying to get at. You seem to me to be trying to say that "the moral thing to do" is always the same thing, and I'm asking "why?" I'm challenging the hidden implied purpose of your statement, "You ought to do X." But I think this next bit is actually moving us closer, so I will address that and hope that this bit discussed here above is not necessary:
    Well was established (imperfect word choice?) below (in your post), moral is a term for the right answer to the question.
    And the question does not presume a purpose. Assuming a purpose would be begging the question*. For any purpose you can imagine, I can question it by asking "But, ought one follow that purpose?". It may be hard to believe, but there is no hidden implied purpose to qualify the question.

    *Asking "What ought one do if we assume X is the answer to 'What ought one do?' ?" is circular logic or an unfounded premise.


    Quote Originally Posted by Segev View Post
    It feels strange to us, who live in a world and society where everyone at least thinks of "good" as the alignment to which to aspire, to say "it is moral to do evil, but in an objective alignment setting where there are people who actively want to adhere to alignments other than Good, that is a perfectly sensible statement, given the definition of "moral" you, OldTrees1, have given me. (I am not saying it's "your" definition; I believe you are citing other philosophers and philosophies. But I am trying to be very precise that I am using it by that specific definition, and not a definition that, for example, says "moral == good.")
    I struck out some unneeded qualifiers.

    In game, evil labels an alignment. It is not strange to discard unrelated moral statements about its namesake.* I can readily imagine a campaign where it is moral to do what is labeled evil. I can also readily imagine a campaign where alignments are amoral. I know some can readily imagine a campaign where the moral character of the alignments follows moral relativism, although I will admit I can't personally readily imagine moral relativism in any context.

    *Apologies but a more absurd example popped into my head:
    Assume that IRL choices between various icecream flavors was an amoral choice.
    Assume that a game was made with alignments named vanilla, rocky road, chocolate, and mint.
    There is no reason to assume, unless stated, that the moral statement about icecream flavors IRL is in any way related to statements about the alignments in that game. Two things sharing the same name does not make them necessarily the same.

    So statements like "It is moral to do evil" are not inherently self contradicting in the context where "evil" is not merely another word for "immoral". The statement might be false, or true, or mu, or depends. That depends on context not presumed at this time. For example Moral Universalism would declare the statement could only be false xor true. Moral Error Theory would say it was mu. Moral Relativism might say it depends.

    PS: I apologize for being the cause of the linguistic gymnastics you had to do in that qualifier around "the definition of 'moral' you, OldTrees1, have given me".
    Last edited by OldTrees1; 2021-03-05 at 01:22 PM.