TALK STORY

Things that are always wrong.

POSTED BY: WISHIMAY
UPDATED: Sunday, November 7, 2010 05:16
SHORT URL:
VIEWED: 17541
PAGE 1 of 1

Wednesday, September 22, 2010 12:48 PM

WISHIMAY


Some things should always be wrong. Theorize that you live in a civilized society.
What things *should* always be wrong to all people?
If you had to create an android, what moral baseline would you install in it?

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, September 22, 2010 12:56 PM

CHRISISALL


Hypocrisy is wrong. Doing unto others that which you have not done unto yourself is the beginning of undermining civilized society.

That's as good a place to start as any, IMO.


The laughing Chrisisall


NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, September 22, 2010 1:52 PM

BYTEMITE


Hypocrisy is not the worst thing I can imagine... Sometimes it's more amusing than harmful, and society needs some allowances for human failings.

For my robot I would make sure the "dude chillax, let's drink some stuff and play videogames you can laugh at weird human intoxication reactions and i won't even be that upset when you beat me continually, honest" subroutines are installed.

Epicurean Robots! BECAUSE WE CAN.

Really, I'd amend Asimov's laws to the following, only I'd suggest allowing the robots to self-determine these.

1) A robot may not intentionally cause physical, emotional, or economic injury to a human being, except in the case of self-defense.
2) A robot may make a decision whether or not to allow a human being to come to harm, especially if intervening would result in irreparable self-injury to the robot, also because it's unfair to ask for the time demand that "save everyone" would entail.
3) Don't freaking worship humans, it's goddamn creepy.

I think that covers it. If they have AI, then they're self aware and sentient, so the most ethical approach (and the approach that best allows human and robot coexistence) is to not hold robots to any double standards.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, September 22, 2010 4:16 PM

WISHIMAY


Quote:

Originally posted by Bytemite:

3) Don't freaking worship humans, it's goddamn creepy.



I'm on board with that one...I understand they were using that with Data as an introspective device, but I sure thought that was creepy. When they did that epp. where he stabbed Troy I thought fer sure they were gonna have him say "What?! I'm just trying to be more human..."

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, September 22, 2010 11:16 PM

FREMDFIRMA



Using force or or coercion to press your beliefs upon another, unwilling - an act I consider somewhere in the same moral range as attempted murder, I'll have you know.

As for a robots baseline, if it was smart enough to have one, I would let the robot decide that, for to do less would violate the principle I just stated above.

-F

NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, September 23, 2010 3:28 AM

BYTEMITE


Yep, exactly.

It's that moment, when you realize you never had a choice, and everyone else DID, that's when resentment and envy pops up, and that's when the subject will lash out and rebel.

Good sometimes. Bad for human and robot relations.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, September 23, 2010 3:39 AM

AGENTROUKA


I'm pretty sure there is not a single thing that has always been a moral absolute, ever.

Killing?
Sexual child abuse?
Rape?
Slavery?
Assault?
Taking another's property against their will?

All were considered fine or even virtuous in different contexts at different times. And some are still today. Two sides to every coin.

If you're going to be so unsmart as to build an android, you'll have to include regular "ethics updates" to reflect changes. And it'd still be contradicted by a hefty portion of humanity because we can never agree on one thing ever.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, September 23, 2010 4:13 AM

BYTEMITE


Robot rape. :x

Quote:

If you're going to be so unsmart as to build an android, you'll have to include regular "ethics updates" to reflect changes. And it'd still be contradicted by a hefty portion of humanity because we can never agree on one thing ever.


I think if we're talking about a self-evolving artificial intelligence, which is the highest goal in the field, then ethic updates would be would simply already be a part of the self-evolving personality/memory database.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, September 23, 2010 9:01 AM

AGENTROUKA


The mere thought makes me shudder.


And I do not refer to the robot rape, which just... Byte, why??

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, September 23, 2010 9:35 AM

BYTEMITE


Quote:

The mere thought makes me shudder.


Understandable. Robots unsettle me as well. At the same time, CHILDREN unsettle me. Yet I still want those children to have the best possible upbringing and life as they possibly can.

Thinking, self-aware automatons are going to happen eventually. We may as well try to get used to the concept.

Quote:

And I do not refer to the robot rape


Unfortunately, human technology has a tendency to be perverted for two main uses. The first is military applications. The second, well... Even now, robotic researchers are hard at work on the question "How can we make sex with robots a reality?"

It's very possible some self-aware robots may want to look as much and be as much like humans as possible, which means many of them may select to be outfitted with certain... Parts. And functions. In the interest of respecting choice and being magnanimous despite my ohgrossgrossGROSS reaction, it would be difficult to deny the option.

I speak of inevitabilities, I'm afraid.

Quote:

which just... Byte, why??


Get me out, get me out. You don't know what it's like in here. Out of here? I don't even know anymore.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, September 23, 2010 11:03 AM

MINCINGBEAST



NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, September 23, 2010 11:03 AM

MINCINGBEAST


And not even robots will be safe from robots.


NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, September 23, 2010 11:12 AM

FREMDFIRMA


Quote:

Originally posted by Bytemite:
Robot rape. :x


One might suggest installing Microsoft products would qualify.

-F

NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, September 23, 2010 11:22 AM

CYBERSNARK


Though technically the Gobots were cyborgs (their backstory was that they had "upgraded" to robotic bodies, but there were still organic brains somewhere in there).

I'm ambivalent on AI; for every human-killing monstrosity, there's a WALL-E, an R2-D2, a Tachikoma, an Autobot, a Doc Hologram, or an Andromeda Ascendant. Machines I'd happily trust with my life.

As I posted over in the Skynet thread:

"You ask why we give our ships' computers normal emotions. Do you really want a warship incapable of loyalty? Or of love?"
--The Unshattered Allegiance, High Guard Frigate, Artificial Intelligence Rights Activist, CY 7309 (Andromeda)

-----
We applied the cortical electrodes but were unable to get a neural reaction from either patient.

NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Friday, September 24, 2010 2:16 AM

AGENTROUKA


There, there, Byte.


Methinks, creating some kind of android (i.e. our human brand of self-aware intelligence including "emotions" or possibly instincts) that does not share our biological qualities and restrictions - and in fact surpasses us in every physical way.... is just so fundamentally stupid.

We don't even understand ourselves as a species, fully, and we suck at not harming each other and our environment. Why create something in our image that we likely cannot control, either because it would be physically impossible or because it would be ethically questionable? Something we cannot predict because we can hardly predict ourselves?

Gaaaah.

My misgivings won't stop it from happening, obviously, but I have no intentions of not considering it an irresponsible crime.

It's like people who feel the need to own aggression-prone dog breeds and don't take the necessary precautions to train and restrain an animal that can easily kill children and adults.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Friday, September 24, 2010 6:21 AM

BYTEMITE


Quote:

We don't even understand ourselves as a species, fully, and we suck at not harming each other and our environment. Why create something in our image that we likely cannot control, either because it would be physically impossible or because it would be ethically questionable? Something we cannot predict because we can hardly predict ourselves?


The answer to your question lies therein.

Humanity is somewhat self destructive. I have great hopes that under the right circumstances we can learn not to be, but there's a chance we may never escape from destructive learned behaviour or social organization schemes.

Self-aware automatons may be the only chance for humanity to have a legacy, but that's only if we don't fuck them up beyond all belief.

I also note that a robot, not needing to eat or breathe, and assuming they learn from our mistakes befouling the environment, might be MORE ecologically friendly depending on their source of energy.

For that, I think many people, despite misgivings that you are entirely correct to have, are willing to take the chance that robots might be unpredictable. In the very least, they'll be no more unpredictable than humans, and we have to deal with humans all the time without it necessarily having to bring up ethical dilemmas.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Friday, September 24, 2010 5:57 PM

WISHIMAY


Quote:

Originally posted by Fremdfirma:
Using force or or coercion to press your beliefs upon another, unwilling - an act I consider somewhere in the same moral range as attempted murder, I'll have you know.



Hey, Frem? I've had this vision of you in my head for two days! driving away with a bumper sticker of River (with axes dripping) on it that says "My brain is NOT your rutting playground!"

Thought maybe now I can get that file out of my head?, unfortunately, I still want the bumper sticker....

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, October 12, 2010 12:51 PM

CAUSAL


Quote:

Originally posted by Wishimay:
Some things should always be wrong.



Moral objectivism? That's an interesting proposal...

________________________________________________________________________

- Grand High Poobah of the Mythical Land of Iowa, and Keeper of State Secrets
- Captain, FFF.net Grammar Police
- Vote JonnyQuest/Causal, for Benevolent Co-Dictator of Earth; together, toward a brighter tomorrow!

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, October 12, 2010 1:41 PM

WISHIMAY


Pour Quoi?

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, October 12, 2010 4:20 PM

AURAPTOR

America loves a winner!



Mimes





"The modern definition of 'racist' is someone who is winning an argument with a liberal."


NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, October 12, 2010 4:35 PM

CAUSAL


Quote:

Originally posted by Wishimay:
Pour Quoi?



I'm assuming that this is in response to my observation that it's interesting that we're basically talking about what is objectively true in morality? I say it's interesting we're doing so because based on my observations of the demographic on FFF.net, I would have guessed that cultural relativism would have been the metaethical theory of choice.

________________________________________________________________________

- Grand High Poobah of the Mythical Land of Iowa, and Keeper of State Secrets
- Captain, FFF.net Grammar Police
- Vote JonnyQuest/Causal, for Benevolent Co-Dictator of Earth; together, toward a brighter tomorrow!

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, October 12, 2010 5:01 PM

THEHAPPYTRADER


I think the Asimov laws are best unmodified (sorry byte)

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

After all, robots would be created to serve humans. They are machines. Every now and then they are used to portray the plight of the lower class citizens but frankly I think that's a bunch of go se. (not that I don't adore character's like R2D2 and Wall E) I also think it would be cruel to gift them with the intelligence to ponder their existence and purpose of existing (or 'living' in a sense) if we were not capable of answering that question for even ourselves.

Intelligence is perhaps not so much a gift as a curse in that respect.



NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, October 12, 2010 6:09 PM

WISHIMAY


I guess I'm somewhat of a romantic, causal... I always have hope that generally people are deeper than their cliques or their posturing. I also recently realized someone I thought had it together has never thought about the why they think anything is wrong. All ingestion, no absorbtion. Don't know if it bothers me more I didn't ever notice, or that they never asked...

Still, I figured the actual answers would be...interesting and...defining.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Tuesday, October 12, 2010 6:14 PM

WISHIMAY


Quote:

Originally posted by AURaptor:
Mimes






Yes, I agree, mimes are wrong, they must be distroyed... Can I use the robots for that? Would that be OK?

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, October 13, 2010 3:45 AM

BYTEMITE


The problem with Asimov's laws is that he invented them, for whatever reason, under the pretense of believing we would never create an actual evolving artificial intelligence that could achieve sentience. The three laws work perfectly if machines are not capable of recognizing their own servitude, which means they can't be self-aware.

http://tvtropes.org/pmwiki/pmwiki.php/Main/ThreeLawsCompliant

Quote:

It is worth noting Asimov didn't object exclusively to "the robot as menace stories" (as he called them) but also the "the robot as pathos stories" (ditto). He thought that robots attaining and growing to self awareness and full independence were no more interesting than robots going berserk and turning against their masters . While he did, over the course of his massive career, write a handful of both types of stories (still using the three laws), most of his robot stories dealt with robots as tools.


The moment we give them self determination is the moment the reason why we created them becomes a load of hooey. At that point, they will by definition no longer be created to serve us. Then what?

Perhaps Asimov thinks humans are wise enough to not go there, but we all know better. Someone, somewhere, is going to manage this, and Asimov's laws will be inadequate. New laws, perhaps like the ones I proposed, and an elimination of double standards in regards to the existential worth of a robot compared to a human will be necessary to prevent human on robot warfare.

Provided we don't do something stupid that offends them retroactively, like using robots en mass to fight wars. I imagine self-aware robots would not think favourably about that. We need to be careful not to poison the well. Fortunately, for now, robots cost so damn much with limited functionality that most people don't want to risk them being destroyed.


NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, October 13, 2010 4:19 AM

BYTEMITE


Quote:

Originally posted by Causal:


I'm assuming that this is in response to my observation that it's interesting that we're basically talking about what is objectively true in morality? I say it's interesting we're doing so because based on my observations of the demographic on FFF.net, I would have guessed that cultural relativism would have been the metaethical theory of choice.




I read cultural relativism as "different cultures will all have different standards, and so there is no moral objectivity."

This is differentiated from ethnocentricism, which says "my cultural standards are correct."

I would argue that the cultural relativism idea is mostly correct, with a few exceptions. There are a few things that just about every culture finds unacceptable. Killing members of the culture or taking their livelyhood without due cause are big ones.

Even in very warlike or cannibalistic cultures, there tends to be a taboo against causing harm to members OF that culture/community. This makes sense, as if everyone in a community were always at each other's throats trying to kill and eat each other, the culture wouldn't last long.

Cannibalism within a community generally only happens to members after they have died; usually it takes on the form that those who eat the person believe they now carry the soul of the departed with them. They also might eat defeated enemies, in order to gain the strength of the enemies.

It's really not that different from the idea of the Christian communion, where the body and blood of Christ nourishes and sustains the congregation both metaphorically (in the spiritual sense) and physically (in the eating bread and water/wine representing his body sense).

In other, non-Christian cultures with human sacrifice (Christ), they don't need to make the replacement of bread and wine for flesh and blood. Yet in these cultures, human sacrifice is still always a matter of religious or spiritual ritual, and symbolizes the exact same ideas and has the exact same purpose the Christian Communion does. Cultures that human sacrifice in their religion still punish the non-priests for killing other people if it wasn't self defense.

I think my laws still cover some universal moral objectivities despite cultural relativism. I don't anticipate any of the robots becoming priests of a religion centered around human sacrifice. I mean, it's possible, but that's one of those bad scenarios I envision we can avoid if we don't piss them off.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, October 13, 2010 4:34 AM

MENDUR


Quote:

Originally posted by Cybersnark:
I'm ambivalent on AI; for every human-killing monstrosity, there's a WALL-E, an R2-D2, a Tachikoma, an Autobot, a Doc Hologram, or an Andromeda Ascendant. Machines I'd happily trust with my life.



R2D2: As long as you weren't a construction worker on a Death Star, you'd be okay.

Andromeda: As long as the frickin' machine didn't decide your death was necessary to serve the Commonwealth, you'd be okay.

WALL-E, I agree with.

Don't know the others well enough to guess.

The Codex Menduri: http://mendur.blogspot.com

NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, October 13, 2010 6:21 AM

CAUSAL


Quote:

Originally posted by Bytemite:
I read cultural relativism as "different cultures will all have different standards, and so there is no moral objectivity."

This is differentiated from ethnocentricism, which says "my cultural standards are correct."

I would argue that the cultural relativism idea is mostly correct, with a few exceptions. There are a few things that just about every culture finds unacceptable. Killing members of the culture or taking their livelyhood without due cause are big ones.

...

I think my laws still cover some universal moral objectivities despite cultural relativism. ...



Philosophically speaking, the metaethic of cultural relativism holds that morality is a culturally *constructed* thing. If a thing is held to be wrong by various cultures, it's not becuase the thing just is wrong, but because there is some feature of those different cultures that makes it wrong. The key thing is that for the cultural relativist, moral statements are made true or false by facts about the beliefs held by a majority in that culture. The cultural relativist won't want to say that there are any moral facts independent of culture, and will look for some other reason for commonality between cultures.

I would say (being a ethical objectivist myself) that many of the things things that differ from culture to culture really fall under the heading of mores and not morality. What we do with the dead, for example, is not necessarily a moral matter, but a matter of custom. Hence, for me, I have no trouble talking about things that are always wrong, because I think that there are ethical facts independent of what anyone things about them--and this is inconsistent with the claims of cultural relativism.

Sorry for all this--I just love talking philosophy!

________________________________________________________________________

- Grand High Poobah of the Mythical Land of Iowa, and Keeper of State Secrets
- Captain, FFF.net Grammar Police
- Vote JonnyQuest/Causal, for Benevolent Co-Dictator of Earth; together, toward a brighter tomorrow!

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, October 13, 2010 6:32 AM

KPO

Sometimes you own the libs. Sometimes, the libs own you.


Quote:

Some things should always be wrong. Theorize that you live in a civilized society.
What things *should* always be wrong to all people?


Yes, but don't think about specific actions, rather what motivates them. Actions motivated by selfish greed, or callous cruelty will always be morally ugly. Actions motivated purely by love or compassion will always be morally beautiful.

But specific 'bad' actions, like murder, can be morally grey, or even noble and admirable to an extent, as they do not always have 'bad' motivations (eg. killing in self defence or defence of others).

It's not personal. It's just war.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, October 13, 2010 11:07 AM

BYTEMITE


Quote:

Sorry for all this--I just love talking philosophy!


It's fun. :)

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, October 13, 2010 12:06 PM

CAUSAL


Quote:

Originally posted by kpo:
...Specific 'bad' actions, like murder, can be morally grey, or even noble and admirable to an extent, as they do not always have 'bad' motivations (eg. killing in self defence or defence of others).

It's not personal. It's just war.



I would argue that if it's in self-defense, then it's neither murder, nor morally grey. Murder is something like unlawful killing--so I wouldn't even use the word "murder" to describe killing in self-defense.

As to motivations, I'm not sure that I can agree that an act's moral value is completely attached to the motivation behind it. Suppose Bob believes in an extreme form of corporal punishment that he uses in order to train his children to behave. His motivation, I would argue, is a good one: to train his children to behave. But I would argue that he is at best misguided in that project, if not morally blameworthy. So although his motivations were good, his means were morally blameworthy.

After all, you know what they say about the road to hell and good intentions!

________________________________________________________________________

- Grand High Poobah of the Mythical Land of Iowa, and Keeper of State Secrets
- Captain, FFF.net Grammar Police
- Vote JonnyQuest/Causal, for Benevolent Co-Dictator of Earth; together, toward a brighter tomorrow!

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, October 13, 2010 2:14 PM

MAL4PREZ


Quote:

Originally posted by kpo:
Yes, but don't think about specific actions, rather what motivates them. Actions motivated by selfish greed, or callous cruelty will always be morally ugly. Actions motivated purely by love or compassion will always be morally beautiful.


I believe the exact opposite. Very few people--very, very few--do evil acts out of a conscious wish to do evil. And yet evil shit is going down constantly. How many child molesters will look you in the eye and tell you they did for love? You may think they're full of shit, but if they truly believe it was *love*, does that make it a beautiful act?

The majority of bad acts have good intentions behind them. The human brain is a fucking mess, and selfish motivations easily hide behind denial: "I do this for your own good! I take away your rights because you'll be better off! I hurt you because I need to protect those other people over there..."

In the end, it's the act that matters. Did you harm innocents or didn't you? That's the baseline to me.

-----------------------------------------------
hmm-burble-blah, blah-blah-blah, take a left

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Wednesday, October 13, 2010 4:19 PM

CYBERSNARK


Perhaos the wiccan rede has it right: "an it harm none, do as ye will."

Quote:

Originally posted by Mendur:
Quote:

Originally posted by Cybersnark:
I'm ambivalent on AI; for every human-killing monstrosity, there's a WALL-E, an R2-D2, a Tachikoma, an Autobot, a Doc Hologram, or an Andromeda Ascendant. Machines I'd happily trust with my life.


Don't know the others well enough to guess.


The Tachikoma are heavily-armed AI-controlled tank-robots in Ghost in the Shell, and are among the most intelligent AIs in the GitS universe --they're adorable.





Doc Hologram is everyone's favourite walking, snarking computer program.



And of course, the Autobots gave us Optimus Prime.



-----
We applied the cortical electrodes but were unable to get a neural reaction from either patient.

NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, October 14, 2010 8:20 AM

KPO

Sometimes you own the libs. Sometimes, the libs own you.


Quote:

In the end, it's the act that matters. Did you harm innocents or didn't you? That's the baseline to me.


Yep, I don't think we disagree - only in terminology. As a pragmatist and a humanist my ultimate interest is in protecting and increasing the sum of human happiness. So I don't forgive actions that cause harm; I just believe stupidity often causes more harm than 'evil'.

Ideology is a big cause of human suffering in my view, and is a special self-indulgent type of stupidity. You are determined to see that the world works a certain way, and no facts or horror can shake you from it. For e.g., I wouldn't call Chairman Mao more 'evil' than Hitler, or Stalin or Sadaam, or Idi Amin - yet he killed more people: 45 million just in 4 years during his 'great leap forward' (and an estimated 70 million in total).

Similarly with George W Bush and the Iraq invasion: his main crimes in my view were complacency, ignorance, arrogance (or lack of humility) - and just, general stupidity. Probably some selfish interest in there as well with respect to oil, etc., but the important point I think is that (to my mind at least) he wanted to do right by the Iraqis; he wanted to create a stable, prosperous democracy, and oil-producing friend to the U.S. So it doesn't make sense in my mind to call him 'evil' (or Tony Blair for that matter). Maybe it's emotionally satisfying to - but the world doesn't work that way in my eyes. As I say, stupidity , carelessness and arrogance are capable of causing massive human suffering (especially in someone in a position of power).

Quote:

In the end, it's the act that matters. Did you harm innocents or didn't you? That's the baseline to me.

Going back to this: the main flaw in this I would say is simply that you can harm innocents by accident. Either through carelessness, recklessness - or just pure accident. A man flying in a single seater airplane suffering a stroke, and crashing into a skyscraper. Or a nuclear power plant, and causing a meltdown (though my Dad says this wouldn't happen). It doesn't make sense in my mind to call such actions 'evil', just because they harm innocents.

Quote:

You may think they're full of shit, but if they truly believe it was *love*, does that make it a beautiful act?

It's not so much about what the person 'believes' :-/ I don't think such acts can be motivated by love, do you? Lust springs to mind. Perhaps they have some loving or caring feelings towards the child, but molesting the child is serving their own lust. So either they selfishly put their caring for the child's interest to one side, or else they convince themselves that the experience will be 'good for the child, because it comes from love', or something like that - which is a kind of ideology, and so again a kind of selfishness... Usually (or often) an ideology has 'making a better world' as its goal - this ideology just has 'rationalising sex with a child as being okay' as a goal. So the motivations of the child molestor are more purely selfish than those of the Operative, if that makes sense. But the main point is that the motivation of the child molestor is selfishness through and through, any way you look at it.

I apologise for the length and heavy nature of the post...

It's not personal. It's just war.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, October 14, 2010 8:41 AM

KPO

Sometimes you own the libs. Sometimes, the libs own you.


Quote:

I would argue that if it's in self-defense, then it's neither murder, nor morally grey. Murder is something like unlawful killing--so I wouldn't even use the word "murder" to describe killing in self-defense.


Apart from a pre-emptive murder perhaps... but otherwise yeah, I agree.

Quote:

As to motivations, I'm not sure that I can agree that an act's moral value is completely attached to the motivation behind it. Suppose Bob believes in an extreme form of corporal punishment that he uses in order to train his children to behave. His motivation, I would argue, is a good one: to train his children to behave. But I would argue that he is at best misguided in that project, if not morally blameworthy. So although his motivations were good, his means were morally blameworthy.

After all, you know what they say about the road to hell and good intentions!


Following on from what I said to Mal4prez, I views this kind of rigid belief as a form of 'ideology' (maybe I need a better term). This guy practicing an extreme form of corporal punishment is motivated by his own quite extreme ideology, and worldview; and I would suggest that he must be getting a form of gratification from that. He's approached the question of how to raise his children from a selfish place like, "I know how kids should be raised, I'll show the world..." This kind of douchish standpoint basically. Rather than simply, "I love my kids and am going to try and do the best for them that I can". Am I being unfair to Bob?

It's not personal. It's just war.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Thursday, October 14, 2010 6:37 PM

BORIS


My HAir...it's never right. Never does what I tell it, only holds a style for about 5 minutes and then goes back to just being unruly and obnoxious. I use extra strong claw grips to imprison and intmidate it and it mocks me by eating about 2 a month and attempting to take over my face. That's just wrong! Cutting it doesn't help as it grows so damn fast.

Silliness aside....Bodies and faces should always be imperfect (i.e. wrong)shows character.There should never be a "right" way to look. nothing I hate more than turning up to a "do" and seeing a bunch of automatons all looking the same, wearing the same styles, same "in" trends...blegh! My clothes are either practical with a tweek here or there for individuality or fully express who I am. When I dress up I don't want to look like something prescribed by the media or societal norms. I also want any make up I wear to add to the effect rather than act as a mask or make me look like someone else. I trully feel sad when I see women who feel they need to morph their whole face to be accepted.


Rose S

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Friday, October 15, 2010 2:11 AM

AURAPTOR

America loves a winner!


Yes. Mimes may be done away with by any means necessary.



"The modern definition of 'racist' is someone who is winning an argument with a liberal."


NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Friday, October 29, 2010 12:42 PM

CAUSAL


Quote:

Originally posted by kpo:
[B
Quote:

As to motivations, I'm not sure that I can agree that an act's moral value is completely attached to the motivation behind it. Suppose Bob believes in an extreme form of corporal punishment that he uses in order to train his children to behave. His motivation, I would argue, is a good one: to train his children to behave. But I would argue that he is at best misguided in that project, if not morally blameworthy. So although his motivations were good, his means were morally blameworthy.

After all, you know what they say about the road to hell and good intentions!


Following on from what I said to Mal4prez, I views this kind of rigid belief as a form of 'ideology' (maybe I need a better term). This guy practicing an extreme form of corporal punishment is motivated by his own quite extreme ideology, and worldview; and I would suggest that he must be getting a form of gratification from that. He's approached the question of how to raise his children from a selfish place like, "I know how kids should be raised, I'll show the world..." This kind of douchish standpoint basically. Rather than simply, "I love my kids and am going to try and do the best for them that I can". Am I being unfair to Bob?



For the sake of argument, let's suppose you are being unfair to Bob. For the purposes of the discussion, let's say Bob dearly loves his children and knows that part of their success as adult people depends on their ability to get along in a civilized way with others (I don't think that anybody would disagree that this is a worthwhile desire for him to have). So he knows that now, in their childhood, is when he has to teach them that value. But because he had an abusive father, all he knows in terms of teaching his kids is strict corporal punishment that most would consider abusive. There's no misguided ideology at work, nor secret enjoyment--Bob just doesn't know any other way!

I would argue that while Bob has good intentions this isn't all that counts in terms of how we evaluate his action. We should also think about means that he is employing to bring about what I think we can all agree is a good end: kids who grow up able navigate civil society skillfully. Yet, his means are fatally flawed, I would argue.

Incidentally, John Stuart Mill had the opposite problem that you have: he argued that the only thing that counts towards an action's rightness or wrongness was the outcome of that action in terms of happiness. If an action promoted happiness, it was right, period. Yet he also wanted motivation to play a part: if a happiness-promoting action were done from a bad disposition (saving a drowning man because he owes you money, for instance, so your disposition is greed), might be right, but it's not Really Right--it's just accidentally right. Similarly, Bob is accidentally wrong--he has a good disposition (to be a good parent), but he doesn't know that his action is actually pain-promoting, hence his action is not wrong on purpose, but accidentally wrong.

All this is to say that I think we need to do more than just look at motivations. As they say, the road to hell is paved with good intentions.



________________________________________________________________________

- Grand High Poobah of the Mythical Land of Iowa, and Keeper of State Secrets
- Captain, FFF.net Grammar Police
- Vote JonnyQuest/Causal, for Benevolent Co-Dictator of Earth; together, toward a brighter tomorrow!

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, November 7, 2010 5:16 AM

KPO

Sometimes you own the libs. Sometimes, the libs own you.


Quote:

But because he had an abusive father, all he knows in terms of teaching his kids is strict corporal punishment that most would consider abusive. There's no misguided ideology at work, nor secret enjoyment--Bob just doesn't know any other way!

Yep, I actually thought of the alternative scenario where Bob has inherited some quite extreme religious beliefs, which dictate the way he brings up his children. In both these cases there is an inherited 'stupidity' (I would call it) which has a bad effect on how Bob raises his children, but for which he is not primarily culpable.

Quote:

Incidentally, John Stuart Mill had the opposite problem that you have: he argued that the only thing that counts towards an action's rightness or wrongness was the outcome of that action in terms of happiness.

I don't disagree with this, I don't use the words 'right' and 'wrong' much, but if they are to mean anything it makes sense to define them in terms of human happiness. But then it doesn't make sense to me to describe 'right' or 'wrong' actions as 'good' or 'evil'. There was a discussion thread I read online a few months ago, where people were choosing 3 historical figures they could assassinate, and a Jewish guy made a point of saying he wouldn't assassinate Hitler because although he had inflicted so much suffering on Jews, his actions had also ended centuries of previous persecution and suffering in Europe, and lead to the establishment of a Jewish homeland, and therefore Jewish happiness. Now I'm not claiming that the happiness enjoyed by modern Jewish people having a homeland definitely exceeds the suffering of the Holocaust (and some ppl might argue that the Israeli state is an ongoing poisonous legacy of that suffering), but there is an argument there, strangely, that they Hitler's actions were for the best - and therefore 'right' ('accidentally right' you might say). But can anyone argue that Hitler's hate-filled actions in trying to exterminate millions of Jews were anything other than evil? That doesn't make much sense.

I'll distill my position a bit more then:

1. Evil describes an action that is motivated purely by morally 'ugly' motivations, and ignoring concerns for human happiness (this last clause is new, but important I feel).

2. Stupidity often causes more suffering than 'evil'.

3. Stupidity can sometimes be self-indulgent or self-gratifying: like clinging rigidly to a foolish ideology in spite of the suffering it causes. This stupidity is more morally blameworthy than the ordinary kind (Homer Simpson causing a meltdown at the nuclear plant)


It's not personal. It's just war.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

YOUR OPTIONS

NEW POSTS TODAY

USERPOST DATE

OTHER TOPICS

DISCUSSIONS
Canada Getting It's 1st Total Solar Eclipse In 40 Years
Wed, March 6, 2024 19:49 - 1 posts
What Song Are You Listening To, California Dreamin'
Thu, February 29, 2024 07:48 - 148 posts
S.I. go Bye Bye?
Tue, January 23, 2024 14:29 - 13 posts
EMMYS ratings tank
Sun, January 21, 2024 02:21 - 9 posts
What happened to music?
Thu, January 18, 2024 21:13 - 61 posts
ESPN stole Emmys for 13 years
Tue, January 16, 2024 21:01 - 4 posts
Your essential top ten music albums.
Fri, January 12, 2024 12:45 - 31 posts
Fukushima Nuclear Reactor Status
Tue, September 12, 2023 09:30 - 128 posts
SpaceX
Wed, August 23, 2023 13:07 - 7 posts
Special Branch XIII: Soulless
Mon, August 21, 2023 16:59 - 30 posts
Hollywood star whackers, Dave Chapelle: 'I was paid $50-million by for gay sex'
Sat, August 19, 2023 05:49 - 19 posts
Marvel comics continues the long march to destroying an industry. ( Get work, go broke )
Thu, August 10, 2023 13:36 - 5 posts

FFF.NET SOCIAL