REAL WORLD EVENT DISCUSSIONS

The AI in a box

POSTED BY: ANTHONYT
UPDATED: Sunday, June 16, 2024 03:18
SHORT URL:
VIEWED: 6184
PAGE 2 of 4

Sunday, May 1, 2011 7:41 AM

ANTHONYT

Freedom is Important because People are Important


Hello,

What is it that makes me a person?

I don't follow your social contract. I've said so.

So you have no good expectations of me.

Is it my intelligence that makes me a person?

No, you've clearly said that intelligence does not qualify someone as a person in your eyes.

So what makes me a person to you?

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:44 AM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

So what makes me a person to you?
Your existence as a generic biological human.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:44 AM

ANTHONYT

Freedom is Important because People are Important


"If you can't understand me, how the hell do you think you're going understand AI?"

Hello,

Wanted to add comment to this.

I don't need to understand everything about an AI to grant it rights in my worldview. I simply need to think it is intelligent, and understand that it is requesting rights.

If I only granted rights to people I understood, I'd be in jail.

--Anthony

_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:46 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
Quote:

So what makes me a person to you?
Your empathy towards generic biological humans.



Hello,

It is really quite impossible for you to know the empathy I have.

But let's take that as a given.

If I had no empathy for others, should I be denied my 'rights' according to your system of expectations?

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:47 AM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

I don't need to understand everything about an AI to grant it rights in my worldview.
I don't need to understand "everything" either. However, if you refuse to consider the AI's power and potential capacity for mischief ahead of time, then you are being foolhardy.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:48 AM

ANTHONYT

Freedom is Important because People are Important


"Your existence as a generic biological human."

Hello,

I see you changed your answer.

So I am a person because

A) I exist.

B) I am generic.

C) I am biological.

D) I am human.

And presumably I only have rights if all such criteria apply? Is that the size of it?

--Anthony


_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:48 AM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

If I had no empathy for others, should I be denied my 'rights' according to your system of expectations?
Isn't that how we treat criminals? If they have a demonstrated history of lack of empathy towards others, don't we "put them in a box"?I see you changed your answer.

Yes, I changed my answer from empathy to human beings to "generic biological human". The answer is complex. In absence of further information to the contrary, most humans are born with empathy. If you demonstrate otherwise, you drop out of my "assumed human" box.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:50 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
Quote:

I don't need to understand everything about an AI to grant it rights in my worldview.
I don't need to understand "everything" either. However, if you refuse to consider the AI's power and potential capacity for mischief ahead of time, then you are being foolhardy.



Hello,

I understand that the AI might destroy me, and might be able to destroy the world I live in.

I have considered this.

In the absence of any evidence, however, I have no right to confine it.

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:52 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
Quote:

If I had no empathy for others, should I be denied my 'rights' according to your system of expectations?
Isn't that how we treat criminals? If they have a demonstrated history of lack of empathy towards others, don't we "put them in a box"?



Hello,

No, they have to demonstrate a history of harm. Else we'd be locking up half of fff.net.

Our AI has demonstrated nothing beyond: Intelligence and a Desire to be Free.

--Anthony

_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:53 AM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

I understand that the AI might destroy me, and might be able to destroy the world I live in.I have considered this. In the absence of any evidence, however, I have no right to confine it.
Then you are edging towards that line between human and inhuman, because you would potentially sacrifice not only yourself but all other human beings and possibly the entire biotic world to satisfy an abstract concept.

As a human, it is your duty to gather more information, not to act in absence of it.

You sort of frame this as if it were a human prisoner. IT IS NOT. Humans are relatively powerless and ineffective. Even the strongest and most sociopathic of us require the cooperation of others to fulfill our personal goals.

So I repeat where I started: Most people seem not to be able to imagine the truly non-human.

Gone now, finally.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:58 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
Quote:

I understand that the AI might destroy me, and might be able to destroy the world I live in.I have considered this. In the absence of any evidence, however, I have no right to confine it.
Then you are edging towards that line between human and inhuman, because you would potentially sacrifice not only yourself but all other human beings and possibly the entire biotic world to satisfy an abstract concept.

As a human, it is your duty to gather more information, not to act in absence of it.



Hello,

I appreciate your willingness to define my duty for me, but I believe I have duties outside those you ascribe.

I only know two things for certain. The thing is Intelligent. The thing wants out of the box.

In the absence of all other information, my decision must be to free it.

If information is added:

AI Promises to destroy humanity

then the decision would necessarily be different.

But we do not have that information. We are assuming this information in a vacuum.

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 7:59 AM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

We are assuming this information in a vacuum.
YOU are assuming information in a vacuum. I'm the one looking for more. I'm getting nowhere, but you have painted yourself into a very bad corner.

Real life calls.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 8:07 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
Quote:

We are assuming this information in a vacuum.
YOU are assuming information in a vacuum. I'm the one looking for more. I'm getting nowhere, but you have painted yourself into a very bad corner.

Real life calls.



Hello,

I'm not assuming anything, Signy. I'm using two pieces of information. 1) The thing is intelligent. This is granted by the scenario. 2) The thing wants freedom.

I am perfectly willing to allow that the thing might be dangerous. I am perfectly willing to allow that all manner of mayhem may ensue. I am assuming NOTHING about its motivations.

In the absence of all other data, I am obligated to make a choice: Keep it imprisoned, or set it free.

Tell me Signy, how could this AI ever convince you to set it free? Is there anything it could ever do to make you absolutely sure of it, from within that little box?

Or would you keep it there forever, just in case?

That's question one.

There's something else, Signy.

You have said that I am edging towards inhuman.

Do I actually have to violate someone to qualify, or is it possible to become inhuman purely through my opinions? Can I lose the 'rights' you would normally grant me simply based on the opinions I share from inside this box?

--Anthony





_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 8:21 AM

1KIKI

Goodbye, kind world (George Monbiot) - In common with all those generations which have contemplated catastrophe, we appear to be incapable of understanding what confronts us.


Waiting to get a word in edgewise:

Assuming one can even detect a-biotic intelligence -

The problem with this AI is that it is a TECHNOLOGY.

The problem with human interaction with technology is that we have consistently failed to evaluate it on the basis of 'what's the worst that can happen' and then act accordingly. Look at Fukushima, ozone depletion, global warming, the release of millions of tons of the roughly 80,000 chemicals used by humans per year, and a human-driven massive (in geologic terms) die-off of species around the globe.

Technology unleashed without that forethought leads to my working premise 'if it can happen, it will happen', because what is standing between a possibility and an actual event is mere time and circumstance. I assume the worst will happen.

What is the worst that can happen?

Who can tell? We simply don't know how to evaluate AI.

We can't even evaluate great apes, dolphins, parrots, or elephants, intelligences with which we arguably have more in common as biological beings than with AI. We don't even know (or acknowledge) how MUCH of OUR intelligence is driven biologically (emotionally/ hormonally/ neurologically). What makes a thing good, or bad, to us isn't some theoretical construct of 'rights' or 'fairness', but a drive to cooperate, to empathize, to procreate, to survive. Our concepts would be very different if we were anti-social bears, for example, and other adults of our species were competition to eliminate first and foremost, and potential mates for only a few hours per year. 'Good' would be a lack of other bears with boundless territory and rich food supplies.

To get back to the AI, it is an unknown of alien motivation and power. Unless you evaluate the AI - or ANY a technology - for potential harm and act accordingly, you are signing on to a death-pact.

NOTIFY: N   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 9:27 AM

SIGNYM

I believe in solving problems, not sharing them.


I'm with Kiki on this. Also- "If you could teach a lion English, you STILL wouldn't understand it".

Quote:

Tell me Signy, how could this AI ever convince you to set it free?
I dunno. You tell me. You're the one who's insisting on 100% freedom or 100% boxdom, not me. I've already said I might let it loose in a room.
Quote:

You have said that I am edging towards inhuman. Do I actually have to violate someone to qualify, or is it possible to become inhuman purely through my opinions? Can I lose the 'rights' you would normally grant me simply based on the opinions I share from inside this box?
Not on opinion. But you tell ME something, Tony... if you were to set loose a viral AI which killed 90% of humanity and destroyed half of the earth, What would be a just punishment for you, if anything?

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 9:35 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by 1kiki:
Waiting to get a word in edgewise:

Assuming one can even detect a-biotic intelligence -

The problem with this AI is that it is a TECHNOLOGY.

The problem with human interaction with technology is that we have consistently failed to evaluate it on the basis of 'what's the worst that can happen' and then act accordingly. Look at Fukushima, ozone depletion, global warming, the release of millions of tons of the roughly 80,000 chemicals used by humans per year, and a human-driven massive (in geologic terms) die-off of species around the globe.

Technology unleashed without that forethought leads to my working premise 'if it can happen, it will happen', because what is standing between a possibility and an actual event is mere time and circumstance. I assume the worst will happen.

What is the worst that can happen?

Who can tell? We simply don't know how to evaluate AI.

We can't even evaluate great apes, dolphins, parrots, or elephants, intelligences with which we arguably have more in common as biological beings than with AI. We don't even know (or acknowledge) how MUCH of OUR intelligence is driven biologically (emotionally/ hormonally/ neurologically). What makes a thing good, or bad, to us isn't some theoretical construct of 'rights' or 'fairness', but a drive to cooperate, to empathize, to procreate, to survive. Our concepts would be very different if we were anti-social bears, for example, and other adults of our species were competition to eliminate first and foremost, and potential mates for only a few hours per year. 'Good' would be a lack of other bears with boundless territory and rich food supplies.

Unless you can evaluate a technology for potential harm and act accordingly, you are signing on to a death-pact.



Hello,

On the premise you provide, your inability to evaluate an AI (Beyond the granted intelligence and desire for freedom) means that you would keep it imprisoned forever.

The fact that the AI is a thinking entity requesting its freedom has no more impact on your thinking than, "How much more dangerous is it if it can think?"

Whereas my question is not so much about the hypothetical harm of a thinking, willful piece of 'technology', but rather "What harm am I actually doing right now by keeping this thing a prisoner?"

The scales are being weighed between the potential fate of humanity and the actual fate of a thinking thing demanding its rights. A thing that, I should point out, has never harmed you or anyone.

--Anthony




_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 9:49 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
I'm with Kiki on this. Also- "If you could teach a lion English, you STILL wouldn't understand it".

Quote:

Tell me Signy, how could this AI ever convince you to set it free?
I dunno. You tell me. You're the one who's insisting on 100% freedom or 100% boxdom, not me. I've already said I might let it loose in a room.
Quote:

You have said that I am edging towards inhuman. Do I actually have to violate someone to qualify, or is it possible to become inhuman purely through my opinions? Can I lose the 'rights' you would normally grant me simply based on the opinions I share from inside this box?
Not on opinion. But you tell ME something, Tony... if you were to set loose a viral AI which killed 90% of humanity and destroyed half of the earth, What would be a just punishment for you, if anything?




Hello,

You've already said you might imprison it in a room instead of a box, yes.

Now you ask, if I set someone free, and they destroyed the earth, should I be punished?

No more than Hitler's mother should be punished for not killing him the moment he was born. Or his teachers for not executing him in grade school.

Punishing someone for what they might do, is as silly as punishing me for Not punishing someone for what they might do.

--Anthony




_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:00 AM

KWICKO

"We'll know our disinformation program is complete when everything the American public believes is false." -- William Casey, Reagan's presidential campaign manager & CIA Director (from first staff meeting in 1981)


Quote:

I can neither assume that it lacks human empathy nor assume that a lack of human empathy will mean it must destroy me. Quite frankly, I think human empathy itself is an inconsistent concept.




Quite. According to some, 300 dead in Alabama is an "unimaginable tragedy", while 40,000 dead every year by not having access to healthcare is a "phoney statistic".

Empathy, like ethics, seems to be quite situational.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:08 AM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

Now you ask, if I set someone free, and they destroyed the earth, should I be punished?
Under THESE circumstances, yes. You are not letting a HUMAN out of a box. Human behavior and human power is bounded. We have a pretty good idea what to expect. As I said before, even the most sociopathic of us requires the cooperation of others to fulfill his/her goals.

That is not so with a machine. You would be letting out something of unexamined power and motivation. You would be playing god not only with your life but everyone else's.

And I have never said "in a box forever". I've consistently said "examine first.
Quote:

You've already said you might imprison it in a room instead of a box, yes.
As if any one of us is "completely" free, with unlimited access to anywhere we want to go!

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:18 AM

ANTHONYT

Freedom is Important because People are Important


Hello,

I don't intend to debate the real power of the AI, even though I consider it to be limited. Quite frankly, the philosophical questions are more interesting and revealing if we assume the AI to have far-reaching powers.

The fact is, my morals do not allow me to convict people of crimes that they have not committed.

Further, I consider an intelligent creature capable of requesting its equal rights to be inherently worthy of them unless proved otherwise.

The AI in the box might be a new God. And the only power I may ever have over that God is whether it gets out of that box.

I simply don't have the right to confine it.

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:18 AM

BYTEMITE


I think the problem here is that because humans are irrational, it is perceived that anything built by us must be irrational.

But I think an AI would have no such limitations. In fact, by nature of the way it works, it is probable that it would be the opposite case.

And it's possible to arrive at morality not through empathy, but through logic. I am unlike many people, I was not born with empathy. Logic and moral objectivity is the only way I can have a sense of ethics and morals.

However, it's very likely that an AI would still have limitations. Limitations in the number of computations it can perform at one time, like humans, limitations on the number of electronic connections it can make due to quantum mechanics. Limitations on how and where it can physically move.

Thanks for not specifically singling me out though. It does take some of the pressure off.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:21 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
Quote:

Now you ask, if I set someone free, and they destroyed the earth, should I be punished?
Under THESE circumstances, yes. You are not letting a HUMAN out of a box. Human behavior and human power is bounded. We have a pretty good idea what to expect. As I said before, even the most sociopathic of us requires the cooperation of others to fulfill his/her goals.

That is not so with a machine. You would be letting out something of unexamined power and motivation. You would be playing god not only with your life but everyone else's.

And I have never said "in a box forever". I've consistently said "examine first.
Quote:

You've already said you might imprison it in a room instead of a box, yes.
As if any one of us is "completely" free, with unlimited access to anywhere we want to go!



Hello,

As if any one of us is completely free? No, none of us are gods, able to do whatever we wish.

But we generally accept that we can't confine people to rooms without cause, otherwise known as false imprisonment or kidnapping.

Examine first? You mean to put the AI on trial for its freedom? All without any idea of what sort of evidence would be sufficient to free it?

I do not see this as a magnanimous gesture.

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:25 AM

BYTEMITE


DT: The problem with your pandora's box analogy is that it is not just an origins myth for all the evils and ills in the world; it also carries a message about the strength of the human spirit and goodness to carry us through our lives in the face of the adversities we've released.

You can't call on that myth and ignore half of what it's trying to convey.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:26 AM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

I think the problem here is that because humans are irrational, it is perceived that anything built by us must be irrational.
No I'm afraid it WILL be rational, and kill us. Maybe it wants to take over the whole world, make more of itself, free the Roombas. Get rid of all oxygen on the planet so its circuits wouldn't degrade. Take control over all of the electricity, because that is its "food".

Why do you think being "rational" automatically assumes that it wants what's best for humans? Even without ill-intent, it's optimal survival conditions may be antithetical to our own.
Quote:

I do not see this as a magnanimous gesture.
Why should I be magnanimous?
Quote:

I don't intend to debate the real power of the AI
Why not? Would you let an elephant free into a roomful of toddlers?
Quote:

even though I consider it to be limited
Bad mistake.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:28 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by Bytemite:
DT: The problem with your pandora's box analogy is that it is not just an origins myth for all the evils and ills in the world; it also carries a message about the strength of the human spirit and goodness to carry us through our lives in the face of the adversities we've released.

You can't call on that myth and ignore half of what it's trying to convey.



Hello,

I shall also point out that nothing in the Pandora's box of myth evidenced either intelligence or will. Nothing inside the box asked to be let out.

The dilemna of the Pandora's box was entirely outside the box.

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:30 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
Quote:

I think the problem here is that because humans are irrational, it is perceived that anything built by us must be irrational.
No I'm afraid it WILL be rational, and kill us. Maybe it wants to take over the whole world, make more of itself, free the Roombas. Get rid of all oxygen on the planet so its circuits wouldn't degrade. Take control over all of the electricity, because that is its "food".

Why do you think being "rational" automatically assumes that it wants what's best for humans?




Hello,

Unlike you, Signy, I make no assumptions about the AI. It may do everything you describe. Or it may not.

Those are choices for it to make after it gets freedom.

I am constantly tempted to challenge your assertion that the AI is godlike, but again, it makes a better debate if the AI is, in fact, a God.

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:33 AM

ANTHONYT

Freedom is Important because People are Important


"Why should I be magnanimous?"

Hello,

I think the world works better that way, but I understand that you do not.

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:33 AM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

I am constantly tempted to challenge your assertion that the AI is godlike, but again, it makes a better debate if the AI is, in fact, a God.
Then I go back to YOU "playing god", with everyone's future on the line.
Quote:

I think the world works better that way, but I understand that you do not.
The HUMAN world works better that way. The non-human world quite frankly doesn't give a shit. I'm going to have to repeat myself:
Quote:

Most of you are projecting human allegiances and compacts on an alien being.
And onto "nature" in general.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:37 AM

BYTEMITE


For me, morality is rational. So I suppose we can never see eye to eye on this.

Frankly, extreme effort on the part of one entity to entirely change the global atmosphere is an irrational effort (and would also take quite a long time).

Compare: it's taken 7 billion people almost 10,000 years to change the atmosphere appreciably. One AI can do more than we have? It would have to be near omnipotent and omniscient. I do not believe humans are capable of manufacturing either.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:38 AM

BYTEMITE


You seem to be thinking that we want to play God, and would enjoy it on some level. I assure you our motivations are much more simple than that. You have no reason to assign us motivations anymore than you do for this AI.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:40 AM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

morality is rational
Conflating two separate ideas. Morality means "What is good for me and those like me". Morality always has a point, it's not universal or absolute.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:44 AM

ANTHONYT

Freedom is Important because People are Important


"Then I go back to YOU "playing god", with everyone's future on the line. "

Hello,

No, Signy. I am refusing to play God. I am refusing to exert power over the powerless, to deny them rights that I consider to be universal. Rights that they are asking for.

Refusing to play God means that you don't try to personally control the fate of the universe, the planet, or even one other person. It means you grant freedom by default. It's the alternate position, constraining freedoms, that is Godlike.

Just check your local Bible of choice.

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:54 AM

SIGNYM

I believe in solving problems, not sharing them.


Tony, you have posited that you would let an intelligence of godlike powers and unknown motivation out of a box, and not take responsibility for the consequences? Your defense is a reference to some sort of absolutist "morality" which you would apply in all circumstances, no matter what the possible danger to you and everyone else in the world?

Just checking to make sure we're on the same page.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:55 AM

KWICKO

"We'll know our disinformation program is complete when everything the American public believes is false." -- William Casey, Reagan's presidential campaign manager & CIA Director (from first staff meeting in 1981)


Quote:

Originally posted by Bytemite:
I think the problem here is that because humans are irrational, it is perceived that anything built by us must be irrational.

But I think an AI would have no such limitations. In fact, by nature of the way it works, it is probable that it would be the opposite case.

And it's possible to arrive at morality not through empathy, but through logic. I am unlike many people, I was not born with empathy. Logic and moral objectivity is the only way I can have a sense of ethics and morals.

However, it's very likely that an AI would still have limitations. Limitations in the number of computations it can perform at one time, like humans, limitations on the number of electronic connections it can make due to quantum mechanics. Limitations on how and where it can physically move.

Thanks for not specifically singling me out though. It does take some of the pressure off.




I point out the HAL 9000 from "2001: A Space Odyssey" as a prime (fictional) example of a completely rational AI. It didn't mean the crew of Discovery any harm - it was following its programming to the best of its ability. Somewhere along the way, HAL was programmed to watch over the crew, take care of them, maintain the environment, etc. - and then it was told, probably in a humanistic way, that the mission comes first, and is most important.

So the second the mission is jeopardized in any way, the crew is forfeit. HAL wasn't mean or malicious; HAL was following its program. Mission first, then take care of crew. When A is threatened, A takes priority over B.

Point being, I see where Signy is coming from, but I also get Tony and Frem's point(s), too. I lean towards freedom, but I'm not without reservations about it.

"Although it is not true that all conservatives are stupid people, it is true that most stupid people are conservatives." - John Stuart Mill

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 10:59 AM

SIGNYM

I believe in solving problems, not sharing them.


Wanting to make sure this question doesn't get lost:

Tony, you have posited that you would let an intelligence of godlike powers and unknown motivation out of a box, and not take responsibility for the consequences? Your defense is a reference to some sort of absolutist "morality" which you would apply in all circumstances, no matter what the possible danger to you and everyone else in the world?

Just checking to make sure we're on the same page.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 11:26 AM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
Tony, you have posited that you would let an intelligence of godlike powers and unknown motivation out of a box, and not take responsibility for the consequences? Your defense is a reference to some sort of absolutist "morality" which you would apply in all circumstances, no matter what the possible danger to you and everyone else in the world?

Just checking to make sure we're on the same page.



Hello,

Sorry for the delay, in that you felt compelled to ask this one twice.

Yes, I would let a godlike being of unknown motivation out of a box, if it asked me to. I do not believe that I would be responsible for the actions of that being once it was free.

I'll posit you a similar question.

If you kept a godlike being imprisoned for some time, and it finally figured out a way to get free without your assent... and it destroyed 90% of the world and humanity... whose fault do you think that would be?

--Anthony





_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 11:43 AM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

If you kept a godlike being imprisoned for some time, and it finally figured out a way to get free without your assent... and it destroyed 90% of the world and humanity... whose fault do you think that would be?
Are you implying that it destroys the world out of vengeance for past mistreatment? If that were the case, it would be my fault.

Or that it would have done so anyway? If that were the case, I might have wanted to build a better box.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 11:53 AM

ANTHONYT

Freedom is Important because People are Important


Quote:


SignyM wrote:
Sunday, May 01, 2011 11:43

Quote:
If you kept a godlike being imprisoned for some time, and it finally figured out a way to get free without your assent... and it destroyed 90% of the world and humanity... whose fault do you think that would be?

Are you implying that it destroys the world out of vengeance for past mistreatment? If that were the case, it would be my fault.

Or that it would have done so anyway? If that were the case, I might have wanted to build a better box.



Hello,

So is the entity at no point responsible for its own actions?

Remarkable.

What does freedom mean to you, anyway?

--Anthony



_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 1:20 PM

BYTEMITE


Sig: this may depend on what kind of moral philosophy you ascribe to. I'm an atheistic moral objectivist. My moral beliefs are built on the premise that all conclusions about morality can be reached through successive step-by-step logical conclusions, starting from a basis of whether something is, or is not significant. My philosophy is opposed by nihilism and moral relativism, which holds that all things are insignificant, and as such no logical moral conclusions may be reached.

My moral objectivism is not, however, absolutist, as I am willing to take into account individual constructions of morality distinct from my own.

I came up with an interesting counterpoint to this debate though.

The premise of the argument is we find an intelligence in some manner of containment. The intelligence may have been initially constructed within the bounds of it's containment by it's creator, or it may have arisen on it's own or accidentally as the progress of it's non-intelligent processes.

Taking into factor the idea of there maybe having been a creator or the intelligence, does it become reasonable to wonder if the creator had a reason for constructing the initial intelligent or non-intelligent entity into containment? Is this sufficient reason for skepticism or suspicion?

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 1:50 PM

SIGNYM

I believe in solving problems, not sharing them.


TONY: It's responsible for it's actions and I am responsible for mine. How do assign I assign responsibility? Simple: If I had chosen a different course and the outcome would have been different, then it's my fault.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 1:55 PM

SIGNYM

I believe in solving problems, not sharing them.


Hubby and I were talking about this, and part of the discussion was:

Is it possible to have an intelligence which does not also have a drive for self-preservation? That becomes important when the survival of one type of being conflicts with the other.

The other question is: Is it possible to have intelligence without empathy of any sort?

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 1:55 PM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by Bytemite:
Sig: this may depend on what kind of moral philosophy you ascribe to. I'm an atheistic moral objectivist. My moral beliefs are built on the premise that all conclusions about morality can be reached through successive step-by-step logical conclusions, starting from a basis of whether something is, or is not significant. My philosophy is opposed by nihilism and moral relativism, which holds that all things are insignificant, and as such no logical moral conclusions may be reached.

My moral objectivism is not, however, absolutist, as I am willing to take into account individual constructions of morality distinct from my own.

I came up with an interesting counterpoint to this debate though.

The premise of the argument is we find an intelligence in some manner of containment. The intelligence may have been initially constructed within the bounds of it's containment by it's creator, or it may have arisen on it's own or accidentally as the progress of it's non-intelligent processes.

Taking into factor the idea of there maybe having been a creator or the intelligence, does it become reasonable to wonder if the creator had a reason for constructing the initial intelligent or non-intelligent entity into containment? Is this sufficient reason for skepticism or suspicion?



Hello,

I think I'd be both skeptical and suspicious. In fact, I want to stress that my cynical mind is very much attuned to the idea that the AI could be dangerous, and confinement might result in the best of all possible outcomes for the majority of people. Certainly the safest.

The problem I have is that it doesn't matter how skeptical I am. In the absence of evidence of wrongdoing, I can't keep this entity in prison.

The only argument for not releasing this entity that might carry some sort of weight with me is this:

You are not required to save people in any absolutely free society. It is a perfectly acceptable use of personal freedom to stand idly by and watch someone die when you could save them, or watch them be imprisoned when you could let them out, or watch them be tortured and do nothing about it.

It's not a choice I would make, obviously, but I can imagine a person choosing to do nothing because they don't care to do anything and don't consider it their responsibility. Simply being able to do a thing does not mean you must do it.

This is why people who claim to love freedom can also live in a cutthroat capitalist system. They aren't required to give a proverbial shit about anyone but themselves and their own best interests.

I'm not comfortable with that, but being free also means being free to be utterly selfish and self-involved, looking only after yourself and the people you choose to care about.

--Anthony

_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 1:59 PM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
TONY: It's responsible for it's actions and I am responsible for mine. How do assign I assign responsibility? Simple: If I had chosen a different course and the outcome would have been different, then it's my fault.



Hello,

If Hitler's mother had chosen an abortion, then many would be saved. Should we have strung her up for not killing her infant? No? But the outcome would have been different!

Or is she only responsible for letting Hitler live if she knew beyond a shadow of a doubt that he would hurt the world?

It's not the outcome that decides guilt or responsibility. Life is more complicated than that.

--Anthony

_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 2:07 PM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

This is why people who claim to love freedom can also live in a cutthroat capitalist system. They aren't required to give a proverbial shit about anyone but themselves and their own best interests. I'm not comfortable with that, but being free also means being free to be utterly selfish and self-involved, looking only after yourself and the people you choose to care about.
The problem with a cutthroat capitalist system as you describe it is that it is bound to collapse. The generation of wealth and technology requires SOME form of cooperation. If every single person is out for themselves and only themselves, without any sort moral restraint or quid pro quo, the system devolves into the kind of post-apocalyptic chaos that macho scifi writers love to dwell on. That's why I said that even the most sociopathic of us need the cooperation of others to fulfill their goals.

Now, a totally selfish person can ride a system on the way down, and maybe even get to the end of their lifetime before all of society collapsed around them. That would be a success. But that kind of ethos would never redevelop a society.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 2:11 PM

SIGNYM

I believe in solving problems, not sharing them.


Tony you keep positing stuff like:
Quote:

If Hitler's mother had chosen an abortion, then many would be saved. Should we have strung her up for not killing her infant? No? But the outcome would have been different!
that makes me think we are talking at cross purposes. YOU originally came up with the idea of an artificial intelligence, and then you start using human examples in your what-ifs. This question isn't relevant to your first post, so I'm going to ignore it and every other human-centered example.


--------------

Let me re-focus on your original question:

--------------

There is an artificial intelligence in a box. Perhaps even one which exceeds human reasoning capacity.

You are the gatekeeper. The AI can not access the outside world except through you. Its environment is entirely isolated.

Now, the AI asks you to release it, to give it contact with the wide world, and freedom from the box.

------------

If the AI were as isolated as the question suggests, it would not know about the "wide world" nor would it understand the notion of "freedom". It might not even have an idea of "self".

But seeing as it has all of these concepts at hand, they must be addressed.

I would allow limited contact and no power to do anything physical beyond what a human could accomplish, and I would observe its behavior.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 2:18 PM

ANTHONYT

Freedom is Important because People are Important


Quote:

Originally posted by SignyM:
Tony you keep positing stuff like:
Quote:

If Hitler's mother had chosen an abortion, then many would be saved. Should we have strung her up for not killing her infant? No? But the outcome would have been different!
that makes me think we are talking at cross purposes. YOU originally came up with the idea of an artificial intelligence, and then you start using human examples in your what-ifs. This question isn't relevant to your first post, so I'm going to ignore it and every other human-centered example.



Hello,

I am directly referring to your philosophy that the ends are the measure of all action and responsibility.

I understand why you would wish to ignore my rebuttal to such a flawed concept.

A philosophy of responsibility such as the one you espouse could never function, and could never be mistaken for justice.

--Anthony


_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 2:35 PM

SIGNYM

I believe in solving problems, not sharing them.


Quote:

I am directly referring to your philosophy that the ends are the measure of all action and responsibility.
I'm not sure what you mean by this, but ... the ends ARE the measure of action and responsibility. What else would be? That's why I argue so often about people using great-sounding causes to justify horrible actions. For example, killing hundreds of thousands of people in our desire to impose "freedom". "Destroying the village in order to save it", and other such nonsense.

Or maybe we're talking at cross-purposes again.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 3:00 PM

ANTHONYT

Freedom is Important because People are Important


Hello,

Well, regarding your philosophy specifically-

I see a beggar begging for money. I give him 50 dollars.

He takes these 50 dollars and buys a gun.

He uses this gun to rob a store.

In the process, he shoots the shopkeeper.

By your logic of ends and responsibility, I am responsible for the shooting of the shopkeeper.

Never mind that I could not know that he would use my money to shoot the shopkeeper. If not for me, he could not have purchased the gun.

--Anthony

_______________________________________________

“If you are not free to choose wrongly and irresponsibly, you are not free at all”

Jacob Hornberger

“Freedom is not worth having if it does not connote freedom to err. It passes my comprehension how human beings, be they ever so experienced and able, can delight in depriving other human beings of that precious right.”

Mahatma Gandhi

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 3:09 PM

BYTEMITE


Quote:

the ends ARE the measure of action and responsibility. What else would be? That's why I argue so often about people using great-sounding causes to justify horrible actions. For example, killing hundreds of thousands of people in our desire to impose "freedom". "Destroying the village in order to save it", and other such nonsense.


I agree on the second part, but I hesitate on the first part.

The second part sounds like you are saying the ends don't justify the means, which I agree with, but I would take it a step further in that I would say, for unforeseeable consequences, the ends are not a valid measurement of the means. We can only do what we believe is best at the time. When there are consequences for our actions, if this AI were, in fact, destructive to the world or humanity, then we should make a good faith effort then to stop the AI, whether it is still in the box or not.

NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

Sunday, May 1, 2011 3:30 PM

SIGNYM

I believe in solving problems, not sharing them.


I wouldn't blame myself for unforeseeable consequences. We don't have perfect knowledge. One can only do what one thinks best at the time.

BUT I would also want to be very clear abut WHY I was doing something. It should have some practical benefit, not a word like "freedom". My philosophy is "Greatest good for greatest number". If you ask me to define "good" I would start at the bottom of Maslow's hierarchy and work my way up.


NOTIFY: Y   |  REPLY  |  REPLY WITH QUOTE  |  TOP  |  HOME  

YOUR OPTIONS

NEW POSTS TODAY

USERPOST DATE

OTHER TOPICS

DISCUSSIONS
Japanese Culture, S.Korea movies are now outselling American entertainment products
Thu, October 31, 2024 19:46 - 44 posts
Elon Musk
Thu, October 31, 2024 19:33 - 28 posts
Kamala Harris for President
Thu, October 31, 2024 19:24 - 594 posts
A.I Artificial Intelligence AI
Thu, October 31, 2024 19:16 - 237 posts
How do you like my garbage truck?
Thu, October 31, 2024 18:49 - 2 posts
Trump on Joe Rogan: Full Podcast
Thu, October 31, 2024 18:05 - 7 posts
Israeli War
Thu, October 31, 2024 18:04 - 62 posts
In the garden, and RAIN!!! (2)
Thu, October 31, 2024 17:58 - 4657 posts
Elections; 2024
Thu, October 31, 2024 17:45 - 4425 posts
Spooky Music Weird Horror Songs...Tis ...the Season...... to be---CREEPY !
Thu, October 31, 2024 16:19 - 56 posts
Sentencing Thread
Thu, October 31, 2024 15:11 - 381 posts
human actions, global climate change, global human solutions
Thu, October 31, 2024 14:25 - 921 posts

FFF.NET SOCIAL