General Discussions

Cloned Humans – CheeseStorm

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
What is your opinion on cloning and how do you think cloned humans should be treated?
Brandon

Member

Posts: 594
From: Kansas City, Mo, USA
Registered: 02-02-2004

I don't see much difference between cloned humans and humans born the way God intended. All the "cloner" is doing is taking what God created and putting it together in a different way. In other words, the "cloner" hasn't actually created anything.

Get a pile of dust and form it into a human, then I'll be impressed.

------------------
"Before Abraham was... I Am." - Jesus Christ

3rd Day Studios

Simon_Templar

Member

Posts: 330
From: Eau Claire, WI USA
Registered: 10-25-2004
Well, if cloning was done, the cloned human should be treated like any other human.

My opinion on cloning humans is that it should not be done first, I think is dangerous because it involves us toying with things that in the future could very easily come back to bite us because we don't understand enough. Start toying with the genetic code and you might inadvertantly breed out immunity to a disease or something of that nature, or just generaly lower immunities etc.. this has been known to happen with simply with breeding of animals and plants to produce certain genetic traits. I think the possability of this is even greater when you start tinkering with cloning people.

Secondly, I don't think there is a good reason for cloning. Every reaon I've heard so far to clone people is either unrealistic (like replacing children who die etc) or is simply a terrible idea like cloning people for work force (esentially slave labor) or cloning people to harvest organs or genetic materiel (possibly the most in human reason I've heard). There simply is no good reason to do it, other than simply because we can.. and it carries alot of dangers and alot of possability of abuse.

------------------
-- All that is gold does not glitter,
Deep roots are not touched by the frost,
The old that is strong does not wither,
Not all those who wander are lost.

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
Alright, what about a robot with human intelligence? Do they need God's spark of life to be considered truly alive, or would the fact that they were conscious and clever make them our equals?
CobraA1

Member

Posts: 926
From: MN
Registered: 02-19-2001
I doubt that'll happen, especially not within my lifetime. The AI field tends to be overly optimistic. I've seen a lot of attempts, but all fail to be truly intelligent enough to carry on an intelligent-sounding conversation with me. I've tried some chat bots and other experimental stuff. They usually can't even do simple reasoning.

How they work is usually also not good enough to be intelligent. A.L.I.C.E., for example, is just a specialized search engine (not very different from Google), and the responses are canned.

Attempting to replicate the brain with a neural network is out of the question - we have nowhere near the technology to create even a fraction of the computational power required for such a network. And even if we could, we know nothing about how to program/train it to be intelligent.

Also, tasks that are extremely simple for humans are very difficult for computers, such as recognizing a randomly placed and oriented object in a cluttered environment. I can recognize and name any item on my desk almost instantly, regardless of how it's placed on my desk. I can even recognize it it it's got other objects in front of it, or if it's distorted by being seen through a glass. A computer, even a supercomputer, has troubles with stuff like that.

Also, take the game of Chess - a computer can calculate millions of moves a second, a human about three. Yet, only recently has a computer beaten a Chess champion. And the computer has to calculate at an astromically faster rate than the human to beat him! We cannot seem to create a computer that thinks as slowly as a human, yet can beat him at Chess!

Computers are also, as far as I can tell, incapable of figuring out stuff (high-level reasoning or cognition, I think it's called). I can sit down and pull apart a piece of code or even a simple machine and answer the questions like "what does it do?" and "why create a piece of code to do that?" As far as I know, that's totally beyond what we can do with what we have currently. Not to mention I don't think we've ever created anything with an imagination.

There's also questions like self awareness, concious decisions vs unconcious decisions (reflexes, instincts, etc), etc.

Not to mention the possible role of emotions.

And on top of that, there's the debate over whether non-physical minds (ie souls) exist.

There's a lot of material to go over and think about in this field .

In conclusion: I'll believe it when I see it. Otherwise, I remain skeptical.

BTW, an interesting link:
http://www.christianity.ca/faith/weblog/2004/8.03.html

------------------
"The very idea of freedom presupposes some objective moral law which overarches rulers and ruled alike." -- C. S. Lewis (1898 - 1963), "The Poison of Subjectivism" (from Christian Reflections; p. 108)

Switch Mayhem now available! Get it here
Codename: Roler - hoping to get more done over the holidays . . .

[This message has been edited by CobraA1 (edited December 29, 2004).]

[This message has been edited by CobraA1 (edited December 29, 2004).]

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
Uhh... I kinda just wanted to know how you would treat an "artificial person"...

But since this is my favorite topic... computer performance is improving exponentially and that's all there is to it. Right now? It sucks. Later? Just don't piss them off...

Brandon

Member

Posts: 594
From: Kansas City, Mo, USA
Registered: 02-02-2004
I used to be into all the sci-fi AI, and Artificial Life... I think I was around 16 years old at the time, but that's when Jesus came into my life and saved me from it all! And I knew when it happened that to know Him was much better than playing around with 1's and 0's. AI is still very interesting to me. Now I just use it in games. Games that bring glory to His name infact As I said in another post, Jesus doesn't remove our hearts desire, but he fulfills it.

Nice link btw CobraA1.

------------------
"Before Abraham was... I Am." - Jesus Christ

3rd Day Studios

CobraA1

Member

Posts: 926
From: MN
Registered: 02-19-2001
quote:
Uhh... I kinda just wanted to know how you would treat an "artificial person"...

I don't know. I guess it depends on how it treats me .

quote:
But since this is my favorite topic... computer performance is improving exponentially and that's all there is to it. Right now? It sucks. Later? Just don't piss them off...

Simply shoving speed at the problem won't solve it. My PC can calculate more moves a second than Gary Kasparov can, but it can't beat him at Chess. All a computer does is whatever it's programmed to do. We have to solve the problem of how to create AI first.

------------------
"The very idea of freedom presupposes some objective moral law which overarches rulers and ruled alike." -- C. S. Lewis (1898 - 1963), "The Poison of Subjectivism" (from Christian Reflections; p. 108)

Switch Mayhem now available! Get it here
Codename: Roler - hoping to get more done over the holidays . . .

GUMP

Member

Posts: 1335
From: Melbourne, FL USA
Registered: 11-09-2002
If you look at the mindset of the people who believe in the Singularity concept you'll find an ingrained belief which mainly comes down to them thinking that if a computer reaches a critical level of storage capacity and performance capabilities that "somehow" an intelligence will evolve out of the disparate components. This idea is often repeated in SciFi but even from the serious researchers I really haven't seen a feasible explanation for how this could occur.

While it's true that there is an AI programming problem you also have to consider the hardware. Obviously the "wetware" running in non-artifical life is very different in design. While it's possible we could emulate thought functions on typical processors they weren't designed for the task. A comparison might be a CPU trying to do graphics functions. It's not only a performance issue, the actual design of the hardware might be inherent to allowing "intelligence". But if you copy the brain's design then you lose the benefits of our computer design.

If you managed to overcome all these design difficulties it might still be possible to combine the best of both worlds by merging an intelligence processor and a generic processor. Communications translations via a bridge between the two different processing protocols would be interesting, to say the least.

This brings up the subject of computerized brain inserts... but if I start on that I could go on for pages.

Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
Well, back to the idea of how to treat an artificial person? Are you thinking that we would turn them into slaves or servents even if they became intelligent? Bicenntenial Man was a good movie, but no robot will ever be human, it will never be in the image of God. Being the image of God does not necessarily mean that we look like God, but rather that we are to behave according to his will. He expects us to live good Christian lives. To be like him, blameless, or at least to try. I also don't believe that a robot could ever understand what it means to "believe". Rather than taking a leap of faith, a robot would never do such a thing because, (correct me if I'm wrong) most of it is based on logic. It would seem illogical to believe in something that we can not see and the only proof we have in a book. The day I see a robot attending church I'll believe it. Besides, what do we need robots for? To make us lazier than we already are? I mean, I'm sure they have some uses, but they would eventually steal jobs and make life too easy for us. Enough rambling, I'm sure someone has a bone to pick with me.

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
I'm not saying that once machines match the storage/performance of our brains (which are machines in themselves) they'll suddenly become human-like. As our technology gets better, we'll understand our own brains better as we learn how they work. Then, we'll be able to put some fast-as-light computers to the real test. And yes, it'll be a couple decades before then... obviously our modern PC's can't hold a candle to our brains.

I agree, back to how we would treat a robot with the same capabilities as us. I've only read bits and pieces of them, but "The Age of Spiritual Machines" and "Are We Spiritual Machines?" are pretty interesting. Your 'robot going to church' idea is pretty cool. Besides the different materials, wouldn't an artificial mind identical to ours have the same beliefs? HMMM... maybe we'd end up with robot internet forums where spiritual machines could argue with atheist robots... Then yeah, my job would be gone. Think how efficiently a robot could get its points across, instead of rambling on like I do! To avoid becoming obsolete, I'd have to merge with that robot, but that's another story.

So yeah, if we model their brains after our own, as long as we don't worry about what the robot is made of (that'd be like racism, wouldn't it?), wouldn't they have to be treated as other humans? Maybe they'd get mad if we banned them from the Olympics and stuff cause they'd be so fast and strong... or would they realize how unfair that'd be? Oh well I'd better cut this rant off and go eat some microwave burritos, keep the wetware humming along...

MadProf
Member

Posts: 181
From: Larnaka, Cyprus
Registered: 01-24-2001
If you believe humans are only machines (athiestic view), and are only a physical body. No soul, no spirit, only impulses and memories, then obviously, computers could one day be made inteligent, and when they do, should have the same rights as humans. But "rights" are a fundementally flawed concept in this worldview altogeather. There are no absolutes, as absolutes are merely illegitimate offspring ideas from flawed thinking machines, and superstition. Do whatever you want. Kill people, steal, rape, it doesn't matter. You are an accident. So are other humans. So are robots. If you feel like enforcing your own "laws", and "democracy", do so, if it makes other accidental biological creatures have feelings of safety and happiness, so what? If you put them in a position where what happens to you depends on what they do, so what? It doesn't matter.

If, however, you believe humans are composite ((mind &| body) and spirit), then robots will never be able to be the same as humans, as they have no spirit, and cannot have spirit. They will always be just tools.

And that's it, really. (IMHO).

Dan "Just ignore me" MadProf

------------------
7 days without prayer makes one weak.

ArchAngel

Member

Posts: 3450
From: SV, CA, USA
Registered: 01-29-2002
well, personality, thought, etc. are mainly physical functions by the brain, so yes, we can make an entity that emulates that. not now, but someday, hopefully.
however, as the madprof pointed out, I believe we have a soul, which we cannot emulate in a machine.
so we can copy us to a degree.

------------------
Soterion Studios

CobraA1

Member

Posts: 926
From: MN
Registered: 02-19-2001
To be honest, I'd probably never buy a machine with the contreversial abilities, I'd probably be more interested in something like the computer that runs the starships in Star Trek - it only does what it's programmed to do, and has no personality. I'm sure the AI experts will be trying to make something that works like a human, but I think that society as a whole will be fine with plain old computerized robots.

------------------
"The very idea of freedom presupposes some objective moral law which overarches rulers and ruled alike." -- C. S. Lewis (1898 - 1963), "The Poison of Subjectivism" (from Christian Reflections; p. 108)

Switch Mayhem now available! Get it here
Codename: Roler - hoping to get more done over the holidays . . .

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
Awww, but then they'd keep advancing without us, and we'd be dumb in comparison.
CobraA1

Member

Posts: 926
From: MN
Registered: 02-19-2001
I'll leave that to the AI labs. Their primary purpose is to serve us, not to be our intellectual superiors anyways. In the end, I'll want a tool that can help me, not a brainy toy.

------------------
"The very idea of freedom presupposes some objective moral law which overarches rulers and ruled alike." -- C. S. Lewis (1898 - 1963), "The Poison of Subjectivism" (from Christian Reflections; p. 108)

Switch Mayhem now available! Get it here
Codename: Roler - hoping to get more done over the holidays . . .

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
As they get smarter, they'll probably get promoted in status, from tools and toys, to pets and servants, to equals and superiors.
Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
yes, I think it would be best if we left the robots as tools. I thought the movie Artificial Intelligence was sad, and I hope it never comes to that! Yes, tools/machines they should stay. Why would we want to create something smarter than us? It just doesn't make sense.

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

ArchAngel

Member

Posts: 3450
From: SV, CA, USA
Registered: 01-29-2002
they only go as far as we make them.
the AI your talking about is hypothetical.

------------------
Soterion Studios

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
The way I see it, the point of making them as smart as us is so we can hitch a ride with them as they improve. I'm not saying this is what everyone wants, but I don't see technology grinding to a halt once we've got faster PCs and servant Furbies scrubbing the floors. But I can't argue with the coolness of having a pack of Furbies running around the house.

The main thing to keep in mind (if you agree that yes, hypothetically, they could someday match our intelligence) is that the machines would probably seek to become more efficient. As Vernor Vinge pointed out, an ultraintelligent machine would be the last invention mankind would ever need to make (it would improve upon itself without our help).

Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
So what is your point?

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
It is unlikely that technology's progress will stop, even when humans are no longer necessary for inventing things.
Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
I see, well, I don't think any of us have to worry about that. You are more than likely correct. Although I think that human ingenuity will always be a part of progression and growth. I think God will take us home before any of this comes to pass.

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

Simon_Templar

Member

Posts: 330
From: Eau Claire, WI USA
Registered: 10-25-2004
I guess I don't believe that intelligence can arise randomly, and I don't believe its possible for something to be made more intelligent than its maker.
In some cases we can know how to do something better, but we ourselves are not capable of doing it because of physical limitations so we can design machines to do it. However, thinking is somewhat different because it, by nature, isn't a mechanistic process. Also, we don't know how to do it any better than we already do. I don't think we will ever discover a way to think better than we do, and not be physically capable of implementing it.

------------------
-- All that is gold does not glitter,
Deep roots are not touched by the frost,
The old that is strong does not wither,
Not all those who wander are lost.

Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
Agreed and good point

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
Can an organism make something smarter than itself?
Well that's where evolution comes in, but you don't believe in that either.
In this case, it's technological improvements - and you'll actually be able to see this evolution taking place.

How isn't thinking a mechanistic process? Our brains are made of the same things as the rest of the universe.

To-Do List:
Improve crappy human memory.
Think together wirelessly over great distances.
Increase calculation speed.

Computers have beaten us at all of those, and since our wetware can't compete, we must keep on using and improving our computers.

CobraA1

Member

Posts: 926
From: MN
Registered: 02-19-2001
quote:
As they get smarter, they'll probably get promoted in status, from tools and toys, to pets and servants, to equals and superiors.

Only if we promote them that way. We do, after all, control every step of the process. In my house, they will always remain tools and toys - and I think it will stay that way in most households.

Right now, a computer/robot capable of increasing its own technology is only hypothetical.

quote:
I think God will take us home before any of this comes to pass.

I agree - I don't really see God allowing us to create such stuff. Nevertheless, it is interesting to talk about .

quote:

How isn't thinking a mechanistic process? Our brains are made of the same things as the rest of the universe.

This comes from the idea of a soul - remember, we are Christians, we do believe in the spiritual .

quote:
Computers have beaten us at all of those, and since our wetware can't compete

Strangely enough, we compete quite well despite our limitations - remember the chess example! Pure processing power alone does not create intelligence.

quote:
Well that's where evolution comes in, but you don't believe in that either.

Evolution, regardless of whether we believe in it, is irrelevent. A computer is not made by randomly mutating silicon chips and picking the best survivors.

No, our progress is controlled, thought out, designed, and intentional. It cannot be compared to evolutionary processes.

------------------
"The very idea of freedom presupposes some objective moral law which overarches rulers and ruled alike." -- C. S. Lewis (1898 - 1963), "The Poison of Subjectivism" (from Christian Reflections; p. 108)

Switch Mayhem now available! Get it here
Codename: Roler - hoping to get more done over the holidays . . .

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
The technological Singularity will likely occur in the next 20-30 years. So God had better get cracking on this Judgement Day or whatever you're expecting/hoping will stop our inventions.

If a man-made machine that thought exactly as we did (and therefore proved that thinking is a mechanical process) wanted to be a Christian, would you tell it that this would be impossible simply because you believe that it needs a soul?

NO - we do not compete quite well with the points I listed. You know, where it says "computers have beaten us at all of THOSE", I was referring to THOSE points. And as for chess, what difference does it make if the computer needs to make more calculations? They take less time to make up their mind, and they still beat us. And if you want a fair competition, you can turn down the difficulty level. ^_~

Does that make the chess program smarter than its human opponent? Of course not. That stupid program can't even tell the difference between a bagel and a cactus. It does what it's programmed to do, and does it well.

Evolution: "A gradual process in which something changes into a different and usually more complex or better form."

It would be fair to say that our computers are gradually (and this rate is accelerating) becoming more complex and more efficient.

Stalks-the-Night

Member

Posts: 18
From: L'anse, MI
Registered: 11-10-2004
I don't like the idea of cloning humans. I think its just wrong and
if this is done then man is opening a pandora's box that it will never close.

------------------
Luke 9:23
1 John 2:6

CobraA1

Member

Posts: 926
From: MN
Registered: 02-19-2001
quote:
The technological Singularity will likely occur in the next 20-30 years.

I seiously doubt we'll have AI comparable to humans that fast. I'd say we're looking at hundereds of years, if at all.

And you make a nice definition of evolution, but many people use it to refer to some forms is biological evolution:

quote:
# Biology.

1. Change in the genetic composition of a population during successive generations, as a result of natural selection acting on the genetic variation among individuals, and resulting in the development of new species.
2. The historical development of a related group of organisms; phylogeny.


------------------
"The very idea of freedom presupposes some objective moral law which overarches rulers and ruled alike." -- C. S. Lewis (1898 - 1963), "The Poison of Subjectivism" (from Christian Reflections; p. 108)

Switch Mayhem now available! Get it here
Codename: Roler - hoping to get more done over the holidays . . .

Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
Computers don't evolve, they simply are improved upon by HUMANS.

Robots are built using logic, correct? In a right or wrong choice such as many people face daily are not always logical. It might be logical to keep the wallet you found, a robot would, but a human/christian would look for its owner. Robots can only do what they are programmed to do. If you make a robot that can think, that is all they will ever do. So many daily choices require thought, and many a soul. I firmly believe that we will never create anything "better" than ourselves.

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
Computer performance/cost is improving exponentially, and I think a few centuries before the Singularity is out of the question, unless we are set back with World War III or something horrible.

Evolution of computers - We try new methods of building our computers. If the method works, we keep it, and a new 'species' of computer is developed.

What use would a robot have for a wallet?

"If you make a robot that can think, that is all they will ever do."
Well we can think, and that's all we ever do... what's the problem?

What are these 'many daily choices' that require a soul?

GUMP

Member

Posts: 1335
From: Melbourne, FL USA
Registered: 11-09-2002
The Singularity concept is the belief that the various technological advancements will eventually culminate in an event or a discovery that will change society/humanity in a profound manner. In other words, it doesn't HAVE TO BE a computer AI becoming self aware. Personally I don't see the advent of an intelligent AI as a big deal in comparison to other possibilites. Brain inserts, superconductors, nanotechnology, superluminal travel and/or communication, wormholes, and even the more realistic possibily of physical rejuvenation have a greater potential for change.

Oh, and the brain is capable of self-modification. It's fully possible to train your brain in order to acquire an eidetic memory.

[This message has been edited by Gump (edited December 30, 2004).]

Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
Wallets are to put money in.

You mean you didn't type that in? Who did? Did you sit around thinking and it magically posted itself? So obviously you are breathing and typing, so you aren't always thinking.

Whether or not to steal. How to treat another person. You seem to me to be someone who would kick a dog. How many choices a day affect other people? They all require something more than a little bit of hardware.

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
What use would a robot have for money? Intelligent machines would probably work together towards a common goal and there would be no need for personal gain (wealth). I guess that is one of the main differences between us and them.

I own two dogs and I do not kick them. This does not mean that I have a soul. I don't steal or treat other people badly (unless the occasion calls for it) because this would create more problems for society. There is nothing spiritual about that decision. If a soul was required to make such decisions, why is there so much crime and hate? If you're going to say "LUCIFER DID IT!", then is every bad decision the result of Satan influencing someone? Furthermore, what the hell would I gain from kicking an animal?

Alright, we do things other than think, but this doesn't mean we do them one at a time. We are always thinking, even while typing. However, evidence would suggest that your brain shuts down while typing. Oh, burn.

I agree, Gump, the arrival of superhuman intelligence doesn't have to be AI 'waking up'. A network (futuristic internet?) could wake up, or maybe we'll just improve ourselves mechanicially (merge with machines?) or biologically. This would probably be a lot safer than letting a superhuman intelligence improve itself and quickly make us obsolete (or kicking our butts for whatever reason - like when we pissed them off by nuking them in the Animatrix ^_^). I'm mostly just *borrowing* ideas from here: http://www.ugcs.caltech.edu/~phoenix/vinge/vinge-sing.html

However, wouldn't it help a lot if we had/were a smarter intelligence to study all those exciting fields you mentioned?

GUMP

Member

Posts: 1335
From: Melbourne, FL USA
Registered: 11-09-2002
You're assuming the base requirements for self-awareness won't have restrictions inherent to the design. Considering the current stage of research it's very early to jump to the conclusion an intelligent computer would be "smarter". If your main complaint is processing and storage limitations caused by the dimensions of the skull, there is always genetic engineering. Combined with brain inserts it might be possible to extend your basic awareness and offload certain calculations to your own little "brain mainframe".

[This message has been edited by Gump (edited December 30, 2004).]

CobraA1

Member

Posts: 926
From: MN
Registered: 02-19-2001
quote:
What use would a robot have for money?

Unless the robot can create an infinite supply of energy and matter, it will need a way to manage the distribution of finite resources. Economic systems are very efficient means of doing so. If we expect the robot to be able to create new technologies (which is required for your theory), it will need access to resources.

In fact, probably the best way to control what a superintelligent computer can do would be to control its access to resources.

There's also the idea of wants and desires - why would a computer superintelligence wish for the control and/or extinction of the human race? What if we controlled its wants and desires to have a mindset where it wanted to help us, instead or harm us? The idea that such a machine would wish us harm just because it's more intelligent does not seem to have a logical basis.

------------------
"The very idea of freedom presupposes some objective moral law which overarches rulers and ruled alike." -- C. S. Lewis (1898 - 1963), "The Poison of Subjectivism" (from Christian Reflections; p. 108)

Switch Mayhem now available! Get it here
Codename: Roler - hoping to get more done over the holidays . . .

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
Wouldn't we continue replacing old-school brain parts with better inserts until nothing remained of the original structure? Then we'd end up with a Best Brain Version 2.0 all the same.

If the robots were identical, they would have the same desires (example - knowledge). They would cooperate without bribing each other to help with the work. If they needed to build more robots, they would share resources, because more help = more progress towards the goal.

I don't mean that the robots would kick our butts just because they could. But if we pissed them off enough (let's say we get in the way of progress by blowing up some of their labs/factories) they would probably view us as a hindrance rather than a curious ape. It's hard to make progress with distractions, like doing homework in front of the TV.

Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
If you blew up my factory, I'd be mad.

What you are thinking of is a communistic society. Everybody pulls their own weight, and helps each other out. No need for money because we all simply take what we need. I can totally understand the want to a utopian society, but if humans can't make a true communist society, what make us think that we can make robots that are truly communist?

"However, evidence would suggest that your brain shuts down while typing. Oh, burn."
Wow, good one. I think I'll go cry.

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
Blowing up factory = anger. That was my point.

I suggested that the soul is not required to make decisions, and you decided that this would make me the kind of person that participates in animal abuse. I think my insult was justified and even tame in comparison.

Communism doesn't work because of our desire for personal gain - greed. Robots, as I already said, would have no need for personal gain - they would have the same goals!

bennythebear

Member

Posts: 1225
From: kentucky,usa
Registered: 12-13-2003
2 words... i robot

------------------
proverbs 17:28
Even a fool, when he holdeth his peace, is counted wise: and he that shutteth his lips is esteemed a man of understanding.

www.gfa.org - Gospel for Asia

www.persecution.com - Voice of the Martyrs

CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
Is it worth seeing? It looked alright.
CobraA1

Member

Posts: 926
From: MN
Registered: 02-19-2001
quote:
Robots, as I already said, would have no need for personal gain

So much for robots wanting to take over humanity. If they don't want personal gain, why take over humanity?

------------------
"The very idea of freedom presupposes some objective moral law which overarches rulers and ruled alike." -- C. S. Lewis (1898 - 1963), "The Poison of Subjectivism" (from Christian Reflections; p. 108)

Switch Mayhem now available! Get it here
Codename: Roler - hoping to get more done over the holidays . . .

ArchAngel

Member

Posts: 3450
From: SV, CA, USA
Registered: 01-29-2002
maybe communal gain?

we could program for personal gain. a great way to have then self-advance

------------------
Soterion Studios

nfektious
Member

Posts: 408
From:
Registered: 10-25-2002
It appears a key assumption that has been overlooked or ignored is that these robots must be self-sustaining - that is, no need to perform self-maintenance, not be affected by weather, not injured by working with something outside their strength tolerance, etc.
CheeseStorm
Member

Posts: 521
From:
Registered: 11-28-2004
If fusion power is the way of the future, there'd be plenty of energy. Nanobots would be good for tiny, delicate repairs. Nanobots could replace all other robots, once computers were small enough.

I don't know where this 'take over humanity' thing is coming from, if you're talking about slavery or being kept as pets. If we piss them off, they could destroy us. They wouldn't need to keep us around - but as long as we didn't get in the way, I don't think they'd decide to kill us.

MadProf
Member

Posts: 181
From: Larnaka, Cyprus
Registered: 01-24-2001
quote:
It appears a key assumption that has been overlooked or ignored is that these robots must be self-sustaining - that is, no need to perform self-maintenance, not be affected by weather, not injured by working with something outside their strength tolerance, etc.

I don't get why this is an assumption?

I think the idea is that this AI "thing" will by its nature want to improve itself. Something along the lines of "lines 920 to 1029 are being executed very slowly, i wonder if i can make it faster..." which then develops into
"these chips are working very slowly, i wonder if i can design faster ones" which it then proceeds to do. Then it finds that humans are the limiting factor on it's calculations, so it asks the humans for what it needs to be self supporting, or, it reads books on (psycology,hypnotism,etc) and uses this to make the humans make it self-supporting. Or it works out ways to hack around the humans firewalls, and then orders itself whatever it wants on ebay.

Anyway.

Happy new year, I guess.

Dan "Who? Me?" MadProf

------------------
7 days without prayer makes one weak.

MadProf
Member

Posts: 181
From: Larnaka, Cyprus
Registered: 01-24-2001
Oh, by the way, there is an excellent short story, by Asimov, I think, about computers/robots/etc not doing a direct takeover, but slowly making humans less and less in control, because they are programmed to protect humans, and humans are not completely rational, and so therefore cannot be trusted with (sharp objects, government, cars, food, making choices, etc) and so the robots stop the humans being able to do this for their own good.

Very good story. I shall have to go look it up...

Dan

------------------
7 days without prayer makes one weak.

nfektious
Member

Posts: 408
From:
Registered: 10-25-2002
Thanks Dan...made me laugh a good bit. I labeled it an assumption because it was never really discussed. I've not known any human being to build or design something that does not require maintenance at some point.

But yeah - Happy New Year

CobraA1

Member

Posts: 926
From: MN
Registered: 02-19-2001
madprof, that's essentially the premise of the "I, robot" movie that recently came out. Well, in this case it was a more direct, faster takeover, but the same premise: Humans can't be trusted to protect themselves, so the robots, being programmed to protect humans, decide to do it for them.

There's still too many questions IMHO that are unanswered as to decide what kind of behavior a superintelligent robot might exhibit. Unfortunately, I doubt we'll ever find out, as it probably won't happen within our own lifetimes, and God might even come back for the end times first.

------------------
"The very idea of freedom presupposes some objective moral law which overarches rulers and ruled alike." -- C. S. Lewis (1898 - 1963), "The Poison of Subjectivism" (from Christian Reflections; p. 108)

Switch Mayhem now available! Get it here
Codename: Roler - hoping to get more done over the holidays . . .

bennythebear

Member

Posts: 1225
From: kentucky,usa
Registered: 12-13-2003
yeah it was a pretty cool movie

------------------
proverbs 17:28
Even a fool, when he holdeth his peace, is counted wise: and he that shutteth his lips is esteemed a man of understanding.

www.gfa.org - Gospel for Asia

www.persecution.com - Voice of the Martyrs

Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
How would the robots be able to distinguish between who was threatening their master and an old friend who playfully hits him on the shoulder? I don't know, so if someone does, or if it's possible lemme know please
Can discernment be programmed other than selecting choices like chess?

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

MadProf
Member

Posts: 181
From: Larnaka, Cyprus
Registered: 01-24-2001
quote:
Originally posted by max:
How would the robots be able to distinguish between who was threatening their master and an old friend who playfully hits him on the shoulder? I don't know, so if someone does, or if it's possible lemme know please
Can discernment be programmed other than selecting choices like chess?

Certainly.


/* (simplified version, cross-translated from
positron meta-programming circuit-dump to m++,
in human-readable form. pmpcddump version
0.0.8.79-ALPHA - GNU/GPL 9.7) */

/*
*
* WARNING, THIS CODE IS NOT GUARANTEED FOR
* ANY PURPOSE WHATSOEVER, AND IS NOT CERTIFIED
* BY ANYTHING, AND IS NOT DESIGNED TO BE RUN
* ON ANY STANDARD ELECTRONIC DEVICE.
*
* FOR EDUCATIONAL PURPOSES ONLY.
*
*/

/* 490021:11:008 */
if ( master.get_emotion() & ( EMO_FEAR | EMO_WORRY | EMO_HATRED ) )
{
/* ... */ /* 490021:11:330 */
for (i = 0; i < master.local.others->count; i++)
{
/* ... */ /* 490021:11:391 */
if ( ( master.local.others.get_other(i) ).get_emotion () & ( EMO_FEAR | EMO_WORRY | EMO_HATRED ) )
{
/* ... */ /* 490021:11:402 */
master.local.others.get_other(i).set_concern(master, 9);
master.set_defense( DEF_RED );
/* ... */ /* 490021:11:459 */
}
/* ... */ /* 490021:11:699 */
}
/* ... */ /* 490021:11:811 */
}

I need to polish my c...

Dan "What?! waddya mean 'syntax error?'" MadProf

PS - It's a lot more complex than that, but thats the main idea. We're not talking about chess here, we're talking about highly complex machines capable of reading human emotion, faces, body-language, to a high and compertant level, with great understanding and knowledge of human humor, jokes, customs, etc, etc.)

Hope this helps :-)

[This message has been edited by madprof (edited January 01, 2005).]

Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
I don't understand any of it, but oh well. It seems it would need to be able to think, wouldn't it?

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

MadProf
Member

Posts: 181
From: Larnaka, Cyprus
Registered: 01-24-2001
"think" as in

"take in data, process it, and make conclusions from it."

yep. robots will have to be able to do that.

Dan "Who? Me?" MadProf

------------------
7 days without prayer makes one weak.

Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
is it that simple? i mean, I'm not trying to argue or insult you, I just would like to understand what would be needed.

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.

MadProf
Member

Posts: 181
From: Larnaka, Cyprus
Registered: 01-24-2001
yeah, I think so. In theory it is simple. In practice, its very hard!
There are so many details.

how do you know the difference between an old friend of someone punching them in fun, and someone being attacked? it's not "just a feeling", or "an aura of hate" or anything like that, there are many little signs. And a machine could be trained to understand those, and interperate them.

Anyway. I'm tired.

Goodnight.

Dan


max, buddy, don't worry about insulting me. it's almost impossible, certainly online. :-)

------------------
7 days without prayer makes one weak.

Max

Member

Posts: 523
From: IA
Registered: 09-19-2004
Hey, thanks prof. Maybe some day we will all see it, but who knows? Anyhoo, I'm really hard to insult as well. Just doesn't mean anything to me, ha ha.

------------------
* Eagles may soar, but weasels aren't sucked in jet engines.