05-21-2006, 03:36 PM
|
#16
|
I Live Here
Join Date: Feb 2005
Posts: 23,211
|
Quote:
RenaissanceMan wrote
Quote:
a different tim wrote
Quote:
RenaissanceMan wrote
I don't think so, it would not be the SAME self, it would be an IDENTICAL self.
|
But I'm not an identical self to the one ten minutes ago anyway. Sterny and fishface are on this as well. I'm sorry but I really don't see the difference (give or take some unfeasible technology and stuff). Neither "self" (presuming the original isn't destroyed in the process of uploading or something) is identical to the one just before the cloning process, because time has passed. What sterny said.
|
The detail that they're not identical-identical is irrevelant. The point was that the ORIGINAL would be destroyed after the copy was made. You can't do that because of the moral implications.
In the original post, the 'original' would be asleep when it was killed to make room for the duplicate. My point is, that sleeping or awake, killing the original is killing a sentient being... in fact the one that originally agreed (stupidly, it now realizes) to the procedure.
The duplicate, of course, has the memories of the original, but knows it's a duplicate almost immediately. (Assuming it's not being lied to) You haven't transferred the 'mind' to another being, you've created a second one.
If you allow the killing of the first one because the duplicate is the same guy, then you open the moral floodgates for creating armies of duplicates to act as throw away workers or soldiers.
For this reason, creation of duplicates.... even in a future where we could all be cyborgs, will be HIGHLY illegal.
Clones too. If you could create 10 genetically identical people, you could get away with ANY crime because there would be no way to conclusively prove who was the actual perpetrator... even if they left copious DNA evidence and several eye witnesses. You can't do that with identical twins... One can't be the alibi for the other AND themselves at the same time.
|
I hate to differ with your last position. Brothers, and especially identical twins (difficult to ID the perpetrator), have indeed been able to "alibi" in reverse. One brother admits to the crime in such a way that the court is convinced he is lying to save his brother, who they are convinced is the true culprit. He is acquitted for being so loyal and perceived as innocent. Then both he and his brother confirm, after the verdict and under double-jeopardy, that he really did do it. The second brother didn't commit the crime and is acquitted and the first brother cannot be retried.
The most common form of cloning, good for up to six or so copies per event, does not require the death of the original. The split in making identical twins doesn’t use the original as a pattern, it grows mass and duplicates individual chromosomes. Once there are enough chromosomes and cell mass, it splits evenly, each half getting a randomly selected set from the duplicated chromosomes, a crap-shoot we all experience. Each resulting being is a twin, not a copy. There is no original.
About your earlier point concerning the death of the "original", cloning may or may not involve the death of the original. Many ethical issues still arise even if such death is not part of the equation which is why I and others gave death a secondary position in the discussion. As we have seen with Dolly, death is not implicit in cloning. That said, some kinds of cloning, Star Trek’s transporter, for instance, DO involve the physical death of the original even if a viable version of the person arrives at the destination. The act of performing such a detailed analysis, necessarily down to the subatomic level, must destroy the model.
“If you allow the killing of the first one because the duplicate is the same guy, then you open the moral floodgates for creating armies of duplicates to act as throw away workers or soldiers.” Here you may have shown strongly that, in fact, there is no extra-physical sanctity to human life any more than for a squid. Supposing you had a record of some person’s complete body/brain structure. You don’t know if that person is currently alive or not. For purposes of creating a clone, should you care? If you make a clone and then find out the original was alive after all, what makes it wrong to erase the clone (remember, no soul is involved)? If you have such a pattern and clone-building technology, are you not duty-bound to make as many clones as possible to, in effect, bring the original back to life?
"Those who most loudly proclaim their honesty are least likely to possess it."
"Atheism: rejecting all absurdity." S.H.
"Reality, the God alternative"
|
|
|
05-21-2006, 04:09 PM
|
#17
|
Guest
|
Quote:
Sternwallow wrote
“If you allow the killing of the first one because the duplicate is the same guy, then you open the moral floodgates for creating armies of duplicates to act as throw away workers or soldiers.” Here you may have shown strongly that, in fact, there is no extra-physical sanctity to human life any more than for a squid. Supposing you had a record of some person’s complete body/brain structure. You don’t know if that person is currently alive or not. For purposes of creating a clone, should you care? If you make a clone and then find out the original was alive after all, what makes it wrong to erase the clone (remember, no soul is involved)? If you have such a pattern and clone-building technology, are you not duty-bound to make as many clones as possible to, in effect, bring the original back to life?
|
They're not clones, they're duplicates. And sentient. If you create a sentient, it's a sentient and you are bound by law to respect it's rights.
You are NOT duty bound to make duplicate sentients in the effort to 'bring someone back to life' any more than you are duty bound to throw yourself into a robbery to protect the innocent. You may need to think about that one really really hard to see why...
Now! This is where it gets interesting. We've gone full circle. Is inception 'Creating a sentient'? If it is, then abortion is murder. Critical here is the legal definition of 'sentient'. At what age or level of development is a developing, sentient capable organism considered to be 'sentient'?
|
|
|
05-21-2006, 10:42 PM
|
#18
|
I Live Here
Join Date: Feb 2005
Posts: 23,211
|
Quote:
RenaissanceMan wrote
They're not clones, they're duplicates. And sentient. If you create a sentient, it's a sentient and you are bound by law to respect it's rights.
|
If you mean twins as I discussed, of course they are clones in every respect. They are just accidental. It is possible to induce cloning by separating the first pair of cells following conception. Then you get intentional twins/clones. You have not substantiated your claim that something about sentience confers a right to life. The law may be mistaken. Why are you sure that it is not?
Quote:
You are NOT duty bound to make duplicate sentients in the effort to 'bring someone back to life' any more than you are duty bound to throw yourself into a robbery to protect the innocent. You may need to think about that one really really hard to see why...
|
Granting that one need not risk his own life to save someone else, as in your robbery scenario, yet, when there is no particular personal risk, are you not duty (or morally) bound to save a life if you have the talent and resources to do so? Would not this duty extend to bringing someone back to life soon after their death if you could do it? By simple extension, then, shouldn't you expend the necessary resources to bring a person to life even after they had completely disintegrated, if it were possible? And, extending again, even if you do not know whether they are dead or not, shouldn't you morally make a clone of them if you can? Isn't creating a sentient life an important and a good thing?
Quote:
Now! This is where it gets interesting. We've gone full circle. Is inception 'Creating a sentient'? If it is, then abortion is murder. Critical here is the legal definition of 'sentient'. At what age or level of development is a developing, sentient capable organism considered to be 'sentient'?
|
I think you meant "conception". You are right that it is a sticky question since, right after conception, what you have is not sentient, but it has a pretty good (no certainty here) chance of becoming sentient. Again, this is only a puzzle if there is something inherent in sentience that confers rights. Can you fill in that gap please? The law is irrelevant to the moral question in this case. The law can permit, indeed can mandate capital punishment.
"Those who most loudly proclaim their honesty are least likely to possess it."
"Atheism: rejecting all absurdity." S.H.
"Reality, the God alternative"
|
|
|
05-22-2006, 01:35 AM
|
#19
|
Guest
|
I think we've gotten a little off track here. The aim of the Raelians isn't to create a clone that is separate from oneself. They don't want to just copy themselves, they want to inhabit a new body. It is their aim to live forever by transferring their minds to new bodies. When they're old, they'll create a younger, zombie version of themselves. This mindless zombie's physical development will be accelerated. It will reach the age of 24 or so. Then their minds are downloaded into a computer and then uploaded into the zombie's brain. So now, the old man has a healthy new body and he's extended his life for another 60 or so years at the end of which he can repeat the process.
Obviously this kind technology is well beyond our capabilities and we don't even know if it is possible. But I just wanted to address the philosophical issue of whether this new person is the "same" person. At first I thought that from the person being cloned's perspective, he wouldn't take on a new life...the clone would be a new cognitive entity. But now after reading what some other posters have written, that may be an irrelavent question. For all we know, we might constantly be turning into new cognitive entities. The "me" that I am right now may be a very different "me" from two years ago. But I don't know, all I know is that I retain the same memories, traits etc.
|
|
|
05-22-2006, 02:02 AM
|
#20
|
Guest
|
To get back on track as Metman suggested, the central question seems to be, What would it feel like to have your mind cloned? Or what would it feel like to have your brain downloaded into a computer?
I don't think it's ever possible to know unless you do it yourself. Let's say you make a copy of Joe's brain in a computer. Then you ask Joe, What happened? Joe would say, "I was just lying here,, then you did the procedure, and I'm still lying here." Now you ask the computer version of Joe what happened. "Well, I was lying over there, and then it felt like I went to sleep, and I woke up here in this computer." Both version have the same memories up until the point of transfer, so there's no question you could ask that could tell you which is the *real* continuation of Joe's consciousness.
That doesn't really answer the question of what it feel like to have it done, though. I'm almost certain you would feel like Human Joe, not Computer Joe. After all, copying information from your brain doesn't disrupt it, so why would you feel like you went into the computer? Of course, you would have created another consciousness, Computer Joe, who says that's exactly what happened. So in other words, I don't think copying your mind, either into another body or a computer, would give you the feeling of extending your consciousness.
|
|
|
05-22-2006, 02:05 AM
|
#21
|
Guest
|
This is my first post so be gentle with me. First off, it's nice to see a debate on this site rather than the usual theist bashing or name-calling. Anyway - I'd like to pose the opposite question. I can't take the credit for it - it was actually posed by a neurologist (can't remember who, sorry). The point is:-
A computer company developes a small chip that is able to exactly replicate the behaviour of a neuron. This is used to replace a neuron that dies in a patients brain. However, another neuron dies and needs replacing. This continues over a number of years. Given that the replacements are able to EXACTLY duplicate the thought processes of the original neurons, at what point does the patient stop being human?
A further point... at what point does the 'belief' neuron get replaced?
This has nothing to do with the return of the Cybermen on Dr Who...Honest!
|
|
|
05-22-2006, 03:22 AM
|
#22
|
Guest
|
Quote:
BornOK1st wrote
This is my first post so be gentle with me. First off, it's nice to see a debate on this site rather than the usual theist bashing or name-calling. Anyway - I'd like to pose the opposite question. I can't take the credit for it - it was actually posed by a neurologist (can't remember who, sorry). The point is:-
A computer company developes a small chip that is able to exactly replicate the behaviour of a neuron. This is used to replace a neuron that dies in a patients brain. However, another neuron dies and needs replacing. This continues over a number of years. Given that the replacements are able to EXACTLY duplicate the thought processes of the original neurons, at what point does the patient stop being human?
A further point... at what point does the 'belief' neuron get replaced?
This has nothing to do with the return of the Cybermen on Dr Who...Honest!
|
Heh. I haven't seen Dr. Who for ages.
First, we must analyze what is meant be 'Being human'. Is a 'human' a sentient native to the planet Earth? Or is it a result of a specific type of evolutionary process where the most generic animal is the one to eventually have the most advanced brain? In that case, there could be humans on other planets with greatly different pasts.
Further, would the term 'human' be used in a derogatory term against sentient machines? I.E. "Well, you're not even human..."
WE think of being human as the pinacle of evolution... the pimpest of the pimp. The top of the food chain. And, in our current state of ignorance, we are.
Anyway, back to your question...
Being 'Human' should be defined as having a human mind, one that is largely deterministic and operates as a network of neurons that largely conforms to a standard set by... well, I suppose a commitee will have to be set up.
So yes, your cybernetically modified person will still be human. Even if the implants augment the original mind to a small degree... he or she would still be human.
Even if the inplants completely replace the brain. Still a human. I would expect that you don't get to reclassify the mind until it's augmented to well beyond human ability.
But if you build it without an original... and construct a psyche to put in it... even a gestalt of several human minds... you might be getting into a different classification of being. What? I don't know.
|
|
|
05-22-2006, 04:01 AM
|
#23
|
Guest
|
You should check out the new series - David Tennant rules.
Thanks for the response. Your argument seems to be along the lines of the 'If it looks like a duck, etc'. If so, to use the cyberman analogy again (or possibly the replicants in Blade Runner would be better), at what point do you stop being human during the replacement process. If your legs are replaced there is no argument, arms similarly. Your heart, kidneys, liver - pretty much any internal organ or external body part could be replaced and you would still qualify as human. However, there must come a point when being human (as opposed to displaying 'humanity) is passed and you become an intelligent machine.
It may be that that is simply what we are and, certainly biologically, there is a strong argument to that statement.
Would there then be a difference between a human who has had his entire body replaced by cybernetics and therefore has human memories and developed emotions and a cloned human body that has no life experiences to fall back on but that is capable of learning and feeling emotion?
I suggest that although the logical answer might be 'no', should such a thing become reality there would be a very different social differentiation between the two.
Is it really Monday morning. I must have been on some really strong coffe last night to be thinking this stuff at the beginning of a week!
|
|
|
05-22-2006, 04:50 AM
|
#24
|
Obsessed Member
Join Date: Oct 2005
Location: Oxford, UK.
Posts: 2,330
|
I'm working today so don't have time to discuss it, so (as I sometimes do) I'm going to recommend an SF short. This is pretty much a science fiction topic anyway.
http://www.fictionwise.com/ebooks/eBook876.htm is, if nothing else, interesting and raises this issue of duplication in a way that rings true as far as the neurological and computing aspects are concerned. I'm pretty sure the Raelians take their SF way too seriously.
"You care for nothing but shooting, dogs and rat-catching, and will be a disgrace to yourself and all your family"
|
|
|
05-22-2006, 03:18 PM
|
#25
|
I Live Here
Join Date: Feb 2005
Posts: 23,211
|
Quote:
Metman07 wrote
I think we've gotten a little off track here. The aim of the Raelians isn't to create a clone that is separate from oneself. They don't want to just copy themselves, they want to inhabit a new body. It is their aim to live forever by transferring their minds to new bodies. When they're old, they'll create a younger, zombie version of themselves. This mindless zombie's physical development will be accelerated. It will reach the age of 24 or so. Then their minds are downloaded into a computer and then uploaded into the zombie's brain. So now, the old man has a healthy new body and he's extended his life for another 60 or so years at the end of which he can repeat the process.
Obviously this kind technology is well beyond our capabilities and we don't even know if it is possible. But I just wanted to address the philosophical issue of whether this new person is the "same" person. At first I thought that from the person being cloned's perspective, he wouldn't take on a new life...the clone would be a new cognitive entity. But now after reading what some other posters have written, that may be an irrelavent question. For all we know, we might constantly be turning into new cognitive entities. The "me" that I am right now may be a very different "me" from two years ago. But I don't know, all I know is that I retain the same memories, traits etc.
|
Right, but even (especially) your memories and mine, the very foundation of our personhood, also change and degrade over time. Do you vividly remember that party for your fifth birthday (feel free to substitute one of your own vivid memories from a similarly distant past)? Actually, science has learned that we only retain bits and snippets of such a "memory" and the details are filled in like patching the holes in a moth-eaten blanket. Contrary to recent popular belief, the human brain is not a video recorder, in principle able to recall every single sensory input for our whole life.
I understand and appreciate your point about the Raelian approach to cloning. I'm saying that the ability to clone people, with or without working brains, is very near if not already here. Duplicating a single memory, much less a whole personality will not be possible in centuries at the present growth rate of understanding the brain's internal mechanisms and how to engineer them. I could envision a whole-brain, pound for pound, transfer into the new body. IMV brain transplants will happen long before we can read, store and impress, mind upon living gray-matter.
"Those who most loudly proclaim their honesty are least likely to possess it."
"Atheism: rejecting all absurdity." S.H.
"Reality, the God alternative"
|
|
|
05-22-2006, 03:52 PM
|
#26
|
I Live Here
Join Date: Feb 2005
Posts: 23,211
|
Quote:
Gathercole wrote
To get back on track as Metman suggested, the central question seems to be, What would it feel like to have your mind cloned? Or what would it feel like to have your brain downloaded into a computer?
I don't think it's ever possible to know unless you do it yourself. Let's say you make a copy of Joe's brain in a computer. Then you ask Joe, What happened? Joe would say, "I was just lying here,, then you did the procedure, and I'm still lying here." Now you ask the computer version of Joe what happened. "Well, I was lying over there, and then it felt like I went to sleep, and I woke up here in this computer." Both version have the same memories up until the point of transfer, so there's no question you could ask that could tell you which is the *real* continuation of Joe's consciousness.
That doesn't really answer the question of what it feel like to have it done, though. I'm almost certain you would feel like Human Joe, not Computer Joe. After all, copying information from your brain doesn't disrupt it, so why would you feel like you went into the computer? Of course, you would have created another consciousness, Computer Joe, who says that's exactly what happened. So in other words, I don't think copying your mind, either into another body or a computer, would give you the feeling of extending your consciousness.
|
I agree. Even the part about going to sleep or feeling as though one has briefly been asleep, seems unnecessary.
Is it your position, as earlier expressed by some on this thread, that sentience once created constitutes a person who has a right to live?
Would the consciousness in the computer prevent you from turning it off? If, being a computer with non-volatile memory, you could turn it off and later turn it back on again without loss to the consciousness, wouldn't that be all right? If so, how long could it be left turned-off morally, years? Would it be OK to offload one consciousness to backup off-site storage and load another "person" in its place? Would a backup copy be yet another sentience we must preserve? Indeed, forgetting cloning for a moment, given the technology you describe for moving conscious entities between living brains and future computers, would it be OK or would it be multiple murders to time-share a body and brain among, say, three separate and distinct people in eight-hour shifts. Each person, unaware of anything during their “downtime”, not even dreaming, would feel as though they were in continuous operation of the host body. They would feel as though they were able to function all day, every day with no breaks needed, but their friends would feel that they were around only eight out of each twenty-four hours.
Perhaps the mind transfer issue is separate from the clone issue. Let’s just clone brainless bodies and use them for spare parts and let those bodies never have a sentience to begin with. We could replace puppy farms with non-person farms. That may sound horrid, (yes, it does) but remember that the bodies are non-persons because they have no possibility of sentience, not by some decree of a tyrant. Just like steers for burgers, these bodies would only come into existence as a source of parts for the real person they are sprung from. If you think this is outrageous, please tell me precisely why you think so.
"Those who most loudly proclaim their honesty are least likely to possess it."
"Atheism: rejecting all absurdity." S.H.
"Reality, the God alternative"
|
|
|
05-23-2006, 02:15 PM
|
#27
|
Guest
|
I do think sentience should be the basis for "human" rights. After all, on what basis is a species assigned human rights? Let's say we discover a sentient alien race: Wookies (from Star Wars). Should we assign Wookies human rights? If human rights are just for humans, then obviously we shouldn't give them to Wookies. But I think most people would feel that Wookies should get human rights, based on their being sentient.
I think the computer thought experiment is very cool, because it shows that we really do assign human-like rights based sentience, rather than something like having human DNA (which some opponents of abortion would say). I wonder what abortion opponents would say about cancelling a partial download of a person's consciousness, before the download became conscious? Would that be murder, because the download has the *potential* to become conscious, even though it didn't? :)
Sternwallow, to answer your time-sharing question, I think people would have to depend on their moral intuition, which is not really equipped for a problem where minds and bodies can be separate. But my intuition tells me that if you download a mind onto a computer, you should give that mind all the rights it would have if it were in a meat body; and that means you can't force it to become unconscious any more than you could forcefully anesthetize someone without reason.
To give my opinion on the "backup" question, I would say that you can't back up a mind. Copying a person's mind onto a computer creates a new mind. It's important to remember that computers cannot "move" files. When you tell a computer to move a file, what it's really doing is copying the file and then deleting the original. Where minds are concerned, we should be careful about copying if we have a problem with deletion.
|
|
|
05-23-2006, 08:01 PM
|
#28
|
Guest
|
The real question is, who the fuck wants to live forever anyway?
|
|
|
05-24-2006, 03:52 AM
|
#29
|
Guest
|
Quote:
Gathercole wrote
I do think sentience should be the basis for "human" rights. After all, on what basis is a species assigned human rights? Let's say we discover a sentient alien race: Wookies (from Star Wars). Should we assign Wookies human rights? If human rights are just for humans, then obviously we shouldn't give them to Wookies. But I think most people would feel that Wookies should get human rights, based on their being sentient.
I think the computer thought experiment is very cool, because it shows that we really do assign human-like rights based sentience, rather than something like having human DNA (which some opponents of abortion would say). I wonder what abortion opponents would say about cancelling a partial download of a person's consciousness, before the download became conscious? Would that be murder, because the download has the *potential* to become conscious, even though it didn't? :)
Sternwallow, to answer your time-sharing question, I think people would have to depend on their moral intuition, which is not really equipped for a problem where minds and bodies can be separate. But my intuition tells me that if you download a mind onto a computer, you should give that mind all the rights it would have if it were in a meat body; and that means you can't force it to become unconscious any more than you could forcefully anesthetize someone without reason.
To give my opinion on the "backup" question, I would say that you can't back up a mind. Copying a person's mind onto a computer creates a new mind. It's important to remember that computers cannot "move" files. When you tell a computer to move a file, what it's really doing is copying the file and then deleting the original. Where minds are concerned, we should be careful about copying if we have a problem with deletion.
|
Right. And this would mean that the characters in Star Trek are actually being deleted everytime they teleport. Basically they are being rebuilt in another location. The new Rikers, Picards, Crushers etc. are in fact new cognitive entities. But they have the same memories etc. as the people prior to transport. So these people feel that they've never "died".
But this may mean that we are constantly becoming new congnitive entities ourselves but we do not know it! Although brain cells stop dividing quite early in comparison to other cells, early on they do divide. Our bodies are constantly being rebuilt. The "me" of now, is possibly a different consciousness from the "me" at age 2. But I have no way of knowing. If I were becoming a new consciousness every second, the current consciousness would never know it. The old consciousness would not know that it died and the new consciousness wouldn't know that it's new. I find that pretty fascinating.
But it makes me wonder, if we ever developed teleportation technology like they have in Star Trek (very unlikely in our lifetime), would I - as in my current consciousness- want to be transported using the method?
|
|
|
05-24-2006, 04:17 AM
|
#30
|
Obsessed Member
Join Date: Oct 2005
Location: Oxford, UK.
Posts: 2,330
|
I think we're basically in the same place now. Looking back on my earlier posts I may have expressed myself badly.
My feeling on it:
You duplicate a person by whatever means. Before the duplication there was one sentient entity. after there are two. My point was that they both have a good claim to be the original. It's as if the person divided - neither is more "original" than the other. Both have consiousness.
However if "the old consciousness would not know that it had died and the new consiousness would not know that it was new" then I would argue that by any non theistic definition of consciousness they would be the same consciousness. Thus, after a clone or whatever, both would be a continuation of the original, although they would differ from each other.
As far as taking backups is concerned I see no ethical problem with storing a backup so long as the backup has not been switched on. The moment it is, it starts to acquire its own experiences and is a separate sentient being.
I personally would not be worried about transporters - all these technologies depend on an accurate mathematical or algorithmic version of an individual consciousness being possible (to be turned into a pattern and beamed across space or whatever) (there are non theistic claims that consciousness is not algorithmic, Penrose etc, but I'm dubious). This being so surely the rules of mathematical identites would apply - two identical algorithms are in fact the same algorithm? Or have I misunderstood the maths implications?
"You care for nothing but shooting, dogs and rat-catching, and will be a disgrace to yourself and all your family"
|
|
|
Thread Tools |
|
Display Modes |
Linear Mode
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT -7. The time now is 04:24 PM.
|