PDA

View Full Version : Mind upload & resleeving



Salbazier
2013-01-18, 10:10 PM
Anyone interested to talk about this topic?

Do you think these technologies are can be feasibly realized in the future? What kind of impact do these technologies would have to the society if they can be realized? Do these technologies or the kind of society it would create is desirable for you?

I was introduced to the terms and their implication by Eclipse Phase. I was aware of the concept previously but never thought much about its serious implications. Personally, nevermind using, living in a society where its are accepted or even just exist is undesirable for me. Thinking about it, resleeving is probably the most scary/disturbing imagined future tech for me. Even singularity, grey-goo, & posthuman modification are not that frightening or disturbing (at least as far talking it. Confronted by some nanobot predator swarm would certainly be scary).

Concerning its feasibility, mind uploading I can see being possible. The problem was understanding how the brain actually worked and making a computer powerful enough to simulate it. That may took a very long time but progress has been continually made in both. So, it does seem like a logical progression. Resleeving/mind download to a biological brain is another matter. Rewiring a whole brain seems like it will require replicator-grade tech.

Grinner
2013-01-18, 10:23 PM
Ah. Eclipse Phase and its edgy terminology. How I love that game. :smallsigh:

Here's the problem. Eclipse Phase never actually explains how cortical stacks, uploading, or resleeving works. There is some mention of Ego bridges, but that's about as far as it goes.

The "proper" term for this practice is digital immortality (https://en.wikipedia.org/wiki/Digital_immortality), sometimes virtual immortality. The thing about digital immortality is that it's generally regarded as useless for extending the human lifespan. Why? Because currently envisioned techniques only copy and simulate the brain. You're not transferring your consciousness into a machine; you're birthing a digital copy of yourself.

Edit: Here's the kicker. Our currently practicable techniques require that the brain be sliced and copied, layer by layer. In other words, we must destroy the brain in order to copy it at the right resolution.

huttj509
2013-01-18, 10:32 PM
Ah. Eclipse Phase and its edgy terminology. How I love that game. :smallsigh:

Here's the problem. Eclipse Phase never actually explains how cortical stacks, uploading, or resleeving works. There is some mention of Ego bridges, but that's about as far as it goes.

The "proper" term for this practice is digital immortality (https://en.wikipedia.org/wiki/Digital_immortality), sometimes virtual immortality. The thing about digital immortality is that it's generally regarded as useless for extending the human lifespan. Why? Because currently envisioned techniques only copy and simulate the brain. You're not transferring your consciousness into a machine; you're birthing a digital copy of yourself.

Edit: Here's the kicker. Our currently practicable techniques require that the brain be sliced and copied, layer by layer. In other words, we must destroy the brain in order to copy it at the right resolution.

Ah, the Star Trek mass murder issue. If you are destroyed, then a consciousness is created that from its perspective was you right before you were destroyed.

Well, to mix references, are you the man in the theater, or the man in the tank? (The Prestige covered the issue rather well)

Soras Teva Gee
2013-01-18, 10:40 PM
Yeah unless one supposes as fact the existence beyond simple substance (a soul) then there seems to be no such thing as mind uploading. As the brain far as we know is not like a hard drive, so any process that did not destroy the brain would have to y'know leave it intact ergo you still exist in it. An "uploaded" mind is a copy.

With a soul it is philosophically possible, but such a concept is too mystical for most sci-fi which is where you see the idea most.

Salbazier
2013-01-19, 12:09 AM
Ah. Eclipse Phase and its edgy terminology. How I love that game. :smallsigh:

Here's the problem. Eclipse Phase never actually explains how cortical stacks, uploading, or resleeving works. There is some mention of Ego bridges, but that's about as far as it goes.

The "proper" term for this practice is digital immortality (https://en.wikipedia.org/wiki/Digital_immortality), sometimes virtual immortality. The thing about digital immortality is that it's generally regarded as useless for extending the human lifespan. Why? Because currently envisioned techniques only copy and simulate the brain. You're not transferring your consciousness into a machine; you're birthing a digital copy of yourself.

Edit: Here's the kicker. Our currently practicable techniques require that the brain be sliced and copied, layer by layer. In other words, we must destroy the brain in order to copy it at the right resolution.

I'm pretty sure mind upload was quite popular in other places beside EP and I don't know 'proper' term for 'resleeving' other than it. Yeah, sorry if bringing up EP annoy you. God knows that setting has problems.

Anyway, I actually agree that DI/MU are useless for life extension of the actual/original person (which is one of the reason why I don't care to participate in it if it ever exist, destructive brain scan or not). Still, some people apparently claimed that destruction of the old & creation of the new at the same time sufficient for continuity of existence. Same case with Star Trek Teleporters, which I think what huttj509 was referring to.

From the perspective of the copy things may be irrelevant though. Because, (ignoring the psychological issue that may arise from realization of being a copy) the copy is experiencing a continuous existence.


Yeah unless one supposes as fact the existence beyond simple substance (a soul) then there seems to be no such thing as mind uploading. As the brain far as we know is not like a hard drive, so any process that did not destroy the brain would have to y'know leave it intact ergo you still exist in it. An "uploaded" mind is a copy.

With a soul it is philosophically possible, but such a concept is too mystical for most sci-fi which is where you see the idea most.

Um, not really. As far I understood 'uploading the mind' has the exact same meaning with 'copying the mind'. Whether or not the mind still exist in the previous medium determines if its duplication of the self or moving of the self but that's different issue from the upload itself.

Coidzor
2013-01-19, 12:21 AM
Considering that human morality does not advance nearly as quickly as our technology, I see a lot more ill than good for quite some time with such technologies.

The only non-frivolous use that's not fairly immoral (see: making copies of egos for the purpose of slave labor in Eclipse Phase, a common enough criminal activity that it came up throughout the core rulebook; also the entire idea of forced uploading) would be the whole effective insurance against permanent death from accident and the like.

willpell
2013-01-19, 12:33 AM
On a primitive reptile-brain level, the idea of mind upload does scare me. But logically, I strongly suspect that the "soul" is in no danger. Not sure if I can discuss why without bumping up against the forum's religion ban; I have a very strongly developed spiritual philosophy which has nothing to do with any existing church/cult/sect/etc. of which I'm aware.

In any event, if this technology existed, I'd force myself to overcome my childish reluctance and be one of the second or third adopters - not an early guinea pig, but as soon as it was even a little proven I'd be all over the bleeding edge of the new wave. It's just like how I forced myself to get on a plane even thought I was deeply apprehensive before, terrified during, and exhausted by the ordeal after...ultimately it was good for me, and it needed to be done so that I could go have easily the most unique and profound experience of my last several years.

We're always scared of change, but change is inevitable; we must be prudent and sensible in assessing risks, but we shouldn't let ourselves be paralyzed into inaction by our superstitions. Otherwise, we're like someone who is so afraid of the dentists' drill that they spend their whole life getting cavities, and end up with far more pain than they would have suffered in the first place. (Granted they can also spend that time psyching themselves up for that pain, so it's not a complete loss, but in general it's much more sensible to take the efficient path of preventative maintenance.)

Grinner
2013-01-19, 12:48 AM
We're always scared of change, but change is inevitable; we must be prudent and sensible in assessing risks, but we shouldn't let ourselves be paralyzed into inaction by our superstitions. Otherwise, we're like someone who is so afraid of the dentists' drill that they spend their whole life getting cavities, and end up with far more pain than they would have suffered in the first place. (Granted they can also spend that time psyching themselves up for that pain, so it's not a complete loss, but in general it's much more sensible to take the efficient path of preventative maintenance.)

Did you just compare a dentist's drill to existential uncertainty? :smallconfused:

Soras Teva Gee
2013-01-19, 01:32 AM
Um, not really. As far I understood 'uploading the mind' has the exact same meaning with 'copying the mind'. Whether or not the mind still exist in the previous medium determines if its duplication of the self or moving of the self but that's different issue from the upload itself.

Exactly so. It all simply means "you" as a particular person cannot do it so its sheer sloppy poor language to speak of something that doesn't actually occur.

You copy yourself. Since your copy would be equally a person they would be entilted to all the same rights and privileges as a person, ergo there's no purpose that's both ethical and practical by which one can achieve anything.

No no secret doubles, no faux immortality, etc. As such its just a novel way of asexual reproduction like cloning. Destroying the original adds a dose of murder/suicide to the mix.

I brought up souls because then an immaterial self which has a material vessel one can philosophically replace that vessel. Of course this would be easily proven right or wrong by a non-destructive medium of "uploading" as should the original remain as themselves it would kind of defeat the demonstration. However for fiction its perfectly fine for souls to be verifiable phenomena.

Salbazier
2013-01-19, 08:49 AM
Exactly so. It all simply means "you" as a particular person cannot do it so its sheer sloppy poor language to speak of something that doesn't actually occur.



If 'uploading' was defined as 'copying into a medium' instead of 'moving into a medium' then it is proper to say 'I was uploaded' the same way it is proper to say 'I was copied' or 'the file was uploaded'. Maybe not the basic meaning of the word, but when a digital file was uploaded/downloaded isn't that what actually happened? A copy of the original file was created at a new storage based on the sent/received information.




You copy yourself. Since your copy would be equally a person they would be entilted to all the same rights and privileges as a person, ergo there's no purpose that's both ethical and practical by which one can achieve anything.

No no secret doubles, no faux immortality, etc. As such its just a novel way of asexual reproduction like cloning. Destroying the original adds a dose of murder/suicide to the mix.



Somewhat agree. Without even going into rights and such, there is no personal benefit for me to be uploaded since the actual me will cease to exist regardless. There is at least one possible purpose though. If you have a something you want to be done even after your death and that something require the your presence, or at least someone with exact aptitude and preference, then mind upload seems like a good way to ensure it. If the copy is perfect, he likely would do things as you want it done without any push. If he would not, with you being dead, things would not be done anyway. Leaving a copy so the next generations can know you personally (approximately) is another.

@willpell

I am not really scared for myself. Being scanned (non-destructively), uploaded, and archived is not scary by itself. It may even a novel experience & interesting experience. Seeing a copy of me come to live and interacting with it will be disconcerting, but the real scare is for my copy. He is the one that would really feel a crisis of existence. I neither want to experience or inflict to others (including and especially my own copy) that kind of mental burden.

If the scan was destructive or resleeving is possible, another scare would be living in a society whose members practice in regular, casual suicide.

KillianHawkeye
2013-01-19, 08:51 AM
Well I just finished watching the first season of Dollhouse, so if the process is anywhere near as easy as it is on that show, sign me up to be transferred into a new body ASAP.

Kato
2013-01-19, 09:07 AM
Semantics aside, I think this might be something that could be feasible... let's say sooner than we think. Like 100 years or so? Faster? I'd not say not faster, if it's possible. (Might be)
(On a side note, I wish to believe in a soul but from a scientific point... I don't. So for the sake of this post, there's no such thing as a soul)

And... I'm not sure how I feel about it. It's just an idea that's really hard to grasp. It's as people said something between immortality and cloning. I have no idea where this would go... Maybe we get limitations, like people are allowed to make backups of themselves but nothing more.
Creating possibly trillions of "new humans" in digital form would... I have no idea what this would encompass. Would they be able to vote? Would they require to work? (Probably) What rights if any would you give them?
If we just make backups and nobody breaks the law we are still immortal, unless it's a power down or something (but then you could always back up the back up) and while certainly having some people's minds forever available would do a service to society it seems really weird if there are no "real" deaths anymore. Kind of...

Really, I don't feel I could possible come up with a judgement apart from: It's really weird and different from anything we know (and I kind of don't want to think about it which might mean I dislike it)

Hylleddin
2013-01-23, 07:46 PM
Yeah unless one supposes as fact the existence beyond simple substance (a soul) then there seems to be no such thing as mind uploading. As the brain far as we know is not like a hard drive, so any process that did not destroy the brain would have to y'know leave it intact ergo you still exist in it. An "uploaded" mind is a copy.

With a soul it is philosophically possible, but such a concept is too mystical for most sci-fi which is where you see the idea most.

It's quite philosophically possible without a soul, if any perfect copy of you is you.

It just depends you are part of the pattern or part of the meat.

Das Platyvark
2013-01-23, 09:30 PM
Long story short, I'm not a fan.
I'm very much a futurist, but my personal hope for the far future is a physical one. At the advent of our technology, my biggest wish is to see a world in which one's physical form is based on their consciousness and not the other way round. It's my only real foray into transhumanism. I just don't like the idea of leaving behind the physical world when all this lovely stuff goes on here, even and especially if I couldn't tell the difference between this and the digital world. The point is not that digital me be ok with the transition; the point is that I be willing to make the transition, and I am not.

warty goblin
2013-01-25, 11:23 AM
Just remember, in the land of the digitally uploaded, the guy in meatspace with his hand on the circuit breaker is king.

The Glyphstone
2013-01-25, 11:34 AM
Just remember, in the land of the digitally uploaded, the guy in meatspace with his hand on the circuit breaker is king.

Not really any different from now, then, except in the means/tool used to kill people, and possibly the ease of doing so.

And even then - cutting the power to a current-day computer doesn't erase its files, you just lose any unsaved progress. If there's another meatspace person to turn everything back on afterwards, it's just a reversion back to a previous 'save point'.

Surrealistik
2013-01-25, 11:41 AM
The achievement of clinical immortality is something I hope happens in our lifetime, as I would like to live as long as is possible; consciousness is too enjoyable to cease. That aside, oblivion is likewise too terrifying to contemplate, an eternity of nothing, even though I logically know that it is not something I would be able to 'experience', nor be cognizant of.

As has been mentioned before, many forms of immortality really aren't, featuring the creation of a copy rather than the transference of the original, so these aren't for me.

Elderand
2013-01-25, 12:12 PM
We are so far from any such thing it's not even funny.

You want mind upload ? You're pretty much forced to simulate a brain completely, all it's chemistry, the quantum state of every single particle in it to near infinite precision. Which is actually impossible.

Grinner
2013-01-25, 12:21 PM
You want mind upload ? You're pretty much forced to simulate a brain completely, all it's chemistry, the quantum state of every single particle in it to near infinite precision. Which is actually impossible.

Hey! There's something we hadn't thought of. Working some quantum magic with the brain's electrical impulses, and entangling them into a non-organic replication of the uploadee's brain.

It doesn't bypass the Swampman/Star Trek murder problem and might not address the problem of memory fully, but it's a start.

Flickerdart
2013-01-25, 12:24 PM
Not really any different from now, then, except in the means/tool used to kill people, and possibly the ease of doing so.

And even then - cutting the power to a current-day computer doesn't erase its files, you just lose any unsaved progress. If there's another meatspace person to turn everything back on afterwards, it's just a reversion back to a previous 'save point'.
So what happens to the rest of the world when everyone starts save scumming?

Ravens_cry
2013-01-25, 12:29 PM
Mind uploading could make an utter shambles of democracy as we know it.
Now, in a lot of fiction, copying an uploaded mind is impossible or at least dangerous, but I don't see any real reason this would be so. Data is data after all. What do you do when you can just Ctrl-C a mind, a vote?

Grinner
2013-01-25, 12:32 PM
Mind uploading could make an utter shambles of democracy as we know it.
Now, in a lot of fiction, copying an uploaded mind is impossible or at least dangerous, but I don't see any real reason this would be so. Data is data after all. What do you do when you can just Ctrl-C a mind?

Seems to me that it would require an awful lot of hardware. I don't disagree that a prevalence of mind uploading technology would create a massive shift in society, but there would still be limitations for the general populace.

warty goblin
2013-01-25, 01:14 PM
Mind uploading could make an utter shambles of democracy as we know it.
Now, in a lot of fiction, copying an uploaded mind is impossible or at least dangerous, but I don't see any real reason this would be so. Data is data after all. What do you do when you can just Ctrl-C a mind, a vote?

Data is data, once it's been gathered. The gathering stage however seems fairly fraught for digitizing a brain. Generally the more precise the measurement, the harder it is to make, and extracting information from a brain would take a lot of very precise physical measurements. There seems plenty of opportunity for things to go pear-shaped when those vast number of operations are occurring in your skull.

Ravens_cry
2013-01-25, 01:18 PM
Seems to me that it would require an awful lot of hardware. I don't disagree that a prevalence of mind uploading technology would create a massive shift in society, but there would still be limitations for the general populace.
All the more dangerous.
Sure, it would initially require pretty massive hardware, but look at the incredible changes in the 20th century and early 20st in computing.
If quantum computing ever keeps its promises, we could see even greater leaps forward.
After all, your brain is not much over a kilogram, and that's not much hardware at all.

Data is data, once it's been gathered. The gathering stage however seems fairly fraught for digitizing a brain. Generally the more precise the measurement, the harder it is to make, and extracting information from a brain would take a lot of very precise physical measurements. There seems plenty of opportunity for things to go pear-shaped when those vast number of operations are occurring in your skull.
That's rather irrelevant. Copying the mind repeatedly wouldn't be copying the wetware brain over and over, but rather the data collected initially, which you agree is simply data, "once it's gathered."

Elderand
2013-01-25, 01:24 PM
All the more dangerous.
Sure, it would initially require pretty massive hardware, but look at the incredible changes in the 20th century and early 20st in computing.
If quantum computing ever keeps its promises, we could see even greater leaps forward.
After all, your brain is not much over a kilogram, and that's not much hardware at all.

Congratulation, you've just vastly underestimated how a brain work.
Computer hardware and even programs are laughably simple compared to a brain. You want something closer : try the internet, the whole of it, not just the hardware and software but even the interactions between the peoples that uses it and you get closer to what a brain does.

Grinner
2013-01-25, 01:32 PM
All the more dangerous.
Sure, it would initially require pretty massive hardware, but look at the incredible changes in the 20th century and early 20st in computing.
If quantum computing ever keeps its promises, we could see even greater leaps forward.
After all, your brain is not much over a kilogram, and that's not much hardware at all.

I'd be more concerned about the existence of quantum processors than mind uploading, frankly.

warty goblin
2013-01-25, 01:34 PM
That's rather irrelevant. Copying the mind repeatedly wouldn't be copying the wetware brain over and over, but rather the data collected initially, which you agree is simply data, "once it's gathered."

Yes. However if there's a fairly non-negligable chance a person doesn't survive the process of uploading, it's still not going to be a widely adopted technology.

Flickerdart
2013-01-25, 01:36 PM
Mind uploading could make an utter shambles of democracy as we know it.
Now, in a lot of fiction, copying an uploaded mind is impossible or at least dangerous, but I don't see any real reason this would be so. Data is data after all. What do you do when you can just Ctrl-C a mind, a vote?
Does copying a mind make it a distinct individual with its own vote? Seems to me that duplicates would still be one "person" for that purpose.

warty goblin
2013-01-25, 01:39 PM
Does copying a mind make it a distinct individual with its own vote? Seems to me that duplicates would still be one "person" for that purpose.

In which case you mindclone yourself ten billion times right before the election, vote a bunch, terminate the unnecessary duplicates, and hold a firesale on lightly used digi-brains.

Flickerdart
2013-01-25, 01:40 PM
In which case you mindclone yourself ten billion times right before the election, vote a bunch, terminate the unnecessary duplicates, and hold a firesale on lightly used digi-brains.
You still get one vote, not ten billion.

warty goblin
2013-01-25, 01:43 PM
You still get one vote, not ten billion.

I still do. The ten billion brain-clones of me all just happen to vote the same way. The election now goes to the party with the most hardware.

At that point you may as well just let people sell their votes on an open market. It'd be a more effective form of wealth redistribution at least.

Flickerdart
2013-01-25, 01:45 PM
I still do. The ten billion brain-clones of me all just happen to vote the same way. The election now goes to the party with the most hardware.

At that point you may as well just let people sell their votes on an open market. It'd be a more effective form of wealth redistribution at least.
No, you get one vote total. The other ten billion of you are still you.

Grinner
2013-01-25, 01:46 PM
You still get one vote, not ten billion.

But they think and feel just like you. Why shouldn't they get their own vote? You could apply the same argument to a clone. The question really is "What defines a person?"

warty goblin
2013-01-25, 01:50 PM
But they think and feel just like you. Why shouldn't they get their own vote? You could apply the same argument to a clone. The question really is "What defines a person?"

And it gets worse the longer your digi-brain exists, since it will inevitably diverge from its source material. The day after you digitize yourself, you and your binary buddy are probably fairly similar. But a year later, when you're off eating hamburgers and all that other meatspace stuff while your digital copy is doing...whatever the hell digitized brains do?

You're going to be very different people.

Flickerdart
2013-01-25, 01:52 PM
But they think and feel just like you. Why shouldn't they get their own vote? You could apply the same argument to a clone. The question really is "What defines a person?"
Why shouldn't you be able to buy one copy of Photoshop and then install it on every single computer in the world? Because of licensing. You only have one citizenship license, regardless of how many of you there are.

Elderand
2013-01-25, 01:53 PM
And it gets worse the longer your digi-brain exists, since it will inevitably diverge from its source material. The day after you digitize yourself, you and your binary buddy are probably fairly similar. But a year later, when you're off eating hamburgers and all that other meatspace stuff while your digital copy is doing...whatever the hell digitized brains do?

You're going to be very different people.

It's worse than that, given that to copy a brain you need to be able to know it's quantum state perfectly, which is impossible, right from the start you and your copies are going to be different, in fact you never get a copy of your mind, you just get an aproximation. And every copy will be aproximated differently. And the more time passes the larger the divergence will get, even between copy kept in perfectly similar conditions.

The Glyphstone
2013-01-25, 01:59 PM
I still do. The ten billion brain-clones of me all just happen to vote the same way. The election now goes to the party with the most hardware.

At that point you may as well just let people sell their votes on an open market. It'd be a more effective form of wealth redistribution at least.

I'd be more worried about the ten billion first-degree murder charges you'd potentially be charged with. If they're legally distinct persons from you, then you did just 'kill' them, and if they're not, they can't cast separate votes.

Ravens_cry
2013-01-25, 02:05 PM
Congratulation, you've just vastly underestimated how a brain work.
Computer hardware and even programs are laughably simple compared to a brain. You want something closer : try the internet, the whole of it, not just the hardware and software but even the interactions between the peoples that uses it and you get closer to what a brain does.
The assumption is that the technology exists to gather that data, and once the data is gathered, exactly what is stopping that data from being duplicated?

Does copying a mind make it a distinct individual with its own vote? Seems to me that duplicates would still be one "person" for that purpose.
Which is a whole other can of worms. From the perspective of each copy, they are a distinct person with goals and desires that are their own. Why shouldn't they get a vote of their own?

Yes. However if there's a fairly non-negligable chance a person doesn't survive the process of uploading, it's still not going to be a widely adopted technology.
If the choice was that or certain death, which is certain for everyone in the end, I bet many would choose uploading if it was available. All a level of unreliability would mean is a reason to delay the process until other options are less likely.

Grinner
2013-01-25, 02:07 PM
Why shouldn't you be able to buy one copy of Photoshop and then install it on every single computer in the world? Because of licensing. You only have one citizenship license, regardless of how many of you there are.

You know, I had been thinking about that too, and I began working on a list of qualifiers and their implications.


DNA - Twins are considered different people.
Legal status - What about undocumented immigrants?
Possession of a body - It's dismissive of the mind, but not all people are of sound mind. Maybe that could work? It would also apply to clones, as well.
Possession of a mind - Cogito ergo sum, yes? The problem is that of citizenship. Perhaps if they demonstrated ability to live on their own, but not all people can do that. Perhaps if someone vouched for them?
History - Does it have a history of its own? Often, the thing that defines a person is not what they are, but what they have been, their place in society.
Acceptance - Existence as a human is often part of a group effort, right? So maybe acknowledgement as a person by the community is all that it would need?
Edit: Biological pedigree - Well, yeah, that could work, but you would end up tossing a bunch of sentient minds out. I suppose that would work like "Possession of a body", but without the possibility for upgrading a mind to personhood.



Let me know if you think of anymore.

Elderand
2013-01-25, 02:15 PM
The assumption is that the technology exists to gather that data, and once the data is gathered, exactly what is stopping that data from being duplicated?

Because part of the "data" is tied on a fundamental level with quantum state of molecules in the brain. That means it is impossible to get an accurate copy because of the uncertainty principle. You can approximate it, but if you do it's not you anymore, just an eerely similar person.

So due to the laws of physics, you can't have perfect copy of this data, no matter what technology is involved. And we are nowhere near having the technology or knowledge necessery to get even that in the foresseable future.

Ravens_cry
2013-01-25, 02:29 PM
Because part of the "data" is tied on a fundamental level with quantum state of molecules in the brain. That means it is impossible to get an accurate copy because of the uncertainty principle. You can approximate it, but if you do it's not you anymore, just an eerely similar person.

So due to the laws of physics, you can't have perfect copy of this data, no matter what technology is involved. And we are nowhere near having the technology or knowledge necessery to get even that in the foresseable future.
Sounds like the Penrose theory of the mind, which has never has much actual evidence.
We've actually created working simulations of a portion of a rat brain. While this is a far cry from a whole rat brain, much less a human one, it's an interesting bare beginnings of a start and seems to point against the idea that such information is quantum and therefore can not be copied.
Even if it is and you can't create an exact copy, 'eerily similar' would be enough like you to create the issues I have mentioned.

Elderand
2013-01-25, 02:45 PM
It's not the penrose theory of the mind.

What I mean is that by nature you cannot know everything there is to know about something. It's the uncertainty principle, it's a real thing. The functionality of a brain and by extension the mind, is dependent on billions of interactions of different nature. The outcome of these interactions depends on their initial state, but you cannot perfectly know this initial state for even one of those factors, let alone billions of them.

Even if it is possible to build a simulation of a brain, which is not a given, it is impossible to "copy" a brain.

Ravens_cry
2013-01-25, 02:55 PM
You may be able to know enough, to make a working copy of a mind without having to take into account the quantum state of every atom and subatomic particle of the cells.
That is the conceit of this thread, and I am pointing out some of the implications of it.

warty goblin
2013-01-25, 06:23 PM
We've actually created working simulations of a portion of a rat brain. While this is a far cry from a whole rat brain, much less a human one, it's an interesting bare beginnings of a start and seems to point against the idea that such information is quantum and therefore can not be copied.


Simulating is very different than copying though. Like an entirely different class of problem different. Rather by definition a simulation just needs to closely resemble or track whatever it is built to simulate. It can do this via abstractions, and simplifications that allow a high degree of mimicry without actually being a direct one-to-one mapping. A copy needs to be a one-to-one mapping.

And the amount of information it would take to copy a brain is stupendous. At a very minimum you'd probably need the exact location of every molecule in the thing, and quite possibly detailed information on things like charge at the atomic level as well. Whether or not needing to know the energy levels of all those ionic compounds the brain throws around is enough for the uncertainty principle to bite a person in the ass I don't know.

Regardless, the level and accuracy of data capture necessary renders this extremely implausible.

Ravens_cry
2013-01-25, 06:39 PM
Perhaps, or perhaps not. That's to copy the brain. What brain uploading would mean would be copying the software. It's possible that the granularity of the function can be mapped without going into that level of detail. After all, a simuated mind won't need things like cellular repair and respiration functions. On a higher level, much of the automated functions can be stripped away. You don't need the systems that regulates the beating of the heart, inhaling and exhaling of the lungs.
I'm not saying it will be easy, we are basically never away from it at this point, but it might just be possible.

Carry2
2013-01-25, 08:04 PM
Now this is what I wanted to discuss on the Singularity thread.


We are so far from any such thing it's not even funny.

You want mind upload ? You're pretty much forced to simulate a brain completely, all it's chemistry, the quantum state of every single particle in it to near infinite precision. Which is actually impossible.
ORLY? (http://en.wikipedia.org/wiki/Blue_Brain_Project) (This is a little like saying that you can't simulate weather without drilling down to individual water molecules, which we know to be impossible in practice and useless even in theory (http://en.wikipedia.org/wiki/Chaos_theory). There are such things as useful higher-level holistic approximations.)

The timeframe for this kind of stuff is really hard to guage. It's almost certainly going to involve development of Strong AI first, as otherwise you won't have the theoretical model for simulating human intelligence outside of a human brain. Otherwise, the only real requirement is extremely precise brain-scanning, down to the synaptic and perhaps biochemical level. (Not that you need to simulate every atom, but a rough estimate of neurotransmitter levels would be helpful.) The latter is probably not that far away, the former is a bit of a crap shoot.

The question of copying vs. destruction/recreation is an interesting one. I don't think destructive scanning would be such an ethical problem for volunteers among the terminally ill, if you performed the process just after natural death (or possibly euthenasia. And in a certain sense, we're all going to be 'terminally ill' at some point.)

Psychologically speaking, the idea of some kind of gradual, organic transition or interface between one state to another is easier to accept, but that's probably much harder to engineer (and may not be philosophically distinct from destruction/recreation.) In theory, you could jam a lot of finely-integrated microprocessor infrastructure into the brain to interface with machine storage/AI-simulation, and combine that with longevity treatments to keep the biological components in good working order (or just leave them atrophy away.) But so far as we can tell, any 'upload/download' process would actually be a relatively slow process, taking days, weeks or months, because the brain can't just rewire all it's synapses at the flick of a switch. More like accelerated learning, really. And the hardware needed to do this verges on magical-do-anything-nanotech.


Mind uploading could make an utter shambles of democracy as we know it.
Now, in a lot of fiction, copying an uploaded mind is impossible or at least dangerous, but I don't see any real reason this would be so. Data is data after all. What do you do when you can just Ctrl-C a mind, a vote?
If you think DRM is contentious now, wait until the Pirate Bay starts distributing Turing-test-capable AIs. ...For sex droids. ...Modelled after Lena Olin. ...After her cryogenically frozen remains go suspiciously missing.

All I can suggest is that, in principle, higher intelligence should be more capable of devising defences against potential abuses of such technology, as well as more capable of abusing the technology.

I'd be more worried about the ten billion first-degree murder charges you'd potentially be charged with. If they're legally distinct persons from you, then you did just 'kill' them, and if they're not, they can't cast separate votes.
Yeah, I'm puzzled by how you register 10 billion suspiciously-similar people to vote without the authorities raising a few eyebrows.

Ravens_cry
2013-01-25, 09:36 PM
All I can suggest is that, in principle, higher intelligence should be more capable of devising defences against potential abuses of such technology, as well as more capable of abusing the technology.

Since when did higher intelligence get involved?:smallconfused:

Coidzor
2013-01-25, 10:23 PM
Since when did higher intelligence get involved?:smallconfused:

Strong AI always brings out the singularity which brings out the deus ex machina.

Ravens_cry
2013-01-25, 10:28 PM
Strong AI always brings out the singularity which brings out the deus ex machina.

Perhaps, but mind uploading isn't Strong AI, it's human intelligence transferred to a different medium.

Coidzor
2013-01-25, 11:41 PM
Perhaps, but mind uploading isn't Strong AI, it's human intelligence transferred to a different medium.

As I read it, part of the assumptions of the post was that such would be basically impossible before we had Strong AI to use as a model anyway and the general consensus of such things seems to be that if one makes a Strong AI then you've got the Singularity on your hands.


Now this is what I wanted to discuss on the Singularity thread.


It's almost certainly going to involve development of Strong AI first, as otherwise you won't have the theoretical model for simulating human intelligence outside of a human brain.

kitep
2013-01-26, 03:36 AM
In which case you mindclone yourself ten billion times right before the election, vote a bunch, terminate the unnecessary duplicates, and hold a firesale on lightly used digi-brains.

If they're people enough to vote, terminating them would be mass murder. And since they all know your plan, it may not be so simple...

Selrahc
2013-01-26, 04:13 AM
The voting thing is a non-issue. At least within a British legal framework.

In order to vote, you don't *just* need to be a person. You need to be a registered voter within the country. In order to be a registered voter within the country, you need to be accepted as a citizen of the country. Birth certificate, passport, etc. If the mind clones are all created in the country, that might be doable. But it will be a lengthy procedure, particularly as you completely overwhelm it by making 10 billion demands on it. Next you need to register to vote, which is another lengthy procedure. 100 years later, when the civil servants have gone through all ten billion applications, twice....

Next step is the fact that the British (and American) systems don't work on a proportional representation system. They're tied to locality. So your network of computers must be evenly distributed across electoral districts, or you're just going to end up with one incredibly secure candidate/state around the region of your network hub. But in order to register to vote somewhere you need to prove a local connection. YThis process would be easier in America, where you just need one computer hub in 50 states to decide the presidential election, but if you also want to control the house and the senate.... you're upping the number of locations you need to cover. If you can prove a local connection, you can send in a postal vote but since these are brand new citizens, the only real way to do it is residency.

The amount of hoops you need to jump through to make this work are staggering.

Ravens_cry
2013-01-26, 09:40 AM
As I read it, part of the assumptions of the post was that such would be basically impossible before we had Strong AI to use as a model anyway and the general consensus of such things seems to be that if one makes a Strong AI then you've got the Singularity on your hands.Or maybe it's the other way around .We won't have Strong AI until we have a mind we know works we can study in detail, the kind of detail only mind 'uploading' would allow.

Kato
2013-01-26, 10:16 AM
Or maybe it's the other way around .We won't have Strong AI until we have a mind we know works we can study in detail, the kind of detail only mind 'uploading' would allow.

Somehow this makes slightly more sense to me, to be honest. Yeah, we need a certain level of technology to copy a brain but reading a mind and then analyzing it seems much easier than coming up with it on the spot... Then again you need high level technology to copy a brain and maybe AI will appear out of nowhere once we get that.

warty goblin
2013-01-26, 10:58 AM
Now this is what I wanted to discuss on the Singularity thread.

ORLY? (http://en.wikipedia.org/wiki/Blue_Brain_Project) (This is a little like saying that you can't simulate weather without drilling down to individual water molecules, which we know to be impossible in practice and useless even in theory (http://en.wikipedia.org/wiki/Chaos_theory). There are such things as useful higher-level holistic approximations.)


As we say in statistics, all models are wrong, some are useful. You can get away with high-level abstractions for predicting weather because you don't need to be exactly right. If you predict a quarter inch of rain next Tuesday, and it only rains 3/16ths of an inch, your prediction is wrong, but still quite useful.

If you are simulating a generic brain it's quite plausible this would work as well. After all your goal is just to be usefully close to what a human brain would do; it doesn't need to match up with anybody specifically.

But when you go to upload a specific brain, you're playing a whole different, much harder, ballgame. Your room for 'usefully wrong' goes way down, because every time your computer model of my brain gets it wrong, you get somebody other than me. If your goal is to actually have my particular brain and its attendant consciousness in a box, that's non-usefully wrong.

(In other words, you've got a whole lot more acceptable variance in the generic brain model.)

Ravens_cry
2013-01-26, 11:12 AM
I'm not so sure. A generic working mind simulation would have to have the same granularity as a specific working mind simulation, would it not?
What is a generic brain anyway?

warty goblin
2013-01-26, 11:27 AM
I'm not so sure. A generic working mind simulation would have to have the same granularity as a specific working mind simulation, would it not?
What is a generic brain anyway?

The same granularity sure, but nowhere near the specificity. So if you fired up your brain simulation and it decided it's favorite painting was the Mona Lisa, that's a valid answer. If you uploaded my brain, and the digital me said that, it's not a valid answer because the Mona Lisa doesn't do all that much for the meatspace me. The digital me is not, in fact, me.

Looked at more broadly, if you want is a piece of software that responds like a person, you've got a very wide range of acceptable outcomes because people are themselves diverse. If you ask a bunch of people about their favorite paintings, you'll get a range of answers. There's variance in humanity, so your simulation needs to end up somewhere in the range of typical human responses in order to successfully simulate a person. It can still be wrong, obviously, but right is a fairly big target.

An individual however has exceedingly low variability. So your simulation has to vastly more precise in order to meaningfully approximate that person to the level of having virtualizated their mind.

It's like if you asked somebody to go find you a dog, vs. asking them to go find your favorite dog Banjo. There's a lot of solutions to the first problem, and only one to the second. And yes, this simile has problems, so let's not obsess over it.

Ravens_cry
2013-01-26, 11:37 AM
Really? If it's working, it will have to be very specifically generic, with the same number of variables. If we can reach the granularity to create one, why not the other?
To go further with the dog simile, it's actually like making a dog from scratch, and I do mean from scratch. Creating a 'generic' dog is just as complicated as creating a specific dog. You can use boilerplate data for the former, but you will still need the same level of detail of data, the granularity.

warty goblin
2013-01-26, 12:01 PM
Really? If it's working, it will have to be very specifically generic, with the same number of variables. If we can reach the granularity to create one, why not the other?
To go further with the dog simile, it's actually like making a dog from scratch, and I do mean from scratch. Creating a 'generic' dog is just as complicated as creating a specific dog. You can use boilerplate data for the former, but you will still need the same level of detail of data, the granularity.

I'm guessing you don't do a lot of mathematical modeling work, do you?

Firstly we're talking about using a higher level simulation. This gets around the impractically large amount of information it would take to build a virtual brain from scratch by using a bunch of carefully chosen simplifications and approximations. Your approximations can be (and will be) wrong, but still produce something useful for the 'generic brain' case if it results in something that behaves close to how a brain acts. So if your model produces results in the typical range for pre-frontal cortexes, it's a good pre-frontal cortex simulator.

For simulating Mr. Johnson's specific brain however, you need to do better. Just getting results typical for pre-frontal cortexes at large doesn't hack it anymore. You need to produce pretty much the exact same results as Mr. Johnson's specific pre-frontal cortex. So you need to know how Mr. Johnson's cortex is different from your generic case, and how to model those differences. It's a harder problem.

(It gets harder when you realize the more variables you add to your model, the less certainty you have that it's close to the truth.)

Ravens_cry
2013-01-26, 12:21 PM
A working generic brain is *exactly* the same level of simulation as a working non-generic brain, otherwise it won't work.

Mando Knight
2013-01-26, 01:04 PM
I have a different problem with mind uploading and resleeving: stream of consciousness. It's going to be a/the copy that gets resleeved in most of the scenarios... the one who wanted to try the resleeving doesn't get to engage in it unless it's a cut-and-paste job rather than a copy-and-paste one.

Kato
2013-01-26, 01:59 PM
A working generic brain is *exactly* the same level of simulation as a working non-generic brain, otherwise it won't work.
I'm not warty goblin but I think I can come up with a pretty simple simile for your discussion:
Task (generic): Make a painting. Difficulty: easy.
Task (specific): Paint the Mona Lisa. Difficulty: hard.

Yeah, it's not a perfect metaphor but I think you get the idea. It's way harder to make a specific item than something that meets just a few specifications. Oh, creating a digital mind is still way out of our league but creating any digital mind given the possible tools would still be more easy than making a copy of an existing one (probably, once we know how a mind works. To user the former metaphor: If you have no idea what a painting is, copying the MOna Lisa might just be easier)

Ravens_cry
2013-01-26, 04:14 PM
I am sorry, but to me, it's a terrible metaphor. A working brain that is average in every respect (the only definition of 'generic' that can fit the human brain) still has all the complexity of specific working brain.
Now, there is the difficulty of getting the information on specific variables and connections of a specific brain, but the working simulation of either will still have exactly as many variables.

Grinner
2013-01-26, 04:34 PM
As far as I can tell, you all are advocating two entirely different approaches.

Raven_cry's amounts to cutting and pasting several "brainscans" together and running it through a simulation.

warty goblin's method revolves around mathematically abstracting certain functions of the brain as part of the simulation.

Either way, they're each perfectly suitable for the original issue, the development of strong AI, and they both have their respective flaws. In the worst case scenario, Raven_cry's AI would be riddled with neurological disorders arising from flaws in the modelling of the virtual brain. warty goblin's AI would be able to pass a Turing test, but some may regard it as a p-zombie.

Actually, the p-zombie thing is likely to always be a problem.

Ravens_cry
2013-01-26, 04:46 PM
Eh, the P-zombie is a problem for everyday existence, if you think about it.
Who is the 'me' that is the gestalt of this multi-trillion cell colony that forms my body, and do others share this property. Oh, others say they do, but by what observation can I be certain they speak the truth.
To the reader, you may, or not. feel the same way about me.
Frankly, I would not describe what warty goblin is describing as a 'generic' simulated human brain, but 'simplified' or 'abstracted'.

warty goblin
2013-01-26, 05:06 PM
Frankly, I would not describe what warty goblin is describing as a 'generic' simulated human brain, but 'simplified' or 'abstracted'.

A simulation is always built on simplifications and approximations of some sort. The question is how many of them you can get away with before you start to erode its predictive efficacy.

The alternative to using a higher level mathematical system is a extremely detailed physical* simulation. The computational overhead for this is really quite high, and comes with the added drawback of needing absolute gobs of extremely precise information on the actual physical construction of the subject's brain to pull off.

*Which itself is of course an approximation. Even for 'simple' problems like objects in space, a computer uses very frequently updated timeslice approximations. It can't handle the literally infinite calculational load of doing things with perfect precision.

Flickerdart
2013-01-27, 01:17 AM
P-zombie isn't really a serious concern because there's no way to determine whether or not anything is a p-zombie.

kitep
2013-01-27, 03:58 AM
I'm not warty goblin but I think I can come up with a pretty simple simile for your discussion:
Task (generic): Make a painting. Difficulty: easy.
Task (specific): Paint the Mona Lisa. Difficulty: hard.

And just for fun, I'll give a counter example :)

Printing a copy of the Mona Lisa: easy
Printing a copy of a stick figure: easy

As far as the operator is concerned, both tasks are identically difficult.

Or it could be worse. If you don't give your printer a specific picture to print, but instead try to issue a "print a generic portrait" command, the computer will issue a "does not compute" error. So in this case, the specific is easy, the generic is hard.

As for duplicating a brain, which is easier depends on how it's accomplished. If you start with hand generated code, then generic is easier. If you start with a brain scan, then specific is easier.

Kato
2013-01-27, 08:27 AM
And just for fun, I'll give a counter example :)

Printing a copy of the Mona Lisa: easy
Printing a copy of a stick figure: easy

As far as the operator is concerned, both tasks are identically difficult.

Or it could be worse. If you don't give your printer a specific picture to print, but instead try to issue a "print a generic portrait" command, the computer will issue a "does not compute" error. So in this case, the specific is easy, the generic is hard.

As for duplicating a brain, which is easier depends on how it's accomplished. If you start with hand generated code, then generic is easier. If you start with a brain scan, then specific is easier.
Sorry, but I feel your missing the point to some extent. Yeah, the painting thing was an oversimplified metaphor.

It depends on the device you use. The discussion is concerned with "creating a mind from scratch", as far as I can gather. So there is nothing to copy, or rather you don't have a scanner/printer to work with. Which means, you can more easily make the stick figure in paint than the Mona Lisa. But that's still not quite the problem. The thing is, even making something that closely resembles the Mona Lisa with some minor changes is much easier than making a perfect copy. This is true for pretty much anything, except a "copy paste" process.

@raven: Nobody's saying it's simply to make a working brain but making a specific brain is still harder (unless copy paste, bla)
/Heck, even making an imperfect copy is most of the time easier than making a perfect one)

Tiki Snakes
2013-01-27, 11:08 AM
The voting thing is a non-issue. At least within a British legal framework.

In order to vote, you don't *just* need to be a person. You need to be a registered voter within the country. In order to be a registered voter within the country, you need to be accepted as a citizen of the country. Birth certificate, passport, etc. If the mind clones are all created in the country, that might be doable. But it will be a lengthy procedure, particularly as you completely overwhelm it by making 10 billion demands on it. Next you need to register to vote, which is another lengthy procedure. 100 years later, when the civil servants have gone through all ten billion applications, twice....

Next step is the fact that the British (and American) systems don't work on a proportional representation system. They're tied to locality. So your network of computers must be evenly distributed across electoral districts, or you're just going to end up with one incredibly secure candidate/state around the region of your network hub. But in order to register to vote somewhere you need to prove a local connection. YThis process would be easier in America, where you just need one computer hub in 50 states to decide the presidential election, but if you also want to control the house and the senate.... you're upping the number of locations you need to cover. If you can prove a local connection, you can send in a postal vote but since these are brand new citizens, the only real way to do it is residency.

The amount of hoops you need to jump through to make this work are staggering.

Not to mention you'd need to somehow provide council tax etc for the billions of new citizens and so on. Now there's a bill you don't want to find on your doormat in the morning. :smallsmile:

kitep
2013-01-27, 12:31 PM
Sorry, but I feel your missing the point to some extent. Yeah, the painting thing was an oversimplified metaphor.

It depends on the device you use. The discussion is concerned with "creating a mind from scratch", as far as I can gather.

Actually, it looks like we agree. If you're creating from scratch, then generic is way easier than specific.

But since the title of the thread is "mind upload" and not "creating strong AI from scratch", I can see a scenerio where it's accomplished by duplicating all the brain connections and whatnot, and it being easier to copy a specific existing brain than to copy a non-existing generic brain.

Ah, topic drift. Sometimes I can't keep up :)

Coidzor
2013-01-28, 01:26 PM
Actually, the p-zombie thing is likely to always be a problem.

Huh. And here, I'd thought that the whole p-zombie thing was just a bit of absurdist philosopher humor. :smallconfused:

Carry2
2013-01-28, 04:29 PM
If you think DRM is contentious now, wait until the Pirate Bay starts distributing Turing-test-capable AIs. ...For sex droids. ...Modelled after Lena Olin. ...After her cryogenically frozen remains go suspiciously missing.
BTW, I can't really claim this plot is original (http://en.wikipedia.org/wiki/I_Dated_a_Robot).


As we say in statistics, all models are wrong, some are useful. You can get away with high-level abstractions for predicting weather because you don't need to be exactly right. If you predict a quarter inch of rain next Tuesday, and it only rains 3/16ths of an inch, your prediction is wrong, but still quite useful.

...Your room for 'usefully wrong' goes way down, because every time your computer model of my brain gets it wrong, you get somebody other than me. If your goal is to actually have my particular brain and its attendant consciousness in a box, that's non-usefully wrong.
Ehhh. This question might not really be answerable short of actually building a human replicant and seeing what happens. It just seems like the stuff that happens on the quantum-mechanical level is so unstable and erratic that it's difficult to imagine it making a lasting contribution to personality except in the aggregate.

Grinner
2013-01-28, 05:07 PM
Huh. And here, I'd thought that the whole p-zombie thing was just a bit of absurdist philosopher humor. :smallconfused:

Really? I've always thought it was a very interesting idea.

I've been thinking about what I wrote though, and I've thought that maybe "p-zombie" wasn't quite the term I had been looking for. "Uncanny valley" is more what I was looking for.

The human mind is a fascinating thing. It's also a sometimes unstable thing. Between the ups, downs, and assorted psychoses, it's hard to pin down what "normal" is. A mathematically abstracted system, however, provides less fault points in the system and therefore less room for deviation. Something wholly and unnaturally consistent could result.

Carry2
2013-01-28, 06:09 PM
The human mind is a fascinating thing. It's also a sometimes unstable thing. Between the ups, downs, and assorted psychoses, it's hard to pin down what "normal" is. A mathematically abstracted system, however, provides less fault points in the system and therefore less room for deviation. Something wholly and unnaturally consistent could result.
Well, fractals and genetic algorithms are perfectly well-defined as mathematical abstractions. They still exhibit all kinds of discontinuous or unpredictable behaviour.

I find the p-zombie idea is one of those overrated (http://www.youtube.com/watch?v=X8aWBcPVPMo) philosophical concepts (basically the complement of the brain-in-a-jar.) Something very similar was used to purport that higher animals have no form of conscious experience or perception of pain. I incline more to the view that if it looks like a duck, swims like a duck and quacks like a duck, it's a duck.

warty goblin
2013-01-28, 06:39 PM
The human mind is a fascinating thing. It's also a sometimes unstable thing. Between the ups, downs, and assorted psychoses, it's hard to pin down what "normal" is. A mathematically abstracted system, however, provides less fault points in the system and therefore less room for deviation. Something wholly and unnaturally consistent could result.

To paraphrase von Neumann, show me exactly what you think a computer cannot do, and I'll make it do that. You can bend functions to do just about anything conceivable, if you can specify the problem accurately enough.

I suspect the problem with accurately simulating a person's brain is that you need a very good understanding of how brains work in general, and a very good understanding of how that particular person's brain works. An understanding so good it's probably utterly unobtainable.