PDA

View Full Version : Science Transhumanists in the Playground: Thread 0



Pages : 1 [2]

pendell
2014-03-17, 12:35 PM
I feel that is ridiculous. What possible gain would we get for having prisoners live longer just to be in prison?

Because one lifetime isn't long enough for the perpetrator to pay for what he's done? That seems to be the view of the piece's author.

I suggest he seek professional help, myself.

Still, that does bring up the problem with this kind of technology -- once it's available to politicians and the general public, someone's probably going to try something like this.

Respectfully,

Brian P.

Mauve Shirt
2014-03-17, 01:31 PM
Eh, people are going to try all kinds of bad stuff with whatever new technology comes into this world.

banthesun
2014-03-17, 10:35 PM
Thanks for the link Brian, there's some interesting stuff there. I'd actually been thinking of similar stuff lately so I've a few things I can bring up.

On the idea of using extended periods of time as a punishment for criminals, there does seem to be a benefit to having harsh punishments in extreme cases. Not for the criminal's rehabilitation, but for those affected by the crime. I've seen plenty of cases where the victims or their families are outraged at the 'light' punishment the criminal received. It points out that part of the penal system is to convince members of the public that justice has been served, helping to preserve the social contract. Personally, I wish this wasn't a necessary part of the system, but for now it appears that it still is. I could see how with immortality treatments there could be a desire to see new and harsher punishments.

Another interesting point the article made was in regard to manipulating prisoners' minds. I find it interesting that people see a difference in altering a person's mental state through technological means and altering their mental state through imprisonment. It begs the question of what a prison sentence actually means. If someone appears to have reformed, they're granted bail, implying that the system has already worked for them (at least provisionally). However, if someone hasn't reformed, they're still let free at the end of their sentence. Does this imply that the sentence was a measured punishment, and that after 25 years they've paid for murder (for example)? What does it say for the role of prisons in protecting society if they're letting unreformed criminals back into society?

If we had technological means of directly influencing the minds of criminals we could guarantee that they were rehabilitated, potentially much faster and at less cost. But this is seen as invasive, and a form of stripping away their liberties at a level unseen by imprisonment or other conventional punishments. It's hardly a new idea too, Clockwork Orange comes to mind as a story based around the premise. It seems that the difference is seen as being between a change brought about by chemical or physical means, and a change brought about by experience. This plays into ideas about the continuity of the self, an important issue in transhumanist theory. The change brought about by experience implies a decision to change, and as such a direct continuity of the mental state before the change. Even though the change might be forced by the prison environment (which is often seen as tue point of prisons), the previous mental state got a say in the matter. Perhaps if this is seen as an important aspect of the process, technological treatments could still be applied to prisoners who ask for them. Hopefully, the issues in Clockwork Orange would be resolved, so prisoners could give informed consent, and not suffer side effects. However, the issue remains of what to do with prisoners who both refuse this treatment and refuse to reform. Perhaps at some level, punishments would involve the loss of freedom of identity, with certain prisoners being forced to reform regardless of mental continuity issues. Mentally ill people routinely get treatments that can be said to affect them in a similar way, do they not? Perhaps in the future a similar approach might be taken to those deemed a risk to society.

I was going to write more (particularly on the idea of capital punishment in a society with backups) but typing on my phone is hard, and I need to grab some food. It's probably better not to cram too much into one post, so I'll leave this here for now. Hopefully some of you find it interesting. :smallsmile:

AtomicKitKat
2014-03-18, 10:32 AM
Hmm. See, to me, as painful as living is at times(due to multiple chronic injuries that I never allowed to heal fully, as well as general self-abuse of joints), it is preferrable to the non-existence of death. As such, to me, death is something to avoid(or if possible, almost reach, but swerve away at the end). Therefore, to me, the death penalty should be the final option in the case of criminals in a society where life extension is viewed as a basic privilege of existence. You void the social contract, and your life is null.

Ravens_cry
2014-03-18, 11:14 AM
That's about the saddest thing you can say about someone, no? That the world is better off without you in it.:smalleek:

warty goblin
2014-03-18, 11:29 AM
That's about the saddest thing you can say about someone, no? That the world is better off without you in it.:smalleek:

I ask myself that relatively frequently. It seems a pertinent question.

AtomicKitKat
2014-03-18, 11:50 AM
That's about the saddest thing you can say about someone, no? That the world is better off without you in it.:smalleek:

I did feel that way before, but for those cases where it's pretty straightforward, it should always be available. And I think we should step back from this line of discussion just a little.

Coidzor
2014-03-18, 02:26 PM
That's about the saddest thing you can say about someone, no? That the world is better off without you in it.:smalleek:

Well, yeah. It is sad, but it also generally coincides with some form of horror as well that mitigates the potential for sympathy or empathy.

Spiryt
2014-03-18, 03:01 PM
That's about the saddest thing you can say about someone, no? That the world is better off without you in it.:smalleek:

Eh, I could probably easily think about plenty of people I would prefer world to be without...

In fact, everyone probably can, even due to simple competition for whatever.

But most of those probably have families, friends, lovers, dogs or whatever, and they make someone's word 'better'.

Even the most degenerate scum can have good relationship with someone, be really good broker, or whatever.

So in other words I don't think it's possible to judge something like that objectively.

Admiral Squish
2014-03-18, 03:17 PM
So, after watching the new Cosmos thing about life and evolution, a transhumanism question has occurred to me.


What does the thread think about the possible applications of transhumanist technologies and techniques to facilitate space exploration, exoplanet exploration/colonization, and other, similar uses?
Do you think humans would favor technological transhumanism, such as mind-uploading or cybernetics, or would they favor biological transhumanism, genetic modification and chemical treatments, for the purpose of colonizing an alien planet? How many people would be willing to dramatically alter themselves and their children to embark on a generation ship journey, or to settle a new planet?

Ravens_cry
2014-03-18, 03:42 PM
I don't think we're even close to the point where we can say either way.
I am delighted with the improvements in cybernetic technology, see above, but we still have a long, long way to go there, and we are even farther away from being capable of the biological modification necessary to survive in space 'natively'.

Coidzor
2014-03-18, 03:59 PM
What does the thread think about the possible applications of transhumanist technologies and techniques to facilitate space exploration, exoplanet exploration/colonization, and other, similar uses?
Do you think humans would favor technological transhumanism, such as mind-uploading or cybernetics, or would they favor biological transhumanism, genetic modification and chemical treatments, for the purpose of colonizing an alien planet?

Presumably some groups that want to be robo-society would go off and form robot colonies where they don't need a gravity well, but I'd imagine that a combination of both technologies would ideally be used, barring the technological progress overwhelmingly favoring one approach over the other.

I can definitely see an initial wave of robots overseen by people favoring or choosing to temporarily use inorganic shells followed by progressively more biological elements as terraforming/adaptation to the planetary environment occurs.

Unless they *really* needed a gravity well for some reason or another though, I don't see why those who favor uploading themselves into robotic bodies would *want* to colonize a planet aside from the initial street cred or ending up in a star system in which there wasn't a suitable number of non-planetary bodies to use for resources instead of any planets in the neighborhood.


How many people would be willing to dramatically alter themselves and their children to embark on a generation ship journey, or to settle a new planet?

At least initially less of a portion of the populace than those who are currently willing to go colonize Mars without any possibility of ever returning even if we gained the technological ability to have routine travel between Earth and Mars, I imagine, but probably not that much less.

Murska
2014-03-18, 07:04 PM
Eh, I could probably easily think about plenty of people I would prefer world to be without...

In fact, everyone probably can, even due to simple competition for whatever.

But most of those probably have families, friends, lovers, dogs or whatever, and they make someone's word 'better'.

Even the most degenerate scum can have good relationship with someone, be really good broker, or whatever.

So in other words I don't think it's possible to judge something like that objectively.

Well, the judgment would be based on whether someone's total value is positive or negative. Given that I at least would also assign a very high value on a human life, basically one of the only ways to not be worth keeping alive would be if keeping that person alive means other people will die. The other possibilities are extreme bordercases.

AtomicKitKat
2014-03-18, 07:33 PM
Space exploration...

I presume the mentioning of a gravity well is due to the terrible effects of weightlessness upon a body adapted for survival under gravitational force(bone and muscle loss, blood pooling, etc.). In which case, to colonise a planet with less than FTL travel, the easiest is probably to upload one's mind into a robotic body(to maintain the ship's controls, bearing, correcting auto-pilot, etc.), while keeping a genetic sample(as "simple" as sperm/ova banks) to create descendants/clones with which to download one's mind into once the planet is reached(or when the conditions are "habitable"). I'm kind of sorry I missed learning about the planned trip to Mars till entries were already closed. :smallfrown:

memnarch
2014-03-19, 02:28 PM
I am somewhat surprised that Schlock Mercenary was only mentioned for the gate clones and not the functionally immortal Bradicor and the things they now live for (http://www.schlockmercenary.com/2001-12-18). Not to mention the problems (http://www.schlockmercenary.com/2007-09-02) they had with people becoming immortal.

HalfTangible
2014-03-19, 02:33 PM
Aside from using Transhumanism and Posthumanism as story fodder for a far future setting (specifically one where humanity splits itself apart over the various manners in which the post-human may come about, and putting one of each on a different planet/moon) I'm interested in cybernetic enhancements, particularly to the brain. Apparently it'll be possible to add circuitry to the mind so that humans literally become smarter.

Also, space travel and AI copied from our minds. Because nothing would be cooler to me than having an AI on my arm that came from my own head, plus another me exploring the galaxy and looking for alien life.

Ravens_cry
2014-03-19, 06:48 PM
Aside from using Transhumanism and Posthumanism as story fodder for a far future setting (specifically one where humanity splits itself apart over the various manners in which the post-human may come about, and putting one of each on a different planet/moon) I'm interested in cybernetic enhancements, particularly to the brain. Apparently it'll be possible to add circuitry to the mind so that humans literally become smarter.

What is 'smarter'? Until we can answer that fully, I don't think we can really say if that's possible or not. Your phone is a bazillion times better at math but, except for in that very limited way, it is hardly smarter than you.


Also, space travel and AI copied from our minds. Because nothing would be cooler to me than having an AI on my arm that came from my own head, plus another me exploring the galaxy and looking for alien life.
Yeah, but would you want to be stuck as the AI in some jerk's arm? Would you want to hang out with you all the time at your beck and whim?

HalfTangible
2014-03-19, 07:10 PM
What is 'smarter'? Until we can answer that fully, I don't think we can really say if that's possible or not. Your phone is a bazillion times better at math but, except for in that very limited way, it is hardly smarter than you.I'm quoting a documentary. It seemed to imply either that we could run such calculations in our head, improve memory, reaction time, etc.

It wasn't particularly clear, which annoys the hell out of me.


Yeah, but would you want to be stuck as the AI in some jerk's arm? Would you want to hang out with you all the time at your beck and whim?

A) I'd get to spend eternity in the internet, able to wire myself around and see everything. I'd have the power to act without organic me's consent, and wouldn't have to go to college anymore, unless I felt like helping organic me. And organic me is kind of a lazy jerk anyway, who won't ask me for much. So... yeah, I'd like that a lot.

B) Presumably I'd still be me, so what the human me cared about, AI me would too. This wouldn't change much since the two would still go pretty much everywhere together. If anything I'd be most upset at the loss of ability to eat and drink, but they might develop an uplink/robopart for that.

Murska
2014-03-19, 07:21 PM
I feel a good definition for intelligence is the ability to direct future into states that correspond to higher values in your utility function.

That is to say, an intelligent actor is capable of acting in a way that makes future better for him.

Ravens_cry
2014-03-19, 07:37 PM
I feel a good definition for intelligence is the ability to direct future into states that correspond to higher values in your utility function.

That is to say, an intelligent actor is capable of acting in a way that makes future better for him.
So, by that definition, the most intelligent person is the one who acquires the most power and control over things while still maintain the interpersonal relationships required to keep them personally happy. Sounds like the most intelligent person by that definition is a psychopathic multi-billionaire. Since they have little need for lasting relationships, they can spend that time and energy on expanding their power and influence, no matter how ruthless they must be.

I'm quoting a documentary. It seemed to imply either that we could run such calculations in our head, improve memory, reaction time, etc.

It wasn't particularly clear, which annoys the hell out of me.

No, it wouldn't be because we really don't have a clear idea what intelligence is.



A) I'd get to spend eternity in the internet, able to wire myself around and see everything. I'd have the power to act without organic me's consent, and wouldn't have to go to college anymore, unless I felt like helping organic me. And organic me is kind of a lazy jerk anyway, who won't ask me for much. So... yeah, I'd like that a lot.

B) Presumably I'd still be me, so what the human me cared about, AI me would too. This wouldn't change much since the two would still go pretty much everywhere together. If anything I'd be most upset at the loss of ability to eat and drink, but they might develop an uplink/robopart for that.
The Internet is a fine place, but I wouldn't want to live there. You would still be a servant/slave to a self claimed 'lazy jerk' who you can now look at objectively. I like me well enough, but hanging out with a second me 24/7 would be torture, let alone one who can boss me around.

Murska
2014-03-19, 08:39 PM
So, by that definition, the most intelligent person is the one who acquires the most power and control over things while still maintain the interpersonal relationships required to keep them personally happy. Sounds like the most intelligent person by that definition is a psychopathic multi-billionaire. Since they have little need for lasting relationships, they can spend that time and energy on expanding their power and influence, no matter how ruthless they must be.

If we take out statistical variation (IE luck) then someone becoming a multibillionaire is a rather strong signifier of intelligence. And sociopathy can also correlate with intelligence, but I think there's some overlap with utility functions and where they're directed here - if your utility function has goals that are easy to satisfy, that does not make you more intelligent by being able to satisfy them, as intelligence would be the capability to steer future towards your preferred state and if your preferred state is easy to reach, that says nothing about your ability to steer the outcomes.

I'm confused why an AI would be a servant or a slave? It's a person, with all the rights that entails, including not being bound to someone or forced to do what he says.

Coidzor
2014-03-19, 09:27 PM
Yeah, but would you want to be stuck as the AI in some jerk's arm? Would you want to hang out with you all the time at your beck and whim?

Of course not, that's why you'd have to pare away those parts of a person in order to make it no longer a person so that it can be owned.

Ravens_cry
2014-03-19, 09:38 PM
If we take out statistical variation (IE luck) then someone becoming a multibillionaire is a rather strong signifier of intelligence. And sociopathy can also correlate with intelligence, but I think there's some overlap with utility functions and where they're directed here - if your utility function has goals that are easy to satisfy, that does not make you more intelligent by being able to satisfy them, as intelligence would be the capability to steer future towards your preferred state and if your preferred state is easy to reach, that says nothing about your ability to steer the outcomes.

I'm confused why an AI would be a servant or a slave? It's a person, with all the rights that entails, including not being bound to someone or forced to do what he says.

More like chance than luck, but my point is that that definition implies some of the worst of humanity (the psychopathic who only works toward their own gain and power, whatever results it might have for others) is the 'most intelligent'.
As for why such an AI is a slave, it's in your arm. Obviously, it's there for a purpose or it could say 'Well, it's been fun, but I am going to blow this lolly stand, meatbag'. How is that not a slave?

Of course not, that's why you'd have to pare away those parts of a person in order to make it no longer a person so that it can be owned.
So a brainwashed slave.

Coidzor
2014-03-19, 09:55 PM
So a brainwashed slave.

Brainwashing would imply that there was something to brainwash. Slave would imply that there was still capacity for personhood.

Ravens_cry
2014-03-19, 10:11 PM
Brainwashing would imply that there was something to brainwash. Slave would imply that there was still capacity for personhood.
How much must you 'pare a person down' until it's not a person any more?

HalfTangible
2014-03-19, 10:13 PM
No, it wouldn't be because we really don't have a clear idea what intelligence is.The ability to think in complex and/or abstract terms? :smallconfused:


The Internet is a fine place, but I wouldn't want to live there. You would still be a servant/slave to a self claimed 'lazy jerk' who you can now look at objectively. I like me well enough, but hanging out with a second me 24/7 would be torture, let alone one who can boss me around.

I could play warcraft 24/7. I could play Dota. I could roleplay online. I could play guns of icarus and get some trading cards to sell. There are like a billion ways to entertain yourself on the internet. 10 trillion if you count porn (i don't, but whatever)

Other people keep telling me I'm NOT a lazy jerk, so maybe it would give me some perspective.

Also, I would not be a servant or a slave. Organic me is not stupid enough to put himself in an authoritarian position over an AI that's much smarter and more powerful than he is.

Coidzor
2014-03-19, 10:28 PM
How much must you 'pare a person down' until it's not a person any more?

*shrug* Don't know yet. Would have to find out if we were going to make little AI-like things based upon ourselves to fit into electronics implanted in our bodies though. And not have different copies of the same person committing homi-suicide. Also, presumably, the macabre path that seems to present itself as one of the more efficient/easily acheivable ways to get artificial wombs and replacement organs.

Personally I'd prefer being able to multitask better so I could keep up with my stupid internet things and read at the same time without having to choose between the two leisure activities. Or focus on a complex task in meat space and do some writing at the same time.

Ravens_cry
2014-03-19, 11:21 PM
@HalfTangible:
Complex is relative. Math is complex to humans, but the machines we build are awesome at it. That doesn't mean having a maths coprocessor implanted in our brain makes us 'smarter', it just means we have a calculator in our head.

warty goblin
2014-03-19, 11:37 PM
@HalfTangible:
Complex is relative. Math is complex to humans, but the machines we build are awesome at it. That doesn't mean having a maths coprocessor implanted in our brain makes us 'smarter', it just means we have a calculator in our head.
Machines are good at arithmetic. That's a very different thing from being good at math.

HalfTangible
2014-03-19, 11:38 PM
@HalfTangible:
Complex is relative. Math is complex to humans, but the machines we build are awesome at it. That doesn't mean having a maths coprocessor implanted in our brain makes us 'smarter', it just means we have a calculator in our head.

Well now you're just nitpicking an argument that I'm not even the original presenter of. :smallannoyed: And not particularly well, at that, you know full well 'complex' meant 'complex to humans'.

Ravens_cry
2014-03-19, 11:47 PM
Well now you're just nitpicking an argument that I'm not even the original presenter of. :smallannoyed: And not particularly well, at that, you know full well 'complex' meant 'complex to humans'.
Well, does been able to do uber math in your head thanks to a chip in your head make you smarter? Within a very narrow definition perhaps, but not much of one and not a very useful one as you can hold a much bigger computer outside your skull.

Murska
2014-03-19, 11:57 PM
More like chance than luck, but my point is that that definition implies some of the worst of humanity (the psychopathic who only works toward their own gain and power, whatever results it might have for others) is the 'most intelligent'.
As for why such an AI is a slave, it's in your arm. Obviously, it's there for a purpose or it could say 'Well, it's been fun, but I am going to blow this lolly stand, meatbag'. How is that not a slave?


First of all, I don't see any logical problem with 'bad people' being intelligent. And you seem to be using sociopathic and psychopathic in some sort of a strange mixed way that confuses me.

More importantly, everyone is working for their own gain, assuming that 'own gain' means 'fulfilling my utility function'. For some people, their utility function simply values things like other people's happiness, and for some people it does not. Intelligence, as I mentioned, does not rely on what your goals are, it is just the ability to pursue them. Therefore there is no reason to say that those, whose goals include personal power and selfish luxury, would be more intelligent than those whose goals include lowering the amount of human suffering in the universe and making their friends and family more happy.

The AI might be located in your arm at a given moment, but it only becomes anything possibly comparable to a slave if it does not have a free choice of leaving your arm whenever it wants.


EDIT: Oh, and being able to do more complex arithmetic in your head does make you more intelligent, but not that much, because it increases your capabilities, but not by a huge amount.

HalfTangible
2014-03-20, 12:02 AM
Well, does been able to do uber math in your head thanks to a chip in your head make you smarter? Within a very narrow definition perhaps, but not much of one and not a very useful one as you can hold a much bigger computer outside your skull.

I don't recall saying anything about math. That was you, as an example for something complex for us but not for machines.

Coidzor
2014-03-20, 12:04 AM
@HalfTangible:
Complex is relative. Math is complex to humans, but the machines we build are awesome at it. That doesn't mean having a maths coprocessor implanted in our brain makes us 'smarter', it just means we have a calculator in our head.

Being able to nigh-automatically do complex calculus on the fly might be interesting though. Or even less complex calculus.


First of all, I don't see any logical problem with 'bad people' being intelligent. And you seem to be using sociopathic and psychopathic in some sort of a strange mixed way that confuses me.

Well, it is a rather dismal view, so even without going into the logic there's a level of rejection that anyone who thinks of themselves as approaching moral would have.

Because they're mixed together by default coloquially? :smallconfused:


More importantly, everyone is working for their own gain, assuming that 'own gain' means 'fulfilling my utility function'. For some people, their utility function simply values things like other people's happiness, and for some people it does not. Intelligence, as I mentioned, does not rely on what your goals are, it is just the ability to pursue them. Therefore there is no reason to say that those, whose goals include personal power and selfish luxury, would be more intelligent than those whose goals include lowering the amount of human suffering in the universe and making their friends and family more happy.

You did seem to imply that such was the case earlier though, just FYI.

Ravens_cry
2014-03-20, 12:18 AM
I don't recall saying anything about math. That was you, as an example for something complex for us but not for machines.
Well, you said it was meant as 'complex for human', and math is complex for humans. My point is we really don't have intelligence pinned down yet, and until we really do, we can't really say some implant will make us smarter or not.
Also, I think it's more than a little dangerous to meddle with something we don't really have a clear idea about.
@Murska:
There's no problem with 'bad people' being intelligent, it's the idea that being an amoral psychopath who holds great power is THE paragon of intelligence.

HalfTangible
2014-03-20, 12:35 AM
Well, you said it was meant as 'complex for human', and math is complex for humans. My point is we really don't have intelligence pinned down yet, and until we really do, we can't really say some implant will make us smarter or not. That's a bit of a leap from point A to point C, isn't it? :smallconfused:

Also, improving memory and reaction time ARE things that can be done that are associated with intelligence. We understand those just fine. You are seriously nitpicking here.

Also, I think it's more than a little dangerous to meddle with something we don't really have a clear idea about.

The main issue with it is that we're more likely to fry our brains than actually improve it at this point in time since the brain is so delicate

Murska
2014-03-20, 12:43 AM
Well, it is a rather dismal view, so even without going into the logic there's a level of rejection that anyone who thinks of themselves as approaching moral would have.

No. I believe myself to be a moral person, and I believe it is utterly wrong to instinctively reject anything that sounds ethically questionable. What is true is true, and ignoring it because you don't like it is a horrible thing that will prevent you from achieving any change for the better and result in atrocities more terrible than it otherwise would have.



Because they're mixed together by default coloquially? :smallconfused:

Well, if they are, then when using them in a context where the particular definition you are using matters, you should really clarify. Being sociopathic doesn't make you a bad person, while being a psychopath does.



You did seem to imply that such was the case earlier though, just FYI.

I see. Well, I did not mean to, and I would very much like for you to elaborate on this so I can try to explain where you are misunderstanding me.



There's no problem with 'bad people' being intelligent, it's the idea that being an amoral psychopath who holds great power is THE paragon of intelligence.

That's a strange idea that doesn't seem to have any basis on anything that has been said in this thread so far.

Ravens_cry
2014-03-20, 12:14 PM
Murska I was looking at the logical conclusion of your definition of intelligence. And HalfTangible, what does reflexes have to do with intelligence? You aren't going to be able to get any better from an implant in your brain than what the peak your body is already capable of anyway, as the transmission speed for motor nerves is fundamentally limited. You are going to basically have to go full body cyborg at that point, and the nerves in the brain are still going to be a bottleneck.

HalfTangible
2014-03-20, 12:20 PM
Murska I was looking at the logical conclusion of your definition of intelligence. And HalfTangible, what does reflexes have to do with intelligence? You aren't going to be able to get any better from an implant in your brain than what the peak your body is already capable of anyway, as the transmission speed for motor nerves is fundamentally limited. You are going to basically have to go full body cyborg at that point, and the nerves in the brain are still going to be a bottleneck.

Think I'm done talking to you.

Ravens_cry
2014-03-20, 12:46 PM
Think I'm done talking to you.
I am sorry this conversation has degenerated to that point, as I was rather enjoying this discussion.

Murska
2014-03-20, 01:32 PM
Murska I was looking at the logical conclusion of your definition of intelligence.

I'm still waiting for any elaboration on how such a non-sequitur could possibly have any logic behind it.

Admiral Squish
2014-03-21, 11:30 AM
Perhaps exoplanet colonization is a bit too far-fetched, but what about using transhumanism to expand the usable area of the earth? Making humans better-adapted to living on less food to allow them to live in areas where it's difficult to raise or ship food. Making humans better able to resist extreme heat or cold, to colonize deserts or even Antarctica. Allowing humans to tap normally unusable resources, like making it so we can drink seawater. Or even making it so we can live in the oceans.

Tyndmyr
2014-03-31, 11:34 PM
Murska I was looking at the logical conclusion of your definition of intelligence. And HalfTangible, what does reflexes have to do with intelligence? You aren't going to be able to get any better from an implant in your brain than what the peak your body is already capable of anyway, as the transmission speed for motor nerves is fundamentally limited. You are going to basically have to go full body cyborg at that point, and the nerves in the brain are still going to be a bottleneck.

Why so? As a simple example, a computer can calculate complex math problems far, far faster than we usually can, and thus, will get a vastly faster response time to such a query. It seems likely that a great deal of processing could be sped up to faster than any reasonable current human time. Now sure, there are still theoretical limits here...but improvement is still improvement.

Targ Collective
2014-03-31, 11:49 PM
The physical body is a living being that acts as a vehicle, but it has its own consciousness - this is evident in the fact that you do not have to consciously control your own heartbeat. I would say that I would not turn to machine augmentation but would rather communicate with my body's consciousness directly if I were to ask it to change - and if it were to say no I would respect that.

Murska
2014-04-01, 03:00 AM
That's a strange leap of logic to make. We don't consciously control a lot of things that happen, but that does not mean they have to be controlled by anything that'd count as a consciousness using any reasonable definition.

There's no reason to believe that your body would contain a separate mind from your, well, mind, that would be intelligent enough to communicate with you.

Targ Collective
2014-04-01, 04:20 AM
What is instinct and what we consider animal response? What is intuition? These all constitute such things. I lost balance and had to walk backwards down my stairs once, fell back against my front door; I was stunned physically but my *consciousness* was happy to walk right away. It was my body that needed to recover.

Let us take another example, eye dilation. We don't control it yet it happens in a controlled way in response to outside stimulus. There's a consciousness guiding that and sure as hell it isn't the guy using the physical body as a vehicle that's doing that.

Reflexes. Sweat. The list goes on. And visitors to hot countries sweat more than the natives so it's more than a genetic response; it's consciousness driven.

Eldan
2014-04-01, 05:55 AM
Let us take another example, eye dilation. We don't control it yet it happens in a controlled way in response to outside stimulus. There's a consciousness guiding that and sure as hell it isn't the guy using the physical body as a vehicle that's doing that.

A camera with a light sensor can do the same thing. Is my camera conscious? My fridge measures it's inside temperature and regulates it. Should I try talking to it?

Murska
2014-04-01, 06:58 AM
Exactly. A rock falls down when dropped off an airplane. But no consciousness has to guide it to do that.

Of course, technically speaking you can be correct in that yes, my fridge has a mind (takes in sensory data and then changes its behaviour based on a preference ordering to steer future into a state it prefers). That's exactly how my brain works. My brain is simply vastly more complex. And, again technically, yes I could communicate with my fridge by changing its inside temperature (input) and it would change its output in return.

But I could not ask my fridge whether or not it wants me to switch it out for a better one. Its 'mind' is not wired for that, it has no such concepts, no such preferences and is not intelligent enough to have any sort of understanding about what I mean, even if I were to communicate the question to it by pulsing in hot and and cold air using a morse code, after spending years teaching it morse from basic mathematical concepts. The same is true with my body (not including my brain), which is already a confusing idea from a biological perspective as most of the functions you listed are controlled by the brain, but admittedly some are not. My spinal column is simply not able to comprehend the sort of concepts required to communicate the idea of upgrading it with cybernetics, and if I were to upgrade it with enough intelligence to be able to do that, then its response would be dictated by those upgrades I added, not some sort of a 'hidden opinion' it had before that.

Tyndmyr
2014-04-01, 11:26 AM
Let us take another example, eye dilation. We don't control it yet it happens in a controlled way in response to outside stimulus. There's a consciousness guiding that and sure as hell it isn't the guy using the physical body as a vehicle that's doing that.

You are using the word consciousness differently than the rest of the world. Those are considered unconcious actions specifically because it is something we do not need to focus on controlling. Many unconcious actions(breathing, etc) are controlled by the medula oblongata. Others vary all the way down to chemical reactions.

Just because something happens does not mean it is a something you can have a conversation with.

warty goblin
2014-04-01, 11:37 AM
You are using the word consciousness differently than the rest of the world. Those are considered unconcious actions specifically because it is something we do not need to focus on controlling. Many unconcious actions(breathing, etc) are controlled by the medula oblongata. Others vary all the way down to chemical reactions.

Just because something happens does not mean it is a something you can have a conversation with.

In the case of gasping, the brain can be literally taken out of the loop. Should blood oxygen fall low enough, a nerve in the chest will cause you to inhale without any input from the brain at all.

Admiral Squish
2014-04-01, 12:03 PM
I think I need to stop watching videos about prosthetics on youtube, I get a bit obsessive about it. I want my dang robot legs already! I want adjustable legs, and bionic legs that do power stuff, and those fancy cheetah-blade legs for runnin', and fancy-shmancy artistic legs! Stupid laws preventing me from upgrading.

Necroticplague
2014-04-01, 01:06 PM
I think I need to stop watching videos about prosthetics on youtube, I get a bit obsessive about it. I want my dang robot legs already! I want adjustable legs, and bionic legs that do power stuff, and those fancy cheetah-blade legs for runnin', and fancy-shmancy artistic legs! Stupid laws preventing me from upgrading.

What laws? To my knowlege, the only relevant laws is that any procedure with anestetics required a license, and a lot of people with licenses aren't willing to do it. If you can man up enough to tear it off yourself, without anestetics, no legal problems. At least, that's what I read when I was looking up the process for implanting magnets so you can feel magnetic fields. Decided to wait till I finished school (thus didn't need to type as much, can take while for sliced open fingers to heal).

Tyndmyr
2014-04-01, 02:31 PM
What laws? To my knowlege, the only relevant laws is that any procedure with anestetics required a license, and a lot of people with licenses aren't willing to do it. If you can man up enough to tear it off yourself, without anestetics, no legal problems. At least, that's what I read when I was looking up the process for implanting magnets so you can feel magnetic fields. Decided to wait till I finished school (thus didn't need to type as much, can take while for sliced open fingers to heal).

I looked into that, and decided against it for the reason that I don't feel with my eyes...hands are inherently what you do things with, and sometimes, I do things with magnets, etc. Sometimes these magnets are ludicrously powerful, and thus, I would be unable to handle them myself. So, while sensing magnetic fields is cool, I'd rather be able to do mad science.

Also, I strongly suggest seeking medical advice prior to DIY surgery. Just saying.

Admiral Squish
2014-04-01, 02:52 PM
As far as I'm aware, it's illegal for a doctor to remove a healthy limb unless it's necessary for the person's survival. Even if it's not illegal, per se, I suspect it's probably a risky endeavor, considering ethics committees and such.

I do have a standing offer from a friend to hack off the requisite limbs, but I would really, really rather have it done in a sterile, controlled environment, when I'm too doped up to feel it.

Jurai
2014-04-02, 04:16 PM
I want to be uploaded to the internet, and then downloaded into a robot body. Screw this ORGANIC malarky.

Admiral Squish
2014-04-05, 09:02 PM
I want to be uploaded to the internet, and then downloaded into a robot body. Screw this ORGANIC malarky.

I think I wanna save that step for further down the road when we have more advanced brain-mapping tech and better machine-to-brain feedback. In the mean time, ROBOT LEGS!

Eldan
2014-04-06, 05:19 AM
Personally, I like organic bodies. It is also my personal opinion that by the time we will have artificial bodies with all the functionality of human ones, machines, cybernetics and robots will look just about indistinguishable from organics anyway. Bionics are the way to go.

Necroticplague
2014-04-06, 06:40 AM
Main problem I see with brain uploading like that, beyond even the sheer complexity, is the problem of data mutability. Namely, its hard to modify info already encoded on a person's brain, but computer files can be modded relatively easily. So what happens when somebody simply deletes, say,'Necro.pers'. People as digital beings turns simple DDOS attacks from merely annoyances to essentially acts of mass murder (or at least the equivalent of a terrorist attack, depending on how much the server backs-up files). Or heck, just the possibility of essentially splitting yourself with a simple copy-paste. Not to mention the kind of heck this would play with any type of internet security. And the ethics of such: if a person sends themselves to you as a trojan, is deleting them murder? would it be self-defence? Of course, by the time we can do brain uploading, I suspect we would have already dealt with those ethics due to essentially "test-running" virtual existence with AIs.

Cheers
2014-04-06, 12:43 PM
Hey everyone!

I don't know if this has been mentioned before (I haven't read the entire thread) but there are plans for a version of the paralympics that would allow self-powered prosthetics and other cybernetic enhancments:

http://www.cybathlon.ethz.ch/?doing_wp_cron=1396805804.5747458934783935546875
(I don't know how to make those fancy link-thingies :smallfrown: )

I especially like the dual award system, one for the pilot and the other for the manufacturer. It is interesting, though I doubt it will find much publicity outside its direct environment and amongst people already in the field.

Anyway, I thought you might find it interesting.

Wardog
2014-04-20, 07:16 AM
Main problem I see with brain uploading like that, beyond even the sheer complexity, is the problem of data mutability. Namely, its hard to modify info already encoded on a person's brain, but computer files can be modded relatively easily. So what happens when somebody simply deletes, say,'Necro.pers'.

The other big problem with "brain uploading" (I think mentioned many pages back but then dropped) is that what you are really doing is person duplicating.

If I plug myself into a brain uploading device and and upload myself into a cool robot body...

... on completion, the robot will wake up thinking "Woot! It worked! I have uploaded myself into a cool robot body!"

Meanwhile, I'll still be sitting there, plugged into the device, thinking "hey - has this worked? I'm still in my body - and what is that robot doing?"


If I took this option because I was dying, and thought this was a way to achive immortality, then it only counts as "immortality" in the same (mostly metaphorical way) that having children and passing on your ideals to them counts as "immortality". (But in this case, my "child" has a cool robot body, shares all my memories and personality, and thinks it is me). But at least I will be dead soon, so out of the way, and can will all of your stuff to my new robot child.

If I weren't dying, I now have the problem of being duplicated. If the law treats AIs as people, then my robot duplicate will presumably have rights, and can own property etc, but who provides it? Does he immediately get a half-share in all my stuff? Do all the contracts I have signed apply to him as well?

If so, that's going to cause a lot of problems. If not, then its arguably even worse, as my robot duplicate (which thinks its me in a new body) is going to lose almost everything it thinks is its, and will probably become resentful of society and/or me. (As far as its concerned, I have transfered myself into a new body, but my old body is keeping all my stuff and my place in society, and claiming to be me).

When Riker got duplicated in ST:TNG, at least he had the advantage of living in a post-scarcity society, where there all needs were provided for, and "acquiring stuff" is no longer something people care about. And he still had all sorts of problems despite that. But just because (in this scenario) brain-uploading is possible, doesn't mean all the social and economic concepts to deal with it are.

Necroticplague
2014-04-20, 08:08 AM
I'm pretty sure any method of analyzing your brain sufficiently well to create a copy good enough to pass as you would involve turning the brain into many small pieces, making the point moot because the original you is now dead (though that only pushes back the issue until you decide to copy-and-paste your file).

And even so, I don't think it would be much of an issue. If he has all of your memories, then as far as everyone else is concerned, it is you. The only person for whom their is any distinction to be made is the original (assuming they still exist). I liken it more to how our body has every cell in it replaced every seven or eight years, not to having kids. Technically, you're a 100% new person, but nobody ever notices the difference.

memnarch
2014-04-20, 11:00 AM
I'm pretty sure any method of analyzing your brain sufficiently well to create a copy good enough to pass as you would involve turning the brain into many small pieces, making the point moot because the original you is now dead (though that only pushes back the issue until you decide to copy-and-paste your file).

And even so, I don't think it would be much of an issue. If he has all of your memories, then as far as everyone else is concerned, it is you. The only person for whom their is any distinction to be made is the original (assuming they still exist). I liken it more to how our body has every cell in it replaced every seven or eight years, not to having kids. Technically, you're a 100% new person, but nobody ever notices the difference.

Technically, neurons don't usually get replaced when they die. There are a few exceptions to that since there are neural stem cells which can add or replace some of the neurons in your nervous system, but the majority are with you from birth to death. Also, teeth, though mostly lacking in cells, don't get replaced either.

Murska
2014-04-20, 02:07 PM
Technically, because particles don't actually exist, there's just an amplitude configuration consisting of the entire universe, you're not the same as you were a picosecond ago.

reaverb
2014-04-20, 07:37 PM
Technically, because particles don't actually exist, there's just an amplitude configuration consisting of the entire universe, you're not the same as you were a picosecond ago. I've always taken this view. That a robot duplicate isn't "you" is a completley logically consistant opinon but that argument extends to you from a few minutes ago being a different person from the current you.

I'd personally be creeped out by a one time organic body -> machine replacement operation. I'd far prefer to replace tiny bits of myself, possibly down to the level of the individual neuron, over time until I was completely transformed. This probably isn't rational, but emotions are never rational if you search for root causes long enough.

Murska
2014-04-20, 08:14 PM
Emotions can well be rational. In this case they aren't, but often they are. To me, being rational just means doing what has the best chance of winning, given what I know. And winning here means an end result that ranks high in my preference ordering of possible end results. Emotions are often useful for various things, and even more often cause no appreciable harm.

reaverb
2014-04-20, 08:50 PM
Emotions can well be rational. In this case they aren't, but often they are. To me, being rational just means doing what has the best chance of winning, given what I know. And winning here means an end result that ranks high in my preference ordering of possible end results. Emotions are often useful for various things, and even more often cause no appreciable harm.Emotions are what define "winning." Why do you want to get 18 SCs in Diplomacy? Because it's the stated goal of the game and gives you the satisfaction of beating the other players within the ruleset. Why does fairly beating other players give you satisfaction? There isn't an answer for that based on pure reason. (You could go through the evolutionary reasons humans feel good when they exercise their knowledge, but that just explains the emotions. It does not make them fundamentally rational) You could easy have a different utility function, such as "cause as much chaos as possible and watch the fireworks", but it would still be based on arbitrary emotion.

Murska
2014-04-20, 10:22 PM
I disagree, on two levels. One, emotions are linked to preferences, but they are not necessary for preferences and utility functions to exist.

It's simplest to elaborate with an example. Create an AI. Give it an utility function - make it work towards steering the future into some states over others. But don't give it a dopamine-equivalent or anything like that. Utility functions are not based on emotions exactly, they're based on terminal values, which are a somewhat different thing entirely. Values are also orthogonal to rationality - being rational means working according to your values, so you cannot rationally choose terminal values.

But more importantly, I feel there's some confusion in this discussion. Winning is to do with your utility function. Rationality means working in a way that furthers your goals, as defined by your utility function. Emotions can either work for your goals, in which case they are rational (feeling fear when contemplating doing something dangerous, for example) or they can work against you (fear paralyzing you even when it is very important to act). But I do not know what you mean by 'fundamentally rational'.

Also, emotions are not arbitrary. They follow rules, just like everything else. Said rules can be rather complex, and we don't exactly understand them perfectly yet, but if we simplify enough we can make useful generalizations. For example, we're generally happy when something happens that we believe is good, according to our current set of values.