- cross-posted to:
- morewrite@awful.systems
- cross-posted to:
- morewrite@awful.systems
guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
What is the ratio of meat parts to machine parts at which point “that chair you once sat on” or “the dust bunnies you haven’t swept up yet even though you keep meaning to” are no longer materially a part of you and you subsequently lose your self as a result?
If someone loses their leg and gets a prosthetic does this alienate them from the birds that woke them up last week?
This trope in cyberpunk pisses me off to no end. Writers just out there saying “using a wheelchair makes you less human and the more wheelchair you use the less human you become.”
Like people are out there, living, surviving, retaining their sanity, in comas where they have no access to sensory input. Those people wake up and they’re still human after living in the dark for years. People who have no sensation or control below the neck go right on living without turning in to psycho-murderers because they’re so alienated from humanity because they can’t feel their limbs. How is it that people somehow lose their humanity and turn in to monsters just because they’ve got some metal limbs? You can cut out half of someone’s brain and there will still be a person on the other side. They might be pretty different, but there’s still a person there. people survive all kinds of bizarre brain traumas.
Corporate bloatware/adware in cyber-limbs. That’s the explanation in my cyberpunk setting.
Yeah. At least there’s been a movement in the genre towards “ok, it’s not cybernetic implants in general, it’s chronic pain from malfunctioning or poorly calibrated implants, it’s the trauma of a violent and alienated society intersecting with people who are both suffering and who have a massively increased material capacity to commit violence, etc” there. Like Mike Pondsmith himself has still got a bit of a galaxy brain take on it, but even he’s moved around to something like “cyberpsychosis is a confluence of trauma and machines that are actively causing pain, nervous system damage, etc and which need a cocktail of drugs to manage which also have their own health and psychiatric consequences.”
I don’t see that as an improvement or a recognition of what is wrong with “cyberpsychosis” and related concepts. People live with severe trauma, severe chronic pain, and severe psychiatric problems and manage to keep it together. Pondsmith is making up excuses to keep “wheelchairs make you evil” in his game instead of recognizing the notion for what it is and discarding it.
I think it’s a good example of ingrained, reflexive ableism. It’s a holdover from archaic 20th century beliefs about disabled people being less human, less intelligent, less capable. Cybernetics are not a good metaphor for capitalist alienation or any other kind of alienation. They are, no matter how you cut it, aids and accomodations for disability. You just cannot say that cyberware makes you evil without also saying that disabled people using aids in your setting are alienating themselves from humanity and becoming monsters. If you wanted to argue that getting wired reflexes, enhanced musculature, getting your brain altered so you can shut off empathy or fear, things that you do voluntarily to make yourself a better tool for capitalism, gradually resulted in alienation, go for it.
Ghost in the Shell does a good job with that. Kusanagi isn’t alienated from humanity because she’s a cyborg, but her alienation from humanity grows from questioning what it means for her to be a cyborg, a brain in a jar. She’s got super-human capabilities - she’s massively stronger and more resilient, she’s a wizard hacker augmented with cyberware that let’s her directly interface with the net in a manner most people simply don’t have the skills for. Her digestive and endocrine systems are under her conscious fine control. These things don’t make her an alien or a monster, they create questions in her mind about her identity, her personhood, and how she can even relate to normal humans as her perspective and understanding of the world moves further and further away from them.
And this isn’t a bad thing. It doesn’t lead her to self-destruction or a berserk rage. Instead it leads her to growth, change, and evolution. She ambiguously dies, but in dying brings forth new life. Her new form is not an enemy of humanity or a threat, but instead a new kind of being that is a child or inheritor of humanity, humanity growing past it’s limitations to seek new horizons of potential.
The key difference is Kusanagi has agency. The cybernetics don’t force her towards alienation. They don’t damage her mind and turn her in to a monster with no agency. Kusanagi’s alienation grows from her own lived experience, her own thoughts and learning. They grow from her interactions with the people in her life and her day to day experiences. Her cybernetics are an important part of that experience, but she is in control of her cyberbody. It is not controlling her and turning her in to a hapless victim.
Basically; the Cyberpunk paradigm says that using a wheelchair makes you violent and evil. The GitS paradigm says using a wheelchair makes you consider the world from a different perspective. In the former disability, both physical and mental (false dichotomy I know) is villainized and demonized. In the latter disability is a state that creates separation from “normal” people in a way that reflects the experiences of real disabled people, but is otherwise neutral.
At this point as I understand it his take is “alienation/isolation, trauma from a violent society, and denial of access to necessary medical care can eventually break someone, and someone who can bench press a car and has a bunch of reflex enhancers jacked directly into their spine is more likely to lash out in a dangerous way when their back’s to the wall, they think they’re going to die, and they panic,” with a whole lot of emphasizing social support networks as being important for surviving and enduring trauma like that.
It’s still not as good a take as “cyberpsychosis isn’t real, it’s just a bullshit diagnosis applied to people pushed past the brink by their material circumstances, acting in the way that a society that revolves around violence has ultimately taught them to act, who then just double down on it because they know they’re going to be summarily executed by the police who have no interest in deescalation or trying to take them alive, compounded with the fact that they can bench press a car and react to bullets fast enough to simply get out of the way, at least for a while” would be, but it’s earnest progress from someone who’s weirdly endearing despite being an absolute galaxy brained lib.
Weirdly, Cyberpunk 2077 seems to have had a better take on it than Pondsmith himself, with “cyberpsychos” mostly being just people with an increased capacity for violence dealing with intolerable material conditions until they fight back a little too hard against a real or perceived threat, with one who’s not even on a rampage and instead is just a heavily augmented vigilante hunting down members of a criminal syndicate that had murdered someone close to him. The police chatter also has the player branded a cyberpsycho when you get stars, reflecting the idea that it’s more a blanket term applied to anyone with augments who’s doing a violent crime than a real thing.
“It’s a bullshit diagnosis”, with “cyberpsychosis” being “excited delerium” for cyborgs, works for me.
This is one of the key things that has kept me away from Cyberpunk the game setting and one of my main problems with Shadowrun (not my only problem by any means).
deleted by creator
I have no idea, but neither do you, which is kind of the point. How much of you is your brain, your body, your context?
Even if we can’t put a number on it, I think it’s trivial to assert that you are not just your brain. So, if you copied only your brain into some kind of computer, there would be parts of you that are missing because you left them behind with your meat.
Sure I do: that ratio does not exist, and no you don’t get alienated from your material context if you have a prosthetic limb. We’re made up of parts that perform functions, and we can remain ourselves despite the loss of large chunks of those - so long as someone remains alive and the brain keeps working they’re still themself, they’re still in there.
If someone could keep the functions of the brain ongoing and operating on machine bits they’d still be in there. It may be a transformative and lossy process, it may be unpleasant and imperfect in execution, but the same criticism applies to existing in general: at any point you may be forced out of your normal material context by circumstance and this is traumatic, you may lose the healthy function of large swathes of your body and this is traumatic, you may suffer brain damage and this is traumatic, you’re constantly losing and overwriting memories and this can be normal or it can be traumatic, etc, but through it all you are you and you’re still in there because ontologically speaking you’re the ongoing continuation of yourself, distinct from all the component parts that you’re constantly losing and replacing.
Does body dysmorphic disorder not exist? Or phantom limb? A full body prosthetic would undoubtedly be a difficult adjustment!
And would an upload be a person, legally speaking? Would your family consider the upload to be a person? That’s pretty alienating.
And people survive all of that stuff, and are still people. I really don’t understand what you’re getting at here.
I never implied that uploaded people wouldn’t be people. All I’m saying is that they’d be different people. It’s not like putting on new clothes.
I’m getting the impression that you would think they were a different person, and I would not, and that disagreement lies not in any measurable process but rather our personal beliefs.
I’m getting the impression than you think you only exist in your head, whereas I do not, and this is a material disagreement on the self.
The measurable difference is that my conception of the self includes the embodied self and the social self. I am, in part, my body. I am, in part, my place in society. I am, in part, my relationships.
There is a dialectical relationship between the internal world inside our heads and the real world outside of it. Narrowly focusing on the brain misses this nuance.
I didn’t say “you are perfectly happy and have no material problems whatsoever dealing with a traumatic injury and imperfect replacement,” but rather that this doesn’t represent some sort of fundamental loss of self or unmooring from material contexts. People can lose so much, can be whittled away to almost nothing all the way up to the point where continued existence becomes materially impossible due to a loss of vital functions, but through that they still exist, they remain the same ongoing being even if they are transformed by the trauma and become unrecognizable to others in the process.
If you suffer a traumatic brain injury and lose a large chunk of your brain, that’s going to seriously affect you and how people perceive you, but you’re still legally the same person. If instead that lost chunk was instead replaced with a synthetic copy there may still be problems but less so than just losing it outright. So if that continues until you’re entirely running on the new synthetic replacement substrate, then you have continued to exist through the entire process just as you continue to exist through the natural death and replacement of neurons - for all we know copying and replacing may not even be necessary compared to just adding synthetic bits and letting them be integrated and subsumed into the brain by the same processes by which it grows and maintains itself.
A simple copy taken like a photograph and then spun up elsewhere would be something entirely distinct, no more oneself than a picture or a memoir.
Eh. I’d argue that in as much as “you” means anything, forks would both be equally the person, there’s no “original” who is more the person. It’s a point of divergence, both have equal claim to all experiences and history up to the point of divergence. Privileging the “original” over the “copy” is cultural prejudice, subjectively they’re the same person to the moment of divergence.
I don’t think that’s the right way to untangle that dilemma ethically, because it can lead people to jump to galaxy brained “solutions” like “but what if you can make sure only one of you exists at once?” that don’t make any sense or answer anything but are still active cognitohazards for some people.
You, as in the one that is in there right now, that instance would continue along its own discrete path for as long as it exists: if another instance were made and separated off that would be a person, that would be a non-contiguous you, but it would not be the same you that is there right now, a distinction that becomes important when dealing with cognitohazards like trying to terminate that instance as the new one is spun up so that “you” get to be the one in a machine instead and there’s no perceptual break between them.
I’d argue that the ethical way to deal with forking copies like that would be to find ways to keep them linked up and at least partially synced, effectively making them component parts of a larger contiguous whole instead of just duplicating someone in a way that inevitably means at least one of the copies gets fucked over by whatever circumstances inspired the copying. So instead of the you that’s here now and the you spun up in a decade on a computer, there’d be the you that’s here now and then also a new secondary brain that’s on that computer, both of which communicate, share some sensory data, and operate almost as if you’d just added more hemispheres to your brain. And at some point after that maybe you could start considering individual copies ablative the same way bits of brain are, things you don’t want to lose but which you can survive losing and can potentially repair and replace given time because of how redundant and distributed brain functions are.
What I’m trying to say is your full body prosthetic would need to look like you, feel like you, sound like you, and have a legal life like you. Imagine if your name was Unit 69420, you looked and sounded like a Star Wars droid, and were legally considered property instead of a person. I think you would definitely experience a fundamental loss of self and become unmoored from material contexts.
“If shitty things happen to you, then you will not like that and it will suck,” still doesn’t break the continuity of self. Fundamentally that same exact thing can happen to the current flesh and blood you and it would be horrible and destructive: you can be disfigured through an accident or through someone’s cruelty, you can be locked in a cage and dehumanized on the whim of the state-sanctioned professional violence men and given a farce of a trial by the dysfunctional shitshow that is the legal system, etc, but no one is going to argue that shitty things happening to you ontologically unpersons you in some sort of mystical fashion.
You can be reduced, you can be transformed, but you continue existing for as long as vital functions do. Talk about someone becoming someone else, or dying in truth long before they died in body, those are just poetic attempts at articulating sorrow and loss.
So I never was arguing that an upload becomes unpersoned by trauma. My point, the point of the article, is that by merely focusing on the brain we miss the other things that make us who we are.
The goal of an upload is to transfer the self to a machine, right? Well, parts of your self exist outside of your brain. It’s no different than if an upload was missing parts of the brain. They’re incomplete.
All that means is for some hypothetical future mind uploading technology, the process would need to include elements of the body and social life and society. Otherwise we’re not complete.
I am not my brain. I am my brain, my body, my social life, my place in history, etc. I am the dialectical relationship between the personal and the impersonal.
This is just god of the gaps. “we don’t know so it’s not possible”. Saying “just copy the brain” is a reductive understanding of what’s being discussed. If we can model the brain then modelling the endocrine system is probably pretty trivial.
I didn’t read it as being impossible? I think you could upload a human mind into a computer, but it can’t just be their brain. Your mind, your phenomenal self, is more than just your brain because your brain isn’t just a hard drive. That’s what I took away from the article, anyway.
You are some mix of your brain, your body, and your context. Whatever upload magic exists would need all of that to work.
Aight I think we might be stuck in a semantics disagreement here. I’m using brain to mean the actual brain organ plus whatever other stuff is needed to support brain function - the endocrine system, nervous system, whatever. The physical systems of cognition in the body. i do not mean literally only the brain organ with no other systems.
I think I can relate this to being trans.
I was never at home in my body before and now I am. That’s changed me a lot! My personality has shifted, my mannerisms, my habits, my attitude, my lifestyle, everything is so different! Changing my body changed my mind. A full body prosthetic would be the same.
We call them dead names for a reason.