Thursday, April 12, 2012

Ethics of Strong VR: Experience Machines and Curated Dreams

Both of the topics that I work on - suicide and human extinction - function as reminders that each of us will die. Death reminders - mortality salience inductions, in the technical jargon - can increase thoughts and behaviors that interfere with open-minded thinking and learning. This may contribute to the poor quality of public discussion of death-related matters like suicide. In particular, and most disturbing given my project, is that being reminded that one will die causes people to desire more children (Study 1, Study 2).

How can we explore ethical issues related to existence and death without reminding people of death, hence triggering a worldview defense response?

I have a proposal: let's think about the ethical implications of something similar (in interesting ways) to life-creation and life-ending, but without the scariness: immersive virtual reality experiences. Virtual reality, unlike birth and death, is fun to think about, and does not trigger worldview defense. Simulated experiences are not yet institutionalized and are wide open for ethical investigation.

There are many differences between entering a virtual reality simulation and coming into existence, and between exiting a virtual reality simulation and death. What I do NOT propose to do is to make "gotcha" arguments along the lines of "oh, you'd want to CONSENT to entering a VR simulation? THAT PROVES ANTINATALISM!!!" What I DO propose to do is to explore our attitudes toward simulated experiences, and in the process, explore how good of a metaphor it is for life creation and ending. Rather than assuming no difference, we should identify and explore the differences and their import.

A survey (respond to some or all of these in the comments):
  1. What would it take to get YOU to sign up for an immersive virtual reality experience, of the kind that would give sensory information to all your senses and feel just like reality? I'm interested both in contract terms you'd demand and in evidence you'd accept that the contract terms would be honored.
  2. Would you require the ability to exit out of the VR experience as a precondition? Is there anything that would convince you to give up this "off switch" capability, such as the hope that the experience might help you learn something or be more meaningful?
  3. If you entered a VR experience with an off switch, what would you think about the possibility of deciding in-game to give up your off switch? Would such consent be meaningful, or would it be problematic?
  4. What would you think about the possibility for extremely negative experiences within the VR story, such as being gang raped for hours or kidnapped and tortured for months, as sometimes happens in real life? Would you want these experiences to be impossible? Would the (small) possibility of these experiences give meaning to the story for you?
  5. What about children? Should children be allowed to have VR experiences? Should special safeguards against especially bad experiences be in place for them?
  6. Is there something special about base reality that makes strong VR experiences immoral or undesirable?

Of course, please also help me brainstorm other related questions and topics.

As an aside - in fact, we enter immersive virtual reality every night for several hours, with no obvious off switch. These dreams are chosen by the monkey brain, presumably based on what it thinks will most benefit our reproductive fitness in our waking life (frequently featuring unpleasant content and negative affect), rather than by our conscious selves. Shall we begin to curate our dreams - and, perhaps, our other experiences as well?




As suggested by estnihil below, if you're interested and feeling social, feel free to post some version of these questions elsewhere, like on forums and stuff - cut and paste verbatim or put them in your own words, and you certainly don't have to link to this or credit my walking-mortality-salience-induction self. Post a link in the comment thread or send me a link, or don't. <3

14 comments:

  1. 1. Fun, in the broad sense. (May be a game, may be a good social environment, may be weird like drug trips.) I would require similar standards as for tricky psychedelic drugs or security-related software right now, so technically-proficient dealers I personally know and trust, or access to source and self-assembly. Also explicit consent, even though I'm likely to agree anyway. Retroactive or inferred consent is never acceptable.

    And there must be a way to deal with outside-people, either by taking them with me (with their consent etc.), or by limiting the VR experience (as with drugs or games). No matter how fun the experience, I wouldn't leave my loved ones behind.

    2. As long as my loved ones are taken care of (either by following me, by being dead already or some agreed-upon goodbye), I don't care. I'm only worried about getting stuck in a painful hell, so I would require some guarantee this won't happen, but otherwise, I'm fine. The VR would merely have to be better than this world for me to switch.

    3. Perfectly acceptable either way, provided previous conditions are fulfilled. Consent can be given in any state, and it's only experiences that matter, not "metaphysical levels" like being in the "base" level, simulation, dream, etc. I don't consider these distinctions meaningful at all.

    There are also lots of worlds I would consent to, even for infinity. Heaven, for example. Even this world, with some not-too-major changes.

    4. No. I can accept unavoidable suffering, but would never actively include suffering. Meaning doesn't need pain, but might need frustration. (Losing is often fun.) As for certainty, see 1).

    5. No. Children can consent just fine. Babies might be tricky to communicate with and get consent from, though. Suffering just sucks and the same standard applies to everyone. (I tend to think of adults as under-protected, not children as over-protected.)

    6. No. I don't think that "base reality" is a meaningful concept at all. There is an uninterrupted conscious experience. Our mental map has levels, not the experience itself. The common objection that virtual worlds aren't "really" meaningful is nonsense.

    I also don't see any actual difference between dreams or "being awake", and consider the waking state "constrained dreaming", as LaBerge calls it. Feels exactly the same to me (except for availability of memories and attention size), so I treat them the same.

    Personally, I found VR very discomforting for years because I feared I might eventually wake up from "normal" life and return to something very unpleasant, or go insane in the process. I also got afraid of more active lucid dreaming because I occasionally "woke up" while already wake, with some weird depersonalization experiences, strange alien "memories" and so on.

    I got used to it eventually, mostly by noticing that the waking state is constructed exactly like a dream, only with less drift because there are less feedback loops, and becoming lucid-while-awake was really just a temporary overriding of external input and is totally harmless.

    My fear of insanity was not about "reality", but my psychological reaction to it, and what I might do or not do. I fixed myself and the fear went away, regardless of "where" I am. I don't have a sense of "I'm awake" or "I'm dreaming" at all anymore.

    (I'm personally much more bothered by the common claim that people go "unconscious" at night. I definitely don't, ever.

    ReplyDelete
  2. 1. I do not have the werewithal to answer this question in detail, but an approximation of my true answer is probably: a lot of joy with little to no suffering. There are many other important things I would need, of course, but I'd at least consider any proposed virtual reality with that one constraint.

    2. No, if my answer to question 1 ended up being good enough. Else, yes.

    3. If I'm an agent, then the off switch is a choice, and choices tend to have value. Assuming I am sufficiently agenty to know how, I'd choose to give up the switch if I got something of equal value in return. Else, no. (I suspect this is a question a lot of people would direly overcomplicate.)

    4. Unacceptable. No, suffering doesn't create "meaning".

    5. Why not? And no, protect them from harm exactly as much as adults.

    6. No.

    ReplyDelete
  3. 1. I'd have to know some parameters of the content first to prevent me from being stuck in a boring or otherwise unsuitable virtual environment. I'd like to keep awareness that it's a VR world unless non-awareness is an integral part of the experience and I understand why. Ideally, I'd like some convenience functions like internet access, emails etc. from within the VR world. I'd require strong evidence that the machine doesn't malfunction and torture people. Proof of concept for all these points could be if it had run successfully for many others. I'd also expect it to cost money and maybe have opportunity costs (other VR worlds?), and I would need some kind of evidence it's worth the price. Under these conditions, I would LOVE to have the opportunity!

    2. In case I keep reality awareness I'd prefer to be able to switch back and forth any time, but it wouldn't be a deal-breaker if I couldn't. Both loss of reality awareness and inability to leave the VR world could give meaning to an experience if it's well-designed. It could be like a pilgrimage, a sexual submission experience or a challenge, like a computer game in hard-core mode. Generally, I'd prefer the off-switch but consider myself "losing" if I use it. It is possible that some stories only have meaning if you forget who you were before, maybe in order to find it out by yourself from clues and quests in the game world. In this case, the off-switch would do more harm than good.

    3. I can't think of any use case that isn't better covered by the cases outlined in 2, except maybe that spending some time in-world to get a feel for it before giving the off-switch up could make sense.

    4. I'd say they should be possible but reduced in their impact. For instance, the violence of imprisonment, torture or assault should technically be possible as a form of content, but the pain and distress experience would have to be adjusted to mere entertainment value. So yes to the general situations, including loss of control, but no to the strong negative affect that would accompany them in the real world.

    5. Children should be able to be part of the world just like adults, but the content should be adjusted for their extra distress vulnerability. Ideally, the machine should be able to measure distress and affect biomarkers and adjust stimuli dynamically.

    6. No.

    ReplyDelete
    Replies
    1. A remark on the logical relationship between the VR questions and the existential questions usually discussed in this blog: We assume here that entering the VR is not a requirement for mere existence - you could decline and still have a life outside. Successful antinatalism or suicide mean there is no life at all. So someone who would decline VR torture possibilities might not decline all life in order to avoid torture.

      Delete
  4. As a further idea, if you want to get closer to the relevance for antinatalism and suicide, maybe you could ask people what they think about non-consensual VR, making people experience the VR even if they didn't say yes, maybe if they couldn't give consent, or if they said no but it's plausible they didn't fully understand the value of the VR. Also if parent should have a right to sign their children up without consent. Also if it should be ok to make people stay in the game if they couldn't return once they left, to prevent them from giving up the rest of the experience. Currently, the big questions of consent are left out.

    ReplyDelete
  5. That was an unbelievably ingenious solution to this problem, Sister Y. I just blanked out and assumed the fatalist pessimist's "Ha, I knew we were all doomed in the end, so nothing you can say can hurt me la la la I'm not listening". Also, someone should post these questions on some sort of neutral forum (I won't do it because I am semi-socially anxious about these things).

    1. Probably the terms that suffering is entirely under my control (I tend to like nightmares and being afraid, so I wouldn't entirely eliminate it) and that the world is barely anything like the one we live in. I'm talking, not even the laws of physics applying, because even those are depressingly restrictive. Essentially I would like a lucid dream. I would also ask for a robotic version of myself to replace me so as to give my loved ones peace of mind (along with me). These sound like ridiculous constraints, which is probably why I REALLY don't like life.

    2.Off-switch capability in-built so that I can log off. I wouldn't want to, but just as a general rule in case I get spooked and end up wanting to get away.

    3. I'd want it so I could never pick that option, ever. Such consent doesn't sound particularly meaningful - the consent to give up consent? Intuitively, it seems like the answer is no to that one being meaningful, though I could be wrong about that.

    4. Might give story A LITTLE meaning, but I still wouldn't have them in there - I sometimes think my life being so horrible is a little bit beautiful, but most of the time it doesn't give me any meaning at all.
    Would want these experiences user-controlled only and ONLY CONSENSUAL (I am opted out of them by default), so my curious mind could see what things are like, without anything getting intense.

    5. Young children should not be allowed as they cannot give meaningful consent, but older children that can should be allowed. However, they should not be allowed to cause themselves to suffer until older, when they can understand the ramifications of this.

    6. No, reality is perceived through our senses, which can be altered by mental illness and drugs. No difference from using a virtual reality machine as far as I'm concerned. Even when we are sick with a bad flu, reality appears altered. A concrete reality may exist, but we don't experience anything like it due to the filters our brain uses to sort through the garbage.

    ReplyDelete
    Replies
    1. Good idea - added an addendum to the post.

      Delete
  6. David Cronenberg films, particularly Videodrome and Existenz, are creepy enough to keep me out of VR. Familiar with them, Sister Y?

    ReplyDelete
  7. And how about a question like this: If the majority of people experiencing VR say they are glad they did it, is that sufficient justification for you to be forced into VR?

    ReplyDelete
  8. "How can we explore ethical issues related to existence and death without reminding people of death, hence triggering a worldview defense response? "

    By metaphorizing nonexistence as a kind of (better) state of existence ("negative bliss").

    ReplyDelete
    Replies
    1. Referring to nonexistence as a state makes the philosopher in me twitch.

      Delete
  9. Hi.

    Two Parts to this post:

    A) The South-African-journal-of-philosophy “Anti-Natalism” special (Vol 31, No 1 (2012):

    http://www.ajol.info/index.php/sajpem

    Anyone read the content? Is there anything new that might make it worth my while to pay for it? I'm sorry if this is old news but I haven't found any opinion pieces on it so far.

    B) VR question answers.

    I was very tempted to answer the questions in great detail but my response was getting quite long. I post the shorter answers here.

    1)I would require fail safes that are of a higher standard than any I have been provided with for this normal existence. A sense of control over the circumstances of my own death is important to me so I’d need to be sure that VR wasn’t a further barrier to suicide-as-an-escape-plan. I suspect that I would only “opt-in” to VR as a consequence of not wanting to be left-behind by others going into it; such a decision might make suicide seem like an attractive alternative depending on what sort of VR is available.

    2 & 3 & 4. Any commitment to scenarios that can go very wrong are of no interest to me(hence my disinterest in living generally). This could get complicated because the best VR would be addictive and might imply that I would have to consent to the consequences of not wanting to leave VR prior to entering.

    I’ve heard that VR experiences might be able to limit the intensity of certain types of pain:

    http://www.gq.com/news-politics/newsmakers/201202/burning-man-sam-brown-jay-kirk-gq-february-2012

    If this is true I’d be tempted to commit but only if I had assurances that I wouldn’t suddenly become disengaged from the distractions of the VR world. I suspect that the therapeutic effect would not be as effective at masking primal (survival related) pain and this would be another reason to stick with the real-world.

    5. I’d have to say yes but we know that the decisions of children tend to be over-ruled. Communal VR would presumably involve the parents having to agree to go into VR with the child and this would leave the final say with the parents. Solitary (non-communal) VR experiences would seem a lot like suicide to parents and it would take an understandable reason for a parent to consent to it.

    6. Besides the points I’ve made:

    Many people would consider full-time VR an insult to their God(s) creation.

    VR could be seen as a threat to the sense of personal identity (more of a worry to people who generally don’t consider how much our environment shapes our personalities)

    Vulnerability of the VR world. Outside world might pose threats to the sustainability of quality experiences.

    ReplyDelete
  10. 1. Since I hate my real life and have little hope of it ever improving to an acceptable level, I'd jump at pretty much any chance to escape to a better virtual world.

    2.As long as I have a guarantee that the VR is better than this reality, I would be fine with not having an off switch if that's what it took. Of course I'd still prefer to have that off switch just in case.

    3.Assuming my personality and mental faculties remained unchanged in the VR, I see no ethical problem with having the option of making that decision in the VR.

    4. I'd want such experiences to be impossible, but I'd want my VR self to think some negative experiences are possible. Afterall, there's no satisfaction in overcoming challenges in the VR if you already know you're going to succeed. For example if I were to design a VR scenario of seducing a woman, obviously I'd want to feel like failure was an option and I only succeeded because I'm so desirable and all that crap.

    5. I see no reason not to let children experience VR, and I don't think children need any more protection because NO ONE deserves bad experiences any more than others in my view.

    6. I see no reason to think the real world is more desirable and I see no moral problems whatsoever with VR, in fact it's infinitely more moral to let people act out their sick fantasies in a virtual world where they're not actually hurting others. Just ask yourself whether you'd want a rapist out in the real world or stuck in a virtual one.

    Hope these answers are helpful.

    ReplyDelete

Tweets by @TheViewFromHell