Review published: 19 February 2016 doi: 10.3389/frobt.2016.00003 Real virtuality: A Code of ethical Conduct. Recommendations for Good Scientific Practice and the Consumers of vR-Technology Michael Madary* and Thomas K. Metzinger Johannes Gutenberg – Universität Mainz, Mainz, Germany The goal of this article is to present a first list of ethical concerns that may arise from research and personal use of virtual reality (VR) and related technology, and to offer concrete recommendations for minimizing those risks. Many of the recommendations call for focused research initiatives. In the first part of the article, we discuss the relevant evidence from psychology that motivates our concerns. In Section “Plasticity in the Human Mind,” we cover some of the main results suggesting that one’s environment can influence one’s psychological states, as well as recent work on inducing illusions of Edited by: embodiment. Then, in Section “Illusions of Embodiment and Their Lasting Effect,” we go Nadia Magnenat Thalmann, on to discuss recent evidence indicating that immersion in VR can have psychological University of Geneva, Switzerland effects that last after leaving the virtual environment. In the second part of the article, we Reviewed by: Jean-Marie Normand, turn to the risks and recommendations. We begin, in Section “The Research Ethics of Ecole Centrale de Nantes, France VR,” with the research ethics of VR, covering six main topics: the limits of experimental HyungSeok Kim, environments, informed consent, clinical risks, dual-use, online research, and a general Konkuk University, South Korea Roland Blach, point about the limitations of a code of conduct for research. Then, in Section “Risks for Fraunhofer Institut für Individuals and Society,” we turn to the risks of VR for the general public, covering four Arbeitswirtschaft und Organisation, Germany main topics: long-term immersion, neglect of the social and physical environment, risky *Correspondence: content, and privacy. We offer concrete recommendations for each of these 10 topics, Michael Madary summarized in Table 1. madary@uni-mainz.de Keywords: ethics, virtual reality, augmented reality, substitutional reality, depersonalization disorder, derealization, Specialty section: informed consent, dual use This article was submitted to Virtual Environments, a section of the journal PReLiMiNARY ReMARKS Frontiers in Robotics and AI Received: 05 December 2015 Media reports indicate that virtual reality (VR) headsets will be commercially available in early 2016, Accepted: 08 February 2016 or shortly thereafter, with offerings from, for example, Facebook (Oculus), HTC and Valve (Vive) Published: 19 February 2016 Microsoft (HoloLens), and Sony (Morpheus). There has been a good bit of attention devoted to the Citation: exciting possibilities that this new technology and the research behind it have to offer, but there has Madary M and Metzinger TK (2016) been less attention devoted to novel ethical issues or the risks and dangers that are foreseeable with the Real Virtuality: A Code of Ethical widespread use of VR. Here, we wish to list some of the ethical issues, present a first, non-exhaustive Conduct. Recommendations for list of those risks, and offer concrete recommendations for minimizing them. Of course, all this takes Good Scientific Practice and the Consumers of VR-Technology. place in a wider sociocultural context: VR is a technology, and technologies change the objective Front. Robot. AI 3:3. world. Objective changes are subjectively perceived, and may lead to correlated shifts in value judg- doi: 10.3389/frobt.2016.00003 ments. VR technology will eventually change not only our general image of humanity but also our Frontiers in Robotics and AI | www.frontiersin.org 1 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct understanding of deeply entrenched notions, such as “conscious when merely using the device for 3D viewing. Many of our points experience,” “selfhood,” “authenticity,” or “realness.” In addition, are also relevant for other types of VR hardware, such as CAVE it will transform the structure of our life-world, bringing about projection. One central area of concern has to do with illusions of entirely novel forms of everyday social interactions and changing embodiment, in which one has the feeling of being embodied other the very relationship we have to our own minds. In short, there than in one’s actual physical body (Petkova and Ehrsson, 2008; will be a complex and dynamic interaction between “normality” Slater et al., 2010). In VR, for instance, one might have the illusion (in the descriptive sense) and “normalization” (in the normative of being embodied in an avatar that looks just like one’s physical sense), and it is hard to predict where the overall process will lead body. Or one might have the illusion of being embodied in an us (Metzinger and Hildt, 2011). avatar of a different size, age, or skin color. In all of these cases, Before beginning, we should quickly situate this article within insight into the illusory nature of the overall state is preserved. the larger field of the philosophy of technology. Brey (2010) has The fact that VR technology can induce illusions of embodiment offered a helpful taxonomy dividing the philosophy of technol- is one of the main motivations behind our investigation into the ogy into the classical works from the mid-twentieth century, new risks generated by using VR by researchers and by the general on the one hand, and more recent developments that follow an public. Traditional paradigms in experimental psychology cannot “empirical turn” by focusing on the nature of particular emerging induce these strong illusions. Similarly, watching a film or playing technologies, on the other hand. We intend the present article to a non-immersive video game cannot create the strong illusion of be a contribution to the latter kind of philosophy of technology. In owning and controlling a body that is not your own. Although particular, we are investigating foundational issues in the applied our main focus will be on VR (see Figure 1), many of the risks and ethics of VR, with a heavy emphasis on recent empirical results. recommendations can be extended to augmented reality (Azuma, Both authors have been participants in the collaborative project 1997; Metz, 2012; Huang et  al., 2013 and substitutional reality Virtual Embodiment and Robotic Re-Embodiment (VERE), a (Suzuki et al., 2012; Fan et al., 2013). In augmented reality (AR, 5-year research program funded by the European Commission.1 see Figure 2), one experiences virtual elements intermixed with Despite this explicit focus, we do not mean to imply that the one’s actual physical environment. issues investigated here will not find fruitful application to themes Following Milgram and colleagues (Milgram and Kishino, from classical twentieth century philosophy of technology (see 1994; Milgram and Colquhoun, 1999), it may be helpful here Franssen et al., 2009). Consider, for instance, Martin Heidegger’s to consider augmented reality along the Reality–Virtuality influential treatment of the way in which modern technology Continuum. The real environment is located at one extreme of distorts our metaphysics of the natural world (Heidegger, 1977; the continuum and an entirely virtual environment is located at also Borgmann, 1984), or Herbert Marcuse’s prescient account the other extreme. Displays can be placed along the continuum of industrial society’s ongoing creation of false needs that under- according to whether they primarily represent the real environ- mine our capacities for individuality (Marcuse, 1964). As should ment while including some virtual elements (augmented reality) become clear from the examples below, immersive VR introduces or they primarily represent a virtual environment while including new and dramatic ways of disrupting our relationship to the natu- some real elements (augmented virtuality). Much of the following ral world (see Neglect of Others and the Physical Environment). discussion will focus on entirely virtual environments, but readers Likewise, the newly created “need” to interact using social media should keep in mind that many of the concerns raised will also will become even more psychologically ingrained as the interac- apply to environments all along the Reality–Virtuality Continuum. tions begin to take place while we are embodied in virtual spaces It is foreseeable that there will be ever new extensions and (see The Effects of Long-Term Immersion and O’Brolcháin et al., special cases of VR. We return to this theme with some philo- 2016). In sum, the fact that connections with classical philosophy sophical remarks at the end of the article. For now, let us at least of technology will remain largely implicit in this article should note that the very distinction between the real and the virtual is not be taken to suggest that they are not of great importance. ripe for further philosophical investigation. One example of such The main focus will be on immersive VR, in which subjects a special, recent extension of VR that does not in itself form a use a head-mounted display (HMD) to create the feeling of being distinct new category is “substitutional reality” (SR, see Figure 3), within a virtual environment. Although our main topic involves in which an omni-directional video feed gives one the illusion of the experience of immersion, some of the concerns raised, such being in a different location in space and/or time, and insight may as neglect of the physical environment (see Neglect of Others and not be preserved. Readers should keep in mind that VR headsets the Physical Environment), can be applied to extended use of an will likely enable users to toggle between virtual, augmented, and HMD even when users do not experience immersion such as substitutional reality, and to adjust one’s location on the Reality– Virtuality Continuum, thus somewhat blurring the boundaries between kinds of immersive environments. 1 The project, as well as the current publication, is funded under the EU 7th We divide our discussion into two main areas. First, we will Framework Program, Future and Emerging Technologies (Grant 257695). VERE address the research ethics of VR. Then we will turn to issues aris- aimed at dissolving the boundary between the human body and surrogate represen- ing with the use of VR by the general public for entertainment and tations in immersive virtual reality and physical reality, giving people the illusion other purposes. To be clear upfront, we are not calling for general that their surrogate representation is their own body. See http://www.vereproject. eu/ for more. We thank members of the VERE consortium for discussing many of restrictions on an individual’s liberty to spend time (and money) the issues in this article during our VERE Ethics Workshops in February 2013 and in VR. In open democratic societies, such regulations must be September 2015. based on rational arguments and available empirical evidence, Frontiers in Robotics and AI | www.frontiersin.org 2 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct FiGURe 1 | illusory ownership of an avatar in virtual reality. Here, a subject is shown wearing a head-mounted display and a body tracking suit. The subject can see his avatar in VR moving in synchrony with his own movements in a virtual mirror. In this case, the avatar is designed to replicate Sigmund Freud in order to enable subjects to counsel themselves. Thus, creating what Freud may have called an instance of avatar-introjection! (Image used with kind permission from Osimo et al., 2015.) FiGURe 2 | An augmented reality hand illusion. Here, augmented reality is used to show the subject a virtual hand in a biologically realistic location relative to his own body. This case differs from virtual reality due to the fact that the subject sees the virtual hand embedded in his own physical environment rather than in an entirely virtual environment (image used with kind permission from Keisuke Suzuki). FiGURe 3 | immersion in the past using substitutional reality. In this example, substitutional reality is used to allow switching between a live view of the scene and a panoramic recording of that scene from the past. Note that SR could also be used to provide live (or recorded) panoramic input from a distant location, creating the illusion that one is “present” somewhere else (image used with kind permission from Anil Seth). Frontiers in Robotics and AI | www.frontiersin.org 3 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct and they should be guided by a general principle of liberalism: underlying the neurally realized part of the human self-model in principle, the individual citizen’s freedom and autonomy in [for example, of the body model in our brain, e.g., Metzinger dealing with their own brain and in choosing their own desired (2003), p. 355] that are largely genetically determined. However, states of mind (including all of their phenomenal and cognitive we also want to point out that human beings possess a large properties) should be maximized. As a matter of fact, we would number of epigenetic traits, that is, a stably heritable phenotype even argue for a constitutional right to mental self-determination resulting from changes in a chromosome without alterations in (Bublitz and Merkel, 2014), somewhat limiting the authority the DNA sequence. of the government, because the above-mentioned values of individual freedom and mental autonomy seem to be absolutely fundamental to the idea of a liberal democracy involving a Context-Sensitivity All the Way Down separation of powers. However, once such a general principle has The way in which our behavior is sensitive to environmental been clearly stated, the much more interesting and demanding features is especially relevant here due to the fact that VR intro- task lies in helping individuals exercise this freedom in an intel- duces a completely new type of environment, a new cognitive and ligent way, in order to minimize potential adverse effects and the cultural niche, which we are now constructing for ourselves as a overall psychosocial cost to society as a whole (Metzinger, 2009a; species. Metzinger and Hildt, 2011). New technologies like VR open a vast space of potential actions. This space has to be constrained in a It is not excluded that extended interactions with VR rational and evidence-based manner. environments may lead to more fundamental changes, Similarly, we fully support ongoing research using VR – indeed, not only on a psychological, but also on a biological level. we argue below that there are ethical demands to do more research using it, research that is motivated in part with the goal Some of the most famous experiments in psychology reveal of mitigating harm for the general public. But we do think that it the context sensitivity of human behavior. These include the is prudent to anticipate risks and we wish to spread awareness of Stanford Prison Experiment, in which normal subjects playing how possibly to avoid, or at least minimize, those risks.2 Before roles as either prison guards or inmates began to show patho- entering into the concrete details, we are going to make the case logical behavioral traits (Haney et al., 1973), Milgram’s obedi- for being especially concerned about VR technology in contrast, ence experiments, in which subjects obeyed orders that they say, to television or non-immersive video games. We do so in believed to cause serious pain and be immoral (Milgram, 1974), two steps. First, in Section “Plasticity in the Human Mind,” we and Asch’s conformity experiments, in which subjects gave obvi- cover some of the relevant discoveries from psychology in the ously incorrect answers to questions after hearing confederates, past decades, including the scientific foundation for illusions of all give the same incorrect answers (Asch, 1951). For a more embodiment. Then in Section “Illusions of Embodiment and recent result showing the unconscious impact of environment Their Lasting Effect,” we cover the more recent experimental on behavior, the amount of money placed in a collection box work that has begun to reveal the lasting psychological effects of for drinks in a university break room was measured under a these illusions. Then in Section “Recommendations for the Use condition in which the image of a pair of eyes was posted above of VR by Researchers and Consumers,” we will cover the research the collection box. With the eyes “watching,” coffee drinkers ethics of VR followed by risks for the general public. placed three times as much money in the box compared to the control condition with no eyes (Bateson et  al., 2006). Effects like this one may be particularly relevant in VR, because the Plasticity in the Human Mind subjective experience of presence and being there is not only One central result of modern experimental psychology is that determined by functional factors like the number and fidelity human behavior can be strongly influenced by external factors of sensory input and output channels, the ability to modify the while the agent is totally unaware of this influence. Behavior is virtual environment, but also, importantly, the level of social context sensitive and the mind is plastic, which is to say that interactivity, for example, in terms of actually being recognized it is capable of being continuously shaped and re-shaped by a as an existing person by others in the virtual world (Heeter, 1992; host of causal factors. These results, some of which we present Metzinger, 2003). As investigations into VR have interestingly below, suggest that our environment, including technology and shown, a phenomenal reality as such becomes more real  –  in other humans, has an unconscious influence on our behavior. terms of the subjective experience of presence – as more agents Note that the results do not conflict with the manifest fact that recognizing one and interacting with one are contained in this most of us have relatively stable character traits over time. After reality. Phenomenologically, ongoing social cognition enhances all, most of us spend our time in relatively stable environments. both this reality and the self in their degree of “realness.” This And there may be many aspects of the functional architecture principle will also hold if the subjective experience of ongoing social cognition is of a hallucinatory nature. 2 Behr et  al. (2005) have addressed similar themes about practical issues in VR research and applications. Here, we wish to address concerns that go beyond their Potential for Deep Behavioral Manipulation initial treatment of the topic. More recently, O’Brolcháin et al. (2016) have covered ways in which the conjunction of VR with social networks might raise threats Whether physical or virtual, human behavior is situated and to privacy and autonomy. We will engage with some of their concerns at various socially contextualized, and we are often unaware of the causal points below. impact this fact has on learning mechanisms as well as on Frontiers in Robotics and AI | www.frontiersin.org 4 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct occurrent behavior. It is plausible to assume that this will be true 2009; Windt, 2010; Metzinger, 2013a,b); and in some configura- of novel media environments as well. Importantly, unlike other tions (e.g., “being one with the world”), there is also a maximal forms of media, VR can create a situation in which the user’s entire UI, likely constituted by the most general phenomenal property environment is determined by the creators of the virtual world, available, namely, the integrated nature of phenomenality per se including “social hallucinations” induced by advanced avatar (Metzinger, 2013a,b). technology. Unlike physical environments, virtual environments can be modified quickly and easily with the goal of influencing VR technology directly targets the mechanism by which behavior. human beings phenomenologically identify with the content of their self-model. The comprehensive character of VR plus the potential for the global control of experiential content introduces The rubber hand illusion is a simple localized illusion of opportunities for new and especially powerful forms of embodiment that can be induced by having subjects look both mental and behavioral manipulation, especially at a visually realistic rubber hand in a biologically realistic when commercial, political, religious, or governmental position (Botvinick and Cohen, 1998; Tsakiris and Haggard, interests are behind the creation and maintenance of the 2005). When the rubber hand is stroked synchronously with virtual worlds. the subject’s physical hand (which is hidden from view), subjects experience the rubber hand as their own.3 While the However, the plasticity of the mind is not limited to behavioral rubber hand can be used to create a partial illusion of embodi- traits. Illusions of embodiment are possible because the mind is ment, the same basic idea can be used to create the full-body plastic to such a degree that it can misrepresent its own embodi- illusion, on a global level. Subjects look through goggles ment. To be clear, illusions of embodiment can arise from normal through which they see a live video feed of their own bodies brain activity alone, and need not imply changes in underlying (or of a virtual body) located a short distance in front of their neural structure. Such illusions occur naturally in dreams, actual location. When they see their bodies being stroked on phantom limb experiences, out-of-body experiences, and Body the back, and feel themselves being stroked at the same time, Integrity Identity Disorder (Brugger et  al., 2000; Metzinger, subjects sometimes feel as if the body that they see in front of 2009b; Hilti et  al., 2013; Ananthaswamy, 2015; Windt, 2015), them is their own (Lenggenhager et al., 2007; see Figure 4). and they sometimes include a shift in what has been termed the This illusion is much weaker and more fragile than the RHI, phenomenal “unit of identification” in consciousness research but it has given us valuable new insights into the bottom-up (UI; Metzinger, 2013a,b), the conscious content that we currently construction of our conscious, bodily self-model in the brain experience as “ourselves” (please note that in the current paper (Metzinger, 2014). In more recent work, Maselli and Slater “UI” does not refer to “user interface,” but always to the specific (2013) have found that tactile feedback is not required for an experiential content of “selfhood,” as explained below). This may illusion of embodiment. They found that a virtual arm with a be the deepest theoretical reason why we should be cautious realistic appearance co-located with the subject’s actual arm about the psychological effects of applied VR: this technology is sufficient to induce the illusion of ownership of the virtual is unique in beginning to target and manipulate the UI in our arm. In addition to visual and tactile signals, recent work brain itself. suggests that manipulations of interoceptive signals, such as heartbeat, can also influence our experience of embodiment Direct UI-Manipulation (Aspell et al., 2013; Seth, 2013). The UI is the form of experiential content that gives rise to The results sketched in these three sections reveal not only autophenomenological reports of the type “I am this!” For categories of risks but also three ways in which the human mind every self-conscious system, there exists a phenomenal unit of is plastic. First, there is “context-sensitivity all the way down,” identification, such that the system possesses a single, conscious which may involve hitherto unknown kinds of epigenetic trait model of reality; the UI is a part of this model; at any given formation in new environments. Second, there is evidence that point in time, the UI can be characterized by a specific and behavior can be strongly influenced by environment and context, determinate representational content, which in turn constitutes and in a deep way. Third, illusions of embodiment can be induced the system’s phenomenal self-model (PSM, Metzinger, 2003) at fairly easily in the laboratory, directly targeting the human UI t. Please note how the UI does not have to be identical with itself. These results can be taken together as empirical premises the content of the conscious body image or a region within it for an argument stating not only that there may be unexpected (like a fictitious point behind the eyes). For example, the UI psychological risks if illusions of embodiment are misused, can be moved out of and behind the head as phenomenally or used recklessly, but that, if we are interested in minimizing experienced in a repeatable and controllable fashion by direct potential damage and future psychosocial costs, these risks are electrical stimulation while preserving the visual first-person themselves ethically relevant. In the following section, we review perspective with its origin behind the eyes (de Ridder et  al., initial evidence that connects the three strands of evidence that 2007). For human beings, the UI is dynamic and can be highly variable. There exists a minimal UI, which likely is constituted 3 For a version of this illusion using a virtual hand in augmented reality (rather than by pure spatiotemporal self-location (Blanke and Metzinger, a rubber hand), see Figure 2 above. Frontiers in Robotics and AI | www.frontiersin.org 5 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct FiGURe 4 | Creating a whole-body analog of the rubber-hand illusion. (A) Participant (dark blue trousers) sees through a HMD his own virtual body (light blue trousers) in 3D, standing 2 m in front of him and being stroked synchronously or asynchronously at the participant’s back. In other conditions, the participant sees either (B) a virtual fake body (light red trousers) or (C) a virtual non-corporeal object (light gray) being stroked synchronously or asynchronously at the back. Dark colors indicate the actual location of the physical body or object, whereas light colors represent the virtual body or object seen on the HMD. (Image used with kind permission from M. Boyer.). we have just presented. That is, we review initial evidence that risks of existing media technology for the general public. A first illusions of embodiment can be combined with a change in envi- important result from VR research involves what is known as ronment and context in order to bring about lasting psychological the virtual pit (Meehan et al., 2002). Subjects are given a HMD effects in subjects. that immerses them in a virtual environment in which they are standing at the edge of a deep pit. In one kind of experiment illusions of embodiment and Their involving the pit, they are instructed to lean over the edge and Lasting effect drop a beanbag onto a target at the bottom. In order to enhance In the last several years, a number of studies have found a the illusion of standing at the edge, the subject stands on the ledge psychological influence on subjects while immersed in a virtual of a wooden platform in the lab that is only 1.5″ from the ground. environment. These studies suggest that VR poses risks that Despite their belief that they were in no danger because the pit was are novel, that go beyond the risks of traditional psychological “only” virtual, subjects nonetheless show increased signs of stress experiments in isolated environments, and that go beyond the through increases in heart rate and skin conductance (ibid.). In a Frontiers in Robotics and AI | www.frontiersin.org 6 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct variation of the virtual pit, subjects may be told to walk across the ReCOMMeNDATiONS FOR THe USe OF pit over a virtual beam. In the lab, a real wooden beam is placed vR BY ReSeARCHeRS AND CONSUMeRS where subjects see the virtual beam. As one might expect, this version of the pit also elicits strong feelings of stress and fear.4 With the results from the first section of the paper in mind as More recently, an experiment reproducing the famous Milgram illustrative examples, we now move on to make concrete recom- obedience experiments in VR found that subjects reacted as if the mendations for VR in both scientific research (see The Research shocks they administered were real, despite believing that they Ethics of VR) and consumer applications (see Risks for Individuals were merely virtual (Slater et al., 2006). and Society). Our main recommendations are italicized and listed In addition to a strong emotional response from immersion, together in Table 1. there is evidence that experiences in VR can also influence behavioral responses. One example of a behavioral influence from VR has been named the Proteus Effect by Nick Yee and The Research ethics of vR Jeremy Bailenson. This effect occurs when subjects “conform In this section, we cover questions about the ethics of conducting to the behavior that they believe others would expect them to research either on VR, or, perhaps more interestingly, research have” based on the appearance of their avatar (Yee and Bailenson, using VR as a tool. For example, it is plausible to assume that in 2007, p. 274; Kilteni et al., 2013). They found, for example, that the future there will be many experiments combining real-time subjects embodied in a taller avatar negotiated more aggressively fMRI and VR or ones using animal subjects in VR (Normand than subjects in a shorter avatar (ibid.). Changes in behavior et al., 2012), which are not only about understanding or improv- while in the virtual environment are of ethical concern, since ing VR itself but only use it a research tool. To begin with a short such behavior can have serious implications for our non-virtual example, Behr et al. (2005) have covered the research ethics of VR physical lives – for example, as financial transactions take place from a practical perspective, emphasizing that the risk of motion in a non-physical environment (Madary, 2014). sickness must be minimized and that researchers ought to assist But perhaps even more concerning for our purposes is evi- subjects as they leave the virtual environment and readjust to the dence that behavior while in the virtual environment can have a real world. In this part of the article, we indicate new issues in the lasting psychological impact after subjects return to the physical research ethics of VR that were not covered in Behr et al.’s initial world. Hershfield et al. (2011) found that subjects embodying treatment. In particular, we will raise the following six issues: avatars that look like aged versions of themselves show a ten- • the limits of experimental environments, dency to allocate more money for their retirement after leaving • informed consent with regard to the lasting psychological the virtual environment. Rosenberg et al. (2013) had subjects effects of VR, perform tasks in a virtual city. Subjects were allowed to fly • risks associated with clinical applications of VR, through the city either using a helicopter or by their own body • the possibility of using results of VR research for malicious movements, like Superman. They found that subjects given purposes (dual use), the superpower were more likely to show altruistic behavior • online research using VR, and afterwards – they were more likely to help an experimenter pick • a general point about the inherent limitations of a code of up spilled pens. Yoon and Vargas (2014) found a similar result, conduct for research. although not using fully immersive VR. They had subjects play For each of these issues, we offer concrete recommendations a video game as either a superhero, a supervillain, or a neutral for researchers using VR as well as ethics committees charged control avatar. After playing the game, subjects were given a with evaluating the permissibility of particular experimental tasting task that they were told was unrelated to the gaming paradigms using VR. experiment. Subjects were given either chocolate or chili sauce to taste, and then told to measure out the amount of food for the Ethical Experimentation subsequent subject to taste. Those who played as heroes poured What are the limits to what we can do ethically as experiments in out more chocolate, while those who played as villains poured VR? We recommend, at the very least, that researchers ought to out more chili. follow the principle of non-maleficence: do no harm. This princi- The psychological impact of immersive VR has also been ple is a central component of research ethics on human subjects explored in a beneficent application. Peck et al. (2013) gave sub- where it is often discussed with the accompanying principle of jects an implicit racial bias test at least 3 days before immersion beneficence: maximize well-being for the subjects. Note how such and then immediately after the immersion. In the experiment, a principle applies to all sentient beings capable of suffering, like subjects were embodied in an avatar with either light skin, dark non-human animals or even potential artificial subjects of experi- skin, purple skin, or they were immersed in the virtual world with ence in the future (Althaus et  al., 2015, p. 10). We will return no body. They found that subjects who were embodied in the to the principle of beneficence in VR in the following section. dark-skinned avatar showed a decrease in implicit racial bias, at The principle of non-maleficence can be found in the codes of least temporarily. ethical conduct for both the American Psychological Association (General Principle A)5 as well as in the British Psychological 4 For some nice anecdotal accounts of experiences with the virtual pit, see Blascovich and Bailenson, 2011: 38-42. 5 http://www.apa.org/ethics/code/ Frontiers in Robotics and AI | www.frontiersin.org 7 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct Society (Principle 2.4). The British Psychological Society offers environments. The principle of non-maleficence should be applied the following recommendation: in the sense that experiments should not be conducted if the outcome involves foreseeable harm to the subjects. On the other Harm to research participants must be avoided. Where hand, the same principle implies a sustained striving for rational, risks arise as an unavoidable and integral element of evidence-based minimization of risks in the more distant future. the research, robust risk assessment and management We, therefore, suggest that careful experiments designed with the protocols should be developed and complied with. beneficent intention of discovering the psychological impact of Normally, the risk of harm must be no greater than that immersion in VR are ethically permissible. encountered in ordinary life, i.e., participants should not In order to adhere to the principle of non-maleficence, be exposed to risks greater than or additional to those to researchers (and ethics committees) will need to utilize their which they are exposed in their normal lifestyles (The knowledge of experimental psychology as well as their knowl- British Psychological Society, 2014, p. 11). edge of results specific to VR. The kinds of results sketched in Sections “Plasticity in the Human Mind” and “Illusions of Following this recommendation in the case of VR might Embodiment and Their Lasting Effect” will be directly relevant raise some novel challenges due to the entirely new nature of the for evaluating whether a line of experimentation violates technology. For instance, a well-known domain of application for non-maleficence. Similarly, the selection of subjects for VR the principle of non-maleficence has been in clinical trials for experiments must be done with special care. New methods of new pharmacological agents. Although this domain of research prescreening for individuals with high risk factors must be incre- ethics still faces important and controversial issues (Wendler, mentally developed, and funding for the development of such 2012), thinkers in the debate can avail themselves of the history of new methodologies needs to be allocated. We, therefore, urge medical technology. In many cases, precedents can be quoted. In careful screening of subjects to minimize the risks of aggravating the case of VR, there is yet no history that we can use as a source an existing psychological disorder or an undetected psychiatric for insight. On the contrary, what is needed is a rational, ethically vulnerability (Rizzo et al., 1998; Gregg and Tarrier, 2007). Many sound process of precedence-setting. experiments using VR currently seek to treat existing psychiat- In its general form, the principle of non-maleficence for VR ric disorders. The screening process for such experiments has can be expressed as follows: the goal of selecting subjects who exhibit signs and symptoms of an existing condition. The screening process should also include No experiment should be conducted using virtual reality exclusion criteria specific to possible risks posed by VR. Ideally, with the foreseeable consequence that it will cause serious the VR research community will seek to establish an empirically or lasting harm to a subject. motivated standard set of exclusion criteria. As we will discuss in Section “The Effects of Long-Term Immersion” below, of Although recommending adherence to this principle is noth- particular concern are vulnerabilities to disorders that could ing new, implementing this principle in VR laboratories may be potentially become aggravated by prolonged immersion and illu- challenging for the following reason. Attempts to apply non- sions of embodiment, such as Depersonalization/Derealization maleficence in VR can encounter a dilemma of sorts. On the one Disorder (DDD; see American Psychiatric Association (2013), hand, a goal of the research ought to be, as we suggest below, to DSM-5: 300.14). Standard exclusion criteria may involve, for gain a better understanding of the risks posed for individuals instance, scoring above a particular threshold on scales testing using VR. For instance, does the duration of immersion pose for dissociative experiences (Bernstein and Putnam, 1986) or a greater risk for the user? Might some virtual environments depersonalization (Sierra and Berrios, 2000). Of course, there be more psychologically disturbing than others? VR research may be cases in which experimenters seek to include subjects should seek to answer these and similar questions. In particular, with experiences of dissociation in order to investigate ways open-ended longitudinal studies will be necessary to assess the in which VR might be used to treat the underlying conditions, risk of long-term usage for the general population, just like with such as treating post-traumatic stress disorder (PTSD) through new substances for pharmaceutical cognitive enhancement exposure therapy in VR (Botella et al., 2015). In those special or medical treatments more generally. On the other hand, it cases, it is important to implement alternative exclusion criteria, is difficult to assess these risks without running experiments such as Rothbaum et al. (2014), who excluded subjects with a that generate those possible risks, thus raising worries about history of psychosis, bipolar disorder, and suicide risk. non-maleficence. A strict adherence to non-maleficence would require avoid- ing all experiments using virtual environments for which the Informed Consent risk is unknown. We suggest that this strict interpretation of The results presented above clearly suggest that VR experiences non-maleficence is not optimal, because substantial ethical can have lasting psychological impact. This new knowledge about assessments should always be evidence-based and necessarily the lasting influence of experiences in VR must not be withheld involve the investigation of greater time-windows and larger from subjects in new VR experiments. populations. VR researchers could and should provide a valuable service by informing the public and policy makers of the possible We recommend that informed consent for VR experi- risks of spending large amounts of time in unregulated virtual ments ought to include an explicit statement to the effect Frontiers in Robotics and AI | www.frontiersin.org 8 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct that immersive VR can have lasting behavioral influences of the research, rather than motivated by false hope or even on subjects, and that some of these risks may be presently desperation. unknown. VR researchers aiming at new clinical applications should Subjects should be made aware of this possibility out of respect therefore work slowly and carefully, in close collaboration for their autonomy (as included, for example, in the American with physicians who may be better situated to make Psychological Association General Principle D)6. That is, if an informed judgments about the suitability of particular experiment might alter their behavior without their awareness patients for new trials. of this alteration, then such an experiment could be seen as a threat to the autonomy of the subject. A reasonable way to Therapeutic and clinical applications should be investigated preserve autonomy, we suggest, is simply to inform subjects of only in the presence of certified medical personnel. possible lasting effects. Please again note the principled problem Another relevant concern here is the way in which the general that research animals are not able to give informed consent, their public keeps informed of new developments in science through the interest needs to be represented by humans. Also note that we popular media. Members of the general public with less interest in are not suggesting that subjects ought to be informed about the science may have a more difficult time gleaning scientific knowl- particular effects that are being investigated in the experiment. edge from the media than those with more interest (Takahashi Thus, our recommendation should not raise the concern that and Tandoc, 2015). When considering their responsibility as informing subjects may compromise researchers’ abilities to test scientists to communicate new results to the public (Fischhoff, for particular behavioral effects. 2013; Kueffer and Larson, 2014), VR researchers working in clini- cal applications must be careful to avoid language that might give Practical Applications: False Hope and Beneficence false hope to patients. Another concern has to do with various applications of VR. One of We should also note here that there are other practical concerns many promising applications for VR research is in the treatment about the use of VR for medical interventions. For instance, once of disease, damage, and other health-related issues, especially the technology is available for patients to use, who will pay for mental health.7 For instance, researchers found that immersing it? Should medical insurance pay for HMDs and new software? burn victims in an icy virtual environment can mitigate their How do we achieve distributive justice and avoid a situation where experience of pain during medical procedures (Hoffman et al., only privileged members of society benefit from technological 2011). Here, we wish to raise some concerns about applications of advances? We make no recommendation here, but flag this ques- VR. The first concern is that patients may develop false hope with tion as something that needs to be considered by policy makers. regard to clinical applications of VR. The second concern is that Similarly, HMDs, CAVE immersive displays, and motion-tracking applications of VR may encounter a tension between beneficence technology may have to be reclassified as medical devices. and autonomy. One risk when performing the research necessary for develop- Patients may believe that treatment using VR is better than ing such applications is that the patients involved may develop traditional interventions merely due to the fact that it is a new a false sense of hope due to the non-traditional nature of the technology, or an experimental application of existing tech- intervention. As this kind of research progresses, scientists must nology. This sense of false hope is known as the “therapeutic continue to be honest with patients so as not to generate false misconception” in the literature on the ethics of clinical research hope. There is also an overlap between media ethics and the ethics (Appelbaum et al., 1987; Kass et al., 1996; Lidz and Appelbaum, of VR technology: a related example is that many of the early 2002; Chen et  al., 2003). Researchers using VR for clinical experiments on full-body illusions (Ehrsson, 2007; Lenggenhager research must be aware of established techniques for combating et al., 2007) have been falsely overreported as creating full-blown the therapeutic misconception in their subjects. For example, “out-of-body experiences” (Metzinger, 2003, 2009a,b), and scien- one established guideline for investigating new clinical applica- tists have perhaps not done enough to correct this misrepresenta- tions is that of “clinical equipoise,” which is the requirement that tion of their own work in the media.8 While incremental progress there be genuine uncertainty in the medical community as to the has clearly been made, large parts of the public still falsely believe best form of treatment (Freedman, 1987). It is important that that scientists “have created OBEs in the lab.” researchers communicate their own sense of this uncertainty in a clear manner to volunteer subjects. Similarly, as Chen et  al. Overall, scientists and the media need to be clear and (2003) note, physicians who have a lasting relationship with their honest with the public about scientific progress, especially patients may be better suited to form a judgment as to whether in the area of using VR for medical treatment. the patient is motivated by a clear understanding of the nature The second concern about applications of VR has to do with the well-known tension between autonomy and beneficence in 6 http://www.apa.org/ethics/code/ 7 VR has been used to treat a wide range of mental health issues, including eat- ing disorders (Ferrer-Garcia et al., 2015), acrophobia (Emmelkamp et al., 2001), 8 For some examples of the full-body illusion being misrepresented in the media, see: agoraphobia (Botella et al., 2004), arachnophobia (Carlin et al., 1997), and PTSD http://www.nytimes.com/2007/08/24/science/24body.html http://news.bbc.co.uk/2/ (Rothbaum et al., 2001). See Parsons and Rizzo (2008) for a meta-analysis of these hi/health/6960612.stm http://www.sciencedaily.com/releases/2007/0 8/070823141057. kinds of treatment. htm Frontiers in Robotics and AI | www.frontiersin.org 9 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct applied ethics (Beauchamp and Childress, 2013: Chapter 6). As As the embodiment in avatars and physical robots may be func- the results surveyed in the first part of this article suggest, VR ena- tionally shallow and may provide only weaker and less stable forms bles a powerful form of non-invasive psychological manipulation. of self-control (for example, with regard to spontaneously arising One obvious application of VR, then, would be to perform such aggressive fantasies, see Metzinger, 2013c for an example), it is manipulations in order to bring about desirable mental states and not clear how such PSM-actions mediated via brain–computer behavioral dispositions in subjects. Indeed, early experiments in interfaces should be assessed in terms of accountability and ethical VR have done just that, making subjects willing to save more responsibility.9 for their retirement (Hershfield et al., 2011), perform better on Just as VR can be used to increase empathy, it can conceiv- tests for implicit racial bias (Peck et al., 2013), and behave in a ably be used to decrease empathy. Doing so would have obvious more environmentally conscious manner (Ahn et al., 2014). In military applications in training soldiers to have less empathy for a paternalistic spirit, such as that of the UK Behavioral Insights enemy combatants, to feel no remorse about doing violence. We Team, one might urge that beneficent VR applications such as will not go further into the difficult issues regarding the use of these should be put in place among the general populace, perhaps new technology in warfare, but we note this possible alternative as a new form of “public service announcement” for the twenty- application of the technology. Apart from increasing or decreas- first century. Here, we wish to note that doing so may generate ing empathy, the power of VR to induce particular kinds of emo- another case of conflict between beneficence and autonomy. If tions could be used deliberately to cause suffering. Conceivably, individuals do not seek to alter their psychological profile in the the suffering could be so extreme as to be considered torture. ways intended by the beneficent VR interventions, then such Because of the transparency of the emotional layers in the human interventions may be considered a violation of their autonomy. self-model (Metzinger, 2003), it will be experienced as real, even if it is accompanied by cognitive-level insight into the nature of Dual Use the overall situation. Powerful emotional responses occur even Dual use is a well-known problem in research ethics and the ethics when subjects are aware of the fact that they are in a virtual of technology, especially in the life sciences (Miller and Selgelid, environment (Meehan et al., 2002). 2008). Here, we use it to refer to the fact that technology can be used for something other than its intended purpose, in particular Torture in a virtual environment is still torture. The fact to military applications. In the context of VR technology, one that one’s suffering occurs while one is immersed in a will immediately think not only of drone warfare, teleoperated virtual environment does not mitigate the suffering itself. weapon systems, or “virtual suicide attacks,” but also of interroga- tion procedures and torture. It is not in the power of the scientists and engineers who develop the technology to police its use, but VR Research and the Internet we can raise awareness about potential misuses of the technology A final concern for the research ethics portion of this article as a way of contributing to precautionary steps. has to do with the use of the internet in conjunction with VR Here is an example. One possible application of VR would be research. For instance, scientists may wish to observe the pat- to rehabilitate violent offenders by immersing them in a virtual terns of behavior for users under particular conditions. It is clear environment that induces a strong sense of empathy for their that the internet will play a main role in the adoption of VR for victims. We see no problem at all with voluntary participation personal use. Users will be able to inhabit virtual environments in such a promising use of the technology. But it is foreseeable with other users through their internet connections, and perhaps that governments and penal systems adopt mandatory treatment enjoy new forms of avatar-based intersubjectivity. As O’Brolcháin using similar techniques, calling to mind Anthony Burgess’ A et al. (2016) suggest, we will soon see a convergence of VR with Clockwork Orange. We will not comment on the moral acceptabil- online social networks. The overall ethical risks of this imminent ity of such a practice, noting that the details of implementation development have been covered in detail by O’Brolcháin et  al. may be an important – and more controllable – unknown factor. (2016); in this section, we will incorporate and expand on their Virtual embodiment constitutes historically new form of act- discussion with a focus on questions of research ethics. ing. Metzinger (2013c) introduced the notion of a “PSM-action” to There is a sizable body of literature covering the main issues describe this new element more precisely. PSM-actions are those of internet research ethics (Ess and Association of Internet actions in which a human being exclusively uses the conscious Researchers Ethics Working Committee, 2002; Buchanan and self-model in her brain to initiate an action, causally bypassing the non-neural body (as in Figure 5). Of course, there will have to be feedback loops for complex actions, for instance, when seeing 9 It is important to note that teleoperated weapon systems are used in an illegal through the camera eyes of a robot, perhaps adjusting a grasping manner today, and it would not be rational to assume that the introduction of military VR-technology in combination with brain–computer interfaces could lead movement in real-time (which is still far from possible today). But to a change in this deplorable situation. With German support, the United States the relevant causal starting point of the entire action is no longer of America execute citizens of and in other sovereign states (e.g., Yemen, Somalia, the body made of flesh and bones, but the conscious self-model Pakistan) without charge, trial, or final judgment (the so-called “extrajudicial kill- in our brain. We simulate an action in the self-model, in the inner ings”), thereby violating international law (under which lethal force may be used image of our body, and a machine performs it. PSM-actions are outside armed conflict zones only as a last resort to prevent imminent threats; see Melzer, 2008 for background and discussion) as well as national law, human almost purely “mental,” put they may have far-reaching causal rights, and humanitarian laws. The potential for further illegal or unethical military consequences in the real world, for example, in combat situations. applications of VR is high, and one of our major concerns. Frontiers in Robotics and AI | www.frontiersin.org 10 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct FiGURe 5 | “PSM-actions”: a test subject lies in a nuclear magnetic resonance tomograph at the weizmann institute in israel. With the aid of data goggles he sees an avatar, also lying in a scanner. The goal is to create the illusion that he is embodied in this avatar. The test subject’s motor imagery is classified and translated into movement commands, setting the avatar in motion. After a training phase, test subjects were able to control a remote robot in France “directly with their minds” via the Internet, while they were able to see the environment in France through the robot’s camera eyes. (Image used with kind permission from Doron Friedman and Ori Cohen, cf. Cohen et al., 2014.) Ess, 2008, 2009). Here, we address the following question: Let us begin with the question of privacy. It is widely accepted how should these existing issues of internet research ethics be that researchers have an ethical obligation to treat confidentially approached for cases of internet research with the use of VR? The any information that may be used to identify their subjects (see, two main issues that we will cover here are privacy and obtaining for example, European Commission, 2013, p. 12). This obliga- informed consent. We will consider the internet both as a tool and tion is based on the general right to privacy outside of a research a venue for research (Buchanan and Zimmer, 2015), while noting context (Universal Declaration of Human Rights, article 12, 1948; that virtual environments may place pressure on the distinction European Commission Directive 95/46/EC). Practicing this between internet as research tool and internet as research venue. confidentiality may involve, for instance, erasing, or “scrubbing,” Frontiers in Robotics and AI | www.frontiersin.org 11 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct personally identifiable information from a data set (O’Rourke, navigation bar. In addition, and more interesting, it is also fore- 2007; Rothstein, 2010). seeable that HMDs will incorporate simultaneous combinations As O’Brolcháin et  al. note, immersive virtual environments of augmented, substitutional, and VR, with the user being able to will involve the recording of new kinds of personal information, toggle between elements of the three. Such a situation would add such as “eye-movements, emotions, and real-time reactions” ambiguity, and perhaps confusion, for attempts to determine the (2015, p. 8). We would like to add that immersive VR could even- user’s location in cyberspace. This ambiguity raises the likelihood tually incorporate motion capture technology in order to record that users may give consent for data collection in a particular the details of users’ bodily movements for the purpose of, for virtual context but then become unaware of the continued data example, representing their avatar as moving in a similar fashion. collection as the user changes context. Such a situation might Although implementing this scenario may be beyond the capa- occur if users of HMDs are able to toggle between, say, an entirely bilities of the forthcoming commercial hardware, it is plausible virtual gaming environment, a look out of the window to the busy and rational to assume that the technology may evolve quickly to street below presented through augmented reality, and a family include such options. Data regarding the kinematics of users will gathering hundreds of kilometers away using substitutional real- be useful for researchers from a range of disciplines, especially ity through an omni-directional camera set up at the party. This those interested in embodied cognition (Shapiro, 2014). On the worry can be addressed by giving users continuous reminders plausible assumption that one’s kinematics is very closely related (after, of course, they have given informed consent) that their to one’s personality and the deep functional structure of bodily behavior is being recorded for research purposes. Perhaps the self-consciousness – only your body moves in precisely this man- visual display could include a small symbol for the duration of ner – there will a highly individual “kinematic fingerprint.” This the time in which data are being collected. kind of data collection presents a special threat to privacy. O’Brolcháin et al. (2016) recommend protecting the privacy We leave the implementational details open, but urge the for users of online virtual environments through legislation and scientific community to take steps to avoid the abuse of through incentives to develop new ways of protecting privacy. As informed consent with this technology, especially in the a complement to these recommendations, we wish to highlight interest of preserving public trust. the threat to privacy created by motion capture technology. Unlike eye-movements and emotional reactions, one’s kinemat- A Note on the Limitations of a Code of Ethics for ics may be uniquely connected with one’s identity, as indicated Researchers above. Researchers collecting such data must be aware of its We would like to conclude our discussion of the research ethics sensitive nature and the dangers of its misuse. In addition, com- of VR by noting that the proposed (incomplete) code of conduct mercial providers of cloud-based VR-technology will frequently is not intended to be sufficient for guaranteeing ethical research have an interest of “harvesting,” storing, and analyzing such data in this domain. What we mean here is that following this code and users should be informed about such possibilities and give should not be considered to be a substitute for ethical reasoning explicit consent to them. on the part of researchers, reasoning that must always remain A second main concern in the ethics of internet research is sensitive to contextual and implementational details that cannot that of informed consent. In contrast to informed consent for be captured in a general code of conduct. We urge researchers to traditional face-to-face experiments, internet researchers may conceive of our recommendations here as an aid in their ongoing obtain consent by having subjects click “I agree” after being reflections about the ethical implications and permissibility of presented with the relevant documentation. There are a number their own work, and to proactively support us in developing this of concerns and challenges regarding the practice of gaining ethics code into more detailed future versions. As we emphasized consent for research using the internet as a venue (Buchanan and in the beginning of the article, this work is only intended as a Zimmer, 2015, see section Privacy, below), including, of course, first list of possible issues in the research ethics of VR and related the fact that actually reading internet privacy policies before technologies. We intend to update and revise this list continu- accepting them would take far more time than we are willing to ously as new issues arise, although the venue for future revisions allocate – on one estimate, it would take each of us 244 h per year is undecided. In any case, we wish to open an invitation for (McDonald and Cranor, 2008). constructive input from researchers in this field regarding issues We suggest that immersive VR will add further complications that should be added or reformulated. to these existing issues due to its manipulation of bodily location and its dissolution of boundaries between the real and the virtual. Scientists must understand that following a code of ethics Consider that entering a new internet venue, say a chatroom or a is not the same as being ethical. A domain-specific ethics forum, involves a fairly well-defined threshold at which informed code, however consistent, developed, and fine grained consent can be requested before one enters the venue. Due to future versions of it may be, can never function as a the centrality of the URL for using the web, one’s own location substitute for ethical reasoning itself. in cyberspace is fairly easy to track. With VR, by contrast, it is foreseeable that one’s movement through various virtual environ- ments will be controlled by one’s bodily movements, through Risks for individuals and Society facial gestures, or simply by the trajectory of visual attention in Now consider possible issues that may arise with widespread a way unlike internet navigation using a mouse, keyboard, and adoption of VR for personal use. Once the technology available Frontiers in Robotics and AI | www.frontiersin.org 12 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct to the general public for entertainment (and other) purposes, In order to better understand the risks, we recommend individuals will have the option of spending extended periods longitudinal studies, further research into the psychologi- of time immersed in VR  –  in a way this is already happening cal effects of long-term immersion. with the advent of smartphones, social networks, increasing time online, etc. Some of the risks and ethical concerns that we Of course, these studies must be conducted according to the have already encountered in the early days of the internet10 will principles of informed consent, non-maleficence, and benefi- reappear, though with the added psychological impact enabled cence outlined in Section “The Research Ethics of VR.” There by embodiment and a strong sense of presence. We all know that are several possible risks that can be associated with long-term internet technology has long ago begun to change our self-models immersion: addiction, manipulation of agency, unnoticed psy- and consequently our very own psychological structure. The com- chological change, mental illness, and lack of what is sometimes bination with technologies of virtual and robotic re-embodiment vaguely called “authenticity” (Metzinger and Hildt, 2011, p. 253). may greatly accelerate this development. The risks that are discovered through longitudinal studies must For instance, consider the infamous case of virtual rape in be directly and clearly communicated to users, preferably com- LambdaMOO, the text-based multi-user dungeon (MUD). In municated within VR itself. that virtual world, a player’s character known as “Mr.Bungle” Psychologists have long expressed concern about internet used a “voodoo doll” program to control the actions of other use disorder (Young, 1998), and it is a topic of ongoing research characters in the house. He forced them to perform a range of (Price, 2011).11 This area of research must now expand in order to sexual acts, some of which are especially disturbing (Dibbell, include concerns about addiction to immersive VR, both online 1993). Users of LambdaMOO were outraged, and at least one and offline. Doing so will require monitoring users who prefer to user whose character was a victim of the virtual rape reported spend long periods of time immersed (see Steinicke and Bruder, suffering psychological trauma (ibid.). The relevant point to keep 2014 for a first self-experiment). There are two relevant open in mind here is that this entire virtual transgression occurred in a questions here. First, how might the diagnostic criteria for addic- world that was entirely text based. We will soon be fully immersed tion to VR differ from the established criteria for internet use in virtual environments, actually embodying – rather than merely disorder and related conditions? Note that the neurophysiological describing – our avatars. The results sketched above in Section underpinnings of VR addiction may differ from that of internet “Illusions of Embodiment and Their Lasting Effect” suggest that use disorder (Montag and Reuter, 2015) due to the prolonged the psychological impact of full immersion will be great, likely illusion of embodiment created by VR technology, and because far greater than the impact of text-based role-playing. We must it implies causal interaction with the low-level mechanisms con- now take steps in order to help users avoid suffering psychological stituting the UI. Second, can we make use of the recommended trauma of various kinds. To this end, we will discuss four kinds treatments for internet use disorder for the purpose of helping of foreseeable risks: individuals with VR addiction? For instance, Gresle and Lejoyeux • long-term immersion; (2011, p. 92) recommend informing users how much time they • neglect of embodied interaction and the physical environment; have spent playing an online game, and including non-player • risky content; characters in the game to urge players to take breaks. It is plausible • privacy. that these strategies would be effective for immersive VR as well, but focused research is needed. We will offer several concrete recommendations for minimiz- A second concern about long-term immersion has to do with ing all four of these kinds of risks to the general public, a number the fact that immersive VR can manipulate the user’s sense of of which call for focused research initiatives. agency (Gallagher, 2005). In order to generate a strong illusion of ownership for the virtual body, the VR technology must track The Effects of Long-Term Immersion the self-generated movements of the user’s real body and render First, and perhaps most obviously, we simply do not know the the virtual body as moving in a similar manner.12 When things are psychological impact of long-term immersion. So far, scientific working well, users experience an illusion of ownership of the vir- research using VR has involved only brief periods of immersion, tual body (the avatar is my body), as well as an illusion of agency typically on the order of minutes rather than hours. Once the (I am in control of the avatar). Importantly, the sense of agency in technology is adopted for personal use, there will be no limits VR is always indirect; control of the avatar is always mediated by on the time users choose to spend immersed. Similarly, most the technology. To be more precise, the virtual body representation research using VR has been conducted using adult subjects. Once has been causally coupled with and temporarily embedded into VR is available for commercial use, young adults and children the currently active conscious self-model in the user’s brain – it will be able to immerse themselves in virtual environments. The is not that some mysterious “self ” leaves the physical body and risks that we discuss below are especially troublesome for these “enters” the avatar, but rather a novel functional configuration in younger users who are not yet psychologically and neurophysi- ologically fully developed. 11 Internet use disorder is listed as an area requiring further research in the DSM-5, but it is not (yet) an official disorder according to the manual. 12 If the real body is not in motion, then co-location of the virtual body with the real 10 See Gregory Lastowka, 2010 for a thoughtful treatment of some of the relevant body as seen from the first-person perspective can be sufficient for the illusion of issues. ownership (Maselli and Slater, 2013). Frontiers in Robotics and AI | www.frontiersin.org 13 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct which two body representations dynamically interact with each do not know whether long-term immersion poses a threat for other. However, the causal loop in principle enables bidirectional mental health. Future research ought to investigate whether fac- forms of control, or even unnoticed involuntary influence. The tors such as the duration of immersion, the content of the virtual fact that the user’s sense of agency in VR is always continuously environment (including the user’s own avatar or the way in which maintained by the technology is an important one for at least two the software controls the automatic behavior, facial gestures, or reasons. First, the technology could be used to manipulate users’ gaze of other avatars), and the user’s pre-existing psychological sense of agency. Second, as we discuss in the general context of profile might have lasting negative effects on the mental health mental health below, long-term immersion could cause low-level, of users. As mentioned above (see Ethical Experimentation), we initially unnoticeable psychological disturbances involving a loss suspect that heavy use of VR might trigger symptoms associated of the sense of agency for one’s physical body. with Depersonalization/Derealization Disorder (DSM-5 300.14). VR technology could manipulate users’ sense of agency by Overall, the disorder can be characterized as having chronic feel- creating a false sense of agency for movements of the avatar that ings or sensations of unreality. In the case of depersonalization, do not correspond to the actual body movements of the user. The individuals experience an unreality of the bodily self, and in the same could be true for “social hallucinations,” i.e., the creation case of derealization, individuals experience the external world of the robust subjective impression of ongoing social agency, of as unreal. For instance, those suffering from the disorder report engaging in a real, embodied form of social interaction, which, feeling as if they are automata (loss of the sense of agency), and however, in reality is only interaction with an unconscious feeling as if they are living in a dream (see Simeon and Abugel, AI or with complex software controlling the simulated social 2009 for illustrative reports from individuals suffering from behavior of an avatar. Using only a computer screen, a modified depersonalization).13 Note that Depersonalization/Derealization mouse, and headphones, a false sense of agency was created Disorder involves feelings of unreality but not delusions of unre- in Daniel Wegner’s well-known “I Spy” experiments (Wegner ality, there is a dissociation of the low-level phenomenology of and Wheatley, 1999; Wegner, 2002). In those experiments, “realness” from high-level cognition. That is, someone suffering subjects reported that they felt themselves to be in control of from depersonalization may lose the sense of agency, but will not a cursor selecting an icon on a computer screen when in fact thereby form the false belief that they are no longer in control of the cursor was being controlled by someone else. The illusion their own actions. of control was induced by auditory priming –  subjects heard Depersonalization/Derealization Disorder is relevant for a word through headphones that had a semantic association us here because VR technology manipulates the psychological with the icon that was subsequently selected by the cursor. It is mechanisms involved in generating experiences of “realness,” reasonable to think that Wegner’s method can be implemented mechanisms similar or identical to those that go awry for those rather easily in VR. While immersed in VR, subjects can receive suffering from the disorder. Even though users of VR do not believe continuous audio and visual cues intended to influence their that the virtual environment is real, or that their avatar’s body is psychological states. Future experimental work can determine really their own, the technology is effective because it generates the conditions under which subjects will experience a sense illusory feelings as if the virtual world is real (recall the virtual pit of agency for movements of the avatar that deviate from the from Section “Illusions of Embodiment and Their Lasting Effect” subject’s actual body movements (as during an OBE or in the above). What counts is the variable degree of transparency or dream state, see Kannape et al., 2010 for an empirical study). opacity of the user’s own conscious representations (Metzinger, Important parameters here will likely be the timing of the false 2003). Our concern is that long-term immersion could cause movement, the degree to which the false movement deviates damage to the neural mechanisms that create the feeling of reality, from the actual position of the body, and the context of the of being in immediate contact with the world and one’s own body. movement within the virtual environment (including, for Heavy users of VR may begin to experience the real world and instance, the attentional state of the subject). their real bodies as unreal, effectively shifting their sense of reality Creating a false sense of agency in VR is a clear violation of exclusively to the virtual environment.14 We recommend focused the user’s autonomy, a violation that becomes especially worri- longitudinal studies on the impact on mental health of long-term some as users spend longer and longer periods of time immersed. immersion in VR. These studies should especially investigate Here, we will not insist that all cases of violating autonomy in risks for dissociative disorders, such as Depersonalization/ this manner are ethically impermissible, noting that some such Derealization Disorder. violations may be subtle and beneficent, a kind of virtual “nudge” in the right direction (Thaler and Sunstein, 2009). In addition, human beings often willfully choose to decrease their autonomy, 13 as in drinking alcohol or playing games. But we do claim that There is a sizeable literature on depersonalization/derealization. Some of the central works include Steinberg and Schnall, 2001; Radovic and Radovic, 2002; creating a false sense of agency in VR is an unacceptable violation Simeon and Abugel, 2009; Sierra, 2012. of individual autonomy when it is non-beneficent, such as when 14 We should be clear here that we are only speculating about a possible causal it is done out of avarice, for example. Manipulating the sense of connection between long-term immersion and experiences of depersonalization/ agency for users in VR is a topic that deserves attention from derealization. The etiology of the disorder is still not well understood. It is well- regulatory agencies. known that episodes of depersonalization/derealization can be triggered by stress, panic attacks, and the use of some drugs (Simeon, 2004). One prominent theory A third concern that we wish to raise about long-term immer- suggests that chronic depersonalization/derealization may be caused by childhood sion is that of risks for mental health. As stated above, we simply trauma (ibid.), though see Marshall et al. (2000). Frontiers in Robotics and AI | www.frontiersin.org 14 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct A final concern for long-term immersion stems from the fact immersive VR would be unedifying, making people more shallow that some may consider experiences in the virtual environment to as they retreat from society in favor of an artificial social world be “inauthentic,” because those experiences are artificially gener- in which their decisions are made for them. Brolcháin et al. raise ated. This concern may remind some readers of Robert Nozick’s this sort of concern: well-known thought experiment about an “Experience Machine” that can provide users with any experience they desire (Nozick, With little exposure to “higher” culture, to great works 1974, p. 42–45). Nozick uses the thought experiment to raise a of art and literature; and without the skills (and maybe problem for utilitarianism, urging his readers to consider reasons the attention spans) to enjoy them; people would be less why one might not wish to “plug-in” to the machine, claiming able to engage with the world at a deep level. People that “something matters to us in addition to experience” (Nozick, without exposure to great works and ideas might find 1974, p. 44). The interesting question, of course, now becomes that [their] inner lives are shaped to a large degree what would happen if this “additional something” can be added by market-led cultural products rather than works of to the experience itself, for example by advanced VR-technology depth and profundity. (2015, p. 20) creating the phenomenal quality of “authenticity,” of direct refer- ence to something “meaningful,” for example by a more robust We agree that such a scenario would be undesirable, but wish version of naïve realism on the level of subjective experience to counterbalance this concern by reminding readers that it is itself or by manipulation of the user’s emotional self-model. not unique to VR. It is a concern that can be applied in various While Nozick suggests that many of us would not wish to plug- degrees to other media technology as well, going all the way back in to the experience machine for the reason just stated, recent to worries about the written word in Plato (Phaedrus 274d–275e). work by Felipe de Brigard suggests otherwise. De Brigard (2010) The printing press, for example, can enable one to disseminate presented students with several variations on the thought experi- great works of literature, but it can also enable the dissemination ment all with an important twist on Nozick’s original version. In of vulgarity – and it certainly changed our minds. Readers with de Brigard’s version, we are told that we are already plugged-in vulgar tastes can “immerse” themselves as they wish. The same to an experience machine and we are asked if we would like to goes for photography and motion pictures. The important point unplug in order to return to our “real” lives. Many of de Brigard’s is as follows. There is no reason to doubt that works of great depth students replied that they would not wish to unplug, leading de and profundity can be produced by artists who choose VR as their Brigard to suggest that our reactions to the thought experiment medium. Just as film emerged as a new predominant art form in are influenced more by the status quo bias (Samuelson and the twentieth century, so might VR in the twenty-first century. We Zeckhauser, 1988) than by our valuing of something more than predict that immersive VR-technology will gradually lead to the experience. It is the status quo bias, de Brigard suggests, that gives emergence of completely new forms of art (or even architecture, us pause about plugging-in to the machine (in Nozick’s version of see Pasqualini et al., 2013), which may be hard to conceive today, the thought experiment) just as it is the status quo bias that gives but which will certainly have cultural consequences, perhaps us pause about unplugging (in de Brigard’s version). even in our understanding what an artistic subject and esthetic Overall, de Brigard’s results offer initial reasons to be skeptical subjectivity really are. about Nozick’s supposition that we would not plug-in because we value factors beyond experience alone. Even with this skepticism, though, many of us may still feel that there is something false, Neglect of Others and the Physical Environment “inauthentic,” or undesirable about living large portions of one’s As users spend increasing time in virtual environments, there life in an entirely artificial environment, such as VR. Apart from is also a risk of their neglecting their own bodies and physical the dubious essentialist metaphysics lurking behind the vague environments – just as for many people today posing and engag- and sometimes ideologically charged notion of an “authentic self,” ing in disembodied social interactions via their Facebook account it is important to note how such intuitions are historically plastic has become more important than what was called “real life” in the and culturally embedded: they may well change over time as past. In extreme cases, individuals refuse to leave their homes for larger parts of the population begin to use advanced forms of VR extended periods of time, behavior categorized as “Hikikomori” technology. As an example, please note how already today we find by the Japanese Ministry of Health. VR will enable us to interact a considerable number of people who are not able to grasp the with each other in new ways, not through disembodied interac- difference between “friendship” and “friendship on Facebook” tion, as in the texts, images, and videos of current social media, any more. Fully engaging with the issue of losing “authenticity” but rather through what we have called the illusion of embodi- in virtual environments would likely require entering into some ment. We will interact with other avatars while embodied in our deep philosophical waters, and we are unable to do so here, though own avatars. Or perhaps we will use augmented reality through we will touch on some of the relevant issues below. Apart from the omni-directional cameras that allow us to enjoy the illusion of deeper philosophical issues, there is one important point that we being in the presence of someone who is far away in space and/ wish to make before moving on. or time. To put it more provocatively, we may soon, as Norbert The point has to do with the way in which we imagine the Wiener anticipated many years ago, have the ability to “telegraph” possibilities of VR for personal use. One reason behind an asser- human beings (Wiener, 1954, p. 103–104). Telepresence is likely tion that long-term immersion would be an inauthentic way of to become a much more accessible, immediate, comprehensive, spending one’s time is that one might assume that the content of and embodied experience. Frontiers in Robotics and AI | www.frontiersin.org 15 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct Our general recommendation on this theme is for focused the children disrupt their playing indicate that this concern is research into the following question: What, if anything, valid and serious.16 is lost in cases of social interactions that are mediated Clark (2003) takes a notably different approach to these kinds using advanced telepresence in VR? If such losses were of issues, raising the point that, instead of treating VR and related unnoticed, what negative effects for the human self-model technologies as a replacement for in-the-flesh interaction, we could be expected? should think of them as providing opportunities for new and perhaps enhanced modes of human interaction. Rather than This question has been a major theme in some of Hubert unsatisfactory reproductions of familiar modes of interaction, the Dreyfus’ work on the philosophy of technology. Dreyfus has technology should be developed with an eye toward “expanding emphasized that mediating technologies may not capture and reinventing our sense of body and action” (2003, p. 111). something of what is important for real-time interactions in the Consider, for example, using a combination of substitutional and flesh, what, following Merleau-Ponty, he calls “intercorporeal- augmented reality to see a representation of some of the physi- ity” (Dreyfus, 2001, p. 57). When we are not present in the ological states of your partner who is many miles away – such flesh with others, the context and mood of a situation may be as a soft flash over the body in synchrony with the heartbeat (as difficult to appreciate – if only because the bandwidth and the in Aspell et  al., 2013). That and similar uses of the technology resolution of our internal models are much lower. Perhaps more could plausibly enhance embodied (though mediated) social importantly, there is a concern that mediating technologies will interaction. As with many other topics addressed here, future not allow us to pick up on all of the subtle bodily cues that research will be crucial for our understanding of which uses of the appear to play a major role in social communication through technology will be best for enabling positive forms of (mediated) unconscious entrainment (Frith and Frith, 2007), cues that social interaction. involve ongoing embodied interaction (Gallagher, 2008; de Clark’s recommendation that we use the technology as an Jaegher et al., 2010). enhancer rather than a replacement does have some appeal. In addition to the concerns about losing embodied signaling However, what counts as an “enhancement,” and what as therapy for communication, we might also consider what is lost from or mere life-style decision, has been a topic of ethical debates for the sense modalities that are not (yet) integrated into VR. As a long time, for instance, in assessing the correct use of phar- Sherry Turkle puts it, when these kinds of technology “keep maceutical cognitive enhancement (Metzinger and Hildt, 2011; grandparents from making several-thousand-mile treks to see Metzinger, 2012). We should also note that his recommendation their grandchildren in person (and there is already evidence may not entirely address the concerns raised by Dreyfus, Turkle, that they do), children will be denied something precious: the and others. The foreseeable problem is that the general public sim- starchy feel of a grandmother’s apron, the smell of her perfume ply will not share Clark’s vision, choosing to use the technology up close, and the taste of her cooking” (Harmon, 2008; Turkle, as a de facto replacement for traditional modes of interaction (as 2011, p. 342). Advances in technology could conceivably address Turkle notes in the passage above). Are “Facebook-friendships” Turkle’s point about other perceptual modalities, but there social enhancements or social disabilities? In such a situation, we remains a question about what may be lost even if we can create must remain mindful of what may be lost, especially when the virtual content for other sense modalities.15 One recent finding technology may encourage less frequent “in-the-flesh” visits to that should raise concern here is that depression is more likely in the infirm and immobile. older adults who have less social contact in person regardless of We wish to close this discussion of the ways in which VR might their amount of telephone, written, and email contact (Teo et al., attenuate our contact with others and with our physical environ- 2015). Apart from this troubling finding, even if the technology ments by revisiting a point briefly made in the previous section eventually enables rich social interaction through telepresence, on a loss of authenticity during long-term immersion. As noted the concern remains that heavy use of such technology will lead above in the discussion of Nozick’s experience machine, many to neglect or even animosity toward one’s actual physical and readers might have the intuition that spending long periods of social environment. The recurring tragedies of parents with time in virtual environments is “somehow inauthentic.” Yet, what “gamer rage” who have injured and killed their children because counts for the applied ethics of VR are not intuitions, but rather rational arguments and empirical evidence. We would like to note that a likely relevant factor here may be whether those long peri- 15 All of these concerns bring up the question of whether the problem is merely a ods of immersion involve forms of intersubjective engagement shortcoming in the technology, or something more fundamental. That is, should with others that are subjectively experienced as meaningful, and we only be concerned about losing important information through mediated inter- how this experience is integrated into our culture. Along these actions, information such as bodily cues and tactile sensations? If so, then advances in technology can conceivably address that concern. Or is there something else lines, one may suggest that the artificial nature of the virtual that is lost when not present in the flesh with others? It seems that thinkers such environment is not as important compared to whether or not the as Dreyfus wish to suggest that there is something else that is lost when we lose environment affords intersubjective engagement experienced as “intercorporeality,” something that cannot be captured with better and better tech- meaningful (see Bostrom, 2003, p. 245–55 and Chalmers, 2005 nology. Still, it remains somewhat difficult to articulate what that “something else” might be. One possibility is that social interactions that are mediated by advanced technology lose some form of “authenticity” as discussed above. It is also worth noting that our epistemic limitations may be relevant: in the case of VR, we do not 16 For a list of examples, see: http://movingtolearn.ca/2013/gamer-rage-child- yet know the way in which social interaction will be altered. abuse-a-growing-problem-deserving-our-attention (retrieved 1 December 2015). Frontiers in Robotics and AI | www.frontiersin.org 16 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct for similar ideas). This, of course, opens the possibility that available for use after their human model is dead. Thus, we will be ultimately shallow or even largely meaningless social interac- able to “resurrect” the dead in VR. The ability to body-swap and tions (once again, think of today’s Facebook-“friendships” and to interact with the dead in this way may offer great opportunity “likes”) are experienced as substantial by users who are really only for therapy in the hands of the beneficent, but it could easily lead overwhelmed by the possibilities of future VR-technology, and to profound trauma, especially in the hands of characters such as which are subsequently described as meaningful. A shallow form Mr. Bungle, mentioned above. of social interaction could then become culturally assimilated These considerations raise difficult questions about which and thereby “normalized” (Metzinger and Hildt, 2011, p. 247). regulatory actions would facilitate the best overall outcome. On Normalization is a complex sociocultural process by which certain the one hand, there are good reasons for taking a fairly restric- new norms become accepted in societal practice, a process that is tive approach to avatar ownership. On the other hand, there are often mediated by the availability of new technologies, a process also reasons for allowing individuals maximum freedom in their that changes our very own minds and which, therefore, carries creation and use of avatars. Of course, one’s approach to such the risk of unnoticed self-deception. Here, we cannot explore this questions will likely reflect whether one’s political philosophy has rich (and controversial) philosophical territory, but note that it more paternalistic or libertarian leanings. We will consider the may be relevant for grappling with the worry of “inauthenticity” reasons for each approach in turn. in virtual environments. A reasonable starting point on this issue would be to treat avatars in an analogous manner to personality rights relating Risky Content to the publication of photos. They are public representations of Another main concern for users of VR is that of virtual content. persons. Interestingly, societies and legal systems exhibit con- One might begin with the general rule of thumb that red lines siderable differences in their underlying moral intuitions here. not to be crossed in reality should be the default red lines in VR. One important conceptual issue here may be determining the One obvious problem, though, is that users will almost certainly relevant degree of similarity between an avatar and a human per- seek out VR as a way of crossing red lines with impunity. A son. Just as many accept the right of an individual to control the second possible problem is that this rule of thumb would make commercial use of his or her name, image, likeness, one might, VR even more subjectively real. One main issue here is whether for example, interpret the “right to my own avatar” a property some particular kinds of content in VR should be discouraged right as opposed to a personal right. Therefore, the validity of in various ways. Obvious candidates for such content would be the right of publicity could be taken to survive the death of the sex (virtual pedophilia, virtual rape) and violence. But there are biological individual. There will be new questions about the perhaps less obvious kinds of content that should be consid- ownership (and individuation) of avatars. The likeness between ered, such as content encouraging and reinforcing undesirable a person and their avatar may or may not be an important factor. personality traits, including those identified as the “dark triad” Instead of likeness, we might individuate avatars by a unique (Paulus and Williams, 2002). The dark triad refers to narcissism, proper name that can be represented in the virtual space, as in Machiavellianism, and psychopathy. Individuals may find it many video games. How does one assign an unequivocal iden- appealing to spend time in virtual worlds designed to reward tity to the virtual representation of a body or a person? Could characters that exhibit traits associated with the dark triad. For there be something like a chassis plate number, a license plate, example, the MMORPG EVE Online is known for fostering a or a “virtual vehicle identification number? (VVIN)? We already style of play that involves manipulating and deceiving other have digital object identifiers (DOIs) for electronic documents players. The VR version of EVE Online, EVE: Valkyrie, has been and other forms of content, a form of persistent identification, described as “[u]ndoubtedly the most heavily anticipated virtual with the goal of permanently and unambiguously identifying reality game.”17 Based on some of the empirical results surveyed the object with which a given DOI is associated. But what about above (see “Illusions of Embodiment and Their Lasting Effect”), an avatar that is currently used by a human operator, namely there is cause for concern about behavioral patterns rewarded in by functionally and phenomenologically identifying with it? immersive games such as EVE: Valkyrie having a lasting influence Should we dynamically associate a “digital subject identifier” on the psychological profile of users. (DSI) with it? There will also be questions about whether some Apart from the behaviors encouraged by particular virtual kinds of virtual activities should be censored. Examples of environments, there are concerns about the content that can be such activities having to do with sex and violence are left to the created when users will have the freedom to create and design reader’s imagination. Another kind of content worth consider- their own avatars. For instance, one goal of our own project VERE ing may be the use of virtual environments for indoctrination is to create software that enables untrained users to generate an into extremist groups. avatar that resembles any human being with fairly little time With these initial thoughts in place, now consider the reasons and effort. This application would in principle allow for “body for taking a fairly strict regulatory stance on the ownership of swapping,” in which users enter the bodies of others (Petkova and one’s own avatar. After mentioning the pressure from social Ehrsson, 2008). It is also worth noting that these avatars will be networks such as Facebook for users to use their real identities, O’Brolcháin et al. recommend the development of technologies similar to digital watermarking that would ensure “that only the 17 http://www.craveonline.com/culture/878953-top-10-virtual-reality-games-will- genuine owner of an avatar can use it” (2015, p. 22). Would this convince-strap-vr-headset#/slide/10 (retrieved 30 September 2015). perhaps have to be a “DSI,” as we proposed above? They suggest Frontiers in Robotics and AI | www.frontiersin.org 17 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct that such technology would help protect the autonomy and Privacy privacy of users. Along the same lines, if one were to identify Privacy is, of course, a major concern with contemporary infor- strongly with one’s own avatar, the “theft” and use of that avatar mation technology (van den Hoven et al., 2014), and there are by another may be extremely disturbing. Importantly, avatar theft further concerns about privacy with the foreseeable convergence may also create completely new opportunities for impersonation between VR and social networks (O’Brolcháin et al., 2016). Here, and fraud, for example also using physical robots. we wish to offer only a few quick remarks on this topic, noting that From a more theoretical point of view, we might distinguish this issue deserves further attention. Commercial applications between internal and external self-models: the internal self-model of virtual environments introduce new possibilities for targeted is in the brain of the user, and it is grounded in his or her body advertising or “neuromarketing,” thus attacking the individual’s (Metzinger, 2014), whereas external person- and body-models mental autonomy (Metzinger, 2015). By tracking the details of can be created in virtual environments. Here, the specific, histori- one’s movements in VR, including eye movements, involuntary cally new kind of action that needs to be ethically assessed and facial gestures, and other indicators of what researchers call low- legally regulated takes place when a user identifies with a potential level intentions or “motor intentions” (Riva et al., 2011), private external model of the self by dynamically integrating it with the agencies will be able to acquire details about one’s interests and internal model of the self already active in his or her brain. The preferences in completely new ways (Coyle and Thorson, 2001). core question seems to be what consequences we draw from If avatars themselves should in the future be used as “humanoid the potential for phenomenological ownership to legal notions of interfaces,” consumers can be influenced and manipulated by ownership. Virtual identification can cause real suffering, and real real-time feedback of the avatar’s own facial and eye move- suffering is relevant for the law. ments (for example, via automatic and unconscious responses Without denying the value of protecting avatar ownership, we in their mirror-neuron system; Rizzolatti and Craighero, 2004). would now like to consider two reasons for taking a less restric- Commercials in VR could even feature images of the target audi- tive, more libertarian, approach. First, implementing control ence himself or herself using the product. The use of big data over the use of particular avatars may be impractical. So far, to “nudge” users (“Big Nudging”) combined with VR could have attempts to curb digital piracy using technology have not been long-lasting effects, perhaps producing changes in users’ mental very successful, and there is no reason to think that things will mechanisms themselves. be different for avatars. In fact, regulation and control may be even more difficult with avatars due to questions raised above Users ought to be made aware that there is evidence that having to do with avatar individuation and degrees of similarity. advertising tactics using embodiment technology such Say a user creates an avatar that is similar but not pixel-for-pixel as VR can have a powerful unconscious influence on identical to another user’s avatar.18 Where precisely should we behavior. draw the line between theft and acceptable similarity? Protecting avatar ownership might lead to a regulatory quagmire. Even if the appearance of the avatar is not highly relevant for ownership, we SUMMARY would need to establish a widely accepted alternative method of individuation, such as a unique proper name that cannot be easily In this article, we have considered some of the risks that may arise forged. The second reason for taking a less restrictive approach with the commercial and research use of VR. We have offered would be out of concern for individual creative freedom. As noted some concrete recommendations and noted areas in which above, VR holds the promise of being a powerful new artistic further ethical deliberation will be required. One main theme of medium – the creative possibilities are astonishing. The fact that the article is that there are several open empirical questions that regulations on avatar ownership may restrict those possibilities should be urgently addressed in a beneficent research environ- must be taken into consideration. ment in order to mitigate risks and raise awareness for users of VR in the general public. More research is needed. Here, one of our Avatar ownership and individuation will be an important main goals was to provide a first set of ethical recommendations issue for regulatory agencies to consider. There are strong as a platform for future discussions, a set of normative starting reasons to place restrictions on the way in which avatars points that can be continuously refined and expanded as we go can be used, such as protecting the interests and privacy of along (see Table 1). individuals who strongly identify with their own particu- Let us end by making one more general point, an observation lar avatar on social networks. On the other hand, these which is of a more philosophical nature. VR is the representation restrictions may prove impractical to implement and may of possible worlds and possible selves, with the aim of making unnecessarily limit personal creative freedom. them appear as real as possible – ideally, by creating a subjective sense of “presence” in the user. Interestingly, some of our best theories of the human mind and conscious experience itself describe it in a very similar way: leading current theories of brain 18 The importance of personal identity for moral philosophy is well-known (Parfit, 1984; Shoemaker, 2014). The considerations here introduce the additional compli- dynamics (Friston, 2010; Hohwy, 2013; Clark, 2015) describe it as cation of identity for virtual representations of persons. See Vallor (2010, especially the constant creation of internal models of the world, predictively pp. 166–167) and Rodogno (2012) for insightful discussions. generating hypotheses – virtual neural representations – about Frontiers in Robotics and AI | www.frontiersin.org 18 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct TABLe 1 | veRe code of conduct for the ethical use of vR in research and by the general public. ReCOMMeNDATiONS FOR THe ReSeARCH eTHiCS OF vR 1. Non-maleficence a. No experiment should be conducted using virtual reality with the foreseeable consequence that it will cause involuntary suffering or serious or lasting harm to a subject. b. A rational, evidence-based identification and minimization of risks (also those pertaining to a more distant future) ought to be a part of research itself. 2. informed consent a. Informed consent for VR experiments ought to include an explicit statement to the effect that immersive VR can have lasting behavioral influences on subjects, and that some of these risks may be presently unknown. b. Experimental VR research should not be carried out on subjects incapable of informed consent. 3. Transparency and media ethics a. In experimental work developing new clinical applications, researchers should be careful not to create false hopes in patients by repeatedly reminding them of the merely experimental nature of the research. b. VR researchers aiming at new clinical applications should work in close collaboration with physicians who may be better situated to make informed judgments about the suitability of particular patients for new trials. c. Scientists and the media need to be clear and honest with the public about scientific progress, and not only in the area of using VR for medical treatment. d. In interacting with the media, scientists should cultivate a proactive attitude, especially if they are the first to become aware of novel types of risks through their own work. Communication with the public, if needed, should be self-initiated, an act of taking control and acting in advance of a future situation, rather than just reacting. 4. Dual use a. Potential military applications of VR, AR, and SR should be closely monitored by policy makers and funding agencies alike. b. Torture in a virtual environment is still torture. The fact that one’s suffering occurs while one is immersed in a virtual environment does not mitigate the suffering itself. c. Policy makers should aim at international arrangements among countries to add VR, AR, and SR in a process to harmonize lists of dual-use technologies to be controlled. 5. internet research a. The scientific community has to take steps to avoid the abuse of informed consent with this technology, especially in the interest of preserving public trust. b. The ability to toggle between VR, AR, and SR may create situations in which users are not able to maintain an understanding of when their informed consent to share information is in effect. Users should be repeatedly reminded within VR that they have given informed consent. 6. The Limitations of a Code of Conduct a. Scientists must understand that following a code of ethics is not the same as being ethical. A domain-specific ethics code, however consistent, developed, and fine-grained future versions of it may be, can never function as a substitute for ethical reasoning itself. b. Such reasoning must be conducted in a way that is sensitive to the contextual and implementational details of particular experimental paradigms, details that cannot be captured by a general code of conduct. ReCOMMeNDATiONS FOR THe USe OF vR BY THe GeNeRAL PUBLiC 1. Long-term immersion a. Longitudinal studies and further research into the psychological effects of long-term immersion are needed. b. Users must be made aware that these studies are seriously limited in that they will, due to ethical constraints, exclude users who may be most vulnerable (such as children or those with latent mental illness). Some of these vulnerabilities may be unknown to science and unknown to the users themselves. 2. increasing virtualization of social interactions – we call for focused research, large longitudinal studies, into the following questions: a. What, if anything, is lost in cases of social interactions that are mediated using advanced telepresence in VR? b. If such losses were unnoticed, what negative effects for the human self-model could be expected? 3. Risky content a. As compared to the viewing of traditional movies containing graphic violence or pornography, the impact of full immersion settings and the associated risk of users suffering psychological trauma will steadily increase as VR technology advances. Users have to be made aware of this possibility. b. VR technology holds the potential to create robust social hallucinations, to directly manipulate the sense of agency, to modulate personality traits via identification with virtual characters, or to causally interact with deeper levels of self-consciousness (UI-manipulation). Users have to be made aware of this possibility. c. Avatar ownership will be an important issue for regulatory agencies to consider. There are strong reasons to place restrictions on the way in which avatars can be used, such as protecting the interests and privacy of individuals who strongly identify with their own particular avatar on social networks. On the other hand, these restrictions may prove impractical to implement and may unnecessarily limit personal creative freedom. Regulators must strike a rational balance between these concerns. 4. Privacy a. Users ought to be made aware that there is evidence that advertising tactics using embodiment technology, such as VR, can have a powerful unconscious influence on behavior. For example, a combination of “Big Nudging” strategies (collecting big data for the purposes of nudging the general public) with VR technology could have long-lasting effects, which might also affect underlying mental mechanisms themselves. b. Data protection: users ought to be made aware of new risks involving surveillance, such as reading out “motor intentions” or a “kinematic fingerprint” during avatar use. Frontiers in Robotics and AI | www.frontiersin.org 19 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct the hidden causes of sensory input through probabilistic infer- are just beginning to understand. It is this complex convolution ence. Slightly earlier, some philosophers (Revonsuo, 1995, p. 55; that makes it so important to think about the Ethics of VR in a Revonsuo, 2009, p. 115.; Metzinger, 2003, p. 556; 2008, 2009a, p. critical, evidence-based, and rational manner. 6, 23, pp. 104–108; Westerhoff, 2015 for overview and discussion) have pointed out at length how conscious experience exactly is a AUTHOR CONTRiBUTiONS virtual model of the world, a dynamic internal simulation, which in standard situations cannot be experienced as a virtual model MM drafted the initial outline of the article and then both MM because it is phenomenally transparent – we “look through it” as and TM contributed written content. if we were in direct and immediate contact with reality. What is historically new, and what creates not only novel psychological ACKNOwLeDGMeNTS risks but also entirely new ethical and legal dimensions, is that one VR gets ever more deeply embedded into another VR: the The authors wish to thank Mel Slater, Patrick Haggard, Angelika conscious mind of human beings, which has evolved under very Peer, and all members of the VERE project for critical discussion specific conditions and over millions of years, now gets causally and detailed proposals. We are also grateful to the three reviewers coupled and informationally woven into technical systems for of this contribution for recommending substantial improvements representing possible realities. Increasingly, it is not only cultur- to the original draft. ally and socially embedded but also shaped by a technological niche that over time itself quickly acquires a rapid, autonomous FUNDiNG dynamics and ever new properties. This creates a complex convo- lution, a nested form of information flow in which the biological This work was funded by FP7 257695 Virtual Embodiment and mind and its technological niche influence each other in ways we Robotic Re-Embodiment (VERE). ReFeReNCeS Blascovich, J., and Bailenson, J. (2011). Infinite Reality. Avatars, Eternal Life, New Worlds, and the Dawn of the Virtual Revolution, 1st Edn. New York: Ahn, S. J., Bailenson, J., and Park, D. (2014). Short- and long-term effects of embod- William Morrow. ied experiences in immersive virtual environments on environmental locus Borgmann, A. (1984). Technology and the Character of Contemporary life. A of control and behavior. Comput. Human Behav. 39, 235–245. doi:10.1016/j. Philosophical Inquiry. Chicago: University of Chicago Press. chb.2014.07.025 Bostrom, N. (2003). Are we living in a computer simulation? Philos. Q. 53, 243–255. Althaus, D., Erhardt, J., Gloor, L., Hutter, A., Mannino, A., and Metzinger, T. (2015). doi:10.1111/1467-9213.00309 “Künstliche Intelligenz: Chancen und Risiken,” in Diskussionspapiere der Botella, C., Serrano, B., Baños, R., and Garcia-Palacios, A. (2015). Virtual reality Stiftung für Effektiven Altruismus, Vol. 2, 1–17. Available at: http://ea-stiftung. exposure-based therapy for the treatment of post-traumatic stress disorder: a org/s/Kunstliche-Intelligenz-Chancen-und-Risiken.pdf review of its efficacy, the adequacy of the treatment protocol, and its acceptabil- American Psychiatric Association. (2013). Diagnostic and Statistical Manual of ity. Neuropsychiatr. Dis. Treat. 11, 2533–2545. doi:10.2147/NDT.S89542 Mental Disorders. DSM-5, 5th Edn. Washington, DC: Author. Botella, C., Villa, H., García Palacios, A., Quero, S., Baños, R., and Alcaniz, M. Ananthaswamy, A. (2015). The Man Who Wasn’t There. Investigations into the (2004). The use of VR in the treatment of panic disorders and agoraphobia. Strange New Science of the Self. London: Dutton. Stud. Health Technol. Inform. 99, 73–90. Appelbaum, P. S., Roth, L. H., Lidz, C. W., Benson, P., and Winslade, W. (1987). Botvinick, M., and Cohen, J. (1998). Rubber hands ‘feel’ touch that eyes see. Nature False hopes and best data: consent to research and the therapeutic misconcep- 391, 756. doi:10.1038/35784 tion. Hastings Cent. Rep. 17, 20–24. doi:10.2307/3562038 Brey, P. (2010). Philosophy of technology after the empirical turn. Tech. Res. Philos. Asch, S. (1951). “Effects of group pressure upon the modification and distortion of Technol. 14, 36–48. doi:10.5840/techne20101416 judgment,” in Groups, Leadership and Men: Research in Human Relations, ed. H. Brugger, P., Kollias, S. S., Müri, R. M., Crelier, G., Hepp-Reymond, M. C., and Guetzkow (Oxford: Carnegie Press), 177–190. Regard, M. (2000). Beyond re-membering: phantom sensations of congen- Aspell, J., Heydrich, L., Marillier, G., Lavanchy, T., Herbelin, B., and Blanke, O. itally absent limbs. Proc. Natl. Acad. Sci. U.S.A. 97, 6167–6172. doi:10.1073/ (2013). Turning body and self inside out: visualized heartbeats alter bodily pnas.100510697 self-consciousness and tactile perception. Psychol. Sci. 24, 2445–2453. Bublitz, J. C., and Merkel, R. (2014). Crimes against minds: on mental manipula- doi:10.1177/0956797613498395 tions, harms and a human right to mental self-determination. Crim. Law Philos. Azuma, R. (1997). A survey of augmented reality. Presence 6, 355–385. doi:10.1162/ 8, 51–77. doi:10.1007/s11572-012-9172-y pres.1997.6.4.355 Buchanan, E., and Ess, C. (2008). “Internet research ethics. The field and its critical Bateson, M., Nettle, D., and Roberts, G. (2006). Cues of being watched enhance issues,” in The Handbook of Information and Computer Ethics, eds K. Himma cooperation in a real-world setting. Biol. Lett. 2, 412–414. doi:10.1098/ and H. Tivani (Hoboken, NJ: Wiley), 273–292. rsbl.2006.0509 Buchanan, E., and Zimmer, M. (2015). “Internet research ethics,” in Stanford Beauchamp, T., and Childress, J. (2013). Principles of Biomedical Ethics, 7th Edn. Encyclopedia of Philosophy, ed. E Zalta. Available at: http://plato.stanford.edu/ New York: Oxford University Press. archives/spr2015/entries/ethics-internet-research/ Behr, K.-M., Nosper, A., Klimmt, C., and Hartmann, T. (2005). Some practical Buchanan, E. A., and Ess, C. M. (2009). Internet research ethics and considerations of ethical issues in VR research. Presence 14, 668–676. the institutional review board. SIGCAS Comput. Soc. 39, 43–49. doi:10.1162/105474605775196535 doi:10.1145/1713066.1713069 Bernstein, E. M., and Putnam, F. W. (1986). Development, reliability, Carlin, A. S., Hoffman, H. G., and Weghorst, S. (1997). Virtual reality and tactile and validity of a dissociation scale. J. Nerv. Ment. Dis. 174, 727–735. augmentation in the treatment of spider phobia: a case report. Behav. Res. Ther. doi:10.1097/00005053-198612000-00004 35, 153–158. doi:10.1016/S0005-7967(96)00085-X Blanke, O., and Metzinger, T. (2009). Full-body illusions and minimal phenomenal Chalmers, D. (2005). “The matrix as metaphysics,” in Philosophers Explore the selfhood. Trends Cogn. Sci. 13, 7–13. doi:10.1016/j.tics.2008.10.003 Matrix, ed. C. Grau (Oxford, UK: Oxford University Press), 132–176. Frontiers in Robotics and AI | www.frontiersin.org 20 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct Chen, D., Miller, F., and Rosenstein, D. (2003). Clinical research and Harmon, A. (2008). “Grandma’s on the computer screen,” in The New York Times. the physician-patient relationship. Ann. Int. Med. 138, 669–672. Available at: http://www.nytimes.com/2008/11/27/us/27minicam.html doi:10.7326/0003-4819-138-8-200304150-00015 Heeter, C. (1992). Being there: the subjective experience of presence. Presence 1, Clark, A. (2003). Natural-Born Cyborgs. Minds, Technologies, and the Future of 262–271. doi:10.1162/pres.1992.1.2.262 Human Intelligence. Oxford, NY: Oxford University Press. Heidegger, M. (1977). The Question Concerning Technology, and Other Essays, Clark, A. (2015). Surfing Uncertainty. Prediction, Action, and the Embodied Mind. 1st Edn. New York: Harper & Row (Harper colophon books). Oxford: Oxford University Press. Hershfield, H., Goldstein, D., Sharpe, W., Fox, J., Yeykelis, L., Carstensen, L., et al. Cohen, O., Koppel, M., Malach, R., and Friedman, D. (2014). Controlling (2011). Increasing saving behavior through age-progressed renderings of the an avatar by thought using real-time fMRI. J. Neural Eng. 11, 035006. future self. J. Mark. Res. 48, S23–S37. doi:10.1509/jmkr.48.SPL.S23 doi:10.1088/1741-2560/11/3/035006 Hilti, L., Hänggi, J., Vitacco, D., Kraemer, B., Palla, A., Luechinger, R., et al. (2013). Coyle, J., and Thorson, E. (2001). The effects of progressive levels of interactivity The desire for healthy limb amputation: structural brain correlates and clinical and vividness in web marketing sites. J. Advert. 30, 65–77. doi:10.1080/00913 features of xenomelia. Brain 136(Pt 1), 318–329. doi:10.1093/brain/aws316 367.2001.10673646 Hoffman, H., Chambers, G., Meyer, W., Arceneaux, L., Russell, W., Seibel, E., de Brigard, F. (2010). If you like it, does it matter if it’s real? Phil. Psychol. 23, 43–57. et al. (2011). Virtual reality as an adjunctive non-pharmacologic analgesic for doi:10.1080/09515080903532290 acute burn pain during medical procedures. Ann. Behav. Med. 41, 183–191. de Jaegher, H., Di Paolo, E., and Gallagher, S. (2010). Can social interaction doi:10.1007/s12160-010-9248-7 constitute social cognition? Trends Cogn. Sci. 14, 441–447. doi:10.1016/j. Hohwy, J. (2013). The Predictive Mind, First Edn. Oxford: Oxford University Press. tics.2010.06.009 Huang, Z., Hui, P., Peylo, C., and Chatzopoulos, D. (2013). Mobile augmented de Ridder, D., van Laere, K., Dupont, P., Menovsky, T., and Van de Heyning, P. reality survey: a bottom-up approach. The Computing Research Repository (2007). Visualizing out-of-body experience in the brain. N. Engl. J. Med. 357, (CoRR). Available at: http://arxiv.org/abs/1309.4413. 1829–1833. doi:10.1056/NEJMoa070010 Kannape, O. A., Schwabe, L., Tadi, T., and Blanke, O. (2010). The limits of Dibbell, J. (1993). “A rape in cyberspace,” in The Village Voice. Available at: http:// agency in walking humans. Neuropsychologia 48, 1628–1636. doi:10.1016/j. www.villagevoice.com/news/a-rape-in-cyberspace-6401665 neuropsychologia.2010.02.005 Dreyfus, H. (2001). On the Internet. Routledge. Kass, N. E., Sugarman, J., Faden, R., and Schoch-Spana, M. (1996). Trust, The Ehrsson, H. (2007). The experimental induction of out-of-body experiences. fragile foundation of contemporary biomedical research. Hastings Cent. Rep. Science 317, 1048. doi:10.1126/science.1142175 26, 25–29. doi:10.2307/3528467 Emmelkamp, P. M., Bruynzeel, M., Drost, L., and van der Mast, C. A. (2001). Kilteni, K., Bergstrom, I., and Slater, M. (2013). Drumming in immersive virtual Virtual reality treatment in acrophobia: a comparison with exposure in vivo. reality: the body shapes the way we play. IEEE Trans. Vis. Comput. Graph. 19, Cyberpsychol. Behav. 4, 335–339. doi:10.1089/109493101300210222 597–605. doi:10.1109/TVCG.2013.29 Ess, C., and Association of Internet Researchers Ethics Working Committee. Kueffer, C., and Larson, B. M. H. (2014). Responsible use of language in scientific (2002). Ethical Decision-Making and Internet Research. Recommendations from writing and science communication. Bioscience 64, 719–724. doi:10.1093/ the AoIR Ethics Working Committee; Association of Internet Researchers. biosci/biu084 Available at: http://aoir.org/reports/ethics.pdf Lenggenhager, B., Tadi, T., Metzinger, T., and Blanke, O. (2007). Video ergo sum: European Commission. (2013). Ethics for Researchers. Facilitating Research manipulating bodily self-consciousness. Science 317, 1096–1099. doi:10.1126/ Excellence in FP7. Brussels. Available at: http://ec.europa.eu/research/ science.1143439 participants/data/ref/fp7/89888/ethics-for-researchers_en.pdf Lidz, C., and Appelbaum, P. (2002). The therapeutic misconception: prob- Fan, K., Izumi, H., Sugiura, Y., Minamizawa, K., Wakisaka, S., Inami, M., et  al. lems and solutions. Med. Care 40(9 Suppl.), V55–V63. doi:10.1097/01. (2013). “Reality jockey: lifting the barrier between alternate realities through MLR.0000023956.25813.18 audio and haptic feedback,” in the SIGCHI Conference, eds W. Mackay, Madary, M. (2014). Intentionality and virtual objects: the case of Qiu Chengwei’s S. Brewster, and S. Bødker (Paris: ACM SIGCHI), 2557. dragon sabre. Ethics Inf. Technol. 16, 219–225. doi:10.1007/s10676-014-9347-4 Ferrer-Garcia, M., Gutierrez-Maldonado, J., Treasure, J., and Vilalta-Abella, F. Marcuse, H. (1964). One-Dimensional Man. Studies in the Ideology of Advanced (2015). Craving for food in virtual reality scenarios in non-clinical sample: Industrial Society, 2nd Edn. London: Routledge. analysis of its relationship with body mass index and eating disorder symptoms. Marshall, R. D., Schneier, F. R., Lin, S. H., Simpson, H. B., Vermes, D., and Eur. Eat. Disord. Rev. 23, 371–378. doi:10.1002/erv.2375 Liebowitz, M. (2000). Childhood trauma and dissociative symptoms in panic Fischhoff, B. (2013). The sciences of science communication. Proc. Natl. Acad. Sci. disorder. Am. J. Psychiatry 157, 451–453. doi:10.1176/appi.ajp.157.3.451 U.S.A. 110(Suppl. 3), 14033–14039. doi:10.1073/pnas.1213273110 Maselli, A., and Slater, M. (2013). The building blocks of the full body ownership Franssen, M., Lokhorst, G.-J., and van de Poel, I. (2009). “Philosophy of technol- illusion. Front. Hum. Neurosci. 7:83. doi:10.3389/fnhum.2013.00083 ogy,” in Stanford Encyclopedia of Philosophy, ed. Z Edward. Available at: http:// McDonald, A., and Cranor, L. F. (2008). The cost of reading privacy policies. I/S plato.stanford.edu/entries/technology/ J. Law Policy Inform. Soc. 4. Freedman, B. (1987). Equipoise and the ethics of clinical research. N. Engl. J. Med. Meehan, M., Insko, B., Whitton, M., and Brooks, F. (2002). Physiological measures 317, 141–145. doi:10.1056/NEJM198707163170304 of presence in stressful virtual environments. ACM Trans. Graph. 21, 645–652. Friston, K. (2010). The free-energy principle: a unified brain theory? Nat. Rev. doi:10.1145/566654.566630 Neurosci. 11, 127–138. doi:10.1038/nrn2787 Melzer, N. (2008). Targeted Killing in International Law. Oxford, NY: Oxford Frith, C., and Frith, U. (2007). Social cognition in humans. Curr. Biol. 17, University Press (Oxford Monographs in International Law). R724–R732. doi:10.1016/j.cub.2007.05.068 Metz, R. (2012). Augmented reality is finally getting real. MIT Technol. Rev. 2. Gallagher, S. (2005). How the Body Shapes the Mind. Oxford, NY: Clarendon Press. Metzinger, T. (2003). Being No One. The Self-Model Theory of Subjectivity. Gallagher, S. (2008). Inference or interaction: social cognition without precursors. Cambridge, MA: MIT Press. Phil. Explor. 11, 163–174. doi:10.1080/13869790802239227 Metzinger, T. (2008). Empirical perspectives from the self-model theory of Gregg, L., and Tarrier, N. (2007). Virtual reality in mental health. A review of subjectivity: a brief summary with examples. Prog. Brain Res. 168, 215–278. the literature. Soc. Psychiatry Psychiatr. Epidemiol. 42, 343–354. doi:10.1007/ doi:10.1016/S0079-6123(07)68018-2 s00127-007-0173-4 Metzinger, T. (2009a). The Ego Tunnel. The Science of the Mind and the Myth of the Gregory Lastowka, F. (2010). Virtual Justice. The New Laws of Online Worlds. Self. New York: Basic Books. New Haven, CT: Yale University Press. Metzinger, T. (2009b). Why are out-of-body experiences interesting for phi- Gresle, C., and Lejoyeux, M. (2011). “Phenomenology of internet addiction,” in losophers? The theoretical relevance of OBE research. Cortex 45, 256–258. Internet Addiction, ed. O. P. Hannah (Hauppauge, NY: Nova Science Publisher’s, doi:10.1016/j.cortex.2008.09.004 Inc), 85–94. Metzinger, T. (2012). Zehn jahre neuroethik des pharmazeutischen kognitiven Haney, C., Banks, W., and Zimbardo, P. (1973). Study of prisoners and guards in a enhancements – aktuelle probleme und handlungsrichtlinien für die praxis. simulated prison. Nav. Res. Rev. 9, 1–17. Fortschr. Neurol. Psychiatr. 80, 36–43. doi:10.1055/s-0031-1282051 Frontiers in Robotics and AI | www.frontiersin.org 21 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct Metzinger, T. (2013c). “Two principles for robot ethics,” in Robotik und Gesetzgebung, Rizzo, A. A., Wiederhold, M., and Buckwalter, J. G. (1998). Basic issues in the use eds E. Hilgendorf and J.-P. Günther (Baden-Baden: Nomos), 247–286. of virtual environments for mental health applications. Stud. Health Technol. Metzinger, T. (2013a). The myth of cognitive agency: subpersonal thinking as a Inform. 58, 21–42. cyclically recurring loss of mental autonomy. Front. Psychol. 4:931. doi:10.3389/ Rizzolatti, G., and Craighero, L. (2004). The mirror-neuron system. Annu. Rev. fpsyg.2013.00931 Neurosci. 27, 169–192. doi:10.1146/annurev.neuro.27.070203.144230 Metzinger, T. (2013b). Why are dreams interesting for philosophers? The example Rodogno, R. (2012). Personal identity online. Philos. Technol. 25, 309–328. of minimal phenomenal selfhood, plus an agenda for future research. Front. doi:10.1007/s13347-011-0020-0 Psychol. 4:746. doi:10.3389/fpsyg.2013.00746 Rosenberg, R., Baughman, S., and Bailenson, J. (2013). Virtual superheroes: using Metzinger, T. (2014). “First-order embodiment, second-order embodiment, superpowers in virtual reality to encourage prosocial behavior. PLoS ONE third-order embodiment,” in The Routledge Handbook of Embodied Cognition, 8:e55003. doi:10.1371/journal.pone.0055003 ed. Lawrence A. S. (London: Routledge), 272–286. Rothbaum, B., Price, M., Jovanovic, T., Norrholm, S., Gerardi, M., Dunlop, B., et al. Metzinger, T. (2015). M-autonomy. J. Conscious Stud 22, 11–12. (2014). A randomized, double-blind evaluation of D-cycloserine or alprazolam Metzinger, T., and Hildt, E. (2011). “Cognitive enhancement,” in The Oxford combined with virtual reality exposure therapy for posttraumatic stress dis- Handbook of Neuroethics, eds J. Illes and B. J. Sahakian (Oxford, NY: Oxford order in Iraq and Afghanistan war veterans. Am. J. Psychiatry 171, 640–648. University Press (Oxford Library of Psychology)), 245–264. doi:10.1176/appi.ajp.2014.13121625 Milgram, P., and Colquhoun, H. (1999). “A taxonomy of real and virtual world Rothbaum, B. O., Hodges, L. F., Ready, D., Graap, K., and Alarcon, R. D. (2001). display integration,” in Mixed Reality: Merging Real and Virtual Worlds, eds Y. Virtual reality exposure therapy for Vietnam veterans with posttraumatic stress Ohta and H. Tamura (New York: Springer), 5–30. disorder. J. Clin. Psychiatry 62, 617–622. doi:10.4088/JCP.v62n0808 Milgram, P., and Kishino, F. (1994). A taxonomy of mixed reality visual displays. Rothstein, M. (2010). Is deidentification sufficient to protect health privacy in IEICE Trans. Inf. Syst. E77-D, 1321–1329. research? Am. J. Bioethics 10, 3–11. doi:10.1080/15265161.2010.494215 Milgram, S. (1974). Obedience to Authority. An Experimental View. London: Tavistock. Samuelson, W., and Zeckhauser, R. (1988). Status quo bias in decision making. Miller, S., and Selgelid, M. (2008). Ethical and Philosophical Consideration of the J. Risk Uncertainty 1, 7–59. doi:10.1007/bf00055564 Dual-Use Dilemma in the Biological Sciences. New York: Springer. Seth, A. (2013). Interoceptive inference, emotion, and the embodied self. Trends Montag, C., and Reuter, M. (2015). Internet Addiction. Neuroscientific Approaches Cogn. Sci. 17, 565–573. doi:10.1016/j.tics.2013.09.007 and Therapeutical Interventions (Studies in Neuroscience, Psychology and Shapiro, L. (2014). The Routledge Handbook of Embodied Cognition, 1 Edn. London: Behavioral Economics). New York: Springer. Routledge, Taylor & Francis Group (Routledge handbooks). Normand, J.-M., Sanchez-Vives, M., Waechter, C., Giannopoulos, E., Shoemaker, D. (2014). “Personal Identity and Ethics,” in Stanford Encyclopedia Grosswindhager, B., Spanlang, B., et  al. (2012). Beaming into the rat world: of Philosophy, ed. E. Zalta. Available at: http://plato.stanford.edu/archives/ enabling real-time interaction between rat and human each at their own scale. spr2014/entries/identity-ethics/ PloS one 7:e48331. doi:10.1371/journal.pone.0048331 Sierra, M. (2012). Depersonalization. A New Look at a Neglected Syndrome. Nozick, R. (1974). Anarchy, State, and Utopia. New York: Basic Books. Cambridge: Cambridge Medicine. O’Brolcháin, F., Jacquemard, T., Monaghan, D., O’Connor, N., Novitzky, P., and Sierra, M., and Berrios, G. E. (2000). The Cambridge depersonalization scale: a Gordijn, B. (2016). The convergence of virtual reality and social networks: new instrument for the measurement of depersonalization. Psychiatry Res. 93, threats to privacy and autonomy. Sci. Eng. Ethics 22, 1–29. doi:10.1007/ 153–164. doi:10.1016/S0165-1781(00)00100-1 s11948-014-9621-1 Simeon, D. (2004). Depersonalisation disorder: a contemporary overview. CNS O’Rourke, P. (2007). Report of the Public Responsibility in Medicine and Research. Drugs 18, 343–354. doi:10.2165/00023210-200418060-00002 Human Tissue/Specimen Banking Working Group. Available at: http://www. Simeon, D., and Abugel, J. (2009). Feeling Unreal. Depersonalization Disorder and primr.org/workarea/downloadasset.aspx?id=936 the Loss of the Self. Oxford, NY: Oxford University Press. Osimo, S., Pizarro, R., Spanlang, B., and Slater, M. (2015). Conversations between Slater, M., Antley, A., Davison, A., Swapp, D., Guger, C., Barker, C., et al. (2006). A self and self as Sigmund Freud – a virtual body ownership paradigm for self virtual reprise of the Stanley Milgram obedience experiments. PLoS ONE 1:e39. counselling. Sci. Rep. 5, 13899. doi:10.1038/srep13899 doi:10.1371/journal.pone.0000039 Parfit, D. (1984). Reasons and Persons. Oxford: Oxford University Press. Slater, M., Spanlang, B., Sanchez-Vives, M., and Blanke, O. (2010). First person Parsons, T., and Rizzo, A. (2008). Affective outcomes of virtual reality exposure experience of body transfer in virtual reality. PLoS ONE 5:e10564. doi:10.1371/ therapy for anxiety and specific phobias: a meta-analysis. J. Behav. Ther. Exp. journal.pone.0010564 Psychiatry 39, 250–261. doi:10.1016/j.jbtep.2007.07.007 Steinberg, M., and Schnall, M. (2001). The Stranger in the Mirror. Dissociation: the Pasqualini, I., Llobera, J., and Blanke, O. (2013). “Seeing” and “feeling” architecture: Hidden Epidemic. New York: Cliff Street Books. how bodily self-consciousness alters architectonic experience and affects the Steinicke, F., and Bruder, G. (2014). “A self-experimentation report about long- perception of interiors. Front. Psychol. 4:354. doi:10.3389/fpsyg.2013.00354 term use of fully-immersive technology,” in The 2nd ACM Symposium, eds Paulus, D., and Williams, K. (2002). The dark triad of personality. Narcissism, W. Andy, S. Frank, S. Evan, and S. Wolfgang (Honolulu, HI: ACM), 66–69. Machiavellianism, and psychopathy. J. Res. Pers. 36, 556–563. doi:10.1016/ Suzuki, K., Wakisaka, S., and Fujii, N. (2012). Substitutional reality system: a novel S0092-6566(02)00505-6 experimental platform for experiencing alternative reality. Sci. Rep. 2, 459. Peck, T., Seinfeld, S., Aglioti, S., and Slater, M. (2013). Putting yourself in the skin doi:10.1038/srep00459 of a black avatar reduces implicit racial bias. Conscious. Cogn. 22, 779–787. Takahashi, B., and Tandoc, E. (2015). Media sources, credibility, and perceptions doi:10.1016/j.concog.2013.04.016 of science: learning about how people learn about science. Public Underst. Sci. Petkova, V., and Ehrsson, H. (2008). If i were you: perceptual illusion of body doi:10.1177/0963662515574986 swapping. PLoS ONE 3:e3832. doi:10.1371/journal.pone.0003832 Teo, A., Choi, H., Andrea, S., Valenstein, M., Newsom, J., Dobscha, S., et al. (2015). Price, H. (ed.) (2011). Internet Addiction. Hauppauge, NY: Nova Science Publisher’s, Inc. Does mode of contact with different types of social relationships predict depres- Radovic, F., and Radovic, S. (2002). Feelings of unreality: a conceptual and phe- sion in older adults? Evidence from a nationally representative survey. J. Am. nomenological analysis of the language of depersonalization. Phil. Psychiatr. Geriatr. Soc. 63, 2014–2022. doi:10.1111/jgs.13667 Psychol. 9, 271–279. doi:10.1353/ppp.2003.0048 Thaler, R., and Sunstein, C. (2009). Nudge. Improving Decisions about Health, Revonsuo, A. (1995). Consciousness, dreams and virtual realities. Phil. Psychol. 8, Wealth, and Happiness. New York: Penguin Books. 35–58. doi:10.1080/09515089508573144 The British Psychological Society. (2014). Code of Human Research Ethics. Available Revonsuo, A. (2009). Inner Presence. Consciousness as a Biological Phenomenon. at: http://www.bps.org.uk/system/files/Public%20files/code_of_human_research_ Cambridge, MA: MIT Press. ethics_ dec_2014_inf180_web.pdf Riva, G., Waterworth, J., Waterworth, E., and Mantovani, F. (2011). From intention Tsakiris, M., and Haggard, P. (2005). The rubber hand illusion revisited: visuotac- to action: the role of presence. New Ideas Psychol. 29, 24–37. doi:10.1016/j. tile integration and self-attribution. J. Exp. Psychol. Hum. Percept. Perform. 31, newideapsych.2009.11.002 80–91. doi:10.1037/0096-1523.31.1.80 Frontiers in Robotics and AI | www.frontiersin.org 22 February 2016 | Volume 3 | Article 3 Madary and Metzinger Real Virtuality: A Code of Ethical Conduct Turkle, S. (2011). Alone Together. Why We Expect More from Technology and Less Windt, J. (2015). Dreaming. A Conceptual Framework for Philosophy of Mind and from Each Other. New York: Basic Books. Empirical Research. Cambridge, MA: MIT Press. Vallor, S. (2010). Social networking technology and the virtues. Ethics Inf. Technol. Yee, N., and Bailenson, J. (2007). The proteus effect. The effect of transformed 12, 157–170. doi:10.1007/s10676-009-9202-1 self-representation on behavior. Hum. Commun. Res. 33, 271–290. van den Hoven, J., Blaauw, M., Pieters, W., and Warnier, M. (2014). “Privacy doi:10.1111/j.1468-2958.2007.00299.x and information technology,” in Stanford Encyclopedia of Philosophy, ed. Yoon, G., and Vargas, P. (2014). Know thy avatar: the unintended effect E. Zalta. Available at: http://plato.stanford.edu/archives/win2014/entries/ of virtual-self representation on behavior. Psychol. Sci. 25, 1043–1045. it-privacy/ doi:10.1177/0956797613519271 Wegner, D. (2002). The Illusion of Conscious Will. Cambridge, MA: MIT Press. Young, K. (1998). Internet addiction: the emergence of a new clinical disorder. Wegner, D. M., and Wheatley, T. (1999). Apparent mental causation. Cyberpsychol. Behav. 1, 237–244. doi:10.1089/cpb.1998.1.237 Sources of the experience of will. Am. Psychol. 54, 480–492. doi:10.1037/0003-066X.54.7.480 Conflict of Interest Statement: The authors declare that the research was con- Wendler, D. (2012). “The ethics of clinical research,” in Stanford Encyclopedia ducted in the absence of any commercial or financial relationships that could be of Philosophy, ed. E. Zalta. Available at: http://plato.stanford.edu/archives/ construed as a potential conflict of interest. fall2012/entries/clinical-research/ Westerhoff, J. (2015). What it means to live in a virtual world generated by our Copyright © 2016 Madary and Metzinger. This is an open-access article distributed brain. Erkenn. 1–22. doi:10.1007/s10670-015-9752-z under the terms of the Creative Commons Attribution License (CC BY). The use, Wiener, N. (1954). The Human Use of Human Beings. Cybernetics and Society. New distribution or reproduction in other forums is permitted, provided the original York, NY: Da Capo Press (The Da Capo series in science). author(s) or licensor are credited and that the original publication in this journal Windt, J. (2010). The immersive spatiotemporal hallucination model of dreaming. is cited, in accordance with accepted academic practice. No use, distribution or Phenom. Cogn. Sci. 9, 295–316. doi:10.1007/s11097-010-9163-1 reproduction is permitted which does not comply with these terms. Frontiers in Robotics and AI | www.frontiersin.org 23 February 2016 | Volume 3 | Article 3