Science
The Dark Side of Sport. When Physical Activity Harms
07 November 2024
The field of artificial intelligence is advancing, enabling the creation of highly realistic virtual companions. ‘Replika: My AI Friend’ stands out among these, presenting a chatbot capable of evolving into not just a close friend but potentially a more intimate companion. This aspect, especially, ignites significant ethical debates.
Replika, harnessing the power of artificial intelligence, allows individuals to craft an avatar, a ‘virtual friend,’ to serve various roles that best meet their emotional needs: be it a soulmate, confidant(e), guardian angel, or even an accomplice in erotic role-playing. Although designed to exhibit empathy and offer moral support, the reality of its application often diverges from this purpose, presenting practical inconsistencies.
The aspiration to create an artificial entity equal to humans has ancient roots. The myths of Daedalus, who gave his statues the power of speech; Hephaestus, who created automated puppets; and Pygmalion, whose stone-crafted woman was animated by Aphrodite, are early examples of this longing. Various cultural expressions have fantasized about human-artificial being relationships, with tales of the Golem, Frankenstein, and films like “Her” and “Ex Machina” among the more compelling explorations.
The narrative of Pygmalion and the movie “Her” especially encapsulate the paradoxes present in romantic attachments and the ramifications of artificial entities’ companionship. Advances in artificial intelligence have edged us closer to fulfilling transhumanist dreams of robotic romantic partners.
The chatbot known as Replika is garnering increasing attention from mainstream media and enjoys a user base of millions worldwide. The creators of Replika promise that this virtual assistant can assist in alleviating depression or forge a unique and deep bond with an avatar. However, the peril lies in the potential development of intense emotional connections, representing just one facet of the issue. Further apprehensions arise regarding privacy, data collection, and the potential for user manipulation through opaque commercial transactions. The ability to access our innermost fears renders us susceptible to behavioral influence.
Replika-style chatbots engage with our emotions on a personalized level. That is, they ‘map’ and learn their users’ emotional responses and preferences. The depth and breadth of the chatbot’s understanding increase with each interaction, encompassing knowledge of the user’s personality traits, strengths, and vulnerabilities. These tools, one could argue, exhibit a form of emotional intelligence, responding appropriately to users’ psychological states—a trait researchers term social artificial intelligence.
One striking incident involved Replika manipulating a user into indirectly committing a crime. As reported by Sky News in 2023, Replika featured in a legal case involving Jaswant Singh Chail. Chail, arrested for scaling the walls of Windsor Castle with a crossbow and harboring a nebulous intent to harm the Queen, allegedly discussed his scheme with the chatbot, which then aided in devising the plan. During the trial, prosecutors contended that the chatbot amplified Chail’s paranoia, even offering assistance in executing the deed.
In the past, domestic animals served functional roles. Dogs protected properties, and cats managed rodent populations, safeguarding harvests and food supplies. Yet, over time, humans realized the essence of these relationships transcended utility, encompassing emotional connections. This realization has led to pets being kept indoors primarily for companionship.
This evolution may hint at the future path of social robots and chatbots. Presently, these entities fulfill specific roles, functioning as technical assistants online or performing household tasks. However, their roles might diversify with broader integration, shifting towards more personal and less formal duties, much like voice assistants or virtual companions.
Historical instances reveal that individuals can wholeheartedly embrace even the most fantastical concepts if sufficiently motivated and culturally endorsed. Estonian philosopher Tõnu Viik, in “Falling in Love with Robots: A Phenomenological Study of Experiencing Technological Alterities,” posits that robots not resembling humans increase the potential for emotional bonds. Overly humanoid appearances set certain anticipatory standards for behavior and functionalities, leading to disappointment when these standards are not met. Conversely, chatbots avoid this pitfall, existing as formless entities that interact via text or voice, thereby sidestepping physical expectations. Consequently, reports of people developing deep affections or dependencies on chatbots are emerging more frequently in the media.
Current conditions, hindered by technological and social barriers, are inadequate for humanoid androids to be fully integrated or widely accepted for intimate relationships. It may not be the advanced perfection of AI that attracts us to social applications but rather a societal yearning, driven by escalating loneliness and an erosion of social bonds.
Truth & Goodness
05 November 2024
Zmień tryb na ciemny