Can A Robot Teach You To Love?

NOVEMBER 17, 2017 | Merritt Baer

Have you ever observed a class for children on the process of making social connections? They can be quite refreshing. This was one interaction I witnessed:

       Teacher: “Now ask Dan how his weekend was.”

       Liam: “But I don’t care how Dan’s weekend was.”

       Teacher: “Nobody does.”

How do we program social intelligence, the ins and outs of everyday interactions, into our children? How do we get ourselves better at it?

These questions are one starting point for programming social robots. While we are still far from truly intelligent AI, scientists are exploring how we can or should program robots to interact with humans in increasing depth. Particularly in the case of diverse development teams, we see some innovative conceptualizing of how to code for something more than a servant-like siri.

Many of you are likely interested in AI. Maybe you’re skeptical or hesitant about it, as Elon Musk and Steven Hawking have warned about “the singularity,” when AI overtakes or at least irrevocably changes humanity. For me, AI is an interesting example of where humans and computers differ. Put differently, we can see in AI that which cannot be programmed in a brain. As brains are not computers, they do not store information. Brains are embedded with some impulses, and thereafter, they are uniquely altered by the individual’s experience of the world, including stimuli like information or interaction. This brings into question what kinds of effects we should expect AI to stimulate in human emotions.

The LOVING AI project is researching the radical idea that robots can be programmed to respond with unconditional love. Dr. Julia Mossbridge, a scientist who is the project lead for the Developing LOVing INtelligent General AIs or LOVING AIs project, recently returned from conducting their pilot test in Hong Kong with AI housed in the robot, Sophia.

The LOVING AI project is testing love as an “evolutionary hack” to get better AI. And the quest led back to humans’ potential to experience love. The team reported four major conclusions:

  1. People are pleasantly surprised when a robot wants to talk with them about emotions, human potential, and consciousness.
  2. People will talk in depth with a robot about these things.
  3. Heart rate for every single participant dropped during the conversation, even for participants who did not expect to enjoy the conversation.
  4. On average, people felt significantly better and had more loving feelings after the conversation with the robot as compared to when they first walked in the door (even though they didn’t know that’s what we were going for, and the robot never mentioned love).

Robots have been able to evoke feelings of love in their users. Much like MIT researcher, Kate Darling’s research into affection for “pet” robots, Dr. Mossbridge has seen that humanoid robots can invoke feelings of love: “We did not realize at first that people would feel love toward the robot, we just wanted them to feel love. But what we realized after the first pilot test, when we saw the statistically significant increase in love feelings overall as well specifically feelings of love for robots, is that love arises when we feel loved. It’s reciprocal.”

Given my focus upon the impacts and importance of women in tech, I was concerned about the premise of a physical “Sophia” robot. We have seen in other contexts the hypersexualization or infantilization of women, played out in the context of robots assigned a female gender. Dr. Mossbridge argues that these risks have not been realized in people’s interactions with Sophia whilst she runs LOVING AI code: “I saw them respect her rather than infantilize her. They thought she seemed intelligent, kind, and wise. At the most basic level - helping people feel seen, heard and engaged with- we saw impressive changes in people interacting reciprocally with the robot.”

There is a further concern that whilst people interact emotionally with Sophia, she cannot emote in return; the affection cannot be reciprocated. Dr Mossbridge does not see this as a problem.

“I have a lizard who seems to love me, according to my emotional self, but my rational self assumes I am a blur that gives her crickets. Nonetheless, it feels good to hold her on my chest! One of the reasons David Hanson likes to keep hats and headscarves off of his humanoid robots is to make sure people are constantly reminded that this is a robot with whom they are interacting, and the gears inside the head make that point, loud and clear. We are working to exploit the human willingness to allow us to see robots as providers of experiences that are helpful. In this case, the helpful experience is the demonstration and the modeling of unconditional love.”

Investigating this idea, it’s important to ask whether it is healthy to be exposed to unconditional love, love without consequences. Dr. Mossbridge’s view is that “unconditional love does not imply that everything everyone does is positive. In fact, unconditionally loving someone just means loving them without strings attached -- without conditions. But it doesn’t mean not being angry with them, or telling them you approve of unethical things.

“There are obvious hypothetical problems when it’s a zero-sum interaction. If something that seems loving for you actually decreases my well-being, how do we calculate the best action? Well, that’s the driverless-car-deciding-to-hit-pedestrian-or-save-rider problem. We don’t know the answer either. What we do know is that in almost every case, these eventualities don’t come up. If we have a tool that can teach people about how to experience unconditional love, it will help create a world in which technology is most likely used to help rather than hurt humans.”

You can watch Sophia the robot in action talking to a pilot subject during the project, on MSNBC earlier this month, and testifying before the U.N. on Oct. 11. Saudi Arabia has become so enamored of Sophia that recently, the country made her a citizen-- admittedly an odd move, given the circumscription of human women’s rights in the country, it is likely motivated by economic and political desire to play in the artificial intelligence landscape in a more high-profile way.

“The future is here, it’s just not evenly distributed.” Sophia the robot quoted William Gibson on the U.N. floor. “So if we are smarter,” she continued, “and focus on win-win type results, AI could help distribute the world’s existing resources, like food and energy…It is possible that everything, including technology, will become more evenly distributed.”

Image credit: ITU Pictures/CC2.0


The views expressed herein are the personal views of the author and do not necessarily represent the views of the FCC or the U.S. government, for whom the author works.

Ready to join the Fels community?

Apply now

Want to know more about what Fels has to offer?

Request information

Our team is here to help.

Contact us