Talking to a perpetrator with the help of deepfake. It has become possible with the use of an application that has been used in trauma processing therapy by professor of Clinical Psychology Agnes van Minnen. After conducting a pilot with two patients, funding for a more extensive study has now been requested.
‘I hope you die’, Meg slams her laptop shut with a bang. She has had a conversation on her computer with the perpetrator who abused her as a child. Meg is now 48 years old and suffers from post-traumatic stress disorder. Afterward, she says that she felt like she was in charge of the situation for the first time. The conversation gives her the feeling of being able to stand up for herself and to verbalize her anger.
Meg, a fictitious name, was not actually speaking to her perpetrator. A deepfake (a digitally created video of a person, ed.) was used based on a picture of the perpetrator. Through deepfake, the mouth and eyes of the perpetrator can move. A therapist speaks on behalf of the perpetrator. The dialogue has been included in the article by Van Minnen about the use of deepfake for trauma processing that was published in the Journal Frontiers in Psychiatry.
Restorative mediation
The deepfake therapy pilot has been applied to two patients of the PsychoTrauma Expertise Center PSYTREC, where Van Minnen is the director of the treatment program. Funding has been requested to treat ten patients with the therapy and to conduct a controlled trial to explore the applicability of this therapy.
‘Restorative mediation already exists’, Van Minnen says about the treatment. ‘It is a conversation between the perpetrator and the victim. In that conversation, it is important that the perpetrator shows remorse and apologizes to the victim. However, this is not always possible; for instance, if the perpetrator does not have insight, if it entails a stalker, or if a report has not been filed. There are many cases in which it is not recommended to have this conversation. Deepfake can then come in as a valuable alternative.’
‘It can be helpful for a victim if a deepfake tells them it was not their fault’
Van Minnen was contacted by Theo Gevers, professor of Computer Vision at the University of Amsterdam, who introduced the software to the commercial market. Gevers wanted to allow a party to use this technique for societal purposes. Van Minnen instantly thought of it as an option for treatment, because many victims are left with feelings of guilt.
‘Most victims were very young at the time of the abuse’, Van Minnen explains. ‘They often wonder why it happened to them and sometimes think that they caused the situation themselves. It can be helpful for a victim if a deepfake tells them it was not their fault. It also allows them to rise above the perpetrator, partly because they are older now compared to when the abuse occurred.’
What the therapist will say as the voice of the deepfake is based on clinical experience and on what works in restorative mediation, Van Minnen explains. ‘We still need to explore what else can be helpful. Most important is that the perpetrators make an apology and that the victims can explain the impact of the abuse on their lives.’
The news coverage of the deepfake therapy in de Volkskrant sparked a discussion. Van Minnen: ‘There is a group of people who question new forms of therapy. Some people have ethical concerns. For instance, the perpetrators are unaware of their pictures being used. The perpetrators can also be people who have never been convicted. Is their privacy well guaranteed? Someone also raised the question of whether it can be dangerous if a person becomes less cautious with a potentially dangerous perpetrator as a result of the therapy. Yet, this risk of danger seems small to me. People are well aware that they have been talking to a deepfake perpetrator.’
The head hardly moves
One of the participants of the pilot had completed her treatment, but the perpetrator still lived in the same town. She avoided the street where he lived and the shop where he worked. After the deepfake therapy, she felt more comfortable going to this street and the shop.
‘The patients instantly froze when they saw any movement on the screen’
Although the deepfake is not very realistic – the voice is different and the head hardly moves – the victims perceive it as real. Van Minnen: ‘When we were starting up and only the picture of the perpetrator could be seen on the screen, the patients were relatively relaxed, but the patients instantly froze when they saw any movement on the screen.’ At the end of the session, the patients also reported this experience and asked to do it again sometime.
Many therapies already incorporate pictures or the use of imagination to converse with a perpetrator. However, the effect is often not as strong as compared to deepfake. ‘That is really remarkable’, Van Minnen says. She compared her application of deepfake to virtual reality, where people also strongly react to visuals despite knowing that they are not real.
Dangers
Funding has been requested by Van Minnen for new research with ten patients who are in their final phase of trauma processing. She hopes that this new study will yield concrete results. ‘I am especially interested in the potential dangers of this therapy. We are very enthusiastic, but we need thorough research first.’
There are more ways to incorporate deepfake in therapy. It is already being used in grief counseling, where people can talk to a deceased person. ‘Perhaps in a few years everyone will have an app on their phone with which we can talk to a deepfake’, Van Minnen says. ‘There are concerns that people will use deepfake too much, but I wonder what could be considered too much. Is visiting someone’s grave weekly also too much?’
Van Minnen presumes that the technology of deepfake is useful when you need closure from someone and experience trouble because of it. ‘That person can also be an ex who treated you badly or a bully. Someone whom you feel hesitant to confront in real life. The application can be very helpful for these situations.’