Survivor Safety: Supporting Survivors of Nonconsensual Image Sharing

Jan 13th, 2022

By Grace Rupp

Technology is an invaluable part of modern life and an essential tool for social communication, especially during the ongoing Covid-19 pandemic. Most Americans, across identity markers, actively use both the Internet and mobile devices, with 95 percent of Americans owning a cell phone (Anderson, 2015) and 89 percent using the Internet (Pew Research Center, 2018). While technology offers countless ways to connect, it has also created new ways to perpetrate sexual and intimate partner violence. A growing area of technology-facilitated abuse is image based sexual abuse, which includes abusive behaviors like deepfakes, doxing, ‘upskirting,’ cyber flashing, sexualized photoshopping, and nonconsensual image sharing, also referred to as revenge porn. Many of these behaviors also include the sharing of survivors’ personally identifying information. The inclusion of such information, along with sexually explicit content is not only a violation of personal privacy, but also poses a significant risk for further abuse, stalking, and harassment.

One particularly prevalent and increasingly pervasive form of image-based abuse is nonconsensual image sharing, also referred to as ‘revenge porn,’ defined as the distribution of nude, intimate, or sexually explicitly images or videos of another person without their explicit consent or stated permission. While the term ‘revenge porn’ is commonly used to describe nonconsensual image sharing, other terminology better recognizes the harm experienced by survivors of this type of sexual violence. ‘Revenge porn’ is the language of Maryland law, but when used inappropriately, it shifts blame to victims rather than emphasizing the harm committed.

Similar to other forms of sexual violence, revenge porn’s purpose is to intimidate, control, coerce, and even humiliate survivors. Using the language of revenge porn implies consent, suggesting that images were created for public consumption, rather than focusing on the abusive behavior of the perpetrator. Revenge porn suggests there is a justification for sharing intimate images without consent, that there is something to be avenged, implying that survivors have done something to warrant and provoke such abuse.  Other language such as nonconsensual image sharing and image-based abuse better recognizes the harm done to the survivor, capturing the seriousness and potential trauma that results from a violation of someone’s personal sexual privacy.

Before the pandemic nonconsensual image sharing was becoming increasingly commonplace, with the pandemic seeing an increase in image-based abuse. Conservative estimates suggest that one in twenty-five Americans experience threats or posts of intimate images without their permission (Lenhart, 2016). Other estimates suggest that between 18 and 37 percent of adult Americans experience online harassment, which includes threats of physical violence, sexual harassment, stalking, and image-based abuse (Anti-Defamation League 2019, Duggan 2017).  More specifically 1 in 12 adult US social media users have experienced nonconsensual image sharing (Ruvalcaba and Eaton 2020); young adults are overrepresented among those experiencing technology facilitated abuse, with 45 percent of people aged 18 to 29 having experienced sever online harassment (Anti-Defamation League, 2019; Duggan 2017). Recent high-profile cases in Maryland include the Mayor of Cambridge Andrew Bradshaw, arrested on November 15, 2021 and charged with fifty counts of violating Maryland law that bans revenge porn (Gaskill, H, 2021).

Nonconsensual image sharing is sexual violence and the trauma experienced by survivors is pervasive and similar to those who experience sexual violence in the physical world. People who are experiencing technology facilitated abuse might not recognize that they have been victimized or even know where to go or who to contact for help.  Survivors of nonconsensual image sharing and technology facilitated abuse report experiencing symptoms of serious psychological distress and trauma, including feelings of isolation, guilt, anger, and worthlessness (Short et al., 2015). Survivors also report suffering from depression, anxiety, PTSD symptoms, and frequently engaging in self-harming behaviors at higher rates than the rest of the population (Bates, 2017; International Centre for Missing and Exploited Children, 2018; U.S. Department of Justice, 2016).  While technology facilitated abuse and nonconsensual image sharing are traumatic and examples of sexually violent behavior, help and support is available across the state for survivors.

Nonconsensual image sharing is traumatic and seeking out supportive measures and professional help is one way of managing safety and beginning the healing process. One possible avenue of support is seeking out professionals across the state, including rape crisis centers and MCASA’s Sexual Assault Legal Institute.  Rape crisis centers across the state have trained hotline staff available 24/7 to answer questions or just to listen to your concerns; advocates can also provide emotional and technical support, specifically around safety planning, documenting image-based abuse, and the actions required to remove harmful images. SALI advocates and attorneys can also support through the process of documenting image-based abuse, safety planning, and possibly taking criminal or civil action.

There are also trained professionals and advocacy organizations who specialize in supporting survivors of technology facilitated abuse, specifically the Cyber Civil Rights Initiative and the Coalition Against Stalkerware, both organizations providing technology specific support. In addition to professional support, many survivors of nonconsensual image sharing also benefit from other social support, including seeking out friends, family, or other supportive people to help in the aftermath of image-based abuse. Given the trauma produced from such abusive behavior, survivors may feel a range of emotions, including confusion, having a trusted emotional support person will offer support, empathy and understanding may prove invaluable.

While professional support may prove invaluable for some survivors, preventative and response measures can be taken independently and privately. More specifically, documenting and reporting nonconsensual image sharing can be one possible response. Evidence of digital abuse, specifically nonconsensual image sharing, is frequently time sensitive, easily deleted, and difficult to track, as such documenting and reporting abuse is one way of preserving it, which may be helpful should legal action or police involvement be necessary. Frequently, nonconsensual image sharing also violates the terms of service of the host site, meaning that take down requests reported to the site may result in the removal of abusive content.

Although it can be a lengthy process, in-app reporting may result in the removal of sexually explicit content. For example, most social media sites and online spaces do not allow nudity or harassing content, as such reporting abusive content and users may result in the removal of abusive content. In addition to in-app reporting, survivors can also request that nonconsensual images be removed from google searches, meaning that individuals can submit requests to remove links to abusive content from Google search results. While documenting abuse and submitting take down requests can be lengthy processes, they are one way of removing abusive content.

It is important to note that the use of technology is not the problem, abusive behavior is the problem. While technology can be used to abuse, it can also be used to enhance and maintain safety, decrease isolation, and support virtual communities. The following additional resources are available to support survivors of nonconsensual image sharing and image based abuse:


Documentation Support 



Anderson, Monica, “Technology Ownership: 2015,” Pew Research Center, October 29, 2015. 

Anti-Defamation League, “Online Hate and Harassment: The American Experience.”  As of November 8, 2019:

Bates, Samantha, “Revenge Porn and Mental Health: A Qualitative Analysis of the Mental Health Effects of Revenge Porn on Female Survivors,” Feminist Criminology, Vol. 12, No. 1, 2017, pp. 22–42.

Duggan, Maeve, Online Harassment 2017, Washington, D.C.: Pew Research Center, July 11, 2017.

Eaton, Asia A., Holly Jacobs, and Yanet Ruvalcaba, 2017 Nationwide Online Study of Nonconsensual Porn Victimization and Perpetration: A Summary Report, Coral Gables, Fla.: Cyber Civil Rights Initiative, July 2017

Gaskill, H. Cambridge Mayor Facing Revenge Porn Distribution Charges. Maryland Matters. November 15, 2021. Retrieved from:

International Centre for Missing and Exploited Children, Studies in Child Protection: Sexual Extortion and Nonconsensual Pornography, Alexandria, Va:

Lenhart, Amanda, “Nonconsensual Image Sharing,” Center for Innovative Public Health Research,  March 2016.

Pew Research Center, March 2018, “Social Media Use in 2018”

Ruvalcaba, Yanet, and Asia A. Eaton, “Nonconsensual Pornography Among U.S. Adults: A Sexual Scripts Framework on Victimization, Perpetration, and Health Correlates for Women and Men,” Psychology of Violence, Vol. 10, No. 1, 2020, pp. 68–78

Short, Emma, Andrew Guppy, Jacqui A. Hart, and James Barnes, “The Impact of Cyberstalking,” Studies in Media and Communication, Vol. 3, No. 2, 2015, pp. 23–37

U.S. Department of Justice, The National Strategy for Child Exploitation Prevention and Interdiction, Washington, D.C., April 2016.

Related Articles

Stay In The Loop

Sign up for our mailing list to receive Frontline, MCASA’s quarterly eNewsletter, and stay updated on MCASA’s programs and upcoming events and training in Maryland.

Sign Up