3. Current Approaches to Countering Gendered Disinformation and Addressing Gender Dimensions of Disinformation

Updated On
Apr 02, 2021

Current approaches to countering gendered disinformation and addressing gender dimensions of disinformation

The field of gender-sensitive counter-disinformation programming is still emerging, and programming that explicitly centers the problem of gendered disinformation and gendered impacts of disinformation is rare. Currently, from the democracy to gender to technology sectors, there is limited, albeit growing, awareness and understanding of the nuanced and varied ways that disinformation and gender programming can intersect.  To illustrate the variety of ways in which a gender lens can be brought to bear on counter-disinformation programming, programmatic examples that include gender elements are mainstreamed in the thematic sections of this guidebook. To complement these examples, this section applies what works in related programming areas to outline ways in which gender can be further integrated into counter-disinformation programming.  For example, promising practices for gender-sensitive counter-disinformation programming can be drawn from good practices in development or humanitarian aid programs focused on gender-based violence and gender equity. 

Focused on direct attacks of online gender-based violence

Existing programming to counter gendered disinformation is largely focused on preventing, identifying, and responding to direct attacks targeting women or people with diverse sexual orientations and gender identities as the subjects of gendered disinformation. These programs are often focused narrowly on women politicians and journalists as the targets of these attacks.  This type of programming includes a variety of responses, such as reporting and removal from platforms, fact-checking or myth-busting, digital safety and security training and skills-building, or media and information literacy for women, girls, and LGBTI communities.  Similarly, the existing body of research identified as focusing on gendered disinformation is largely centered around diagnosing these direct attacks, the motivations of their perpetrators, and the harms of such attacks.  While these are critical areas to continue funding for programming and research, these interventions are necessary but not sufficient. Donors and implementers must also pursue programming that addresses other dimensions of gender and disinformation.  

To better inform the design and delivery of effective and sustainable interventions to counter gendered disinformation, as well as to mitigate the gendered impacts of disinformation more broadly, researchers must also broaden their focus to investigate such topics as: 

  • The different ways in which women, girls, men, boys, and people with diverse sexual orientations and gender identities engage with the digital information ecosystem
  • The risk factors for and protective factors against perpetrating or being targeted by gendered disinformation
  • Women as perpetrators of—or otherwise complicit parties to—disinformation, hate speech, and other forms of harmful online campaigns

Informative programming in this space might include digital landscape mapping, gender and technology assessments to identify gaps in access and skills, focus group discussions, community engagement, and public opinion research. This type of programming will enable practitioners to better understand the diverse ways in which these different groups interact with the digital information space, may be vulnerable to being targeted by disinformation or susceptible to perpetrating disinformation, and are affected by the impacts of disinformation. 

 

More reactive than proactive, more ad hoc than systematic

As noted in other sections of the guidebook, one way to characterize counter-disinformation programming is to look at approaches as proactive or reactive.  

Proactive programming refers to interventions which seek to prevent the creation and spread of gendered disinformation before it enters the digital information space. It might also include efforts to strengthen the resilience of those likely to be targeted by disinformation or those susceptible to becoming perpetrators of gendered disinformation.  This can include a broad array of interventions, such as media and information literacy, confidence- and resilience-building, gender equality programming, civic and political participation programming, and education, workforce development, and livelihoods programming. 

Reactive programming might include interventions which seek to respond to gendered disinformation after it has already been dispatched, such as reporting content to platforms or law enforcement for removal or investigation or fact-checking and responsive messaging to counter false or problematic content.  

Some gender-sensitive counter-disinformation programming may be both reactive and proactive, as they are interventions that both respond to the creation and spread of discrete cases of gendered disinformation and aim to deter would-be perpetrators of gendered disinformation. Examples include platform- or industry-wide policies and approaches to identification, tagging, or removal of content, legislation to criminalize hate speech, online gender-based violence, and other harmful or problematic content, or regulation of platform responses to gendered disinformation.

Reactive approaches tend to be more ad hoc and immediate or short-term by nature, attempting to stamp out discrete disinformation campaigns or attacks as they emerge.  Some proactive approaches are also ad hoc in nature, such as programs with one-off training sessions, classes, mobile games, or other toolkits for digital safety and security or media and information literacy.  However, many proactive approaches (and some responses which are both reactive and proactive) are more systematic or long-term, aiming to transform gender norms, increase democratic participation, create long term social and behavior change, create safer spaces for women, girls, and people with diverse sexual orientations and gender identities online, and build the resilience of individuals, communities, and societies to withstand the weight of disinformation attacks and campaigns.

Much of the existing programming to counter gendered disinformation is reactive and ad hoc, designed to respond to gendered disinformation and address its impacts after it has already been pushed into the digital environment.  Reactive interventions, such as content tagging or removal and fact-checking, myth-busting, or otherwise correcting the record in response to direct attacks, are generally insufficient to reverse the harms caused by gendered disinformation, from reputational damage and self-censorship to withdrawal from public and digital spaces and sowing seeds of distrust and discord. 

Paragraphs

Design Tip


However, as scholars and practitioners in this field will note, much of the damage has already been done by the time responses to gendered disinformation are deployed6.

As is the case with most gender-related programming, while there are important uses for both reactive and proactive programming to counter gendered disinformation, in order to ensure that disinformation prevention and response programming is both effective and sustainable, it is imperative that the donor and implementer communities think about proactive, not just reactive, gender-sensitive counter-disinformation programming.  A major challenge, however, is that gender-transformative programming and programming designed to strengthen the protective factors against disinformation can typically be measured in generational shifts, rather than the two- to five-year periods most donor funding streams would require.  Accommodating this holistic approach would require donors to consider rethinking the typical structure of their funding mechanisms and reporting requirements.

Footnotes

6As discussed during the Center for Democracy and Technology’s closed Research Workshop on Disinformation: Understanding the Impacts in Terms of Race and Gender, September 25, 2020.