7. Monitoring and Reporting Inside Closed Groups and Messaging Apps
Written by: Claire Wardle
Claire Wardle leads the strategic direction and research for First Draft, a global non-profit that supports journalists, academics and technologists working to address challenges relating to trust and truth in the digital age. She has been a Fellow at the Shorenstein Center for Media, Politics and Public Policy at Harvard's Kennedy School, the Research Director at the Tow Center for Digital Journalism at Columbia University's Graduate School of Journalism and head of social media for UNHR, the United Nations Refugee Agency.
In March 2019, Mark Zuckerberg talked about Facebook’s “pivot to privacy,” which meant the company was going to emphasize Facebook groups, as a recognition that people were increasingly drawn to communicating with a smaller number of people in private spaces. Over the last few years, the importance of smaller groups for social communication has been clear to those of us working in this space.
In this chapter, I will explain the different platforms and applications, talk about the challenges of monitoring these spaces, and end with a discussion of the ethics of doing this type of work.
Different platforms and applications
Recent research by We Are Social shows the continuous dominance of Facebook and YouTube, but the next three most popular platforms are WhatsApp, FB Messenger and WeChat.
In many regions around the world, chat apps are now the dominant source of news for many consumers, particularly WhatsApp, for example, in Brazil, India and Spain.
Certainly, WhatsApp and FB Messenger are popular globally, but in certain countries, alternatives are dominant. For example, in Iran, it is Telegram. It’s Line in Japan, KakaoTalk in South Korea and WeChat in China.
All these sites have slightly different functionality, in terms of encryption, group or broadcast features, and additional options such as in-app commerce opportunities.
Closed Facebook groups
There are three types of Facebook Groups: Open, Closed and Hidden.
- Open groups can be found in search and anyone can join.
- Closed groups can be found in search but you have to apply to join.
- Hidden groups cannot be found in search and you have to be invited to join.
Increasingly, people are congregating on Facebook groups, partly because they’re being pushed by Facebook’s algorithm but also because people are choosing to spend time with people they already know, or people who share their perspective or interest.
According to Statista, in July 2019, Discord had 250 million monthly active users (for comparison, Snap had 294 million, Viber had 260 million and Telegram had 200 million). Discord is popular with the gaming community, but in recent years, it has also become known as a site where people congregate in “servers” (a form of group in Discord) to coordinate disinformation campaigns.
One aspect of Discord and some closed Facebook groups is that you will be asked questions before you are accepted into that group. These questions might be about your profession, your religion, your political beliefs or your attitudes toward certain social issues.
Encryption, groups and channels
One reason these platforms and applications have become so popular is that they offer different levels of encryption. WhatsApp and Viber are currently the most secure, offering end to end encryption. Others, like Telegram, FB Messenger and Line, offer encryption if you turn it on.
Certain apps have groups or channels where information is shared to large numbers of people. The largest WhatsApp group can hold 256 people. FB Messenger groups hold 250. In Telegram, a group can be private or publicly searchable, and can hold 200. Once it hits that number it can be converted into a supergroup and up to 75,000 people can join. Telegram also has channels, a broadcast capability inside an app. You can subscribe to a channel and see what’s being posted there, but you can’t post your own content in response.
There is no doubt that misinformation circulates on closed messaging apps. It is difficult to independently assess whether there is more misinformation on these platforms than on social media sites, because there is no way of seeing what is being shared. But we know it is a problem, as high-profile cases from India, France and Indonesia have shown us. And in the U.S., during the shootings in El Paso and Dayton in August 2019, there were examples of rumors and falsehoods circulating on Telegram and FB Messenger.
The question is whether journalists, researchers, fact-checkers, health workers and humanitarians should be in these closed groups to monitor for misinformation. If they should be in these groups, how should they go about their work in a way that is ethical and keeps them safe?
While there are significant challenges to doing this work, it is possible. However, keep in mind that many people who use these apps are doing so specifically so they will not be monitored. They use them because they are encrypted. They expect a certain level of privacy. This should be central to anyone working in these spaces. Even though you can join and monitor these spaces, it’s paramount to be aware of the responsibility you have to the participants in these groups, who often do not understand what is possible.
Techniques for searching
Searching for these groups can be difficult, as there are different protocols for each. For Facebook groups, you can search for topics within Facebook search and filter by groups. If you want to use more advanced Boolean search operators, search on Google using your keywords and then add site:facebook.com/groups.
For Telegram, you can search in the app if you have an Android phone, but not if you have an iPhone. There are desktop applications like https://www.telegram-group.com/. Similarly for Discord, there are sites such as https://disboard.org/search
Decisions around joining and participating
As mentioned, some of these groups will ask questions to secure entry. Before trying this, you should talk to your editor or manager about how to answer these questions. Will you be truthful about who you are and why you are in the group? Is there a way to join by staying deliberately vague? If not, how can you justify that decision to hide your identity (this might be necessary if you are joining a group that could jeopardize your safety if you identify yourself as a journalist). If you gain access will you contribute in any way, or just “lurk” to find information you can corroborate elsewhere?
Decisions about automatically collecting content from groups
It is possible to find “open” groups by searching for links that are posted to other sites. These then appear in search engines. It is then possible to use computational methods to automatically collect the content from these groups. Researchers monitoring elections in Brazil and India have done this, and I know anecdotally of other organizations doing similar work.
This technique allows organizations to monitor multiple groups simultaneously, which is often impossible otherwise. A key point is that only a small percentage of groups are findable this way, and they tend to be groups desperate for wide membership, so are not representative of all groups. It also raises ethical flags for me personally. However, there are guardrails that can be employed by securing the data, not sharing with others, and de-identifying messages. We need cross-industry protocols about doing this type of work.
The other technique is to set up a tipline, where you encourage the public to send you content. The key to a tipline is having a simple, clear call to action, and that you explain how you intend to use that content. Is it simply for monitoring trends, or are you going to reply to them with a debunk once you’ve investigated what they’re sent you?
Returning to the ethical questions, which impact so much around working with closed messaging apps, it’s important that you’re not just “taking” content, or in other words being extractive. And putting ethics aside for one minute, all the research shows that if audiences don’t know how their tips are being used, they are significantly less likely to keep sending them in. People are more willing to help if they feel like they’re being treated like partners.
The other aspect, however, is how easy it is to game tiplines by sending in hoax content, or by one individual or small group sending in lots of the same content to make it appear to be a bigger problem that it is.
Ethics of reporting from closed messaging groups
Once you’ve found content, the question is how to report on it. Should you be transparent about how you found it? As part of their community guidelines, many groups ask that what is discussed in a group does not get shared more widely. If the group is full of disinformation, what will be the impact of your reporting on it? Can you corroborate what you have found in other groups or online spaces? If you report, might you put your own safety, or that of your colleagues or family at risk? Remember that doxxing journalists and researchers (or worse) is part of the playbook for some of the darker groups online.
Reporting from and about closed messaging apps and groups is full of challenges, yet those sources will become increasingly important as spaces where information is shared. As a first step, think about the questions outlined in this chapter, talk to your colleagues and editors, and if you don’t have guidelines in your newsroom about this type of reporting, start working on some. There are no standard rules on how to do this. It depends on the story, the platform, the reporter and a newsroom’s editorial guidelines. But it is important that all the details are considered before you start this kind of reporting.