Crowdsourcing

Conversations with Data: #3

Do you want to receive Conversations with Data? Subscribe

Conversations WData-header-1

Hi there!

It’s time for another data conversation.

In case you missed our first conversation, we’ll be inviting you to share your thoughts, or ask an expert questions, about anything and everything data journalism.

Each edition is essentially a mini crowdsourcing project. So, we thought we’d help ourselves out a little and crowdsource some advice, about crowdsourcing, from you.

What you said

It seems that success is all in how you ask the crowd for help.

Start by giving simple and straightforward tasks, suggests Eliot Higgins, from Bellingcat: "for example, Europol's Trace an Object Stop Child Abuse campaign asked people to simply identify objects, a clear task with an obvious ending point. The biggest failures have been when large groups of people approach a complicated task with no oversight, the Reddit Boston Marathon Bombing investigation being the prime example of this".

Andrew Losowsky, Project Lead at The Coral Project, had similar thoughts.

He reminded us that "most crowdsourcing fails, and that's ok. If you want to create the most likely conditions for success, make sure it's well publicised, easy to understand, is clearly connected to a compelling result, has an easy way for people to be notified when that result appears, and includes ways for you to address abuse by bad actors. It might still fail. Keep trying".

Timing is another factor to consider.

From Mozilla’s Vojtech Sedlak: "when soliciting ideas or opinions from our community, we found it important to identify an inflection point that attracts attention to the issue we want to learn about. Instead of just asking people out of the blue, we waited for a media story or an event that garnered public attention and only then launched our surveys".

Some other suggestions for enticing the crowd to contribute include partnering with civic society (Flor Coelho, LA NACION Data) and appealing to the crowd’s competitive instincts by presenting the data collection process as a game (Myf Nixon, mySociety’s Gender Balance project).

But be wary, says Stu Shulman, "one challenge is knowing when more is not helping".

It’s also important to remember that the way you ask for data can have implications for your analysis further down the line.

One of the frustrations with crowdsourcing, says ProPublica’s Peter Gosselin, is that "while it provides loads of sources, you can't use the group for quantitative purposes because of self-selection and non-randomness". If he could change anything about their IBM project, he’d restructure their questionnaires to cope with these problems.

Likewise, if Clare Blumer, from the ABC’s investigation into Australian aged care, could go back in time, she’d ‘fork’ their survey questions so that people who were satisfied and dissatisfied with their aged care experience would be asked a different set of questions.

Without separating their data, the ABC struggled with the data wrangling process. In the end, data was duplicated into smaller spreadsheets so that their journalists could read all the material that was submitted.

Looking for more tips? Check out this report from the Tow Center for Digital Journalism. It’s Roberto Rocha’s go-to whenever his newsroom at CBC Montreal wants to do a crowdsourced project.

Our next conversation

For our next conversation, we’re giving you the opportunity to ask a selection of Journalism 360 ambassadors questions about immersive storytelling. It’s a great time to think about experimenting in the area, with plenty of grants up for grabs.

Until next time,

Madolyn from the EJC Data team

subscribe figure