SICSA Remote Collaboration Activities – Recruiting Participants

13 September 2022,

by Isa Inuwa-Dutse from the University of Hertfordshire, and Salma ElSayed from Abertay University

Introduction/Overview

Crowdsourcing involves the act of engaging a wide collection of diverse individuals to work on a paid or voluntary activity submitted by requesters. The individuals or  participant are not constraints by geographical location. Prolific is an online crowdsourcing platform that enables the recruitment of participants to engage
in a paid activity/task.

The support for remote activities fund provided means to award participants in our studies which positively impacted the turnout. We would like to share our experiences with Prolific as a crowdsourcing platform for recruiting participants and how this facilitated our research projects. For a comprehensive understanding, see
the quick guide offered by Prolific.

Depending on the nature of the study, engaging with participants varies. We share our experiences (as researchers) and relevant tips for recruiting participants. We refer to standard approach as the classic way of recruiting N participants via Prolific to partake in a study over a given period. The multi-participant approach requires some degree of coordination because the study participants are required to simultaneously engage in the study activity.

Standard Study – Isa Inuwa-Dutse, University of Hertfordshire.

This study is within explainable AI (XAI) and the goal is to study the impact of explanation on social loafing. As a crowdsourcing platform, Prolific offers a rich array of features to enable the recruitment of a diverse range of participants. One of the commonly used approaches is the classic way of recruiting N participants to engaged in an activity (asynchronously) over a given period. We refer to this method as the Standard approach in this report. The main areas to focus attention include:

  • Task description

Typically a study or task is designed elsewhere, such as Qualtrics, and a link to the task is shared with recruited participants to partake in the study. On completion of the task/activity, participants are returned to Prolific for payments pending a successful review by the researcher(s). It is useful to succinctly describe the task to the participants and what is expected of them.

  • Task duration, bonus, review and making payment

The participants should be informed about the average completion time of the activity
beforehand. This will be useful, especially during review of the responses from the
participants. For instance, if the average completion time is 15mins and a participant spent <5mins, then you would like to carefully ascertain whether such participant heeded to the instructions. The inclusion of attention check questions is also crucial in this regard.

  • Restarting and adding participants to a study

Sometimes after a prolonged period of completing a study, the need to increase participants may arise. You may likely encounter a no response scenario or not receiving any response simply because the task is not visible to the participants. The easiest way to solve the issue is to contact the researcher help centre to restart the activity and make it visible to the participants.

  • Miscellaneous
Multi-participant Study – Salma ElSayed, Abertay University, Dundee.

To study the impact of affective non-player characters in multi-player games, two participants had to play a web game slice at the same time then fill a questionnaire. We tried different approaches.

At first, we advertised for the study using word of mouth and social media. This was before any funding, and we managed to recruit 12 testers in 3 months. It was way into COVID and the lockdowns which heavily impacted running the experiments. Also, to succeed in finding two available subjects at the same time was cumbersome and involved several rounds of communication, even with the use of Doodle for booking. The collected data was not enough for analysis.

It was then decided to pursue a small grant and try crowdsourcing with Prolific. The SICSA Remote Collaboration Activities fund enabled paying participants and the experiment design was altered to accommodate the use of the platform. We created two studies on Prolific; the first had general information about the research, consent statements, and a Calendly to book a slot from available times. Participants were then matched on time slots and sent a Zoom link on the day/time they selected. They would anonymously join the call to play the web game with another. The second study is released to them after the play session and included a Qualtrics link to the survey and debriefing information. The use of Prolific was an attempt to reach more participants and collect more responses; however, the drop-off rate was high and too many candidates would complete the consent and booking, but never show to play the game, although confirmations were sent a day before the event. In two months, only 9 participants have properly completed the study.

Given how slow the process was going, and how this phase was exhausting its timeline, we decided it was best to pursue another path in parallel. We advertised again within the school but used part of the SICSA fund to offer Amazon vouchers as awards. We managed to recruit 30 testers in a week and since most were students or staff, the drop-off rate was marginal, and it was easier to verify the booking on Calendly and arrange the Zoom call. We finally have reasonably enough data to analyse!

Prolific provides integration with several survey platforms like Gorilla, Google forms, and MS forms. Our surveys were hosted on Qualtrics, and it was a smooth process to have participants navigate between the two platforms. We faced some issues during the lifetime of the two studies ranging from technical blips, participants drop-off, and processing refunds. Prolific’ s customer service were immense help. They have a very quick response rate and provide practical information and guidance. Also being part of the Prolific Research Community allowed for useful exchanges with peers around experiment design and Prolific features.

Overall, the Prolific experience was informative and provided good guidelines to designing and running studies and minimising participants drop-off rate. The support pages are very organised and helpful. However, it is felt that multi-participant scenarios require a non-standard approach. It was a tedious task to make sure the testers are available or find a substitute tester on the fly. The game porotype currently accommodates two players which facilitated arrangements, but it is expected that for more than two players, an elaborate scheme should be in place to accommodate the communication and coordination burden.

The unique scenario of needing several participants simultaneously should be considered and a more solid framework proposed.

We would like to thank SICSA for the Support for Remote Collaboration Activities funding which facilitated recruiting participants and collecting data. We hope sharing our experience and reflection will inspire and motivate fellow researchers.

This entry was posted in Blog.