ABSTRACT
Micro-task platforms provide a marketplace for hiring people to do short-term work for small payments. Requesters often struggle to obtain high-quality results, especially on content-creation tasks, because work cannot be easily verified and workers can move to other tasks without consequence. Such platforms provide little opportunity for workers to reflect and improve their task performance. Timely and task-specific feedback can help crowd workers learn, persist, and produce better results. We analyze the design space for crowd feedback and introduce Shepherd, a prototype system for visualizing crowd work, providing feedback, and promoting workers into shepherding roles. This paper describes our current progress and our plans for system development and evaluation.
- Annett, J. Feedback and human behaviour: the effects of knowledge of results, incentives, and reinforcement on learning and performance. Penguin Books, 1969.Google Scholar
- Benkler, Y. Coase's Penguin, or, Linux and "The Nature of the Firm." The Yale Law Journal 112, 3 (2002), 369--446.Google ScholarCross Ref
- Bernstein, M.S., Little, G., Miller, R.C., et al. Soylent: a word processor with a crowd inside. Proc. 23nd ACM Symp. on UIST, ACM (2010), 313--322. Google ScholarDigital Library
- Cheshire, C. and Antin, J. The Social Psychological Effects of Feedback on the Production of Internet Information Pools. Journal of Computer-Mediated Communication 13, 3 (2008), 705--727.Google ScholarCross Ref
- Dow, S.P. Using Crowds to Study Creativity. CrowdConf, (2010).Google Scholar
- Ericsson, K.A. and Smith, J. Toward a General Theory of Expertise: Prospects and Limits. Cambridge University Press, 1991.Google Scholar
- Hinds, P. The Curse of Expertise: The Effects of Expertise and Debiasing Methods on Predictions of Novice Performance. Journal of Experimental Applied Psychology 5, (1999), 205--221.Google ScholarCross Ref
- Horton, J.J. Employer Expectations, Peer Effects and Productivity: Evidence from a Series of Field Experiments. SSRN eLibrary, (2010).Google Scholar
- Ipeirotis, P.G., Provost, F., and Wang, J. Quality management on Amazon Mechanical Turk. Proc. ACM SIGKDD HCOMP, ACM (2010), 64--67. Google ScholarDigital Library
- Kittur, A., Chi, E.H., and Suh, B. Crowdsourcing user studies with Mechanical Turk. Proc. 26th ACM SIGCHI Human Factors (2008), 453--456. Google ScholarDigital Library
- 1Lampe, C. and Resnick, P. Slash(dot) and burn: distributed moderation in a large online conversation space. ACM SIGCHI Hum. factors (2004), 543--550. Google ScholarDigital Library
- Lave, J. and Wenger, E. Situated Learning: Legitimate Peripheral Participation. Cambridge, 1991.Google Scholar
- Little, G., Chilton, L.B., Goldman, M., and Miller, R.C. TurKit: tools for iterative tasks on mechanical Turk. Proc. ACM SIGKDD Workshop on HCOMP, (2009), 29--30. Google ScholarDigital Library
- Musico, C. There's No Place Like Home. destinationCRM.com, 2008.Google Scholar
- Viégas, F., Wattenberg, M., and Mckeon, M. The Hidden Order of Wikipedia. In Online Communities and Social Computing. 2007, 445--454. Google ScholarDigital Library
Index Terms
- Shepherding the crowd: managing and providing feedback to crowd workers
Recommendations
Shepherding the crowd yields better work
CSCW '12: Proceedings of the ACM 2012 conference on Computer Supported Cooperative WorkMicro-task platforms provide massively parallel, on-demand labor. However, it can be difficult to reliably achieve high-quality work because online workers may behave irresponsibly, misunderstand the task, or lack necessary skills. This paper ...
How many crowdsourced workers should a requester hire?
Recent years have seen an increased interest in crowdsourcing as a way of obtaining information from a potentially large group of workers at a reduced cost. The crowdsourcing process, as we consider in this paper, is as follows: a requester hires a ...
Modus Operandi of Crowd Workers: The Invisible Role of Microtask Work Environments
The ubiquity of the Internet and the widespread proliferation of electronic devices has resulted in flourishing microtask crowdsourcing marketplaces, such as Amazon MTurk. An aspect that has remained largely invisible in microtask crowdsourcing is that ...
Comments