ABSTRACT
Preparing complex jobs for crowdsourcing marketplaces requires careful attention to workflow design, the process of decomposing jobs into multiple tasks, which are solved by multiple workers. Can the crowd help design such workflows? This paper presents Turkomatic, a tool that recruits crowd workers to aid requesters in planning and solving complex jobs. While workers decompose and solve tasks, requesters can view the status of worker-designed workflows in real time; intervene to change tasks and solutions; and request new solutions to subtasks from the crowd. These features lower the threshold for crowd employers to request complex work. During two evaluations, we found that allowing the crowd to plan without requester supervision is partially successful, but that requester intervention during workflow planning and execution improves quality substantially. We argue that Turkomatic's collaborative approach can be more successful than the conventional workflow design process and discuss implications for the design of collaborative crowd planning systems.
- Ruote. http://ruote.rubyforge.org/.Google Scholar
- RunMyProcess. http://www.runmyprocess.com/.Google Scholar
- Graphviz. http://www.graphviz.org/.Google Scholar
- Ahmad, S., Battle, A., Malkani, Z., and Kamvar, S. The Jabberwocky programming environment for structured social computing. Proceedings of UIST 2011, (2011), 53--64. Google ScholarDigital Library
- Ahn, L. von. Games with a Purpose. IEEE Computer 39, (2006), 92--94. Google ScholarDigital Library
- Bernstein, M. S., Little, G., Miller, R. C., et al. Soylent: a word processor with a crowd inside. Proceedings of UIST 2010, ACM (2010), 313--322. Google ScholarDigital Library
- Bigham, J. P., Jayant, C., Ji, H., et al. VizWiz: nearly real-time answers to visual questions. Proceedings of UIST 2010, ACM (2010), 333--342. Google ScholarDigital Library
- Faridani, S., Hartmann, B., and Ipeirotis, P. G. What's the Right Price? Pricing Tasks for Finishing on Time. Proceedings of HCOMP11: The 3rd Workshop on Human Computation.Google Scholar
- Franklin, M. J., Kossmann, D., Kraska, T., Ramesh, S., and Xin, R. CrowdDB: answering queries with crowdsourcing. Proceedings of SIGMOD 2011, (2011), 61--72. Google ScholarDigital Library
- Huang, E., Zhang, H., Parkes, D. C., Gajos, K. Z., and Chen, Y. Toward automatic task design: a progress report. Proceedings of the ACM SIGKDD Workshop on Human Computation, ACM (2010), 77--85. Google ScholarDigital Library
- Ipeirotis, P. G. Analyzing the Amazon Mechanical Turk marketplace. XRDS: Crossroads, The ACM Magazine for Students 17, (2010), 16--21. Google ScholarDigital Library
- Kittur, A., Smus, B., Khamkar, S., and Kraut, R. E. CrowdForge: Crowdsourcing Complex Work. Proceedings of UIST 2011, ACM (2011), 43--52. Google ScholarDigital Library
- Lasecki, W. S., Murray, K. I., White, S., Miller, R. C., and Bigham, J. P. Real-time crowd control of existing interfaces. Proceedings of UIST 2011, (2011), 23. Google ScholarDigital Library
- Little, G., Chilton, L. B., Goldman, M., and Miller, R. C. TurKit: human computation algorithms on mechanical turk. Proceedings of UIST 2010, ACM (2010), 57--66. Google ScholarDigital Library
- Olsen, T. Incorporating crowdsourcing into business processes. Adjunct Proceedings of CSCW 2011, (2011).Google Scholar
- Quinn, A. J. and Bederson, B. B. Human computation: a survey and taxonomy of a growing field. Proceedings of CHI 2011, ACM (2011), 1403--1412. Google ScholarDigital Library
- Sheng, V. S., Provost, F., and Ipeirotis, P. G. Get another label? Improving data quality and data mining using multiple, noisy labelers. Proceeding of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining, ACM (2008), 614--622. Google ScholarDigital Library
- Singer, Y. and Mittal, M. Pricing Tasks in Online Labor Markets. Proceedings of HCOMP11: The 3rd Workshop on Human Computation, (2011).Google Scholar
- Sorokin, A. and Forsyth, D. Utility Data Annotation with Amazon Mechanical Turk. Proceedings of CVPR 2008, (2008).Google ScholarCross Ref
- Stohr, E. A. and Zhao, J. L. Workflow Automation: Overview and Research Issues. Information Systems Frontiers 3, (2001), 281--296. Google ScholarDigital Library
- White, S. BPMN modeling and reference guide. Future Strategies Inc., Lighthouse Point Fla., 2008.Google Scholar
Index Terms
- Collaboratively crowdsourcing workflows with turkomatic
Recommendations
Turkomatic: automatic recursive task and workflow design for mechanical turk
CHI EA '11: CHI '11 Extended Abstracts on Human Factors in Computing SystemsCompleting complex tasks on crowdsourcing platforms like Mechanical Turk currently requires significant up-front investment into task decomposition and workflow design. We present a new method for automating task and workflow design for high-level, ...
Turkomatic: automatic, recursive task and workflow design for mechanical turk
AAAIWS'11-11: Proceedings of the 11th AAAI Conference on Human ComputationOn today's human computation systems, designing tasks and workflows is a difficult and labor-intensive process. Can workers from the crowd be used to help plan workflows? We explore this question with Turkomatic, a new interface to microwork platforms ...
Understanding Malicious Behavior in Crowdsourcing Platforms: The Case of Online Surveys
CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing SystemsCrowdsourcing is increasingly being used as a means to tackle problems requiring human intelligence. With the ever-growing worker base that aims to complete microtasks on crowdsourcing platforms in exchange for financial gains, there is a need for ...
Comments