ABSTRACT
Crowdsourcing platforms are changing the way people can work and earn money. The population of workers on crowdsourcing platforms already counts millions and keeps growing. Workers on these platforms face several usability challenges, which we identify in this work by running two surveys on the CrowdFlower platform. Our surveys show that the majority of workers spend more than 25% of their time on searching tasks to work on. Limitations in the current user interface of the task listing page prevent workers from focusing more on the execution. In this work we present an attempt to design and implement a specific user interface for task listing aimed to help workers spend less time searching for tasks and thus navigate among them more easily.
- L. B. Chilton, J. J. Horton, R. C. Miller, and S. Azenkot. Task search in a human computation market. HCOMP '10, pages 1--9, New York, NY, USA, 2010. ACM. Google ScholarDigital Library
- D. Hiemstra. A probabilistic justification for using tf-idf term weighting in information retrieval. International Journal on Digital Libraries, 3(2):131--139, 2000.Google ScholarCross Ref
- J. Howe. The rise of crowdsourcing. Wired, 14(14):1--7, October 2006.Google Scholar
- P. G. Ipeirotis. A plea to amazon: Fix mechanical turk! http://bit.ly/GRvrAn.Google Scholar
- L. C. Irani and M. S. Silberman. Turkopticon: Interrupting worker invisibility in amazon mechanical turk. CHI '13, pages 611--620, New York, NY, USA, 2013. ACM. Google ScholarDigital Library
- M.-C. Yuen, I. King, and K.-S. Leung. Task recommendation in crowdsourcing systems. CrowdKDD '12, pages 22--26, New York, NY, USA, 2012. ACM. Google ScholarDigital Library
Index Terms
- Toward effective tasks navigation in crowdsourcing
Recommendations
Understanding Malicious Behavior in Crowdsourcing Platforms: The Case of Online Surveys
CHI '15: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing SystemsCrowdsourcing is increasingly being used as a means to tackle problems requiring human intelligence. With the ever-growing worker base that aims to complete microtasks on crowdsourcing platforms in exchange for financial gains, there is a need for ...
Toward crowdsourcing micro-level behavior annotations: the challenges of interface, training, and generalization
IUI '14: Proceedings of the 19th international conference on Intelligent User InterfacesResearch that involves human behavior analysis usually requires laborious and costly efforts for obtaining micro-level behavior annotations on a large video corpus. With the emerging paradigm of crowdsourcing however, these efforts can be considerably ...
Toward microtask crowdsourcing software design work
CSI-SE '16: Proceedings of the 3rd International Workshop on CrowdSourcing in Software EngineeringThe use of crowdsourcing as an approach for performing software engineering work is slowly but surely gaining foothold. Different models of crowdsourcing, however, have had varying levels of success to date. This paper contributes to the discussion a ...
Comments