Abstract
Microtask crowdsourcing holds great potential as an employment opportunity with the flexibility and anonymity that individuals with disability may require. Though prior research has explored the accessibility of crowd work, the lived crowd work experiences of the broader community of workers with disability are still largely under-explored, especially when it comes to how their experiences are similar to or different from the experiences of workers without disability. In this work, we aim to obtain a deeper understanding of the microtask crowdsourcing experience for people with disabilities, especially regarding their financial and social experiences of participating in crowd work, along with the benefits and challenges that they encounter through this work. Specifically, we first surveyed 1,200 crowd workers both with and without disability about their experiences using the Amazon Mechanical Turk platform, and the differences we found inspired the design of a follow-up survey to gain greater understanding of the crowd work experience for workers with disability. Our findings reveal that workers with disability receive unique benefits from performing crowd work, such as a greater sense of purpose, but also encounter many challenges, such as completing tasks on time and earning a livable wage, causing them to turn to online communities for assistance. Although many of the challenges they face are not unique to crowd workers with disability, workers with disability may be disproportionately impacted by these challenges. From our findings, we provide implications for crowd platforms, as well as the gig economy as a whole, that seek to promote greater consideration of workers with a diverse range of conditions to create a more valuable work experience for them.
Supplemental Material
Available for Download
- Mohammad Allahbakhsh, Boualem Benatallah, Aleksandar Ignjatovic, Hamid Reza Motahari-Nezhad, Elisa Bertino, and Schahram Dustdar. 2013. Quality control in crowdsourcing systems: Issues and directions. IEEE Internet Computing, Vol. 17, 2 (2013), 76--81.Google ScholarDigital Library
- Jeffrey P Bigham, Chandrika Jayant, Hanjie Ji, Greg Little, Andrew Miller, Robert C Miller, Robin Miller, Aubrey Tatarowicz, Brandyn White, Samual White, et al. 2010. Vizwiz: nearly real-time answers to visual questions. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. 333--342.Google ScholarDigital Library
- Alice M Brawley and Cynthia LS Pury. 2016. Work experiences on MTurk: Job satisfaction, turnover, and information sharing. Computers in Human Behavior, Vol. 54 (2016), 531--546.Google ScholarDigital Library
- Robin Brewer, Meredith Ringel Morris, and Anne Marie Piper. 2016. "Why would anybody do this?": Older Adults' Understanding of and Experiences with Crowd Work. Proceedings of CHI 2016 (2016).Google Scholar
- Michele A Burton, Erin Brady, Robin Brewer, Callie Neylan, Jeffrey P Bigham, and Amy Hurst. 2012. Crowdsourcing subjective fashion advice using VizWiz: challenges and opportunities. In Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility. 135--142.Google ScholarDigital Library
- Roc'io Calvo, Shaun K Kane, and Amy Hurst. 2014. Evaluating the accessibility of crowdsourcing tasks on Amazon's mechanical turk. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. 257--258.Google ScholarDigital Library
- Wei-Chu Chen, Siddharth Suri, and Mary L Gray. 2019. More than money: Correlation among worker demographics, motivations, and participation in online labor market. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 13. 134--145.Google ScholarCross Ref
- Chun-Wei Chiang, Anna Kasunic, and Saiph Savage. 2018. Crowd coach: Peer coaching for crowd workers' skill growth. Proceedings of the ACM on Human-Computer Interaction, Vol. 2, CSCW (2018), 1--17.Google ScholarDigital Library
- Manu Chopra, Indrani Medhi Thies, Joyojeet Pal, Colin Scott, William Thies, and Vivek Seshadri. 2019. Exploring crowdsourced work in low-resource settings. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1--13.Google ScholarDigital Library
- Juliet Corbin and Anselm Strauss. 2014. Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage publications.Google Scholar
- Geoff Cumming. 2013. Understanding the new statistics: Effect sizes, confidence intervals, and meta-analysis. Routledge.Google Scholar
- Geoff Cumming. 2014. The new statistics: Why and how. Psychological science, Vol. 25, 1 (2014), 7--29.Google ScholarCross Ref
- Xuefei Nancy Deng and KD Joshi. 2016. Why individuals participate in micro-task crowdsourcing work environment: Revealing crowdworkers' perceptions. Journal of the Association for Information Systems, Vol. 17, 10 (2016), 3.Google ScholarCross Ref
- Djellel Difallah, Elena Filatova, and Panos Ipeirotis. 2018. Demographics and dynamics of mechanical turk workers. In Proceedings of the eleventh ACM international conference on web search and data mining. 135--143.Google ScholarDigital Library
- Pierre Dragicevic. 2016. Fair statistical communication in HCI. In Modern statistical methods for HCI. Springer, 291--330.Google Scholar
- Claudia Flores-Saviaga, Yuwen Li, Benjamin Hanrahan, Jeffrey Bigham, and Saiph Savage. 2020. The Challenges of Crowd Workers in Rural and Urban America. In Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, Vol. 8. 159--162.Google ScholarCross Ref
- Snehal Gaikwad, Durim Morina, Rohit Nistala, Megha Agarwal, Alison Cossette, Radhika Bhanu, Saiph Savage, Vishwajeet Narwal, Karan Rajpal, Jeff Regino, et al. 2015. Daemo: A self-governed crowdsourcing marketplace. In Adjunct proceedings of the 28th annual ACM symposium on user interface software & technology. 101--102.Google ScholarDigital Library
- Alessandro Gandini. 2019. Labour process theory and the gig economy. Human Relations, Vol. 72, 6 (2019), 1039--1056.Google ScholarCross Ref
- Mary L Gray, Siddharth Suri, Syed Shoaib Ali, and Deepti Kulkarni. 2016. The crowd is a collaborative network. In Proceedings of the 19th ACM conference on computer-supported cooperative work & social computing. 134--147.Google ScholarDigital Library
- Anhong Guo, Xiang'Anthony' Chen, Haoran Qi, Samuel White, Suman Ghosh, Chieko Asakawa, and Jeffrey P Bigham. 2016. Vizlens: A robust and interactive screen reader for interfaces in the real world. In Proceedings of the 29th annual symposium on user interface software and technology. 651--664.Google ScholarDigital Library
- Lei Han, Kevin Roitero, Ujwal Gadiraju, Cristina Sarasua, Alessandro Checco, Eddy Maddalena, and Gianluca Demartini. 2019. All those wasted hours: On task abandonment in crowdsourcing. In Proceedings of the Twelfth ACM International Conference on Web Search and Data Mining. 321--329.Google ScholarDigital Library
- Benjamin V Hanrahan, Ning F Ma, Eber Betanzos, and Saiph Savage. 2020. Reciprocal research: Providing value in design research from the outset in the rural united states. In Proceedings of the 2020 International Conference on Information and Communication Technologies and Development. 1--5.Google ScholarDigital Library
- Kotaro Hara, Abigail Adams, Kristy Milland, Saiph Savage, Chris Callison-Burch, and Jeffrey P Bigham. 2018. A data-driven analysis of workers' earnings on Amazon Mechanical Turk. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1--14.Google ScholarDigital Library
- Kotaro Hara, Abigail Adams, Kristy Milland, Saiph Savage, Benjamin V Hanrahan, Jeffrey P Bigham, and Chris Callison-Burch. 2019. Worker demographics and earnings on amazon mechanical turk: An exploratory analysis. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. 1--6.Google ScholarDigital Library
- Kotaro Hara and Jeffrey P Bigham. 2017. Introducing people with ASD to crowd work. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. 42--51.Google ScholarDigital Library
- Kotaro Hara, Vicki Le, and Jon Froehlich. 2013. Combining crowdsourcing and google street view to identify street-level accessibility problems. In Proceedings of the SIGCHI conference on human factors in computing systems. 631--640.Google ScholarDigital Library
- Ellie Harmon and M Six Silberman. 2019. Rating working conditions on digital labor platforms. Computer Supported Cooperative Work (CSCW), Vol. 28, 5 (2019), 911--960.Google Scholar
- Paul Harpur and Peter Blanck. 2020. Gig workers with disabilities: opportunities, challenges, and regulatory response. Journal of Occupational Rehabilitation, Vol. 30, 4 (2020), 511--520.Google ScholarCross Ref
- Hwajung Hong, Svetlana Yarosh, Jennifer G Kim, Gregory D Abowd, and Rosa I Arriaga. 2013. Investigating the use of circles in social networks to support independence of individuals with autism. In Proceedings of the SIGCHI conference on human factors in computing systems. 3207--3216.Google ScholarDigital Library
- Julie S Hui, Matthew W Easterday, and Elizabeth M Gerber. 2019. Distributed apprenticeship in online communities. Human-Computer Interaction, Vol. 34, 4 (2019), 328--378.Google ScholarCross Ref
- Panagiotis G Ipeirotis. 2010. Analyzing the amazon mechanical turk marketplace. XRDS: Crossroads, The ACM magazine for students, Vol. 17, 2 (2010), 16--21.Google Scholar
- Lilly C Irani and M Six Silberman. 2013. Turkopticon: Interrupting worker invisibility in amazon mechanical turk. In Proceedings of the SIGCHI conference on human factors in computing systems. 611--620.Google ScholarDigital Library
- Hannah Johnston, Chris Land-Kazlauskas, et al. 2018. Organizing on-demand: Representation, voice, and collective bargaining in the gig economy. Conditions of work and employment series, Vol. 94 (2018).Google Scholar
- Sarah Kaine and Emmanuel Josserand. 2019. The organisation and experience of work in the gig economy. Journal of Industrial Relations, Vol. 61, 4 (2019), 479--501.Google ScholarCross Ref
- Toni Kaplan, Susumu Saito, Kotaro Hara, and Jeffrey Bigham. 2018. Striving to earn more: a survey of work strategies and tool use among crowd workers. In Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, Vol. 6.Google ScholarCross Ref
- Katherine C Kellogg, Melissa A Valentine, and Angele Christin. 2020. Algorithms at work: The new contested terrain of control. Academy of Management Annals, Vol. 14, 1 (2020), 366--410.Google ScholarCross Ref
- Aniket Kittur, Boris Smus, Susheel Khamkar, and Robert E Kraut. 2011. Crowdforge: Crowdsourcing complex work. In Proceedings of the 24th annual ACM symposium on User interface software and technology. 43--52.Google ScholarDigital Library
- Masatomo Kobayashi, Tatsuya Ishihara, Toshinari Itoko, Hironobu Takagi, and Chieko Asakawa. 2013a. Age-based task specialization for crowdsourced proofreading. In International Conference on Universal Access in Human-Computer Interaction. Springer, 104--112.Google ScholarDigital Library
- Masatomo Kobayashi, Tatsuya Ishihara, Akihiro Kosugi, Hironobu Takagi, and Chieko Asakawa. 2013b. Question-answer cards for an inclusive micro-tasking framework for the elderly. In IFIP Conference on Human-Computer Interaction. Springer, 590--607.Google ScholarCross Ref
- Rochelle LaPlante, M Six Silberman, and Industriegewerkschaft Metall. 2016. Building trust in crowd worker forums: Worker ownership, governance, and work outcomes. Proceedings of WebSci16. ACM (2016), 35--63.Google Scholar
- Laura Lascau, Sandy JJ Gould, Duncan P Brumby, and Anna L Cox. 2022. Crowdworkers' Temporal Flexibility is Being Traded for the Convenience of Requesters Through 19 "Invisible Mechanisms' Employed by Crowdworking Platforms: A Comparative Analysis Study of Nine Platforms. In CHI Conference on Human Factors in Computing Systems Extended Abstracts. 1--8.Google Scholar
- Walter Lasecki, Christopher Miller, Adam Sadilek, Andrew Abumoussa, Donato Borrello, Raja Kushalnagar, and Jeffrey Bigham. 2012. Real-time captioning by groups of non-experts. In Proceedings of the 25th annual ACM symposium on User interface software and technology. 23--34.Google ScholarDigital Library
- Sooyeon Lee, Bjorn Hubert-Wallander, Molly Stevens, and John M Carroll. 2019. Understanding and designing for deaf or hard of hearing drivers on Uber. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1--12.Google ScholarDigital Library
- Greg Little, Lydia B Chilton, Max Goldman, and Robert C Miller. 2010. Turkit: human computation algorithms on mechanical turk. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. 57--66.Google ScholarDigital Library
- Peng Liu, Xianghua Ding, and Ning Gu. 2016. "Helping Others Makes Me Happy" Social Interaction and Integration of People with Disabilities. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing. 1596--1608.Google ScholarDigital Library
- Xiao Ma, Lara Khansa, and Sung S Kim. 2018. Active community participation and crowdworking turnover: A longitudinal model and empirical test of three mechanisms. Journal of Management Information Systems, Vol. 35, 4 (2018), 1154--1187.Google ScholarCross Ref
- Elizabeth B Marquis, Sangmi Kim, Rasha Alahmad, Casey S Pierce, and Lionel P Robert Jr. 2018. Impacts of perceived behavior control and emotional labor on gig workers. In Companion of the 2018 ACM conference on computer supported cooperative work and social computing. 241--244.Google ScholarDigital Library
- David Martin, Sheelagh Carpendale, Neha Gupta, Tobias Hoßfeld, Babak Naderi, Judith Redi, Ernestasia Siahaan, and Ina Wechsung. 2017. Understanding the crowd: Ethical and practical matters in the academic use of crowdsourcing. In Evaluation in the crowd. crowdsourcing and human-centered experiments. Springer, 27--69.Google Scholar
- David Martin, Benjamin V Hanrahan, Jacki O'Neill, and Neha Gupta. 2014. Being a turker. In Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing. 224--235.Google ScholarDigital Library
- Brian McInnis, Dan Cosley, Chaebong Nam, and Gilly Leshed. 2016. Taking a HIT: Designing around rejection, mistrust, risk, and workers' experiences in Amazon Mechanical Turk. In Proceedings of the 2016 CHI conference on human factors in computing systems. 2271--2282.Google ScholarDigital Library
- Meredith Ringel Morris, Jeffrey P Bigham, Robin Brewer, Jonathan Bragg, Anand Kulkarni, Jessie Li, and Saiph Savage. 2017. Subcontracting microwork. In Proceedings of the 2017 CHI conference on human factors in computing systems. 1867--1876.Google ScholarDigital Library
- Raymond S Nickerson. 2000. Null hypothesis significance testing: a review of an old and continuing controversy. Psychological methods, Vol. 5, 2 (2000), 241.Google Scholar
- Catherine A Okoro, NaTasha D Hollis, Alissa C Cyrus, and Shannon Griffin-Blake. 2018. Prevalence of disabilities and health care access by disability status and type among adults-United States, 2016. Morbidity and Mortality Weekly Report, Vol. 67, 32 (2018), 882.Google ScholarCross Ref
- Kathryn E Ringland. 2019. A place to play: the (dis) abled embodied experience for autistic children in online spaces. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1--14.Google ScholarDigital Library
- Joel Ross, Lilly Irani, M Six Silberman, Andrew Zaldivar, and Bill Tomlinson. 2010. Who are the crowdworkers? Shifting demographics in Mechanical Turk. In CHI'10 extended abstracts on Human factors in computing systems. 2863--2872.Google Scholar
- Niloufar Salehi, Lilly C Irani, Michael S Bernstein, Ali Alkhatib, Eva Ogbe, and Kristy Milland. 2015. We are dynamo: Overcoming stalling and friction in collective action for crowd workers. In Proceedings of the 33rd annual ACM conference on human factors in computing systems. 1621--1630.Google ScholarDigital Library
- Brian Sierkowski. 2002. Achieving web accessibility. In Proceedings of the 30th annual ACM SIGUCCS conference on User services. 288--291.Google ScholarDigital Library
- M Six Silberman, Lilly Irani, and Joel Ross. 2010. Ethics and tactics of professional crowdwork. XRDS: Crossroads, The ACM Magazine for Students, Vol. 17, 2 (2010), 39--43.Google ScholarDigital Library
- Saiganesh Swaminathan, Kotaro Hara, and Jeffrey P Bigham. 2017. The crowd work accessibility problem. In Proceedings of the 14th Web for All Conference on The Future of Accessible Work. 1--4.Google ScholarDigital Library
- Maria Tomprou, Laura Dabbish, Robert E Kraut, and Fannie Liu. 2019. Career mentoring in online communities: Seeking and receiving advice from an online community. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1--12.Google ScholarDigital Library
- Carlos Toxtli, Siddharth Suri, and Saiph Savage. 2021. Quantifying the Invisible Labor in Crowd Work. Proceedings of the ACM on Human-Computer Interaction, Vol. 5, CSCW2 (2021), 1--26.Google ScholarDigital Library
- Bureau of Labor Statistics U.S. Department of Labor. 2020. PERSONS WITH A DISABILITY: LABOR FORCE CHARACTERISTICS - 2019. (Feb 2020). https://www.bls.gov/news.release/disabl.nr0.htmGoogle Scholar
- Stephen Uzor, Jason T Jacques, John J Dudley, and Per Ola Kristensson. 2021. Investigating the Accessibility of Crowdwork Tasks on Mechanical Turk. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1--14.Google ScholarDigital Library
- Aditya Vashistha, Pooja Sethi, and Richard Anderson. 2018. BSpeak: An accessible voice-based crowdsourcing marketplace for low-income blind people. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1--13.Google ScholarDigital Library
- Matthieu Vicente. 2019. Collective Relations in the Gig economy. E-Journal of International and Comparative Labour Studies, Vol. 8, 1 (2019).Google Scholar
- Mark E Whiting, Dilrukshi Gamage, Snehalkumar (Neil) S Gaikwad, Aaron Gilbee, Shirish Goyal, Alipta Ballav, Dinesh Majeti, Nalin Chhibber, Angela Richmond-Fuller, Freddie Vargus, et al. 2017. Crowd guilds: Worker-led reputation and feedback on crowdsourcing platforms. In Proceedings of the 2017 acm conference on computer supported cooperative work and social computing. 1902--1913.Google ScholarDigital Library
- Mark E Whiting, Grant Hugh, and Michael S Bernstein. 2019. Fair work: Crowd work minimum wage with one line of code. In Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, Vol. 7. 197--206.Google ScholarCross Ref
- Alex C Williams, Gloria Mark, Kristy Milland, Edward Lank, and Edith Law. 2019. The perpetual work life of crowdworkers: How tooling practices increase fragmentation in crowdwork. Proceedings of the ACM on Human-Computer Interaction, Vol. 3, CSCW (2019), 1--28.Google ScholarDigital Library
- Alex Wood and Vili Lehdonvirta. 2019. Platform labour and structured antagonism: Understanding the origins of protest in the gig economy. Available at SSRN 3357804 (2019).Google Scholar
- Ming Yin, Mary L Gray, Siddharth Suri, and Jennifer Wortman Vaughan. 2016. The communication network within the crowd. In Proceedings of the 25th International Conference on World Wide Web. 1293--1303.Google ScholarDigital Library
- Ming Yin, Siddharth Suri, and Mary L Gray. 2018. Running out of time: The impact and value of flexibility in on-demand crowdwork. In Proceedings of the 2018 CHI conference on human factors in computing systems. 1--11.Google ScholarDigital Library
- Kathryn Zyskowski, Meredith Ringel Morris, Jeffrey P Bigham, Mary L Gray, and Shaun K Kane. 2015. Accessible crowdwork? Understanding the value in and challenge of microtask employment for people with disabilities. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing. 1682--1693.Google Scholar
Index Terms
- Understanding the Microtask Crowdsourcing Experience for Workers with Disabilities: A Comparative View
Recommendations
Investigating the Accessibility of Crowdwork Tasks on Mechanical Turk
CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing SystemsCrowdwork can enable invaluable opportunities for people with disabilities, not least the work flexibility and the ability to work from home, especially during the current Covid-19 pandemic. This paper investigates how engagement in crowdwork tasks is ...
Accessible Crowdwork?: Understanding the Value in and Challenge of Microtask Employment for People with Disabilities
CSCW '15: Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social ComputingWe present the first formal study of crowdworkers who have disabilities via in-depth open-ended interviews of 17 people (disabled crowdworkers and job coaches for people with disabilities) and a survey of 631 adults with disabilities. Our findings ...
Evaluating the accessibility of crowdsourcing tasks on Amazon's mechanical turk
ASSETS '14: Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibilityCrowd work web sites such as Amazon Mechanical Turk enable individuals to work from home, which may be useful for people with disabilities. However, the web sites for finding and performing crowd work tasks must be accessible if people with disabilities ...
Comments