The following schedule is tentative and subject to change.

Date Topic Readings Assignments & Project
Aug 20 Introduction & Course overview

Lecture
Aug 22 Crowdsourcing:
Background and applications

Lecture

Required

Howe. The Rise of Crowdsourcing. WIRED, June 2006.

Optional

von Ahn et al. reCaptcha: Human-based Character Recognition via Web Security Measures. Science, September 2008

von Ahn and Dabbish. Labeling Images with a Computer Game. CHI'04

Cooper et al. Predicting Protein Structures with a Multiplayer Online Game. Nature, August 2010.

Assignment 1: Due before next class (Aug 26)
Aug 27 Crowd workers

Lecture

Required

Difallah et al. Demographics and Dynamics of Mechanical Turk Workers. WSDM'18

Yin et al. The Communication Network Within the Crowd. WWW'16

Optional

Martin et al. Being a Turker. CSCW'14

Gray et al. The Crowd is a Collaborative Network. CSCW'16

Aug 29 Crowdsourcing platform:
Amazon Mechanical Turk

Lecture

Creating a HIT on MTurk
No required readings. Bring laptops to class.

Assignment 2: Due on Sep 9

Sign-up for presentations!
Sep 3 No class (Labor day)
Sep 5 Crowdsourcing platform:
Tasks and dynamics

Lecture

Required

Difallah et al. The Dynamics of Micro-Task Crowdsourcing: The Case of Amazon MTurk. WWW'15

Optional

Gadiraju et al. A Taxonomy of Microtasks on the Web. HT'14

Vakharia and Lease. Beyond Mechanical Turk: An Analysis of Paid Crowd Work Platforms. iConference'15

Jain et al. Understanding Workers, Developing Effective Tasks, and Enhancing Marketplace Dynamics: A Study of a Large Crowdsourcing Marketplace. VLDB Endowment, March 2017

Sep 10 Crowdsourcing:
Opportunities and challenges

Lecture

Required

Kittur et al. The Future of Crowd Work. CSCW'13

Optional

Quinn and Bederson. Human Computation: A Survey and Taxonomy of a Growing Field. CHI'11

Gadiraju et al. Human Beyond the Machine: Challenges and Opportunities of Microtask Crowdsourcing. IEEE Intelligent Systems, July 2015

Sep 12 Incentive design:
Financial incentives

Required

Ho et al. Incentivizing High Quality Crowdwork. WWW'15

Optional

Mason and Watts. Financial Incentives and the "Performance of Crowds". HCOMP'09

Yin et al. The Effects of Performance-Contingent Financial Incentives in Online Labor Markets. AAAI'13

Harris. The Effects of Pay-to-Quit Incentives on Crowdworker Task Quality. CSCW'15

Sep 17 Incentive design:
Other incentives

Required

Rogstadius et al. An Assessment of Intrinsic and Extrinsic Motivation on Task Performance in Crowdsourcing Markets. ICWSM'11

Shaw et al. Designing Incentives for Inexpert Human Raters. CSCW'11

Optional

Nov et al. Scientists@Home: What Drives the Quantity and Quality of Online Citizen Science Participation?. PLOS ONE, April 2014

Feyisetan and Simperl. Social Incentives in Paid Collaborative Crowdsourcing. TIST, September 2017

Sep 19 Incentive design:
Intelligent management

Required

Yin and Chen. Bonus or Not? Learn to Reward in Crowdsourcing. IJCAI'15

Optional

Gao and Parameswaran. Finish Them!: Pricing Algorithms for Human Computation. VLDB Endowment, October 2014

Feyisetan et al. Improving Paid Microtasks through Gamification and Adaptive Furtherance Incentives. WWW'15

Final project: Guideline out; proposal due on Oct 3
Sep 24 Task assignment and recommendation

Required

Ho and Vaughan. Online Task Assignment in Crowdsourcing Markets. AAAI'12

Difallah et al. Pick-a-Crowd: Tell Me What You Like, and I'll Tell You What To Do. WWW'13

Optional

Lin et al. Signals in the Silence: Models of Implicit Feedback in a Recommendation System for Crowdsourcing. AAAI'14

Mavridis et al. Using Hierarchical Skills for Optimized Task Assignment in Knowledge-Intensive Crowdsourcing. WWW'16

Sep 26 Quality assurance:
Empirical methods

Required

Dow et al. Shephearding the Crowd Yields Better Work. CSCW'12

Optional

Huang and Fu. Enhancing Reliability Using Peer Consistency Evaluation in Human Computation. CSCW'13

Sampath et al. Cognitively Inspired Task Design to Improve User Performance on Crowdsourcing Platforms. CHI'14

Doroudi et al. Toward a Learning Science for Complex Crowdsourcing Tasks. CHI'16

Oct 1 Quality assurance:
Algorithmic approaches

Required

Whitehill et al. Whose Vote Should Count More: Optimal Integration of Labels from Labelers of Unknown Expertise. NIPS'09

Optional

Oyama et al. Accurate Integration of Crowdsourced Labels Using Workers' Self-Reported Confidence Scores. IJCAI'13

Sunahase et al. Pairwise HITS: Quality Estimation from Pairwise Comparisons in Creator-Evaluator Crowdsourcing Process. AAAI'17

Wang et al. Obtaining High-Quality Label by Distinguishing between Easy and Hard Items in Crowdsourcing. IJCAI'17

Oct 3 Quality assurance:
Intelligent management

Required

Kamar et al. Combining Human and Machine Intelligence in Large-Scale Crowdsourcing. AAMAS'12

Optional

Bragg et al. Optimal Testing for Crowd Workers. AAMAS'16

Gurari and Grauman. CrowdVerge: Predicting If People Will Agree on the Answer to a Visual Question. CHI'17

Oct 8 No class (October break)
Oct 10 Final project: Pitch
Oct 15 Engagement control:
Empirical methods

Required

Yu et al. A Comparison of Social, Learning, and Financial Strategies on Crowd Engagement and Output Quality. CSCW'14

Law et al. Curiosity Killed the Cat, but Makes Crowdwork Better. CHI'16

Optional

Preist et al. Competing or Aiming to be Average?: Normification as a Means of Engaging Digital Volunteers. CSCW'14

Dai et al. And Now for Something Completely Different: Improving Crowdsourcing Workflows with Micro-Diversions. CSCW'15

Oct 17 Engagement control:
Intelligent management

Required

Segal et al. Intervention Strategies for Increasing Engagement in Crowdsourcing: Platform, Predictions, and Experiments. IJCAI'16

Optional

Segal et al. Optimizing Interventions via Offline Policy Evaluation: Studies in Citizen Science. AAAI'18

Oct 22 Workflow design:
Use-case specific workflows

Required

Bernstein et al. Soylent: A Word Processor with a Crowd Inside. UIST'10

Chilton et al. Cascade: Crowdsourcing Taxonomy Creation. CHI'13

Optional

Noronha et al. Platemate: Crowdsourcing Nutritional Analysis from Food Photographs. UIST'11

Kim et al. Crowdsourcing Step-by-Step Information Extraction to Enhance Existing How-to Videos. CHI'14

Oct 24 Workflow design:
General workflows

Required

Little et al. TurKit: Human Computation Algorithms on Mechanical Turk. UIST'10

Optional

Kittur et al. CrowdForge: Crowdsourcing Complex Work. UIST'11

Kulkarni et al. Collaboratively Crowdsourcing Workflows with Turkomatic. CSCW'12

Oct 29 Workflow design:
Intelligent management

Required

Dai et al. Decision-theoretic Control of Crowd-sourced Workflows. AAAI'10

Optional

Lin et al. Dynamically Switching between Synergistic Workflows for Crowdsourcing. AAAI'12

Bragg et al. Crowdsourcing Multi-Label Classification for Taxonomy Creation. HCOMP'13

Tran-Thanh et al. Crowdsourcing Complex Workflows under Budget Constraints. AAAI'15

Oct 31 Beyond independence:
Cooperative work examples

Required

Drapeau et al. MicroTalk: Using Argumentation to Improve Crowdsourcing Accuracy. HCOMP'16

Optional

Suzuki et al. Atelier: Repurposing Expert Crowdsourcing Tasks as Micro-Internships. CHI'16

Chang et al. Revolt: Collaborative Crowdsourcing for Labeling Machine Learning Datasets. CHI'17

Nov 5 Final project: Peer feedback (Session 1)
Nov 7 Final project: Peer feedback (Session 2)
Nov 12 Cooperative work:
Complex task

Required

Retelny et al. Expert Crowdsourcing with Flash Teams. UIST'14

Optional

Hahn et al. The Knowledge Accelerator: Big Picture Thinking in Small Pieces. CHI'16

Valentine et al. Flash Organizations: Crowdsourcing Complex Work by Structuring Crowds As Organizations. CHI'17

Nov 14 Cooperative work:
Intelligent management

Required

Salehi et al. Huddler: Convening Stable and Familiar Crowd Teams Despite Unpredictable Availability. CSCW'17

Optional

Singla et al. Learning to Hire Teams. HCOMP'15

Zhou et al. In Search of the Dream Team: Temporally Constrained Multi-Armed Bandits for Identifying Effective Team Structures. CHI'18

Nov 19 Crowd-Powered Systems

Required

Lasecki et al. Real-Time Captioning by Groups of Non-Experts. UIST'12

Kokkalis et al. MyriadHub: Efficiently Scaling Personalized Email Conversations with Valet Crowdsourcing. CHI'17

Optional

Vashistha et al. Respeak: A Voice-based, Crowd-powered Speech Transcription System. CHI'17

Nguyen et al. An Interpretable Joint Graphical Model for Fact-Checking from Crowds. AAAI'18

Nov 21 No class (Thanksgiving)
Nov 26 Crowdsourcing: Future Ideas

Required

Whiting et al. Crowd Guilds: Worker-led Reputation and Feedback on Crowdsourcing Platforms. CSCW'17

Optional

Morris et al. Subcontracting Microwork. CHI'17

Vaish et al. Crowd Research: Open and Scalable University Laboratories. UIST'17

Nov 28 No class (Project day)
Dec 3 Final project presentation (Session 1)
Dec 5 Final project presentation (Session 2)