Below is an incomplete list of books and papers (in addition to required and optional papers that we discuss in class, grouped by topics) in the field of crowdsourcing and social computing for interested readers.

Books

Law and von Ahn. Human Computation. 2011

Understanding Crowd Workers: Demographics, Size, Work Experience, Concerns and More

Ross et al. Who are the Crowdworkers?: Shifting Demographics in Mechanical Turk. CHI'10 Extended Abstracts

Downs et al. Are Your Participants Gaming the System?: Screening Mechanical Turk Workers. CHI'10

Horton and Chilton. The Labor Economics of Paid Crowdsourcing. EC'10

Suri et al. Honesty in an Online Labor Market. HCOMP Workshop @ AAAI'11

Berinsky. Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk. Political Analysis, 2012

Paolacci and Chandler. Inside the Turk: Understanding Mechanical Turk as a Participant Pool. Current Directions in Psychological Science, 2014

Pavlick et al. The Language Demographics of Amazon Mechnical Turk. Transactions of the ACL, 2014

Chandler et al. Nonnaivete among Amazon Mechanical Turk Workers: Consequences and Solutions for Behavioral Researchers. Behavior Research Methods, March 2014

Bartneck et al. Comparing the Similarity of Responses Received from Studies in Amazon€s Mechanical Turk to Studies Conducted Online and with Direct Recruitment. PLOS ONE, 2015

Huff and Tingley. "Who are These People?" Evaluating the Demographic Characteristics and Political Preferences of MTurk Survey Respondents. Research and Politics, 2015

Brawley and Pury. Work Experience on MTurk. Computers in Human Behavior, January 2016

McInnies et al. Taking a HIT: Designing around Rejection, Mistrust, Risk, and Workers' Experiences in Amazon Mechanical Turk. CHI'16

Xia et al. "Our Privacy Needs to be Protected at All Costs": Crowd Workers' Privacy Experiences on Amazon Mechanical Turk. CSCW'18

Hara et al. A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk. CHI'18

Crowdsourcing Incentive Design and Control

Harris. You're Hired! An Examination of Crowdsourcing Incentive Models in Human Resource Tasks. CSDM Workshop @ WSDM'11

Kaufmann et al. More Than Fun and Money. Worker Motivation in Crowdsourcing--A Study on Mechanical Turk. AMCIS'11

Faridani et al. What's the Right Price? Pricing Tasks for Finishing on Time. HCOMP Workshop @ AAAI'11

Chandler and Kapelner. Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets. Journal of Economic Behavior & Organization, 2013

Chen et al. Optimistic Knowledge Gradient Policy for Optimal Budget Allocation in Crowdsourcing. ICML'13

Huang and Fu. Don't Hide in the Crowd!: Increasing Social Transparency between Peer Workers Improves Crowdsourcing Outcomes. CHI'13

Tran-Thanh et al. Efficient Budget Allocation with Accuracy Guarantees for Crowdsourcing Classification Tasks. AAMAS'13

Mao et al. Volunteering Versus Work for Pay: Incentives and Tradeoffs in Crowdsourcing. HCOMP'13

Raddick et al. Galaxy Zoo: Motivations of Citizen Scientists. 2013

Singer and Mittal. Pricing Mechanisms for Crowdsourcing Markets. WWW'13

Singla and Krause. Truthful Incentives in Crowdsourcing Tasks Using Regret Minimization Mechanisms. WWW'13

Difallah et al. Scaling-Up the Crowd: Micro-Task Pricing Schemes for Worker Retention and Latency Improvement. HCOMP'14

Rokicki et al. Competitive Game Designs for Improving the Cost Effectiveness of Crowdsourcing. CIKM'14

Teodoro et al. The Motivations and Experiences of the On-Demand Mobile Workforce. CSCW'14

Tran-Thanh et al. BudgetFix: Budget Limited Crowdsourcing for Interdependent Task Allocation with Quality Guarantees. AAMAS'14

Yin et al. Monetary Interventions in Crowdsourcing Task Switching. HCOMP'14

Dergousoff and Mandryk. Mobile Gamification for Crowdsourcing Data Collection: Leveraging the Freemium Model. CHI'15

Harris. The Effects of Pay-to-Quit Incentives on Crowdworker Task Quality. CSCW'15

Goran and Faltings. Learning to Scale Payments in Crowdsourcing with PropeRBoost. HCOMP'16

Ho et al. Adaptive Contract Design for Crowdsourcing Markets: Bandit Algorithms for Repeated Principal-Agent Problems. Journal of Artificial Intelligence Research, 2016

Ikeda and Bernstein. Pay It Backward: Per-Task Payments on Crowdsourcing Platforms Reduce Productivity. CHI'16

Vaughan. Incentives and the Crowd. XRDS, Fall 2017

Xia and Muthukrishnan. Revenue-Maximizing Stable Pricing in Online Labor Markets. HCOMP'17

Ye et al. When Does More Money Work? Examining the Role of Perceived Fairness in Pay on the Performance Quality of Crowdworkers . ICWSM'17

Task Assignment, Routing and Recommendation

Yuen et al. Task Recommendation in Crowdsourcing Systems. CrowdKDD'12

Celis et al. Adaptive Crowdsourcing for Temporal Crowds. WWW'13

Heidari and Kearns. Depth-Workload Tradoffs for Workforce Organization. HCOMP'13

Ho et al. Adaptive Task Assignment for Crowdsourced Classification. ICML'13

Bragg et al. Parallel Task Routing for Crowdsourcing. HCOMP'14

Goel et al. Mechanism Design for Crowdsourcing Markets with Heterogeneous Tasks. HCOMP'14

Tran-Thanh et al. BudgetFix: Budget Limited Crowdsourcing for Interdependent Task Allocation with Quality Guarantees. AAMAS'14

Tran-Thanh et al. Efficient Crowdsourcing of Unknown Experts Using Bounded Multi-Armed Bandits. Artificial Intelligence, September 2014.

Assadi et al. Online Assignment of Heterogeneous Tasks in Crowdsourcing Markets. HCOMP'15

Kobren et al. Getting More for Less: Optimized Crowdsourcing with Dynamic Tasks and Goals. WWW'15

Roy et al. Task Assignment Optimization in Knowledge-Intensive Crowdsourcing. VLDB Journal, August 2015

Zheng et al. QASCA: A Quality-Aware Task Assignment System for Crowdsourcing Applications. SIGMOD'15

Schnitzer et al. Perceived Task Similarities for Task Recommendation in Crowdsourcing Systems. WWW'16

Goncalves et al. Task Routing and Assignment in Crowdsourcing based on Cognitive Abilities. WWW'17

Pilourdault et al. Motivation-Aware Task Assignment in Crowdsourcing. EDBT, March 2017

Quality Assurance

Raykar et al. Supervised Learning from Multiple Experts: Whom to Trust When Everyone Lies a Bit. ICML'09

Ipeirotis et al. Quality Management on Amazon Mechanical Turk. HCOMP Workshop @ KDD'10

Karger et al. Iterative Learning for Reliable Crowdsourcing Systems. NIPS'11

Hansen et al. Quality Control Mechanisms for Crowdsourcing: Peer Review, Arbitration, & Expertise at FamilySearch Indexing. CSCW'13

Joglekar et al. Evaluating the Crowd with Confidence. KDD'13

Kajino et al. Clustering Crowds. AAAI'13

Mao et al. Better Human Computation Through Principled Voting. AAAI'13

Venanzi et al. Trust-Based Fusion of Untrustworthy Information in Crowdsourcing Applications. AAMAS'13

Aydin et al. Crowdsourcing for Multiple-Choice Question Answering. AAAI'14

Jagabathula et al. Reputation-Based Worker Filtering in Crowdsourcing. NIPS'14

Li et al. The Wisdom of Minority: Discovering and Targeting the Right Group of Workers for Crowdsourcing. WWW'14

Venanzi et al. Community-Based Bayesian Aggregation Models for Crowdsourcing. WWW'14

Kazai and Zitouni. Quality Management in Crowdsourcing using Gold Judges Behavior. WSDM'16

Augustin et al. Bayesian Aggregation of Categorical Distributions with Applications in Crowdsourcing. IJCAI'17

Gadiraju et al. Using Worker Self-Assessments for Competence-Based Pre-Selection in Crowdsourcing Microtasks. ACM Transactions on Computer-Human Interaction (TOCHI), September 2017

Daniel et al. Quality Control in Crowdsourcing: A Survey of Quality Attributes, Assessment Techniques, and Assurance Actions. ACM Computing Surveys, February 2018.

Tian et al. Selective Verification Strategy for Learning from Crowds. AAAI'18

Engaging the Crowd

Mao et al. Why Stop Now? Predicting Worker Engagement in Online Crowdsourcing. HCOMP'13

Dontcheva et al. Combining Crowdsourcing and Learning to Improve Engagement and Performance. CHI'14

Eveleigh et al. Designing for Dabblers and Deterring Drop-Outs in Citizen Science. CHI'14

Segal et al. Improving Productivity in Citizen Science through Controlled Intervention. WWW'15

Crowdsourcing Workflow Design and Control

Law and Zhang. Towards Large-Scale Collaborative Planning: Answering High-Level Search Queries Using Human Computation. AAAI'11

Ambati et al. Collaborative Workflow for Crowdsourcing Translation. CSCW'12

Willett et al. Strategies for Crowdsourcing Social Data Analysis. CHI'12

Andre et al. Crowd Synthesis: Extracting Categories and Clusters from Complex Data. CSCW'14

Verroios and Bernstein. Context Trees: Crowdsourcing Global Understanding from Local Views. HCOMP'14

Goto et al. Understanding Crowdsourcing Workflow: Modeling and Optimizing Iterative and Parallel Processes. HCOMP'16

Gebru et al. Scalable Annotation of Fine-Grained Categories Without Experts. CHI'17

Crowd-Powered Systems

Lasecki et al. Chorus: A Crowd-Powered Conversational Assistant. UIST'13

Ipeirotis and Gabrilovich. Quizz: Targeted Crowdsourcing with a Billion (Potential) Users. WWW'14

Luther et al. Crowdlines: Supporting Synthesis of Diverse Information Sources through Crowdsourced Outlines. HCOMP'15