by ,
What price should be offered to a worker for a task in an online labor market? How can one enable workers to express the amount they desire to receive for the task completion? Designing optimal pricing policies and determining the right monetary incentives is central to maximizing requester's utility and workers' profits. Yet, current crowd-sourcing platforms only offer a limited capability to the requester in designing the pricing policies and often rules of thumb are used to price tasks. This limitation could result in inefficient use of the requester's budget or workers becoming disinterested in the task. In this paper, we address these questions and present mechanisms using the approach of regret minimization in online learning. We exploit a link between procurement auctions and multi-armed bandits to design mechanisms that are budget feasible, achieve near-optimal utility for the requester, are incentive compatible (truthful) for workers and make minimal assumptions about the distribution of workers' true costs. Our main contribution is a novel, no-regret posted price mechanism, BP-UCB, for budgeted procurement in stochastic online settings. We prove strong theoretical guarantees about our mechanism, and extensively evaluate it in simulations as well as on real data from the Mechanical Turk platform. Compared to the state of the art, our approach leads to a 180% increase in utility.
Truthful Incentives in Crowdsourcing Tasks using Regret Minimization Mechanisms A. Singla, A. KrauseIn International World Wide Web Conference (WWW), 2013
Bibtex Entry:
	author = {Adish Singla and Andreas Krause},
	booktitle = {International World Wide Web Conference (WWW)},
	month = {May},
	title = {Truthful Incentives in Crowdsourcing Tasks using Regret Minimization Mechanisms},
	year = {2013}}