Artificial Intelligence Algorithm beat crowdsourcing

How a machine learning algorithm beat the assembled masses of’s Mechanical Turk.

Amazon’s Mechancial Turk is the ultimate in nearly anonymous outsourcing: any task that can be completed online can be accomplished by the combination of automated marketplace and human labor. Those who sign up to complete tasks – Turkers – are paid wages as low as pennies per chore to do everything from data entry to folk art.

Mechanical Turk is designed to complete tasks that are easy for humans and hard for machines, such as categorizing or identifying the content of images. The problem for Amazon and all its imitators, however, is that machines are getting better at many tasks, while the humans on Mechanical Turk, for reasons I’ll explore in tomorrow’s post, are getting worse.

* Only 79 out of 4660 human applicants could pass a basic multiple choice test
* A Bayes classifier correctly identified the category of a business a third more often than the humans. In the automotive category, the computer was twice as likely as the assembled masses to correctly identify a business.

Something about Mechanical Turk itself is broken — either the incentive system or its mechanisms for policing quality

Yelp paper – Towards Building a High-Quality Workforce with
Mechanical Turk (5 pages)

Online crowdsourcing services provide an inexpensive and scalable platform for large-scale information verification tasks. We present our experiences using Amazon’s Mechanical Turk (AMT) sto verify over 100,000 local business listings for an online directory. We compare the performance of AMT workers to that of experts across five different types of tasks and find that most workers do not contribute high-quality work. We present the results of preliminary experiments that
work towards filtering low-quality workers and increasing overall workforce accuracy. Finally, we directly compare workers’ accuracy on business categorization
tasks against a Na¨ıve Bayes classifier trained from user-contributed business reviews and find that the classifier outperforms our workforce. Our report aims to
inform the community of empirical results and cost constraints that are critical to
understanding the problem of quality control in crowdsourcing systems.

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks

Featured articles

Ocean Floor Gold and Copper
   Ocean Floor Mining Company

var MarketGidDate = new Date();