Crowdsourcing task work is a delicate decision, since it pulls hours from an in-house team, needs to integrate with in-house workflows, requires a very well-definied data specification and requires a certain amount of trust in data quality. When it works, crowdsourcing microwork can be brilliant. When it doesn’t, it’s a train wreck. Before entering into a contract for microwork, it’s criticial to nail down the specs, do your homework (and check lots of references), and give as much power as possible to the project leads so they can make the process work as efficiently as possible.
Every crowdsourcing marketplace has its own rules and specialty, but generally, they break down into three categories:
1. Contests - Contest marketplaces solicit responses as an open call and generally choose just one as a winner.
2. Open Markets - Marketplaces like oDesk, eLance, Guru, and Mechanical Turk allow employers to post nearly any job at any price, leading to a wide range of offers, from 20-minute typing assignments to complex, multi-week software development projects.
3. Microwork - Microwork marketplaces break up large, repetitive projects into very small, discrete chunks that are managed by a highly-automated software.
Founded in 2010, the industry website, Crowdsourcing.org, is a neutral organization dedicated solely
to crowdsourcing and crowdfunding. As one of the most influential and credible authorities in the crowdsourcing space,
Crowdsourcing.org is recognized worldwide for its intellectual capital, crowdsourcing and crowdfunding
practice expertise and unbiased thought leadership.