Cowdsourcing.org The Industry Website

Register Login
or sign in with

Web's Largest listing of crowdsourcing and crowdfunding events

Events

Advertise
Web's Largest Directory of Sites

2,824 crowdsourcing and crowdfunding sites


The case for and against crowdsourcing (part 2)
editorial

The case for and against crowdsourcing (part 2)

 

In the first part of this article I described which types of crowdsourcing currently exist and the benefits of crowdsourcing. In part two, I now explore some of the drawbacks of crowdsourcing.

There are a number of key issues to consider when conducting a crowdsourcing project that have the potential to materially impact the outcome. The potential drawbacks of crowdsourcing often stem from the fact that in crowdsourcing initiatives, online volunteers instead of companies own staff members are executing the work. The use of ‘outside’ workers can present some challenges which I'm going to explore accross the following areas to highlight the potential drawbacks of working with crowds.

I will examine some of the potential drawbacks associated with each of the following categories:

    •    Collaboration tools
    •    Effectiveness and efficiency
    •    Intellectual property rights
    •    Motivaion and drivers
    •    Management overhead

 

Drawbacks of crowdsourcing


In crowdsourcing initiatives, online volunteers instead of companies own staff members are executing the work. The use of ‘outside’ workers can present some challenges.


Collaboration tools


Organizations manage staff in the workplace in a way that enables individuals to learn and gain from the knowledge and experience of colleagues. They are provided with access to the information they need for their jobs and any systems or procedures required to carry out their responsibilities. With crowdsourcing initiatives, it doesn’t work this way. With crowdsourcing, people are limited to providing their individual contributions without the benefit or opportunity for review by others. In addition, they forfeit the opportunity to enrich their contribution with relevant information that they would normally learn from their collaborations within a team-working environment. In addition, limited or no tools are made available to the crowdsourcing volunteers. This limits the contributions of crowdsourcing volunteers who have to rely on their own means.


If we take the example of design tools that are available in a R&D lab, a staff member can assume a much broader set of activities than would be possible by a volunteer on a crowdsourcing project who might be limited for example to the provisioning of text on a product innovation. To the extent that crowdsourcig platforms are developing tools to help the volunteer workers perform their activities, their involvement in the R&D process will be extended.


A good example of an enterprise that is using crowdsourcing that has integrated software into its platform to help participants is Lego where customers can design their own toys using Lego software. The software enables virtual piling of Lego building blocks. When tools are made available to crowdsourcing volunteers, they can create more value for the enterprise using crowdsourcing as part of its strategy. In addition, collaboration tools that enable higher control and consistency of work or that enable collaboration or hand-over of others’ contributions will increase the quality of contributions. 


 

(Image credit: Lego DesignByMe.com)

 

Effectiveness and efficiency

Staff members are hired on basis of their specific or unique knowledge, skills and working experience. However, crowdsourcing models allow anyone to participate with selection being a function of the quality of contributions only paying for work that meets the required standard. It could be questioned therefore whether all participants are in fact qualified for the activity.  Could you call a crowdsourcing initiative a success when a large number of contributions are received that are generally of very low quality? In my research of the site NU.nl  – the most popular online news site in the Netherlands – it appears that the majority of the news photos uploaded by visitors were of very low quality. An expert panel estimated that 86% of 750+ photos that they assessed were of insufficient quality with only one photo from the whole batch being considered of professional quality.


Large volumes of low quality contributions also cause practical problems: sifting through all contributions to identify the useful ones requires effort and time and clearly therefore increases the cost of the crowdsourcing project.  In cases where the crowdsourcing endeavor results in highly complex submissions such as with innovation competitions like Cisco’s “X prize”  and IBM’s “Innovation Jam”, it requires a substantial effort to spot the ‘diamonds’ amongst 000s of submissions. This is described in an article  in MIT Sloan on IBM’s innovation Jam (Bjelland and Wood, 2008): “senior executives and others spent weeks sifting through tens of thousands of postings – gigabyites of often aimless Jam conversations”.

Actually, a firm should aim to leave the filtering of good quality contributions to the crowd. There are already many moderation or rating systems available in online communities. It is unclear whether the systems really detect the best quality contributions or are in fact popularity meters, showing how well a person is using Twitter, Facebook or other social media to influence voting behavior.

Companies can affect the quality of crowdsourced contributions by using standard formats and clear rules. However, not all crowdsourcing activities are suited for a fixed format and practice learns that rules are not read well. A large proportion of respondents in the NU.nl study were unaware that they would receive a revenue share when their news photo would be sold to external media. This revenue sharing was announced in the General Terms and Conditions – which every visitor had to accept before uploading a news photo – and they also had to provide their banking account number. So when offering rewards, this has to be emphasized multiple times and communicated together with the reward criteria. In these reward criteria, a firm can specify what the desired quality level is. My study of contributors of NU.nl site shows that people who were driven by the opportunity to receive a financial reward, performed better on those aspects that emphasized the reward criteria.

Intellectual Property Rights (IPR)


In positions of employment, employees receive a salary for their contribution and time and the firm owns any intellectual property developed by the employee during the employment period. In crowdsourcing, people are participating voluntarily and unless the position on IP is made clear (i.e. a condition of the right to participate is the acceptance that the IP transfers to the event sponsor) – and explicitly stated, the IP is available for exploitation by the contributor.

Motivation and Drivers

Several scientists have wondered about the willingness of crowdsourcing participants to provide their contributions for free. Eric Von Hippel gives two explanations for this phenomenon, known as ‘freely revealing’. Firstly, not all participants will be able to exploit their contribution commercially. Inventing a new recipe for beer (i.e. ‘open beer’ project) may be easy, setting up a brewery plus marketing and distributions channels is a different story. A second reason that Von Hippel points out, is that people are motivated to selflessly work on a ‘public good’ – a facility that will be of benefit for a larger group of people. I am not convinced by the second argument because, in many crowdsourcing initiatives, there is no intention to produce a public good. Most initiatives are driven by normal commercial considerations. Even initiatives that can be classified as serving public interests, e.g. Wikipedia, show that altruistic motives do not influence contributors’ behaviour (Nov, 2007). Therefore it can be questioned whether the second argument holds. In my opinion also the first argument will not sustain for years. I suppose that in many cases, contributors are not aware of the value that they create for the firm. I expect that with the rise of new licensing forms, such as creative commons, people become more aware of opportunities to benefit from their own contribution. Therefore I believe that in time commercial usage of crowdsourcing contributions without compensation will become harder.


Another issue with crowdsourced contributions is the uncertainty whether the contributor is really the producer or inventor and the contribution is not owned by other organizations or persons. It can take a lot of time to investigate whether the person is the owner of his/her contribution.

Management Overhead

For the operator of a crowdsourcing platforms there is also the  overhead associated with managing an active and growing community. For example, the Dutch social network Hyves  employs 25 moderators who are continuously monitoring and improving the quality of the content while the site GO Supermodel, which targets an audience of teenage girls, has 70 moderators checking the content of the site 24 hours a day/7 days per week. Through experiments with new moderation tools, staffing may be reduced.


What does the future hold?


So what does the future hold for crowdsourcing - crowdsourcing clearly has its advantages as well as its limitations. As we continue to experiment and learn about crowdsourcing we will gain experience that will help us use it wisely and take advantage of the value it offers. At the same time we will become better and mitigating any associated risks and we will learn how to better deal with the operational issues - by developing (web-based) tools, the applicability of crowdsourcing may be extended to new types of activities.

By Irma Borst


Irma Borst PhD is Principal Consultant at Logica Consulting and Associate Researcher at RSM Erasmus University, Management of Technology and Innovation.

A PhD dissertation, "Understanding Crowdsourcing"  by Irma Borst sub-titled "Effects of Motivation and Rewards on Participation and Performance in Voluntary Online activities, was published on 23rd December 2010. The research challenge was to study the effects of intrinsic and extrinsic motivation of online volunteers and how rewards influence this relationship.

 

Flag This

1

Comments

Guest
 Join or Login
 Optional