The Industry Website

Web's Largest listing of crowdsourcing and crowdfunding events

Web's Largest Directory of Sites

2,964 crowdsourcing and crowdfunding sites

Why Does Crowdsourced Testing Work?
© Image: Shutterstock

Why Does Crowdsourced Testing Work?

Editor's Note: This is the third in a series of posts from our friends at passbrains, who will be walking us through a primer on crowdsourced testing. So far we've defined crowdsource testing, and looked at exactly how it works. In this part, we cover why it works.

To understand why crowdsourced testing works, it is important to understand the set of biases that infest most testers and test managers around the world. This phenomenon is called, “The Curse of Knowledge,” a phrase used in a 1989 paper in The Journal of Political Economy. It means that for a particular subject expert, it is nearly impossible to imagine and look beyond the knowledge the tester has acquired (i.e. the set of concepts, beliefs and scenarios that the tester knows or predicts.) As a result, it is particularly challenging to think outside the box and conceive the various ways a typical end user would use particular software.

This phenomenon has been empirically proven through an infamous experiment conducted by a Stanford University graduate student of psychology, Elizabeth Newton. She illustrated the phenomenon through a simple game, people were assigned to one of two roles, namely tappers and listeners. Each tapper was to select a well-known song, such as “Happy Birthday,” and tap the rhythm on a table. The listeners were to guess the song from the taps. However, before the listeners guessed the song, tappers were asked to predict the probability that listeners would guess correctly. They predicted 50 percent. Over the course of the experiment, 120 songs were tapped out, but listeners guessed only three of the songs correctly - a success rate of merely 2.5 percent.

The explanation is as follows: when tappers tap, it is impossible for them to avoid hearing the tune playing along to their taps. Meanwhile, all the listeners could hear is a kind of bizarre Morse code. The problem is that once we know something, we find it impossible to imagine the other party not knowing it.

Extrapolating this experiment to software testing, most testers conduct a battery of tests that they feel is representative and that captures the set of end-user scenarios for how the software would be used. The reality is far from this. Any expert tester would asset that it is impossible to capture the complete set of scenarios that an end user may throw at a software system. As a result, critical paths of the code under certain scenarios go untested, which leads to software malfunctioning, production system crashes, customer escalations, long hours of meetings, debugging, etc.

Crowdsourced testing circumvents all these headaches by bringing a comprehensive set of code coverage mechanisms and end-user scenarios during the design and development stages of software engineering, during which the cost of modification is meager. This results in identifying critical use cases early on and providing for those contingencies, which reduces software maintenance costs later on during and after productive deployment. Besides progressive code coverage, the quality and depth of software testing among various vital software modules is achieved, which ultimately results in a higher code quality, among other benefits.

What does the future hold?

Crowdsourced testing, clearly, has its advantages and limitations. It cannot be considered as a panacea for all testing requirements and the power of the crowd should be diligently employed. The key to avoid failure in crowdsourcing would be to use it prudently depending on the tactical and strategic needs of the organization that seeks crowdsourced testing services. It is important for the organization to embrace the correct model, identify the target audience, offer the right incentives and have a suitable workforce to manage the task results.

Next in the series we'll learn a little more about some key concerns to remember when crowdsource testing.

- The above was originally written by Mithun Sridharan, and adapted by Dieter Speidel of Disclosure: is a client of / massolution.

Flag This



 Join or Login