2,529 crowdsourcing and crowdfunding sites
Last month, Boston-based uTest got a nice early gift for the holidays--$17 million in series D funding to add to $20 million received in three previous funding rounds. The crowdsource software testing company claims "year-over-year growth of 250% over the past three years," and today taps into a crowd of over 40,000 registered testers spread across the globe.
uTest's success is part of the current groundswell of excitement surrounding a number of crowdsourcing ventures--as evidenced by our recent breakdown of VC dollars poured into such efforts in 2011--but there's also a bit of Carpe Diem in the uTest story. The firm occupies an ideal position in this particular moment in time. The demand for quality assurance for the thousands of applications available on a plethora of platforms means it's boom-time for software testing. While uTest's model is adaptable to just about any kind of software or platform, one area in particular has driven much of the growth the past few years.
With hundreds of thousands of new mobile devices being activated around the globe every day, thousands of attendant applications are in constant need of quality assurance work. By its very nature, mobile applications require testing in a practically unlimited number of user environments, and uTest's huge crowd is the next best thing to the real world (uTest likes to refer to it as testing "in the wild").
Of course, uTest is much bigger than just a network of software geeks tinkering with new iPhone apps from little studios. In the four years it's been around the company has provided services to some huge names including Google, Intuit, Microsoft and Skype, as well as startups like Trulia and Vlingo. And those services go beyond basic function tests. As co-founder Doron Reuveni wrote after announcing the $17 million funding round, "we recently dove into other types of testing services, including security, load, localization and usability, so the latest round of investment will help us further refine and expand our offerings."
Recently, Crowdsourcing.org's Carl Esposti has looked into how uTest has managed so much success with their model. Last May he reported on three new tools uTest was deploying to help motivate its army of testers and how the effort helps surface some of the qualities you might expect from a more traditional workforce--like organizational discipline--from within its distributed crowd. Those tools are just the latest in uTest's continuing effort to do something else that sets it apart from other crowdsourcing firms--offloading much of the responsibility for quality assurance to the crowd itself. uTest manages itself--including its crowd--as a community-based business, identifying, developing and reward the talent that exists within its own ranks. Community-driven teaching, mentoring and vetting push uTest's crowd towards becoming a more self-sufficient and self-policing crowdsourcing model that should be able to scale easily for quite some time.
So while uTest is certainly in the right place at the right time, it has also built the right tools, providing access to needed testing capacity on demand and the ability to deploy it rapidly.
"Unlike traditional outsourcing models that have some inherent limitations when it comes to being totally and immediately variable, with uTest's crowdsouring model you've got the ability to turn up capacity and turn it back down very quickly. It's truly 100% variable," says Esposti. "You've got the ability to identify and deploy, only for the time needed, every configuration of user, device and application, when you need it where you need it and test the applications in the real world. You can't do this as rapidly, efficiently or as cost effectively with in-house teams or outsourced teams."
Another theme emerges the more you learn about uTest--that the company is keeping pace; rapidly learning and evolving as it grows to get the best results out of its crowd.
"With uTest in 2008, there was plenty of unpredictability and plenty of variability in the model," the company's chief marketing head Matt Johnston told me. "And as we started working with enterprises like Google and Microsoft and AOL, their tolerance for that sort of stuff was zero, and it forced us to grow up in a hurry."
Johnston explains that uTest uses a few key methods to make sure quality is baked in.
"One is the quality of the profile that people fill out. Two is the tester rating algorithm has gotten generations more sophisticated." He says those tools help separate good from great testers and are a major predictor of future success. uTest has invested heavily in its matching algorithms that pair testers with projects and in the tester rating algorithms. "Those become a lot of the secret sauce for us, " says Johnston.
Gianni Giacomelli, senior vice president for new product innovation at Genpact--one of the leading global providers of technology and business process services, which has done plenty of testing via a more traditional outsourcing model--says crowdsourcing was a "bit of a blind spot" for some. He doesn't see a model like uTest's as radically different from traditional in-house or outsourced models; rather he sees all three existing on a sort of continuum, with in-house sourcing to the far left, outsourcing on the right, and crowdsourcing pushing outsourcing a little but further to the right.
"Outsourcing does have a crowd, when compared to an in-house group," he told me in an interview, adding that with a crowdsourcing model like uTest's "your crowd is even bigger now. It can be very specialized if you want, it can be very generic if you want; certainly it can be cost-efficient. To me this is not an entirely new model, it is just an extension of what has been happening already."
Giacomelli does see some limitations to a full crowd model, naturally. He notes that it can be more difficult to monitor and control some of the processes that take place between assigning tasks to a crowd and their completion, and he also cites issues around intellectual property and confidentiality that often are incompatible with a crowd model.
uTest's Matt Johnston says he's often heard charges that a crowdsourcing model can be chaotic or lacking accountability, but he says it's just a matter of building the needed control into the process. And he agrees with the notion that crowdsourcing is re-treading a trail that was first blazed by outsourcing models.
"I think we have taillights to follow because I think anything that could be outsourced traditionally, will be able to be crowdsourced, eventually."
Rather than crowdsourcing simply taking over areas that were once the domain of more traditional outsourcing, Giacomelli sees the future of models like uTest's maturing more when they merge with those traditional practices. He points to the example of Amazon, which he says brought a lot of new ideas to the table, but also does a lot of traditional retail marketing and has come to dominate by iterating on old tried and true models rather than completely replacing them.
While uTest has managed to grow at a remarkable clip in its early years, thanks in part to the mobile explosion, Esposti says the company has just begun to scratch the surface of its potential markets. He thinks uTest could continue to scale for a while, but if it wants to become a top tier provider it's going to need to figure out how to provide application testing for more enterprise systems.
"It's going to have to work out how it can compete with internal operations or traditional outsourcing providers for maintaining the big ERP systems that allow the major enterprises to run their businesses," says Esposti.
If uTest is interested in bringing its crowd deeper into the enterprise world, Matt Johnston isn't showing that card just yet. He sees the natural limits of crowdsourcing as varying from area to area--the ceiling could be quite different for graphic design versus software testing, for example. "Designing a business card is one thing, but designing internal inventory applications could be quite another."
He sees the same principle holding true for more traditional outsourcing as well.
"There are some things that are just too far inside the firewall to be sourced outside."
It's certainly true for now that not everything can be crowdsourced, but if investors continue to grow uTest's pool of resources it's likely we'll start to see more of a crowd hanging out in some of the deeper levels of enterprise services.
- Eric Mack is a contributing editor for Crowdsourcing.org. He also currently contributes to CNET. In the past, his work has been featured by NPR, Wired, the New York Times and other outlets. You can contact him at firstname.lastname@example.org. Find him on Twitter and Google+. Also be sure to follow Crowdsourcing.org on Twitter.