2,358 crowdsourcing and crowdfunding sites
Crowdsourcing is becoming an integral part of how enterprises conduct their research, product development, execution and marketing. Over the past twenty years most companies have had at least one eye on making their processes more efficient.
We shouldn't get so caught up in the buzz around crowdsourcing that we overlook how to talk about it as a process that needs refining. So how do we go about making crowdsourcing more efficient, more productive or just plain better?
There is a way to do this – more than one way, no doubt.
First, we need to think of the parameters.
Crowdsourcing ideas for an innovation platform is a different game from crowdsourcing a function like checking signatures on bank checks. Both are different from engaging the crowd in a customer ecosystem, the way Dell and Nike do, or the way that companies like Global Dawn do in marketing.
We need to think of ways to improve each area of crowdsourcing and to validate new work processes — but it's likely, as with all processes, that there is a common denominator.
1:9:90 making crowdsourcing better
I said last time out that the 1:9:90 rule gives us a way to leverage the behaviour of crowds. This, I believe, is the common denominator, the principle for making crowdsourcing function well and then for improving it.
For a long time, observers of the web were in agreement that only about 1% of a crowd actually creates content; a further 10%, say, might contribute in some way – by commenting or sharing, for example; and the vast majority lurk or are present on a site without taking action.
This is significantly different from the old Pareto rule that said 20% of your customers provide 80% of the value, but both underline the fact that crowds, groups, ecosystems perform because a minority carries most of the weight. Our expectation should be that we can influence and encourage the proportion of people who make positive contributions – so, for example, we can convert some of the 90% into contributors, as well as enhance the performance of 1s, 9s and 90s.
A second issue for crowds is that crowdsourced projects are often non-collaborative. For example, in the Netflix Prize project only a small number of the 41,000 teams actually shared data to bring a crowd perspective to the puzzle of how to improve video recommendations. We need a clearer view of what behaviour contributes best in any given crowdsourced project. It has been argued elsewhere that Netflix set the wrong boundary conditions, which in turn limited its upside to 10% when it could have been 200% or 300%.
Taken together the lessons are: keep boundary conditions flexible, and segment the crowd so that we can promote and improve the participation of those involved.
How do you do that? The answer is to create the right data framework to understand the behaviour of people who come to crowdsourced or participatory sites.
I want to touch on two aspects of improvement. The first is the nature of crowdsourcing. The second is the data.
The narrative web
Crowdsourcing is essentially game-like behaviour. It is competitive and collaborative, just like team sports.
The web is rapidly evolving towards a game-like, contest driven environment. This is good. The static nature of the web to date has not been good for participation – though it's worked well as a 'read web'.
Think of websites that have traditional static navigation systems augmented in some cases with a small amount of user-recommendations (your average brand site!).
These sites have no timeline, no narrative and few compelling hooks that draw people in as participants or keep them coming back.
Contest or challenge-driven sites, by contrast, like all games, have drama, a timeline, a beginning and end, and plenty of opportunities to engage people in real-life stories. The web is moving to a more powerful narrative form. We're not so aware of it but it is happening. And it is all part of a move towards a more challenge driven economy that crowdsourcing absolutely typifies.
And we are rapidly moving towards more social game formats. That means games, the new underlying dynamics of websites, are driven by role adoption, by people taking on roles and engaging in some form of game, relationship or play.
The new role of data
Challenges, contests, games are the typical setting for engaged crowds. They also yield much more user data than static websites do because people take on a variety of roles within contests and because they come back to look into the narrative and its development, for example how a competition is playing out, who is scoring most points, who is taking a lead.
The web-as-challenge is moving in a fascinating direction. There is one problem, though; to date we have tended to measure the behaviour on these new sites in much the same way we did in the past. We measure views, time on sites, Facebook fans, and tweets. In other words we measure volume.
These volume metrics try to replicate traditional brand visibility metrics – they say: How many people can I infer that like me?
This does nothing to help site sponsors, contest organisers, and crowdsource users to change the behaviour on a site. And change is desirable – we want people to be more effective, in the sense that we want more 1s and more 9s.
The way forward is to create web metrics that help us get at behaviours we can influence and improve. Volume metrics don't do this. We should instead be measuring the activities that 1s and 9s engage in on a crowdsourcing site, and the behaviour that prevents 90s from becoming 9s.
We want the answers to questions like: Do 9s pass the word around about the contest, challenge or game? Do they create content for the site or share ideas? Do they create original content or do they tend to bring content in from other sites? What kind of messages on the site do they respond to? Which elements of design, narrative and hook prevent them from creating or sharing more?
And how about the 90s? Do they navigate the site well or do they have difficulty in locating information such as registration or upload buttons? Do they struggle with purpose? Is the site design preventing them from being 9s or is it the nature of the rewards? What data will help us make them more effective?
It is possible to capture all of this type of data and to process it in real time. That means crowdsource projects — or indeed any online project — can adapt to its users in ways that make it much easier to promote positive activity, to facilitate 90s to become 9s, to support 9s in spreading the word or engaging with content or to make 1s more effective.
Data and the narrative web
We do have a tendency to think that a website — like most IT — has to be pre-planned in a profound sense, and that once it is up, the opportunities for adaptation are limited. This is essentially an IT/software view of the world – anticipation is all. This used to be necessary but it is true no longer.
It is perfectly possible to post-design sites – that is, to create an environment where the site becomes a learning process, and where important elements of design are adapted as the narrative of the contest or challenge unfolds.
It is not necessary to be running challenge-like projects to do that but it becomes more powerful in a contest or challenge-driven environment because people do more, take on more roles and reveal more about themselves.
In summary, the way to improve crowdsourcing is:
• Be flexible with boundary conditions and let the crowd define some element of the outcome
• Recognise the emerging narrative nature of the web and use narrative techniques to engage people
• Build post-design adaptation into the project, so that sites adapt as the narrative unfolds
• Segment (1:9:90)
• Use the data to assist the 90s and 9s to and promote their participation
• Support the 1s to grow and improve as a group by understanding what tools they need to be great creators.
Narrative, data and adaptation will typify the web within three years. We will all be looking at how we adopt the tools that made movies and novels compelling, and as we do we will have the inestimable advantage of having great, continuous real-time data on how those tools work.
By Jonathan Lakin
Jonathan Lakin is the CEO of Global Dawn, the customer engagement platform. He's also an angel investor and catalyst for new enterprises in data and marketing. Jonathan's company Global Dawn is due to launch a new platform in October which significantly increases the efficiency of engaging customers in co-creation activities in marketing and talent development.