By Ram Rampalli, July 12, 2011

Crowdsourcing Thought Leadership: Building a successful portfolio of crowdsourcing projects (Part 1)

This is part of a series of guest posts by Ram Rampalli, our crowdsourcing partner at eBay.
Part I – Assessment Stage
Part II – Pilot Stage
Part III – Analysis Stage
Part IV – Production Stage

About the author: Ram Rampalli created and leads the crowdsourcing program within the Selling & Catalogs team at eBay Inc. You can follow him on Twitter (@ramrampalli)

Building a successful portfolio of crowdsourcing projects – Part 1

Like many other organizations, eBay has leveraged offshore teams in several low-cost destinations to complete many routine operational processes. While many of these projects have been quite successful, outsourcing continues to pose several key challenges, especially around scalability and cost. As eBay began looking at several alternatives to address these challenges, crowdsourcing emerged as an attractive option.


Crowdsourcing Thought Leadership

At eBay we have found crowdsourcing to be an incredibly powerful tool for completing large-scale information collection and annotation tasks. Still, as with any new paradigm, there is a learning curve to incorporating crowdsourcing successfully into a larger business process.

In a four part series, I will introduce a suggested methodology to build a successful portfolio of crowdsourcing projects. Hopefully this will help those new to crowdsourcing get projects up and running with minimal frustration.

I suggest a four-stage model towards a successful portfolio of crowdsourcing projects.

  1. Assessment
  2. Pilot
  3. Optimization and Analysis
  4. Production

The first step – Assessment – has two components.


Different crowdsourcing platforms have different approaches to creating and assessing micro-tasks with given inputs. Things to consider when assessing a crowdsourcing platform:

  • Scalability – can this platform provide enough annotators to handle the volume of information that you need assessed?
  • Quality – How does this platform ensure accurate results? Does it provide Gold Standard units? Does it offer redundancy (multiple judgments per data point)? Does it offer automated workflows that have been tested at scale?
  • Ease of Integration – How easy is it to feed information to the platform? Once a project is up and running, how easy is it to adjust to changing business rules?

For the purposes of this series, I will use CrowdFlower as a use case.


I have created a simple questionnaire that will help vet your task. If you have a task in mind, simply answer the questionnaire and you’ll get an instant assessment.


If you answered “No” to any of these questions, that is a red flag. If your project passed the initial test, you can proceed to the next stage – Pilot.

You may not have a truly binary answer to one or more questions posed above and therefore may be unsure if your task is crowdsourceable. If that is the case, contact CrowdFlower, and a platform specialist will help you assess the task more fully.