News

By Ram Rampalli, July 20, 2011

Crowdsourcing Thought Leadership: Building a successful portfolio of crowdsourcing projects (Part 2)

(photo via www.insideflorida.com)

This is part of a series of guest posts by Ram Rampalli, our crowdsourcing partner at eBay.
Part I – Assessment Stage
Part II – Pilot Stage
Part III – Analysis Stage
Part IV – Production Stage

About the author: Ram Rampalli created and leads the crowdsourcing program within the Selling & Catalogs team at eBay Inc. You can follow him on Twitter (@ramrampalli)

A quick aside before jumping into Part 2: This series lays out a methodology for compiling a successful portfolio of high-accuracy, deterministic crowdsourcing projects done through the CrowdFlower platform. It is not an absolute methodology for all possible crowdsourcing projects.

Building a successful portfolio of crowdsourcing projects – Part 2

In the first part of this series, we discussed the Assessment stage. If your proposed project passed the initial assessment, it graduates to the next stage: Pilot.

The first recommended step in the pilot stage is the pilot requirements specification. In my experience, some ideas that passed the assessment stage were later disqualified after further review with the CrowdFlower team. This prompted us to come up with the pre-pilot document, in which we ask the project sponsors to complete pilot set-up information that also serves as a second checkpoint:

  • Name
    • Choose a simple and descriptive name.
  • Objective
    • Define clear and concise objectives.
    • Example: Improve the recommendation engine algorithm accuracy from 65% to 90% by end of 2011.
  • Description
    • Describe the task that you expect the worker to perform. If it’s difficult to describe in a few sentences, you may need to refine the task.
    • Often projects are rejected when the sponsor cannot provide a simple job description.
  • Sample Data
    • Provide sample data that is representative of the source data for the ongoing project.
    • This process often results in project sponsors uncovering bottlenecks in the data generation process.
  • Gold Units
    • The project sponsor should either present sample gold units (preferable) or have a plan in place for the gold units.
  • Volume
    • Define how often the test is to be run and the volume for each run.
  • Success Metrics
    • Define the metrics to evaluate the success of this project.
    • If this project is currently run through outsourcing, or other means, list the current performance metrics.

If the project satisfies the pilot specifications document, we work with the CrowdFlower team to design the pilot, with thorough testing by the CrowdFlower team prior to launch.

Many projects are launched in two stages.

Stage I: Controlled Launch

In the first stage, we do a controlled launch of a subset of all units. We monitor the task closely to make sure that it is on track to achieve the success metrics outlined previously. If necessary, we calibrate and optimize the task to improve performance.

Stage II: Full Launch

Once we are sure the project is running smoothly we then open up the task at full scale. Upon completion, CrowdFlower sends a copy of the results.

To save time, be sure to agree upon the structure of the results file, so that you are getting the data in its most usable format.

This sets you up for the next phase – Optimization and Analysis.