CrowdSearcher can be characterized as a MULTI-PLATFORM, REACTIVE, EXPERTISE BASED AND SOCIAL NETWORKING BASED CROWDSOURCING APPROACH.
The main aim of CrowdSearcher is to provide an effective way of controlling the crowd in crowdsourcing campaigns.
Controlling means adapting the behaviour of the crowdsourcing systems in response to the quantity and timing of completed tasks, the quality of responses and task results, and the profile, availability and reliability of performers.
At the purpose of controlling the crowd, we bring together various ingredients:
- crowdsourcing: we define an abstract model of crowdsourcing activities in terms of elementary task types (such as: labelling, liking, commenting, sorting, grouping) performed upon a data set, and then we define a crowdsourcing task as an arbitrary composition of these task types, according to the model below.
- social networking: We show how social platforms, such as Facebook or Twitter, can be used for crowdsourcing search-related tasks, side by side with traditional crowdsourcing platforms.
- multi-platform integration: we allow deployment of abstract crowdsourcing tasks to several invitation and execution platforms, including Twitter, Facebook, Amazon Mechanical Turk, and we collect and aggregate results from every platform. Performance of crowds depends on the execution platform (e.g., Facebook and Twitter immediately collect a lot of responses but then more professional platforms like Doodle or Linkedin outperform them), on the task type (simpler tasks are responded more frequently), on the posting time, the topic, and the language of tasks. The selection of the task execution platform might influence the time required to get answers from the crowd (Facebook features less latency, but Doodle brings in more answers in the long term).
- expertise finding: we analyze how performer profiling can be enriched with the social activity of the performer himself and of his friends or social connections. Experiments show how different profiling options can impact on the quality and efficiency of crowdsourcing campaigns.
- reactive rules: reactive control is obtained through rules which are formally defined and whose properties (e.g., termination) can be easily proved. Rules are defined on top of data structures which are derived from the model of the application. Rules are written in reactive style, according to the ECA (Event-Condition-Action) paradigm and allow making decisions about the production of results, the classification of performers (e.g., identification of spammers), the early termination and re-planning of tasks based on some performance measures, the dynamic definition of micro-tasks, and so on. Simple changes in the declarative reactive rules can significantly impact on the task quality and cost. Reactive rules allow significant savings in terms of execution time and number of executions, as well as improvements in precision of results.
CrowdSearcher is implemented as a cloud service, where crowdsourcing campaigns are configured through a Web user interface or through API.