Commit 8a5daa86ff2e2e2e11b6f2931eaf5066aa7c332e

Authored by waldispuhl
1 parent a7614cca29
Exists in master

update abstract

Showing 1 changed file with 3 additions and 3 deletions Side-by-side Diff

CHIPLAYpaper/MarketPaper.tex View file @ 8a5daa8
... ... @@ -125,8 +125,8 @@
125 125  
126 126 \begin{abstract}
127 127 {\color{red} Crowdsourcing with human-computing games is now a well-established approach to help solving difficult computational problems (e.g. Foldit, Phylo). The current strategies used to distribute problems among participants are currently limited to (i) delivering the full problem to every single user and ask them to explore the complete search space (e.g. Foldit), or (ii) decomposing the initial problem into smaller sub-problems and aggregate the solutions returned by gamers (e.g. Phylo). The second approach could be used to explore larger solution space and harness collective intelligence, but interactions between players might lead to group-think phenomena. Therefore, popular crowdsourcing platforms such as Amazon Mechanical Turk deliberately forbid communication between participants.\\
128   -In this paper, we design a novel multi-player game-with-a-purpose, and analyze the impact of multiple game mechanisms on the performance of the system. We present a highly collaborative human-computing game that uses a market, skills and a challenge system to help the players solve a graph problem.} The results obtained during 12 game sessions of 10 players show that the market helps players to build larger solutions. We also show that a skill system and, to a lesser extent, a challenge system can be used to influence and guide the players towards producing better solutions.\\
129   -Our collaborative game-with-a-purpose is open-source, and aims to serve as an universal platform for further independent studies.
  128 +In this paper, we design a novel multi-player game-with-a-purpose, and analyze the impact of multiple game mechanisms on the performance of the system. We present a highly collaborative human-computing game that uses a market, skills and a challenge system to help the players collectively solve a graph problem. The results obtained during 12 game sessions of 10 players show that the market helps players to build larger solutions. We also show that a skill system and, to a lesser extent, a challenge system can be used to influence and guide the players towards producing better solutions.\\
  129 +Our collaborative game-with-a-purpose is open-source, and aims to serve as an universal platform for further independent studies.}
130 130 \end{abstract}
131 131  
132 132 \keywords{Game-with-a-purpose; Human computing; Collaboration; Crowdsourcing; Graph problem; Market; Trading game; Skills; Challenges.}
... ... @@ -134,7 +134,7 @@
134 134 \section{Introduction}
135 135 Human-computation and crowd-sourcing are now perceived as valuable techniques to help solving difficult computational problems. In order to make the best use of human skills in these systems, it is important to be able to characterize the expertise and performance of humans as individuals and even more importantly as groups.
136 136  
137   -Currently, popular crowd-computing platforms such as Amazon Mechanical Turk (AMT) \cite{Buhrmester01012011, Paolacci} or Crowdcrafting \cite{Crowdcrafting} are based on similar divide-and-conquer architectures, where the initial problem is decomposed into smaller sub-tasks that are distributed to individual workers and then aggregated to build a solution. {\color{red} This is also the case in popular scientific human-computing games such as Phylo [CITE PHYLO]. Importantly,} these systems prevent any interaction between workers in order to prevent groupthink phenomena and bias in the solution \cite{Lorenz:2011aa}.
  137 +Currently, popular crowd-computing platforms such as Amazon Mechanical Turk (AMT) \cite{Buhrmester01012011, Paolacci} or Crowdcrafting \cite{Crowdcrafting} are based on similar divide-and-conquer architectures, where the initial problem is decomposed into smaller sub-tasks that are distributed to individual workers and then aggregated to build a solution. {\color{red} This is also the case in popular scientific human-computing games such as Phylo \cite{Kawrykow:2012aa,Kwak:2013aa}. Importantly,} these systems prevent any interaction between workers in order to prevent groupthink phenomena and bias in the solution \cite{Lorenz:2011aa}.
138 138  
139 139 However, such constraints are necessarily limiting the capacity of the system to harness the cognitive power of crowds and make full benefit of collective intelligence. For instance, iterative combinations of crowdsourced contributions can help enhancing creativity \cite{DBLP:conf/chi/YuN11}. The usefulness of parallelizing workflows has also been suggested for tasks accepting broad varieties of answers \cite{DBLP:conf/chi/Little10}.
140 140