Salience Effect in Crowdsourcing Contests


Speaker


Abstract

Online platforms typically allow online users to contribute freely to the community. However, without appropriate control, the behavior of the online community might not align with the platform’s designed objective, which can lead to an inferior platform performance. This paper investigates how the feedback information on a crowdsourcing platform and systematic bias of crowdsourcing workers can affect crowdsourcing outcomes. Specifically, we examined the role of a systematic bias, namely the salience effect, in influencing the performance of a crowdsourcing platform (i.e., Kaggle) and how the number of crowdsourcing workers moderates the impact of the salience effect as a result of a parallel path effect and competition effect. Our results suggest that the salience effect influences the performance of contestants, including the winners of the contests. Furthermore, the parallel path effect cannot completely eliminate the impact of the salience effect, but it can attenuate it to a certain extent. By contrast, the competition effect is likely to amplify the impact of the salience effect. Our results have critical implications for crowdsourcing firms and platform designers.