Affiliations 

  • 1 Universiti Tenaga Nasional
MyJurnal

Abstract

Crowdsourcing gathers the world’s software engineering experts on a specific subject matter, and allows organisations and individuals to employ the combined effort of these ‘experts’ to accomplish the software task at hand. However, leveraging the knowledge of experts will not be achieved without online crowdsourcing platforms, which makes communication possible. This study intends to evaluate the performance of four Crowdsourced Software Engineering(CSE) platforms (TopCoder, InnoCentive, AMT and Upwork) based on the criteria of the Web of System Performance (WOSP) model. The WOSP criteria include functionality, usability, security, extendibility, reliability, flexibility, connectivity and privacy. Findings from the analyses showed that the four CSE platforms vary in all of their features, and at the same time, they all lack the requirements of flexibility. The results provide insight into the current status of CSE platforms and highlight the gaps inherent in these platforms while offering a more complete picture. This study contributes to work on enhancing the design of current and future platforms.