While organizations are seeking new approaches to improving system development, methods are often designed and tailored in a less than rigorous manner. There is a tendency to blindly adopt the latest fad in methodology and to measure success in terms of adherence to these methods without understanding why they are better or how they create value. Principles, practices, and tools are often introduced without explaining what to expect from these new methods or considering their limits. The project’s over-arching goal was to use a design science research (DSR) approach to systems development design to encourage a move toward a more evidence-based assessment of these methods. The study’s artifacts were flow tools and practices customized and redesigned in TechCo, rather than new artifacts that were develop and are unavailable elsewhere. We found that DSR addressed the problem, at least in part, by forcing academic and industry participants to expand on ‘satisficed’ secondary knowledge and engage with ambiguities head on. (i) Apply DSR-appropriate standards of rigour to evaluating information systems development (ISD) methods; (ii) design and evaluate ISD methods before ISD method components; (iii) design clear and discriminatory metrics for ISD methods; (iv) consider temporal issues when designing and evaluating ISD methods; and (v) be wary of self-referencing metrics when evaluating ISD methods. More fundamentally, we found that both academic and industry participants were operating under evolving conditions of bounded rationality.
|Title of host publication||Design Science Research : Cases|
|Editors||Jan vom Brocke, Alan Hevner, Alexander Maedche|
|Number of pages||23|
|Place of Publication||Cham|
|Publication status||Published - 2020|
|Series|| Progress in IS|