Write a Blog >>
Mon 14 - Fri 18 November 2022 Singapore
Mon 14 Nov 2022 11:15 - 11:30 at SRC LT 52 - ESEC/FSE 20 Community Chair(s): Everton Guimaraes

Background. Artifact evaluation has been introduced into the software engineering and programming languages research community with a pilot at ESEC/FSE 2011 and has since then enjoyed a healthy adoption throughout the conference landscape.
Objective. In this qualitative study, we examine the expectations of the community toward research artifacts and their evaluation processes.
Method. We conducted a survey including all members of artifact evaluation committees of major conferences in the software engineering and programming language field since the first pilot and compared the answers to expectations set by calls for artifacts and reviewing guidelines.
Results. While we find that some expectations exceed the ones expressed in calls and reviewing guidelines, there is no consensus on quality thresholds for artifacts in general.
We observe very specific quality expectations for specific artifact types for review and later usage, but also a lack of their communication in calls.
We also find problematic inconsistencies in the terminology used to express artifact evaluation's most important purpose – replicability.
Conclusion. We derive several actionable suggestions which can help to mature artifact evaluation in the inspected community and also to aid its introduction into other communities in computer science.

Mon 14 Nov

Displayed time zone: Beijing, Chongqing, Hong Kong, Urumqi change

11:00 - 12:30
ESEC/FSE 20 CommunityESEC/FSE 2020 at SRC LT 52
Chair(s): Everton Guimaraes Pennsylvania State University, USA
A Theory of the Engagement in Open Source Projects via Summer of Code Programs
Jefferson Silva PUC-SP, Brazil, Igor Wiese Federal University of Technology - Paraná (UTFPR), Daniel M. German , Christoph Treude University of Melbourne, Marco Gerosa Northern Arizona University, USA, Igor Steinmacher Northern Arizona University, USA
Community Expectations for Research Artifacts and Evaluation Processes
Ben Hermann TU Dortmund, Stefan Winter LMU Munich, Janet Siegmund Chemnitz University of Technology
DOI Media Attached
Heard it Through the Gitvine: An Empirical Study of Tool Diffusion Across the npm Ecosystem
Hemank Lamba , Asher Trockman Carnegie Mellon University, USA, Daniel Armanios Carnegie Mellon University, USA, Christian Kästner Carnegie Mellon University, Heather Miller Carnegie Mellon University, USA, Bogdan Vasilescu Carnegie Mellon University, USA