How do empirical software engineering researchers assess the credibility of practitioner–generated blog posts?

Ashley Williams, Austen Rainer

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Background: Blog posts offer potential benefits for research, but also present challenges. The use of blog posts in SE research is contentious for some members of the community. Also, there are no guidelines for evaluating the credibility of blog posts. Objective: To empirically investigate SE researchers’ opinions on the credibility of blog posts, and identify criteria for evaluating blog posts. Method: We conduct an online survey of software engineering researchers (n=43), to gather opinions on blog–post credibility and credibility criteria. Results: There is diversity of opinion. The majority of researchers provide a qualified response to the credibility of blog posts: essentially, it depends. Several credibility criteria are valued by researchers, such as Reasoning, Clarity of writing, Reporting empirical data and Reporting methods of data collection. Approximately 60% of respondents thought the criteria generalised to other practitioner–and researcher–generated content. Conclusion: The survey constitutes the first empirical benchmark of the credibility of blog posts in SE research, and presents an initial set of criteria for evaluating the credibility of blog posts. The study would benefit from independent replication and evaluation.

Original languageEnglish
Title of host publicationProceedings of EASE 2019 - Evaluation and Assessment in Software Engineering
PublisherAssociation for Computing Machinery
Pages211-220
Number of pages10
ISBN (Electronic)9781450371452
DOIs
Publication statusPublished - 15 Apr 2019
Event23rd Evaluation and Assessment in Software Engineering Conference, EASE 2019 - Copenhagen, Denmark
Duration: 14 Apr 201917 Apr 2019

Conference

Conference23rd Evaluation and Assessment in Software Engineering Conference, EASE 2019
CountryDenmark
CityCopenhagen
Period14/04/201917/04/2019

Keywords

  • Argumentation
  • Blogs
  • Credibility
  • Evidence
  • Evidence based software engineering
  • Experience

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Software

Fingerprint Dive into the research topics of 'How do empirical software engineering researchers assess the credibility of practitioner–generated blog posts?'. Together they form a unique fingerprint.

  • Cite this

    Williams, A., & Rainer, A. (2019). How do empirical software engineering researchers assess the credibility of practitioner–generated blog posts? In Proceedings of EASE 2019 - Evaluation and Assessment in Software Engineering (pp. 211-220). Association for Computing Machinery. https://doi.org/10.1145/3319008.3319013