Macquarie University
Browse

File(s) stored somewhere else

Please note: Linked content is NOT stored on Macquarie University and we can't guarantee its availability, quality, security or accept any liability.

Data from: The extent and consequences of p-hacking in science

dataset
posted on 2022-06-10, 02:54 authored by Megan L. Head, Luke Holman, Rob Lanfear, Andrew T. Kahn, Michael D. Jennions
A focus on novel, confirmatory, and statistically significant results leads to substantial bias in the scientific literature. One type of bias, known as “p-hacking,” occurs when researchers collect or select data or statistical analyses until nonsignificant results become significant. Here, we use text-mining to demonstrate that p-hacking is widespread throughout science. We then illustrate how one can test for p-hacking when performing a meta-analysis and show that, while p-hacking is probably common, its effect seems to be weak relative to the real effect sizes being measured. This result suggests that p-hacking probably does not drastically alter scientific consensuses drawn from meta-analyses.

Usage Notes

Data from: The extent and consequences of p-hacking in scienceThis zip file consists of three parts. 1. Data obtained from text-mining and associated analysis files. 2. Data obtained from previously published meta-analyses and associated analysis files. 3. Analysis files used to conduct meta-analyses of the data. Read me files are contained within this zip file.FILES_FOR_DRYAD.zip

History

FAIR Self Assessment Rating

  • Unassessed

Data Sensitivity

  • General

Source

Dryad

Usage metrics

    Macquarie University Research Data Repository

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC