The Seven Deadly Sins of Psychology: A Manifesto for Reforming the Culture of Scientific Practice

Cover
Princeton University Press, 2017 - 274 Seiten
1 Rezension

Why psychology is in peril as a scientific discipline--and how to save it

Psychological science has made extraordinary discoveries about the human mind, but can we trust everything its practitioners are telling us? In recent years, it has become increasingly apparent that a lot of research in psychology is based on weak evidence, questionable practices, and sometimes even fraud. The Seven Deadly Sins of Psychology diagnoses the ills besetting the discipline today and proposes sensible, practical solutions to ensure that it remains a legitimate and reliable science in the years ahead.

In this unflinchingly candid manifesto, Chris Chambers draws on his own experiences as a working scientist to reveal a dark side to psychology that few of us ever see. Using the seven deadly sins as a metaphor, he shows how practitioners are vulnerable to powerful biases that undercut the scientific method, how they routinely torture data until it produces outcomes that can be published in prestigious journals, and how studies are much less reliable than advertised. He reveals how a culture of secrecy denies the public and other researchers access to the results of psychology experiments, how fraudulent academics can operate with impunity, and how an obsession with bean counting creates perverse incentives for academics. Left unchecked, these problems threaten the very future of psychology as a science--but help is here.

Outlining a core set of best practices that can be applied across the sciences, Chambers demonstrates how all these sins can be corrected by embracing open science, an emerging philosophy that seeks to make research and its outcomes as transparent as possible.

Was andere dazu sagen - Rezension schreiben

Nutzerbericht - Als unangemessen melden

tl;dr: great book. Read.
The “Seven Sins” is concerned about the validity of psychological research. Can we at all, or to what degree, be certain about the conclusions reached in psychological
research? More recently, replications efforts have cast doubt on our confidence in psychological research (1). In a similar vein, a recent papers states that in many research areas, researchers mostly report “successes” in the sense of that they report that their studies confirm their hypotheses - with Psychology leading in the proportion of supported hypotheses (2). To good to be true? In the light of all this unbehagen, Chambers’ book addresses some of the (possible) roots of the problem of (un)reliability of psychological science. Precisely, Chambers mentions seven “sins” that the psychological research community appears to be guilty of: confirmation bias, data tuning (“hidden flexibility”), disregard of direct replications (and related problems), failure to share data (“data hoarding”), fraud, lack of open access publishing, and fixation on impact factors.
Chambers is not alone in out-speaking some dirty little (or not so little) secrets or tricks of the trade. The discomfort with the status quo is gaining momentum (3,4,5, 6); see also the work of psychologists such as J. Wicherts, F. Schönbrodt, D. Bishop, J. Simmons, S. Schwarzkopf, R. Morey, or B. Nosek, to name just a few. For example, recently, the German psychological association (DGPs) opened up (more) towards open data (7). However, a substantial number of prominent psychologist oppose the more open approach towards higher validity and legitimateness (8). Thus, Chambers’ book hit the nerve of many psychologists. True, a lot is at stake (9, 10, 11), and a train wreck may have appeared. Chambers book knits together the most important aspects of the replicability (or reproducibility); the first “umbrella book” on that topic, as far as I know. Personally, I feel that one point only would merit some more scrutiny: the unchallenged assumption that psychological constructs are metric (12,13,14). Measurement builds the very rock of any empirical science. Without precise measurement, it appears unlikely that any theory will advance. Still, psychologists turn a dead ear to this issue, sadly. Just assuming that my sum-score does possess metric niveau is not enough (15).
The book is well written, pleasurable to read, suitable for a number of couch evenings (as in my case). Although methodologically sound, as far as I can say, no special statistical knowledge is needed to follow and benefit from the whole exposition.
The last chapter is devoted to solutions (“remedies”); arguably, this is the most important chapter in the book. Again, Chambers arrives at pulling together most important trends, concrete ideas and more general, far reaching avenues. The most important measures are to him a) preregistration of studies, b) judging journals by their replication quota and strengthening the whole replication effort as such, c) open science in general (see Openness Initiative, and TOP guidelines) and d) novel ways of conceiving the job of journals. Well, maybe he is not so much focusing on the last part, but I find that last point quite sensible. One could argue that publishers such as Elsevier managed to suck way to much money out of the system, money that ultimately is paid by the tax payers, and by the research community. Basically, scientific journals do two things: hosting manuscripts and steering peer-review. Remember that journals do not do the peer review, it is provided for free by researchers. As hosting is very cheap nowadays, and peer review is brought by without much input by the publishers, why not come up with new, more cost-efficient, and more reliable ways of publishing? One may think that money is not of primary concern for science, truth is. However, science, as most societal endeavors, is based entirely on the trust and confidence of the wider public. Wasting that trust, destroying the funding base. Hence, science cannot
 

Andere Ausgaben - Alle anzeigen

Über den Autor (2017)

Chris Chambers is professor of cognitive neuroscience in the School of Psychology at Cardiff University and a contributor to the Guardian science blog network.

Bibliografische Informationen