Open Science and research quality at the German conference on psychology (DGPs congress in Leipzig)

From 17th to 22th September, the 50th anniversary congress of the German psychological association takes place in Leipzig. On previous conferences in Germany in the last two or three years, the topic of the credibility crisis and research transparency has been sometimes covered, sometimes completely ignored.

 

Therefore I am quite happy that this topic now has a really prominent place at the current conference. Here’s a list of some talks and events focusing on Open Science, research transparency, and what a future science could look like – see you there!


Sunday, Sep 18: Pre-conference workshop “How to do Open Science: Chancen, Möglichkeiten, Standards” (Susann Fiedler, Kai Jonas, Erich Weichselgartner)


Sunday, Sep 18: Pre-conference workshop “Dos and Don’ts of data analysis: Lessons from the replication crisis” (Felix Schönbrodt)


Tuesday, Sep 20, 10-12: Invited symposium “Reproducibility and trust in psychological science“, chaired by Jelte Wicherts (Tilburg University)

From the abstract:

In this symposium we discuss issues related to reproducibility and trust in psychological science. In the first talk, Jelte Wicherts will present some empirical results from meta-science that perhaps lower the trust in psychological science. Next, Coosje Veldkamp will discuss results bearing on actual public trust in psychological science and in psychologists from an international perspective. After that, Felix Schönbrodt and Chris Chambers will present innovations that could strengthen reproducibility in psychology. Felix Schönbrodt will present Sequential Bayes Factors as a novel method to collect and analyze psychological data and Chris Chambers will discuss Registered Reports as a means to prevent p-hacking and publication bias. We end with a general discussion.

 

  • Reproducibility problems in psychology: what would Wundt think? (Jelte Wicherts)
  • Trust in psychology and psychologists (Coosje Veldkamp)
  • Never underpowered again: Sequential Bayes Factors guarantee compelling evidence (Felix Schönbrodt)
  • The Registered Reports project: Three years on (Chris Chambers)

 

For details on the talks, see here.


Tuesday, Sep 20, 13:30: Keynote by Brian Nosek: “Addressing the Reproducibility of Psychological Science”

The currency of science is publishing.  Producing novel, positive, and clean results maximizes the likelihood of publishing success because those are the best kind of results.  There are multiple ways to produce such results: (1) be a genius, (2) be lucky, (3) be patient, or (4) employ flexible analytic and selective reporting practices to manufacture beauty.  In a competitive marketplace with minimal accountability, it is hard to resist (4).  But, there is a way.  With results, beauty is contingent on what is known about their origin.  With methodology, if it looks beautiful, it is beautiful. The only way to be rewarded for something other than the results is to make transparent how they were obtained.  With openness, I won’t stop aiming for beautiful papers, but when I get them, it will be clear that I earned them.


Tuesday, Sep 20, 14:30: Panel discussion: “Assuring the Quality of Psychological Research”

Moderation: Manfred Schmitt
Discussants: Manfred Schmitt, Andrea Abele-Brehm, Klaus Fiedler, Kai Jonas, Brian Nosek, Felix Schönbrodt, Rolf Ulrich, Jelte Wicherts

When replicated, many findings seem to either diminish in magnitude or to disappear altogether, as, for instance, recently shown in the Reproducibility Project: Psychology. Several reasons for false-positive results in psychology have been identified (e.g., p-hacking, selective reporting, underpowered studies) and call for reforms across the whole range of academic practices. These range from (1) new journal policies promoting an open research culture to (2) hiring, tenure and funding criteria that reward credibility and replicability rather than sexiness and quantity to (3) actions for increasing transparent and open research practices within and across individual labs. Following Brian Nosek’s (Center of Open Science) keynote, titled “Addressing the Reproducibility of Psychological Science” this panel discussion aims to explore the various ways in which our field may take advantage of the current debate. That is, the focus of the discussion will be on effective ways of improving the quality of psychological research in the future. Seven invited discussants provide insights into different current activities aimed at improving scientific practice and will discuss their potential. The audience will be invited to contribute to the discussion.  

I will represent the new guidelines of the German association for data management. They soon will be published, but here’s the gist: Open by default (raw data are an essential part of a publication); exceptions should be justified. Furthermore, we define norms for data reusage. Stay tuned on this blog for more details!


Tuesday, Sep 20, 18:00: Positionsreferat – Towards Evidence-Based Peer Review (Malte Elson)


Wednesday, Sep 21, 10:45: Beyond Registered Experiments – The Foundations of Cumulative Empirical Research

  • What Does it Mean to Replicate? (Prof. Dr. Christoph Klauer)
  • About the Recognizable Reality in Measures of Latent Psychological Attributes (Florian G. Kaiser)
  • The Stochastic Theory of Causal Effects and its Implications for Assuring the Quality of Quasi-Experimental Research (Rolf Steyer)
  • Diagnosticity and A-Priori Theorizing (Klaus Fiedler)

Wednesday, Sep 21, 14:45: Reproducible Paper Writing (Sebastian Sauer)

[…] In any case, a reanalysis of the data must result in similar or identical results.
[…] In this talk, we present a method that is well-suited for writing reproducible academic papers. This method is a combination of Latex, R, Knitr, Git, and Pandoc. These software tools are robust, well established and not more than reasonable complex. Additional approaches, such as using word processors (MS Word), Markdown, or online collaborative writing tools (Authorea) are presented briefly. The presentation is based on a practical software demonstration. A Github repository for easy reproducibility is provided.


 

These are not all sessions on the topic – go to http://www.dgpskongress.de/frontend/index.php# and search for “ASSURING THE QUALITY OF PSYCHOLOGICAL RESEARCH” to see all sessions associated with this topic. Furthermore, the CMS of the congress does not allow direct linking to the sessions, so you have to search for the sessions yourself.

 

Want to meet me at the conference? Write me an email, or send me a PM on Twitter.

Introducing the p-hacker app: Train your expert p-hacking skills
Honoured to receive the Leamer-Rosenthal-Prize
Comments (1) | Trackback

One Response to “Open Science and research quality at the German conference on psychology (DGPs congress in Leipzig)”

Leave a Reply

© 2017 Felix Schönbrodt | Impressum | Datenschutz | Contact