I’m honoured that the Berkeley Initiative for Transparency in the Social Sciences (BITSS) chose me for one of the 2016 Leamer-Rosenthal Prizes for Open Social Science! This award comes with a prize of $10,000 and “recognizes important contributions by individuals in the open science movement”. For my contributions to Open Science, see this website. For more details about the price, which is generously donated by the John Templeton Foundation, see here.
BITSS has become one of the central hubs for the global open science movement and does a great job by providing open educational resources (e.g., “Tools and Resources for Data Curation”, or “How to Write a Pre-Analysis Plan”), grants, running the open science catalysts program, and hosting their annual meeting in San Francisco.
Other recipients of the price were Eric-Jan Wagenmakers, Lorena Barba, Zacharie Tsala Dimbuene, Abel Brodeur, Elaine Toomey, Graeme Blair, Beth Baribault, Michèle Nuijten and Sacha Epskamp.
I am optimistically looking into a future of credible, reproducible, and transparent research. Stay tuned for some news from our work here at the department’s Open Science Committee at LMU Munich!
From 17th to 22th September, the 50th anniversary congress of the German psychological association takes place in Leipzig. On previous conferences in Germany in the last two or three years, the topic of the credibility crisis and research transparency has been sometimes covered, sometimes completely ignored.
Therefore I am quite happy that this topic now has a really prominent place at the current conference. Here’s a list of some talks and events focusing on Open Science, research transparency, and what a future science could look like – see you there!
From the abstract:
In this symposium we discuss issues related to reproducibility and trust in psychological science. In the first talk, Jelte Wicherts will present some empirical results from meta-science that perhaps lower the trust in psychological science. Next, Coosje Veldkamp will discuss results bearing on actual public trust in psychological science and in psychologists from an international perspective. After that, Felix Schönbrodt and Chris Chambers will present innovations that could strengthen reproducibility in psychology. Felix Schönbrodt will present Sequential Bayes Factors as a novel method to collect and analyze psychological data and Chris Chambers will discuss Registered Reports as a means to prevent p-hacking and publication bias. We end with a general discussion.
For details on the talks, see here.
The currency of science is publishing. Producing novel, positive, and clean results maximizes the likelihood of publishing success because those are the best kind of results. There are multiple ways to produce such results: (1) be a genius, (2) be lucky, (3) be patient, or (4) employ flexible analytic and selective reporting practices to manufacture beauty. In a competitive marketplace with minimal accountability, it is hard to resist (4). But, there is a way. With results, beauty is contingent on what is known about their origin. With methodology, if it looks beautiful, it is beautiful. The only way to be rewarded for something other than the results is to make transparent how they were obtained. With openness, I won’t stop aiming for beautiful papers, but when I get them, it will be clear that I earned them.
Moderation: Manfred Schmitt
Discussants: Manfred Schmitt, Andrea Abele-Brehm, Klaus Fiedler, Kai Jonas, Brian Nosek, Felix Schönbrodt, Rolf Ulrich, Jelte Wicherts
When replicated, many findings seem to either diminish in magnitude or to disappear altogether, as, for instance, recently shown in the Reproducibility Project: Psychology. Several reasons for false-positive results in psychology have been identified (e.g., p-hacking, selective reporting, underpowered studies) and call for reforms across the whole range of academic practices. These range from (1) new journal policies promoting an open research culture to (2) hiring, tenure and funding criteria that reward credibility and replicability rather than sexiness and quantity to (3) actions for increasing transparent and open research practices within and across individual labs. Following Brian Nosek’s (Center of Open Science) keynote, titled “Addressing the Reproducibility of Psychological Science” this panel discussion aims to explore the various ways in which our field may take advantage of the current debate. That is, the focus of the discussion will be on effective ways of improving the quality of psychological research in the future. Seven invited discussants provide insights into different current activities aimed at improving scientific practice and will discuss their potential. The audience will be invited to contribute to the discussion.
I will represent the new guidelines of the German association for data management. They soon will be published, but here’s the gist: Open by default (raw data are an essential part of a publication); exceptions should be justified. Furthermore, we define norms for data reusage. Stay tuned on this blog for more details!
[…] In any case, a reanalysis of the data must result in similar or identical results.
[…] In this talk, we present a method that is well-suited for writing reproducible academic papers. This method is a combination of Latex, R, Knitr, Git, and Pandoc. These software tools are robust, well established and not more than reasonable complex. Additional approaches, such as using word processors (MS Word), Markdown, or online collaborative writing tools (Authorea) are presented briefly. The presentation is based on a practical software demonstration. A Github repository for easy reproducibility is provided.
These are not all sessions on the topic – go to http://www.dgpskongress.de/frontend/index.php# and search for “ASSURING THE QUALITY OF PSYCHOLOGICAL RESEARCH” to see all sessions associated with this topic. Furthermore, the CMS of the congress does not allow direct linking to the sessions, so you have to search for the sessions yourself.
Want to meet me at the conference? Write me an email, or send me a PM on Twitter.
“If you torture the data long enough, it will confess.”
Send this to a friend