Skip to main content

ReproducibiliTea

About ReproducibiliTea

Join us at the ReproducibiliTea journal club for an informal discussion about how we can improve transparency, reproducibility and openness in research – with tea (and other refreshments)!

We welcome staff and research students working in any discipline, regardless of experience or expertise.

The programme of events is set out below, together with article underpinning the theme for each month.

We meet on the third Thursday of each month from 12.00 - 13.00 in the Philip Robinson Library Committee room (unless indicated otherwise.) Participants not able to attend in person are welcome to join us via zoom.

You're welcome to just drop in to any of our ReproducibiliTea meetings, but if you want to join online or stay up-to-date with these and other events, please join us on Teams or on our mailing list.

Programme for 2022/23

Meetings will take place 12.00 - 13.00 in the Philip Robinson Library Committee room (unless indicated otherwise.) Participants not able to attend in person are welcome to join us via zoom. Details will be sent out via the UKRN Newcastle mailing list and team.

DateThemePaper
20/10/22 Analytical flexibility illustrated Ioannidis JPA (2005) Why Most Published Research Findings Are False. PLoS Med 2(8): e124. https://doi.org/10.1371/journal.pmed.0020124 
17/11/22 Scientific originality through the historical lens  Wagenmakers , E.-J., Dutilh, G., & Sarafoglou, A. (2018). The Creativity-Verification Cycle in Psychological Science: New Methods to Combat Old Idols. Perspectives on Psychological Science, 13(4), 418–427. https://doi.org/10.1177/1745691618771357
15/12/22 Replicating landmark studies Klein, Richard A., et al. (2018) Many labs 2: investigating variation in replicability across sample and setting. Advances in Methods and Practices in Psychological Science. 1(4), 443-490. https://doi.org/10.1177/2515245918810225
Postponed due to UCU strike action - new date TBC Best practices in using technology in research   Wilson G, Bryan J, Cranston K, Kitzes J, Nederbragt L, et al. (2017) Good enough practices in scientific computing. PLOS Computational Biology 13(6), https://doi.org/10.1371/journal.pcbi.1005510
20/04/23 Replication – important or not? Zwaan, R. A., Etz, A., Lucas, R. E., & Donnellan, B. (2018, August 1). Making Replication Mainstream. Preprint. https://doi.org/10.31234/osf.io/4tg9c
18/05/23 The case for Open Access     Tennant JP, Waldner F, Jacques DC et al. The academic, economic and societal impacts of Open Access: an evidence-based review. F1000Research 2016, 5:632. https://doi.org/10.12688/f1000research.8460.3
15/06/23 Measurement Crisis in the making   Hussey, I., & Hughes, S. (2018, November 19). Hidden invalidity among fifteen commonly used measures in social and personality psychology. Preprint. https://doi.org/10.31234/osf.io/7rbfp
20/07/23 Responding to errors in a fallible science        Bishop, DVM. (2018). Fallibility in Science: Responding to Errors in the Work of Oneself and Others. Advances in Methods and Practices in Psychological Science, 1(3), 432–438. https://doi.org/10.1177/2515245918776632. (Accepted manuscript)