Everything Hertz

  • Autor: Vários
  • Narrador: Vários
  • Editora: Podcast
  • Duração: 168:44:34
  • Mais informações

Informações:

Sinopse

A podcast by scientists, for scientists. Methodology, scientific life, and bad language. Co-hosted by Dr. Dan Quintana (University of Oslo) and Dr. James Heathers (Northeastern University)

Episódios

  • 44: Who’s afraid of the New Bad People? (with Nick Brown)

    19/05/2017 Duração: 01h08min

    James and Dan are joined by Nick Brown (University of Groningen) to discuss how the New Bad People — also known as shameless little bullies, vigilantes, the self-appointed data police, angry nothings, scientific McCarthyites, second-stringers, whiners, the Stasi, destructo-critics, and wackaloons* — are trying to improve science Here’s what they cover Power imbalances in academia Publication bias Euphemisms for people who are publicly critical of science How to go about questioning the scientific record Peer reviewed criticism vs. blog posts Making meta-analysis easier Data-recycling Well-being and genomics Popular science books and conflicts of interest The ‘typical’ response to a Letter to an Editor What Dan and James do during the breaks Why don’t people report descriptive statistics anymore? Priming studies Science in the media What Nick has changed his mind about Links Nick on Twitter - @sTeamTraen Nick’s blog - http://steamtraen.blogspot.no * This list is from one of James’ blog posts https://medium.c

  • 43: Death, taxes, and publication bias in meta-analysis (with Daniel Lakens)

    05/05/2017 Duração: 01h02min

    Daniel Lakens (Eindhoven University of Technology) joins James and Dan to talk meta-analysis. Here’s what they cover: Daniel’s opinion on the current state of meta-analysis The benefit of reporting guidelines (even though hardly anyone actually follows them) How fixing publication bias can fix science Meta-analysis before and after that Bem paper How to correct for publication bias Whether meta-analyses are just published for the citations The benefits of pre-registering meta-analysis How we get people to share their data How sharing data doesn’t just benefit others - it also helps you replicate your own analyses later Success is tied to funding, no matter how “cheap” your research is How people can say “yes” to cumulative science, but “no” to sharing data Responding to mistakes How to find errors in your own papers before submission We ask Daniel: i) If he could should one slide to every introductory psychology lecture in the world, what would say?, ii) What has he changed his mind about in the last few y

  • 42: Some of my best friends are Bayesians (with Daniel Lakens)

    21/04/2017 Duração: 01h07min

    Daniel Lakens (Eindhoven University of Technology) drops in to talk statistical inference with James and Dan. Here’s what they cover: How did Daniel get into statistical inference? Are we overdoing the Frequentist vs. Bayes debate? What situations better suit Bayesian inference? The over advertising of Bayesian inference Study design is underrated The limits of p-values Why not report both p-values and Bayes factors? The “perfect t-test” script and the difference between Student’s and Welch’s t-tests The two-one sided test Frequentist and Bayesian approaches for stopping procedures Why James and Dan started the podcast The worst bits of advice that Daniel has heard about statistical inference Dan discuss a new preprint on Bayes factors in psychiatry Statistical power Excel isn’t all bad… The importance of accessible software We ask Daniel about his research workflow - how does he get stuff done? Using blog posts as a way of gauging interest in a topic Chris Chambers’ new book: The seven deadly sins of

  • 41: Objecting to published research (with William Gunn)

    07/04/2017 Duração: 01h07min

    In this episode, Dan and James are joined by William Gunn (Director of Scholarly communications at Elsevier) to discuss ways in which you can object to published research. They also cover: What differentiates an analytics company from a publishing company? How scientific journals are one of the last areas to fully adopt the dynamic nature of the internet Data repositories How to make a correction in a journal The benefits of Registered Reports When everyone asked Elsevier for a journal of negative results but no one submitted to them How unit of publication isn’t really indicative of science as a process Altmetrics and gaming the system How to appeal to a journal about a paper Citation cartels: the dumbest crime William’s switch from research to publishing and his shift in perspective The crackpot index James’ flowchart on how to contact an editor The copyediting process Elsevier’s approach to open peer review: should junior researchers be worried? The one thing William thinks that everyone else thinks is c

  • 40: Meta-research (with Michèle Nuijten)

    24/03/2017 Duração: 49min

    Dan and James are joined by Michèle Nuijten (Tilburg University) to discuss 'statcheck', an algorithm that automatically scans papers for statistical tests, recomputes p-values, and flags inconsistencies. They also cover: - How Michèle dealt with statcheck criticisms - Psychological Science’s pilot of statcheck for journal submissions - Detecting data fraud - When should a journal issue a correction? - Future plans for statcheck - The one thing Michèle thinks that everyone else thinks is crazy - Michèle's most worthwhile career investment - The one paper that Michèle thinks everyone should read Links Michèle's website: https://mbnuijten.com Michèle's twitter account: https://twitter.com/michelenuijten Statcheck: https://statcheck.io Tilberg University meta-research center: http://metaresearch.nl Guardian story on detecting science fraud: https://www.theguardian.com/science/2017/feb/01/high-tech-war-on-science The paper Michèle thinks everyone should read: http://opim.wharton.upenn.edu/DPlab/papers/publishedP

  • 39: Academic hipsters

    10/03/2017 Duração: 54min

    We all know hipsters. You know, like the guy that rides his Penny-farthing to the local cafe to write his memoirs on a typewriter - just because its more ‘authentic’. In this episode, James and Dan discuss academic hipsters. These are people who insist you need to use specific tools in your science like R, python, and LaTeX. So should you start using these trendy tools despite the steep learning curve? Other stuff they cover: Why James finally jumped onto Twitter A new segment: 2-minutes hate The senior academic that blamed an uncredited co-author for data anomalies An infographic ranking science journalism quality that’s mostly wrong When to learn new tools, and when to stick with what you know Authorea as a good example of a compromise between "easy" and "reproducible" Links The science journalism infographic http://www.nature.com/news/science-journalism-can-be-evidence-based-compelling-and-wrong-1.21591 Facebook page www.facebook.com/everythinghertzpodcast/ Twitter account www.twitter.com/hertzpodcast M

  • 38: Work/life balance - Part 2

    24/02/2017 Duração: 01h02min

    Dan and James continue their discussion on work/life balance in academia. They also suggest ways to get your work done within a sane amount of hours as well as how to pick the right lab. Some of the topics covered: Feedback from our last episode Why the podcast started in the first place The "Red Queen" problem Does the "70 hour lab" produce better work? Some experiments aren't suited to a 9-5 schedule More tips for anonomusly skiving off at work What are cognitive limits off focused work? Do early career researchers even earn the minimum wage when you factor in the hours worked? How James gets things done: Work on one thing at a time until it's done and protect your time How Dan gets things done: Pomodoros (40 mins work, 10 minute break), blocking social/news websites How do pick a lab to work in? Links Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpodcast

  • 37: Work/life balance in academia

    17/02/2017 Duração: 56min

    In this episode, we talk work/life balance for early career researchers. Do you need to work a 70-hour week to be a successful scientist or can you actually have a life outside the lab? Some of the topics covered: An update on "the postdoc that didn't say no" story Brian Wansink's response De-identifying data in research The perils of public criticism Criticising the research vs. criticising the person Some sage advice from a senior academic on "Making science the centre of your life" Look for a boss that won't make insane demands of your time How much good work is really coming out of a 70-hour week? An old hack Dan used to do to pretend he was working on data when he was really just on twitter Links GRIM test calculator http://www.prepubmed.org/grim_test/ Jordan's follow-up post https://medium.com/@OmnesRes/the-donald-trump-of-food-research-49e2bc7daa41#.me8e97z51 Brian Wansink's response http://www.brianwansink.com/phd-advice/statistical-heartburn-and-long-term-lessons The "Making science the centre of yo

  • 36: Statistical inconsistencies in published research

    27/01/2017 Duração: 50min

    In episode 34 we covered a blog post that highlighted questionable analytical approaches in psychology. That post mentioned four studies that resulted from this approach, which a team of researchers took a closer look into. Dan and James discuss the statistical inconsistencies that the authors reported in a recent preprint. Some of the topics covered: Trump (of course) A summary of the preprint The GRIM test to detect inconsistencies The researchers that accidently administered the equivalent of 300 cups of coffee to study participants How do we prevent inconsistent reporting? 21 word solution for research transparency Journals mandating statistical inconsistency checks, such as 'statcheck' Links The pre-print https://peerj.com/preprints/2748/ 'The grad student that didn't say no' blog post http://www.brianwansink.com/phd-advice/the-grad-student-who-never-said-no The caffeine study http://www.bbc.com/news/uk-england-tyne-38744307 Tobacco and Alcohol Research Group lab handbook (see page 6 for open science pr

  • 35: A manifesto for reproducible science

    20/01/2017 Duração: 50min

    Dan and James discuss a new paper in the inaugural issue of Nature Human Behaviour, "A manifesto for reproducible science". Some of the topics covered: What's a manfesto for reproducibility doing in a Nature group journal? Registered reports The importance of incentives to actually make change happen What people should report vs. what they actually report A common pitfall of published meta-analyses The reliance of metrics in hiring decisions and the impact of open science practices Tone police How do we transition to open science practices? SSRN preprints being bought by Elsevier Authors getting gouged by copyediting costs (and solutions) Does being 'double-blind' extend to doing your analysis blind Trial monitoring is expensive Links The paper http://www.nature.com/articles/s41562-016-0021 Our paper on reporting standards in heart rate variability http://www.nature.com/tp/journal/v6/n5/full/tp201673a.html Equator guidelines http://www.equator-network.org Facebook page https://www.facebook.com/everythi

  • 34: E-health (with Robin Kok)

    22/12/2016 Duração: 01h11s

    Dan and James have their very first guest! For this episode they're joined by Robin Kok (University of Southern Denmark) to talk e-health. They also cover a recent blog post that inadvertently highlighted questionable research practices in psychology. Some of the topics covered: The grad student who never said no Postdoc work/life balance Questionable research practices Torturing data (with rattan sticks) Using the GRIM test to assess data accuracy Unpaid internships Saying 'yes' to opportunities that come your way The Myers-Briggs test is rubbish What is e-health? Are e-health interventions efficacious? e-health intervention implementation issues The poor quality of psych intervention smartphone apps Using "Facebook Live" to broadcast conference presentations The future of e-health Links Robin's twitter account https://www.twitter.com/robinnkok "The grad student who never said no" blog post http://www.brianwansink.com/phd-types-only/the-grad-student-who-never-said-no The Buzzfeed quiz on 'Which Disney prince

  • 33: Zombie theories

    16/12/2016 Duração: 43min

    Dan and James discuss Zombie theories, which are scientific ideas that continue to live on in the absence of evidence. Why do these ideas persist and how do we kill them for good? Some of the topics covered: Why do some ideas live on? Zombie theories in heart rate variability research Reasons why zombie theories proliferate more in the social sciences Attractiveness and simplicity Theories become brands Oxytocin zombie theories The power of shaming Ideas are corrected more quickly in smaller fields James' new interest in Cow ECG People using science as a weapon to open up hip pockets How do we kill these zombies for good? Manual vs. automated PubMed comments What's the impact of paper retraction on future citations? How do you correct the scientific record? Links Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpodcast

  • 32: Can worrying about getting sick make you sicker?

    01/12/2016 Duração: 43min

    Dan and James discuss a new population study that linked health anxiety data with future heart disease. Some of the topics covered: Web MD and health anxiety How would healthy anxiety contribute to heart disease? A summary of the study Ischemic heart disease = coronary artery disease Do people with healthy anxiety take better care of thier health? Don't be fooled by percentage increase of risk for something that's rare There are some things you can't just randomize The pros and cons of big data collection Links The paper http://bmjopen.bmj.com/content/6/11/e012914.full Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpodcast

  • 31: Discover your psychiatric risk with this one weird trick

    16/11/2016 Duração: 54min

    Dan and James discuss a recent study of over one million Swedish men that found that higher resting heart rate late adolescence was associated with an increased risk for subsequent psychiatric illness. Some of the topics covered: How did these authors get such an enormous dataset? The benefits of testing so many people What we liked about the study (hint: lots of things) Measuring cardiovascular efficiency using a cycle ergometer The pitfalls of self-reported physical activity How the media covered this study Contextual factors - does the testing environment induce anxiety? Co-morbidity in psychiatry What would James do with 200,000 ECGs strips? Links The paper http://jamanetwork.com/journals/jamapsychiatry/fullarticle/2569454 The Daily Mail story http://www.dailymail.co.uk/health/article-3875062/Why-heartbeat-teenager-affect-later-life-Boys-high-blood-pressure-risk-mental-health-problems-adults.html?linkId=30382089 Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.

  • 30: Authorship

    02/11/2016 Duração: 49min

    Dan and James discuss authorship in the biomedical sciences

  • 29: Learning new skills

    16/10/2016 Duração: 48min

    Dan and James talk about how they learn new things. Some of the topics discussed: Internet memes Consolidating old ideas rather than learning new ones Why learn a new skill when you just get someone else to do it? A lesson of not having a good understanding statistical software... James and Dan butt heads about meta-analysis (again) Learning new things is interesting How did people learn things before the internet? How to follow things on Twitter without being on Twitter Links Bayes factor paper with 'primer' paper matrix https://alexanderetz.com/2016/02/07/understanding-bayes-how-to-become-a-bayesian-in-eight-easy-steps/ Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpodcast

  • 28: Positive developments in biomedical science

    30/09/2016 Duração: 49min

    Pre-registration, p-hacking, power, protocols. All these concepts are pretty mainstream in 2016 but hardly discussed 5 years ago. In this episode, James and Dan talk about these ideas and other developments in biomedical science. Some of the topics discussed: James loves blinded reviews, scihub Dan loves protocols, learning stats through social media, reproducible science Links The COMPARE initiative - http://compare-trials.org "Give me the F-ing PDF" Chrome extension https://chrome.google.com/webstore/detail/give-me-the-f-ing-pdf/iekjpaipocoglamgpjoehfdemffdmami/related Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com/hertzpodcast

  • 27: Complaints and grievances

    23/09/2016 Duração: 52min

    Dan and James discuss complaints and grievances. Stay tuned for part 2 where things get positive. Some of the topics discussed: Conflicts of interest Criticism in psychology Why does there seem to be so much bad blood in psychology? Retracted papers: fraud or sloppiness? Authors not acknowledging your peer-review remarks The short-term nature of research The benefits of 'centers of excellence' Links The 'vibe of the thing' scene from 'The Castle' (Great Aussie film) https://www.youtube.com/watch?v=wJuXIq7OazQ Facebook page https://www.facebook.com/everythinghertzpodcast/ Twitter account https://www.twitter.com

  • 26: Interpreting effect sizes

    09/09/2016 Duração: 45min

    When interpreting the magnitude of group differences using effect sizes, researchers often rely on Cohen's guidelines for small, medium, and large effects. However, Cohen originally proposed these guidelines as a fall back when the distribution of effect sizes was unknown. Despite the hundreds of available studies comparing heart rate variability (HRV), Cohen's guidelines are still used for interpretation. In this episode, Dan discusses his recent preprint describing an effect size distribution analysis on HRV studies. Some of the topics discussed: A summary of Dan's preprint What is an effect size? What can an effect size distribution tell us? How effect sizes can inform study planning How close are Cohen's guidelines to the distribution of HRV effect sizes? What samples sizes are appropriate? Pre-publication review vs. post-publication review Statcheck Links The preprint article http://www.biorxiv.org/content/early/2016/08/31/072660 Statcheck https://mbnuijten.com/statcheck/ Facebook page https://www.fa

  • 25: Misunderstanding p-values

    27/08/2016 Duração: 55min

    P-values are universal, but do we really know what they mean? In this episode, Dan and James discuss a recent paper describing the failure to correctly interpret p-values in a sample of academic psychologists. Some of the topics discussed: Common p-value misconceptions James tests Dan on his p-value knowledge p-values vs. effect size The problem of sample size with p-value interpretation The Facebook mood manipulation study Data peeking Equivalent p-values do not represent equivalent results Meta-analytical thinking Using significance as a categorical factor Statistical vs. clinical significance Clinical trial registration and 'secondary outcome creep' Dan and James answer listener questions Science communicator vs. scientist Grant titles and the 'Pub test' NASA and social media Links The article http://journal.frontiersin.org/article/10.3389/fpsyg.2016.01247/full Geoff Cumming's book (we got the name completely wrong - sorry Geoff!) http://www.amazon.com/Understanding-The-New-Statistics-Meta-Analysis-eb

página 8 de 10