AccueilGroupesDiscussionsPlusTendances
Site de recherche
Ce site utilise des cookies pour fournir nos services, optimiser les performances, pour les analyses, et (si vous n'êtes pas connecté) pour les publicités. En utilisant Librarything, vous reconnaissez avoir lu et compris nos conditions générales d'utilisation et de services. Votre utilisation du site et de ses services vaut acceptation de ces conditions et termes.

Résultats trouvés sur Google Books

Cliquer sur une vignette pour aller sur Google Books.

Chargement...

Distrust

par Gary Smith

MembresCritiquesPopularitéÉvaluation moyenneDiscussions
512,993,862 (4)Aucun
This book argues that our growing distrust of science is fuelled by tools scientists themselves created, as technological advances and developments in data analysis have led to disinformation, data torturing, and data mining. Smith examines these issues and offers solutions for restoring the credibility of the scientific community.… (plus d'informations)
Récemment ajouté parAKBouterse, DavidWineberg, jodi, bookdraig
Aucun
Chargement...

Inscrivez-vous à LibraryThing pour découvrir si vous aimerez ce livre

Actuellement, il n'y a pas de discussions au sujet de ce livre.

“Whenever I hear about provocative research, my default assumption is that it is wrong.”

The media, from journals to social media and everything in between, are filled with lies. Gary Smith, who seems to live to debunk the fraudsters, takes them all on in Distrust. And then he settles in on scientific studies for the abuse they regularly employ to deceive, and achieve fame. Not to put too fine a point on it, but 70% of psychologists themselves don’t trust the psychology studies they read. This is Smith’s chosen world, and he has penetrated it to a remarkable – and most helpful – extent.

The book is great fun. It’s lovely to watch Smith demolish the fraud in every medium. Many of the examples will prove familiar to readers, from Russian bots to ChatGPT, passing through crypto along the way. The chapters all end not with a Conclusion, but with a short section titled The Irony. It’s usually to the effect that people don’t trust this medium, which they enjoy thanks to the tireless work of scientists who they don’t trust. With his career-long history of precision bombing the fraudsters, it is also – ironically — trustworthy. Because he suffers along with us.

The structure is simple, setting up the ease of reading the book. Smith breaks the fraud into three distinct flavors: disinformation, tortured data, and datamining. And several chapters follow each one of them straight downhill.

Disinformation is the one everyone is most confronted with. Americans are their own large part of the problem. Nearly 75% believe in the paranormal, despite all the debunking and proof against it. Every year, more Americans believe the 2020 election was actually won by Donald Trump. Meanwhile, 20-30% continue to believe the American moon landing was fake. The National UFO Reporting Center receives an average of five sightings every day – since 1974. With California being the favorite state for aliens. And most often the week of the 4th of July, just so you know. This is disinformation at work. From hackers to bots (which might actually be the majority of social media accounts), the pressure to confuse is huge. And users are nothing if not gullible.

Even scientists citing each other’s work cite the debunked studies more often than the truthful ones, he shows. Everybody loves a good story. It’s the outrageous, bogus ones that go viral. In other words, everybody’s doing it, and Science is no more saintly than hackers and trolls.

Science is getting beat up, every which way it turns. For all the inventions, services, and unprecedented living standards science has given mankind, it is regarded with ever more suspicion. Smith says: “To the extent that that scientific research is used by governments to justify policies that people dislike, science is viewed as part of the problem.” But it’s also not nearly that simple.

He has delightful stories from all kinds of sources, but he concentrates the bulk of the book on the professionals – scientists themselves. Their deceit, their trickery, their out and out fraud, is worse than garden variety internet lies, if only because science has structure, rules, checks and respect meant to prevent such things.

In the search for reasons why scientists go rogue, Smith cites Goodhart’s Law that when a measure becomes a target, it ceases to be a good measure. Think of teaching to the test, because nothing matters more than the test score. Both the teaching and the test score are severely devalued by it.

So with scientists and their p score. The p score measures statistical significance. When it is 5% or less the study has significant scientific value. Scientists therefore will do seemingly anything to hit that number. They will ignore date ranges and age categories, discount outliers – anything to have their data show the results they want, with a p value of .05. Not .051, or God forbid, .06 or more, but .05. Smith titles one chapter in the data torturing section: Squeezing Blood From Rocks.

The cheating goes to astonishing extents. Half of all drug studies are not replicable, because researchers tortured the data to fit their goal of showing efficacy of some drug (and then everyone is surprised when the drug doesn’t work as advertised, except for the massive, unexpected side effects). The vaunted journals that publish the studies often don’t even read them, and neither it appears, do the journal reviewers, the backstop of the whole validation system.

Half the journals themselves are fraudulent. They will publish anything if the authors send money. There is so much scientific publishing going on that “half of all articles are not read by anyone other than the author, journal editor, and journal reviewers.” But it goes on the résumé, and that is really all that matters in a Publish or Perish community. Scientists gaming the system have coalesced into author machines, with hundreds and thousands claiming authorship of a single paper. Smith says the author list can be several dozen pages preceding the six page paper. There aren’t enough words for each of them have contributed one.

Contrary to the contracts they sign, many researchers will refuse to give other scientists their data so the study can be repeated and verified. Unfortunately, it can take years for a fraudulent paper to be rescinded by the journal, but it seems to be happening more and more often. In the meantime, whole careers are made. TED talks are given, books published, speaking tours extended, and thousands influenced by untruths. Then, very often, after the real truth comes out, their careers and reputations suffer little or no damage and the fraudulent claims continue to circulate as facts. Is it any wonder that people distrust scientists? Smith himself says “whenever I hear about provocative research, my default assumption is that it is wrong.”

Added recently to this mix is the app, mostly thanks to “smart” watches. All kinds of software is getting pre-certified by the FDA and without randomized controlled testing. They are simply released to a gullible public. No one knows if any of them have any healthcare value, but they come with FDA implicit approval. Smith calls them digital snake oil.

He has a delightful section on the British Medical Journal’s annual Christmas Issue, which profiles the most ridiculous of the ridiculous studies. And they really are laughable. Sadly, real money was authorized to conduct them, and scientist/authors work to promote them to the scientific community. Readers will be able to see that to some audiences, they could prove believable - if they weren’t so transparently idiotic. They should make readers think twice before spreading internet facts like hurricanes with female names do more damage or that the Chinese tend to die around the fourth of the month. (These were real studies.) The great James Randi used to get in on the act, issuing Ig Nobel Prizes for the worst of the worst. That’s what has become of science. As Bob Saget said, “No, seriously, I read it. I wrote it down and I read it. It must be true. I believe everything I read!”

Datamining is the gift of computers. They are able to classify unlimited information, determine patterns between and among classifications, and spit out relationships from the data all day long. It has come to the point where scientists don’t need to have a theory they seek to prove. Just let the computers rip. The data will provide interesting correlations that will lead to a theory. That the theory is totally meaningless is of no concern. It is a publishable theory, with a huge bank of data behind it.

The relationships that datamining produces could never be replicated by human workers. There are so many classifications and variables that only a computer could evaluate and match them. Smith says there are so many that they are worthless: “If the number of true relationships yet to be discovered is limited, while the number of coincidental patterns is growing exponentially with the accumulation of more and more data, then the probability that a randomly discovered pattern is real inevitably approaches zero.” And in a lovely vicious circle, he adds “The fundamental issue is not that Internet data are flawed (which they are), but that data mining is flawed. The real problem with Internet data is that they encourage people to data mine.”

Then there is Artificial Intelligence (AI), the hot fad of the moment. Smith agrees with other authors I have read that computers, and AI, are stupid. He says and cites others claiming AI simply spouts BS all day long. The degree of intelligence demonstrated by AI is nil.

They have no way of evaluating anything. They only identify and match. With enough data behind them, they can plausibly appear intelligent. But the mistakes they make would not be made by a six year old human. They routinely misidentify objects because the angle is different, or the lighting is different, or a drawing doesn’t have the heft of a photo. Smith’s clear drawing of a child’s wagon with red and white stripes is identified, with absolute confidence, as a candy cane.

AI will answer the same question three different ways, sometimes incoherently. Because they don’t actually master a language. They classify words. They also randomize plausible answers so what they respond with doesn’t appear to be a canned script. AI is far from being able to take over the other side of a conversation. Smith has spent time working with the state of the art ChatGPT, and finds it untrustworthy, to put it mildly: “Computers are autistic savants and their stupidity makes them dangerous.”

But some of the same stupidity shows up from human observers. Internet studies show that people believe the most technical jobs are the first to go to AI. People like surgeons, who must make split second decisions and act on unexpected conditions, they believe, will be the first to be replaced by AI. Yet menial jobs like cooking and cleaning, maintenance and service will remain human domains, survey participants say. This is precisely the opposite of what will happen, as rote tasks fall to AI, while highly technical jobs like financial evaluators, genomic advisors, and surgeons, will remain human responsibilities. They might employ AI systems to aid them, but AI cannot and will not replace them.

I have read most of the stories Smith tells elsewhere. But his context, framing them in terms of disinformation, data torture and datamining, gives them new perspective. Knowing what to fear and why we hate is valuable.

David Wineberg ( )
  DavidWineberg | Apr 11, 2023 |
aucune critique | ajouter une critique
Vous devez vous identifier pour modifier le Partage des connaissances.
Pour plus d'aide, voir la page Aide sur le Partage des connaissances [en anglais].
Titre canonique
Titre original
Titres alternatifs
Date de première publication
Personnes ou personnages
Lieux importants
Évènements importants
Films connexes
Épigraphe
Dédicace
Premiers mots
Citations
Derniers mots
Notice de désambigüisation
Directeur de publication
Courtes éloges de critiques
Langue d'origine
DDC/MDS canonique
LCC canonique

Références à cette œuvre sur des ressources externes.

Wikipédia en anglais

Aucun

This book argues that our growing distrust of science is fuelled by tools scientists themselves created, as technological advances and developments in data analysis have led to disinformation, data torturing, and data mining. Smith examines these issues and offers solutions for restoring the credibility of the scientific community.

Aucune description trouvée dans une bibliothèque

Description du livre
Résumé sous forme de haïku

Discussion en cours

Aucun

Couvertures populaires

Vos raccourcis

Évaluation

Moyenne: (4)
0.5
1
1.5
2
2.5
3
3.5
4 2
4.5
5

Est-ce vous ?

Devenez un(e) auteur LibraryThing.

 

À propos | Contact | LibraryThing.com | Respect de la vie privée et règles d'utilisation | Aide/FAQ | Blog | Boutique | APIs | TinyCat | Bibliothèques historiques | Critiques en avant-première | Partage des connaissances | 207,006,154 livres! | Barre supérieure: Toujours visible