Media Predictably Concluded A Study Showed Sexism. Here’s What It Actually Showed.

If ever there’s a study that shows even a slight difference between the treatment of boys and girls, or men and women, the media will run with it while crying sexism has been proven.

The most recent example comes from a study on participants’ reactions to seeing a child cry while getting their blood drawn. Participants were then told either the child was named “Samantha” or “Samuel” and asked to rate his or her pain based on what they saw in the video. The results showed, overall, that “Samantha’s” pain was rated at 45.9 on a 100-point scale and “Samuel’s” pain was rated at 50.4 on the same scale.

Cue the dramatic outrage.

Why Are We Still Dismissing Girls’ Pain,” wrote Laurie Edwards, a “science writer” at the New York Times.

A new study finds Americans take the pain of girls less seriously than that of boys,” was the headline from CNN.

Study shows gender bias starts early with girls’ pain taken less seriously,” wrote USA Today.

Parents Take Young Boys’ Pain More Seriously Than Young Girls’ Pain,” said the headline from Fatherly.

Hmm, Apparently People Don’t Care As Much When Girls Cry As When Boys Do,” wrote Cosmopolitan, naturally.

And on it goes.

Brian D. Earp, one of the researchers for the paper, took to Twitter to point out how the media got his study so wrong.

“From gender bias to media bias? A thread on how our study looking at adult perceptions of children’s pain got misconstrued. A reminder of the importance of taking media coverage with a grain of salt, reading original studies when possible, and guarding against confirmation bias,” Earp wrote.

He then provided screenshots of the headlines listed above and explained how the media completely invented the notion that the results showed “parents” or “Americans” didn’t care about the pain of young girls.

“First, the effect in our study was observed *only* in female participants–not ‘Americans’ or ‘parents’ or ‘people’ in general. In fact, male participants rated girl pain *higher* than boy pain (albeit not to a statistically significant degree),” Earp tweeted.

Also, according to Earp, the study “did not measure ‘sexism’ or ‘credibility’ or the extent to which adult raters “cared” about the pain of boys vs. girls–or even whether they took the pain of boys ‘more seriously’ than that of girls” and it didn’t “show that pain is ‘often missed’ in girls.”

Further, while the overall difference in participants’ rating of the child’s pain was “statistically significant,” the researchers don’t know what the practical or clinical significance is yet and does not speculate as to that effect. But here’s the real kicker, from Earp:

“But even more importantly, we go on to note that the difference was driven entirely by female participants: when you look at men only, there is NO statistical difference in ratings; when you look at women, it’s 45.7 for Samantha vs. 53.1 for Samuel (effect size d = .34).”

This means that it is women who judge boys’ pain as more significant than girls’ pain, which completely blows the sexism argument out of the water. Earp, says he and the other researchers “did not predict this” and “are not sure how to explain the observed female-only bias in ratings of child pain based on perceived gender.”

He did offer one possible explanation: That the idea that “boys don’t cry” affected the participants’ views toward “Samuel’s” pain level.

“The inference might go something like this: ‘Boys in our culture are taught to act tough & strong when they are in pain & not to cry out, so if a boy DOES cry out… & express he is in pain, then perhaps he is experiencing quite a lot of pain–enough to behave in a way that boys are stereotypically taught not to behave,’” Earp tweeted. “If that is the right explanation, it would say nothing about not taking girls’ pain (as) seriously, dismissing their pain, caring less about their pain, etc.”

Earp eventually concluded that this was a “complex” issue that needs further study, but his Twitter thread is an important read on how the media is awful at factually reporting the findings of various studies. Most reporters almost never read the full study and only look at the executive summary. I’ve documented several instances of this before, and they almost always come from studies where reporters can get a “sexism” headline from the results.

via Daily Wire

Enjoy this article? Read the full version at the authors website: https://www.dailywire.com/rss.xml