Why Discuss When You Can Just Censor?
Daily newspapers are hemorrhaging readers, and no wonder, because they focus on covering up information rather than letting it out there and disputing it if need be.
Sometimes, I don't know why they even call them newspapers anymore -- perhaps out of nostalgia -- instead of cover-up papers or avoid controversy papers.
Kevin Roderick posted on LAObserved that the LA Times had promoted a woman to be a science reporter:
Eryn Brown will be the paper's general assignment science reporter, though she won't be one of those newsroom specialists who brings expertise to a beat. Brown has been, most recently, the paper's letters editor, but was a writer before that.
Roderick linked to the LAT's blog item of the announcement:
Eryn Brown will be joining the Health/Science team as a general assignment science reporter. She'll be reporting both on large scientific discoveries and on the practical science behind current events. As The Times' Letters editor, a position she's held since 2008, Eryn has been distilling the often passionate and personal views of L.A. Times readers. In her new job, she'll be distilling the often passionate but scientific work of researchers and scientists. Her beat will be a broad one, covering science as it touches an array of disciplines and departments. Eryn previously worked at Fortune magazine in New York, writing features about technology, dot-com culture and heavy industry. She moved to Los Angeles in 2002, where she freelanced for The Times (including the Los Angeles Times Magazine), the New York Times, Wired and other publications. She joined The Times' editorial board in January 2006, where she wrote about the economy, water policy and healthcare.
Notice any science or science reporting background in there?
I left a comment on the blog item -- one which remained unposted. I check back on the LA Times' comments I leave, when I remember, because I suspect they'll just ditch ones they don't like.
Well, that's exactly what they did.
And they did it to the wrong girl.
I left another comment (along the lines of "Where's my comment?") which also wasn't posted.
And then I wrote to the LA Times Reader's Rep to ask why it wasn't posted:
Subject: comment not published on your entryhttp://latimesblogs.latimes.com/readers/2010/07/sports-designer-movie-editor-science-writer-named.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+readersblog+(Readers%27+Representative+Journal)
Why wasn't it?
Do you still have the comment?
I had an experience with another paper where they got a really racist letter in response to an article I was quoted in, and their response is to not respond to it rather than print it and expose the racism and have a discussion about it.
I took time to write down information from an epidemiologist who trains me in how to read studies. I'm super-irritated that the comment hasn't been posted. Criticism not welcomed, just pretend-welcomed? -Amy Alkon
PS Note that I have to be suspicious about my comment and check. Yuck.
The Reader's Rep wrote back:
Ms. Alkon,Thank you for following up on your comment. I had thought it was inaccurate in that Eryn Brown is not a young, inexperienced reporter, and I regret that the truncated posting I published made it sound that way. I also think it's a bit unfair to assume that she is unqualified; why not read her articles and then make that call?
However, I've thought about the comment some more, and you're right that it should be published. It doesn't violate the terms of service, and I am an advocate of transparency. I am posting it now.
Best regards,
Deirdre Edgar
Readers' representative
Los Angeles Times
http://latimes.com/readers
Twitter: @LATreadersrep
She's "an advocate of transparency"? When caught, it seems.
From her e-mail, it seems she didn't really read my post. Reporting on science takes understanding science and scientific methodology, and learning that studies have limitations, and that authors misrepresent their data or sometimes don't even understand it, just for starters.
I've recently had an experience in researching a column where I found that a study that had been chronicled in numerous books as if it were a good study had samples sizes of, I think, 9 and 26. Not cool. Then, the follow-up studies -- since the 70s! -- really didn't have sufficient evidence or they were mucked up by problems with the study design. I spent three weeks agonizing over the column, trying to dig up studies with solid evidence, and I just couldn't find them. Yet, again, if you look in numerous books that are supposedly science books, written by people considered solid researchers, the study and follow-up studies aren't approached at all critically.
But, back to my comment -- now on the site, and which I've posted below:
It's extremely disturbing that the LAT is promoting a woman to science reporter who lacks a science background. It's no small thing to report on science, and to know when studies have limitations that make their findings invalid.A recent example in the media -- a study out of University of Texas on "cougar" sex: I've read the study, and every single report in the media I read on it had sloppy errors in it.
I know times are tough but papers need to invest in training science reporters and not just promote any diligent young reporter with a notebook and a pen and a few years experience in other areas.
Here's one thing you can pass on to your reporter, from an epidemiologist who helps me be rigorous about in my assessment of studies: "There is no thing as a perfect study -- every study of humans has major flaws that handicap any attempt to draw general conclusions from it (often, however, one can draw specific conclusions, like that the authors are incompetent). Some studies just have fewer or smaller flaws than others on their topic -- so learn to think along a continuum, not just 'good data' or 'fudge.'"
Also, you don't just read one study on a topic -- you look at a body of work and see if they have similar findings.
To learn stats -- jeez, or start -- pick up a book an evolutionary psychologist/university prof recommended to me: Biostatistics: The Bare Essentials
.
Really, it's just shameful that you're promoting this woman out of nowhere to report on science. It's not like reporting on cute cat stories or quoting some Wall Street guy.
To the reporter: If you want guidance on how to do what you need to do to train yourself, please contact me, and I mean that in a helpful way, not a condescending one.
Posted by: Amy Alkon | July 21, 2010 at 11:40 AM
Well, stamped when I posted it, but actually posted by their Reader's Rep actually much later.







This is right on, but I can't help thinking that you'll be pigeonholed: "What? An advice columnist? What could she know, what doily to put under the teacups?"
But one can hope, even though the media won't call Patrick Smith or Phil Plait when they can get some schlock to sound frantic and sell Kleenex.
Radwaste at July 23, 2010 2:58 AM
I think I may have said this here before, but... Some years ago I was on an airplane flight, and I got to talking to my seat-mate about how poorly the mainstream media covers science and technology issues. I was griping about a news story I'd seen that day concerning an airplane crash, in which they had identified the aircraft as a 737 when it was actually a Fokker 100, which is not even remotely similar. He told me that he was a lawyer and that it was a truism in his profession that newspapers could always be counted on to get basic facts about courtroom trials wrong; he noted an AP article he'd seen recently concerning a criminal trial that he had been following, and they had mis-identified a prosecution witness as being a defense witness.
Cousin Dave at July 23, 2010 6:41 AM
Here's what happens when reporters without science experience report on studies, not knowing the difference between an observational (cohort study) and a randomized, double blind, placebo-controlled study, the kind that has a chance of measuring something specific:
http://www.aolnews.com/surge-desk/article/study-meat-eaters-pack-on-pounds-regardless-of-calories/19561807
Katie Drummond, of AOLNews, most stupidly writes that "Meat Eaters Pack on More Pounds, Regardless of Calories"
Seven years of exhaustive research by investigative science reporter Gary Taubes say otherwise.
Utter idiocy. Utter idiocy.
This reminds me of the study that was reported as "breast implants cause suicide," because they found that more women who had breast implants committed suicide.
Correlation is not causation.
Katie should stay away from scientific topics.
Amy Alkon at July 23, 2010 7:19 AM
Dr. Eades had an excellent post a while back on cohort studies:
http://www.proteinpower.com/drmike/statistics/observational-studies-2/
Amy Alkon at July 23, 2010 7:27 AM
Taubes on it here:
http://www.nytimes.com/2007/09/16/magazine/16epidemiology-t.html?_r=1&pagewanted=all
Amy Alkon at July 23, 2010 7:28 AM
Incompetence, inexperience, and even blatant liars are the norm these days when it comes to mediatypes and politicians. One only needs to be familiar with a teleprompter to be considered credible, regardless of content or agenda.
jksisco at July 23, 2010 7:48 AM
Augh!
It doens't matter how many years she has as a reporter or writer, if she has no science training, then she is unqualified. period.
People like her getting science journalist gigs are one of the biggest problems with the level of scientific knowledge in this country. Because knowing about the most accepted theories or thinking scientific knowledge is great is not enough, a good science reporter must understand the philosphy of science just as much, and why we rely on the scientific method.
Only then they will understand why every sensational headline they print and every time they write an article looking only at the abstract of a paper, they actually do harm to their readers.
plutosdad at July 23, 2010 7:51 AM
> it was a truism in his profession that newspapers
> could always be counted on to get basic facts
> about courtroom trials wrong
Not just law, ANYTHING.
Remember when you were a kid, and you found something that interested you enough to learn more about it than other people knew? Some hobby, or industrial process, or esoteric phenomenon that most people didn't see every day? And your family thought it was great that you liked it so much, even if they kind of glazed over when you tried to share your enthusiasm? And then one day someone from the local paper came by and you were kind of excited to think you'd soon be able to share conversation about your interest with the entire community?
But how two weeks later the article came out and it was ridiculed with inaccuracies and warpage of both details and major themes?
It's like that for EVERYTHING. It's like that ALL THE TIME. These people aren't especially bright, and are often flatly inattentive.
But most who learned to enjoy newspapers eventually set aside that distrust, because reading a paper is an indulgence, after all. It's something to do with our hands while we're waiting for the eggs at the diner, and in the olden days it was what we were counting on for information... This wasn't the fifteen minutes of the day when we wanted to be most cynical.
But we should have been, and we still should be.
Crid [CridComment at gmail] at July 23, 2010 8:06 AM
Hey, let's not go nuts about the term, "qualified", either. It's a form of the fallacy, "appeal to authority" to assume that the facts of any article are correct just because of the background of the reporter. What credentials really do is allow the reporter to get to the heart of a complex matter better - and you know that some geniuses can't string simple sentences together.
Look at Amy. Unless you did some digging, you'd never guess she's spent so much time in evo psych; what, to pound somebody being a bozo in a daily?
"He told me that he was a lawyer and that it was a truism in his profession that newspapers could always be counted on to get basic facts about courtroom trials wrong..."
Which is why so many think OJ "got off".
One more thing about "the news": never forget that the events presented to you were covered by fewer than six people at the scene most of the time. Keep that in mind.
Radwaste at July 23, 2010 8:14 AM
Skepticism is underrated.
MarkD at July 23, 2010 8:19 AM
Unless you did some digging, you'd never guess she's spent so much time in evo psych
I started out giving free advice on the street corner as a joke, but I've worked very hard to learn scientific methodology and to put science in my column. I continue to take a humble approach to it -- for example, asking David Buss, in my posting about the study he and his students did recently, if I got anything wrong. He gave me a more nuanced view of self-reported sex data than I previously had, and I posted it. My goal isn't to look like a genius, but to tell the truth.
Amy Alkon at July 23, 2010 8:19 AM
Amy Alkon
http://www.advicegoddess.com/archives/2010/07/23/why_discuss_whe.html#comment-1735903">comment from MarkDSkepticism is underrated.
Highly. It's one of my best tools.
Amy Alkon
at July 23, 2010 8:32 AM
Journalism and teaching have the same problem. They require two sets of knowledge to be done well. First, you need to know how to report or teach. Second, you need to know something about what you are reporting or teaching.
You get your degree in journalism or teaching and the coursework teaches you how to do the job, but doesn't give you any knowledge of the related subject matter.
A reporter learns how to report, but requires no more than the standard undergraduate-level instruction in history, economics, politics, law, science, mathematics, etc. in order to graduate.
A teacher learns to teach, but is taught very little about the subject area in which this teaching will done. My biology teacher in high school was a 300-lb physical education major. Her lectures consisted of reading the textbook aloud.
Personally, I think journalism and education should be minors and graduates in those fields should be required to major in something more substantive.
Conan the Grammarian at July 23, 2010 8:55 AM
On Twitter, via @ee4ee
Amy Alkon at July 23, 2010 9:07 AM
"A teacher learns to teach, but is taught very little about the subject area in which this teaching will done. My biology teacher in high school was a 300-lb physical education major. Her lectures consisted of reading the textbook aloud."
Conan, I think you give both professions too much credit. I have taken the education curriculum at a major university and it consists of platitudes and current educational theories none of which stand the test of time. J majors today are being similarly short changed and I suspect that most of their education, like teachers, consists of political indoctrination and self esteem exercises designed to convince them that they are the best and the brightest regardless of their total lack of any substantial education and their generally low SAT scores. Isabel
Isabel1130 at July 23, 2010 10:30 AM
The value of a scientific result is that it predicts the future. You must be able to collect data, apply the theory, and predict what will happen to some degree of certainty.
Everything else is only association. B follows A, or B is associated with A in the past. So what? This is often useful to identify areas for more research, but says nothing in itself. You need at least an observed, repeatable, predictible relationship and a plausible mechanism.
Retrospective studies have a subtle but disastrous flaw. Consider this example.
Divide an area on the floor into a square grid of 100 squares each 2" on a side. Take 200 dried beans and drop them onto the grid to scatter nicely. On average, each square will contain two beans. But, almost always, you will find a square that contains (say) six beans. The chance of that is less than 1%. You can look at that grid and say "There is less than a 1% chance that this square would contain 6 beans. This relationship cannot be ignored, and we recommend that all people either seek out or avoid that square".
The chance that a particular square will contain 6 beans is less than 1%, and you will be quickly bored by attempting to predict which one, like playing the gambling game Roulette. But, you fool yourself by looking for that square after the fact, then assuming it must be significant because its occurrence has a low probability.
It doesn't matter if some event has a low probability (say 1%) if you screen 100 cases to find it, and cannot predict what causes it.
Unfortunately, the people who do these studies are often ignorant of these facts. They take data sets collected for various, other purposes. Much of the data was not carefully collected under controlled conditions. They feed this data into standardized statistical computer programs, look for associations, and publish the results.
An association says nothing without repeatable, predictable causation.
Andrew_M_Garland at July 23, 2010 10:47 AM
Amy, please.
You may NOT criticize the selection, nor question the qualifications, of the new science editor because ... she is a woman -- a woman gamely struggling to make her way in a male-dominated arena! Indeed, she is heroically battling the rigid, structured, and thus inherently misogynistic nature of science itself!! With her woman's "different way of knowing", however, she will overcome any lack of what you (and The Dreaded Patriarchy) call "experience" and "expertise".
(turns aside to gag and choke back stomach contents)
This descent into banality and mediocrity is to be expected from a MSM institution permeated by feminism, which itself depends on relentless and unquestioned reporting of the existence of a gender coin with only one side. My regard for the integrity of MSM news organizations as unbiased reporters of fact disappeared long ago as the feminists' "Lace Curtain" was being imposed by "progressive" (i.e., smarter than us regular folks) editors and publishers.
Amy, as I recall, you can comment from personal experience on the pernicious effects of this pervasive MSM mind-set. You should know better by now! ;)
Jay R at July 23, 2010 11:13 AM
"rigid, structured, and thus inherently misogynistic nature of science itself"
Rigid? Structured? My God! Why didn't I see it before?! The feminists are right ... science IS like an oh-so-threatening phallus!
With the new editor, it appears the LA Times is helping to get at the "root" of the problem, eh?
Jay R at July 23, 2010 11:22 AM
When reading the news, keep in mind the Gell-Mann Amnesia Effect. From Michael Crichton:
"Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray's case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the "wet streets cause rain" stories. Paper's full of them.
In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know."
lsomber at July 23, 2010 12:03 PM
More from Crichton:
"That is the Gell-Mann Amnesia effect. I’d point out it does not operate in other arenas of life. In ordinary life, if somebody consistently exaggerates or lies to you, you soon discount everything they say. In court, there is the legal doctrine of falsus in uno, falsus in omnibus, which means untruthful in one part, untruthful in all.
But when it comes to the media, we believe against evidence that it is probably worth our time to read other parts of the paper. When, in fact, it almost certainly isn’t. The only possible explanation for our behavior is amnesia."
lsomber at July 23, 2010 12:08 PM
Amy Alkon
http://www.advicegoddess.com/archives/2010/07/23/why_discuss_whe.html#comment-1735946">comment from lsomberRecently, I found all these articles quoting David Buss on the cougars/sexual desire study he did with his students. Only, none of these people actually talked to him; they just quoted from the results of the study and attributed it to Buss. They saw that he was the famous dude on the study, and don't understand that the fact that he was listed fourth meant something. He didn't write the study -- Judy Easton and one of the other four authors listed before Buss did most of the writing and he did some. If you get any experience -- like the slightest experience -- you know that "first author" listed on a study usually means something.
Amy Alkon
at July 23, 2010 12:09 PM
"It's one of my best tools."
Which is why you should go to DragonCon and participate in the Skeptrak.
"After the highly successful 2009 year at Dragon*Con, we are proud to announce the 3rd year of "Skeptrack", providing a full complement of live entertainment and discussion panels for the legions of free thinkers, science enthusiasts and skeptics helping to put the Science in Science Fiction at Dragon*Con.
Our list of guests have included many of the world's most prominent skeptics from the world of television, podcast and print media: 'The Amaz!ng' James Randi, Skeptic Society Jr. Skeptic Editor Daniel Loxton, Benjamin Radford, Joe Nickell, Alison Smith, Dr. Phil Plait, and Margaret Downey, to name just a few.
To monitor updates as they unfold, to join in the discussion groups, and for more information about Skepticism, critical thinking and paranormal investigation visit Skeptrack on the web at Skeptrack. (www.skeptrack.org)"
Radwaste at July 23, 2010 1:02 PM
Conan and Isabel are absolutely right, but the point that is missing is: These journalists actually convince themselves that they are experts in whatever subject that they are covering! You can ask them, and they will tell you. You can then quiz them about their subject knowledge, and watch them fall all over themselves making up half-truths and excuses, but then you ask them the question again, and they will still insist that they are experts!
This is all the result of a nonsensical conceit that arose in the liberal arts back around 1970. It went like this: liberal arts knowledge is universal and applicable to all subjects, whereas profession-specific training is simply a matter of memorizing a few rules and terms. I had a (conceited) high school professor explain it to me in these derogatory terms: "You're going to school for engineering. But I graduated with a degree in liberal arts, which means I can learn any subject. I can read a few books and become as good an engineer as you. But I can also read books and become a banker, a doctor, or a lawyer if I feed like it. Whereas, you will never be anything other than an engineer." The ultimate outcome of this was the plague of smiley-face droids that HR managers had to put up with in the '80s, who walked into job interviews and answered every question with, "I want a job where I work with people!"
Cousin Dave at July 23, 2010 1:26 PM
Amy Alkon
http://www.advicegoddess.com/archives/2010/07/23/why_discuss_whe.html#comment-1735962">comment from Cousin DaveThese journalists actually convince themselves that they are experts in whatever subject that they are covering!
This is where you get in trouble. I always worry that I haven't investigated enough, don't know enough, am missing something, have fucked up. This leads me to do what I did on the cougars entry -- e-mail one of the authors to ask to have the flaws in my posting, if any, pointed out to me. I love criticism (from people actually qualified to give it) -- it makes me and my output better.
Amy Alkon
at July 23, 2010 1:46 PM
Don't forget that the greatest weakness of that plague was that they were workaholics. And they left their last job because they wanted a "new challenge."
However, most of the HR managers with whom you seem to sympathize in that statement were actually part of that plague. Today's HR specialists insist they can gauge whether a candidate is qualified for a job based on a few criteria from the hiring manager.
Conan the Grammarian at July 23, 2010 1:50 PM
When a popular-media report cites a piece of academic research, people who are genuinely interested in the topic would often like to read the original paper itself. This is usually impossible unless you are a subscriber to the publication in which it appeared (unlikely and *very* expensive) or pay something like $30 for a one-time access fee.
IMSHO, this should not be allowed for publicly-funded research, the resutls of which should be required to be posted on the web for access at no charge. If the counterargument is that this will destroy the scientific journals, so be it: we are past the paper-only age, and whatever reviewing/refereeing/editing needs to be done can be done by scholars themselves, without the need to pay $$$ to a publishing empire.
david foster at July 23, 2010 3:30 PM
Conan, regarding the current crop of HR managers, you're absolutely right. I'm talking about an old school of HR managers (back when their department was called "Personnel"), of whom there were still a few left when I got my first big-company job in 1983.
Cousin Dave at July 23, 2010 3:33 PM
CalTech, JPL--don't these places deal in science? Can't the LAT find someone who might be able to write, report and understand all that tech talk? Or is Erin some editor's friend?
KateC at July 23, 2010 4:08 PM
All good points. Plus: your typical liberal arts major has actively avoided any math, statistics or science. The courses they may be required to take in these areas are always specialty courses, watered down so that they have a chance of passing them.
For some odd reason I, as an engineer, wound up required to take a business math course. The liberal arts students were struggling. After the course was over, they thought that they has passed a real math course. They had no clue that it was a watered down joke that repeated material from Junior High School.
In other words, not only do they not understand math and science - they don't even know enough to understand that they don't understand.
bradley13 at July 23, 2010 4:15 PM
This is why so much junk science gets printed and left unquestioned. I once wrote back questioning the. Premise of a report on nutrition and the LA Times writer's response made it clear that he didn't know what he was talking about.
Tony at July 23, 2010 7:51 PM
> Or is Erin some editor's friend?
Are you saying Ehrenreich has another daughter?
Crid [CridComment at gmail] at July 23, 2010 9:19 PM
Amy Alkon
http://www.advicegoddess.com/archives/2010/07/23/why_discuss_whe.html#comment-1736019">comment from Crid [CridComment at gmail]> Or is Erin some editor's friend? Are you saying Ehrenreich has another daughter?
Hah - great.
Amy Alkon
at July 23, 2010 11:25 PM
Leave a comment