My father has occasionally told this story:
On July 20, 1969, he and my mother were watching the Moon landing. As the first transmissions from another world came back to Earth, my mother is reported to have remarked, “The picture is terrible.” My father looked at her and said, “Yes, but it’s coming
from the Moon.” (Yes, I was rised by Burns and Allen.) I was apparently also watching but, only having been just over a month shy of my second birthday, I can’t say I have any recollection of this, although I certainly wish I did.
Look what we can do.
Whilst I may be out of the will for sharing that story, I was in a similar circumstance just last week.
At the Science Writing Workshop at the Ghost Ranch Santa Fe, one of the educational sessions was co-presented by a writer who was physically present, and another who was videoconferenced in via Skype, and whom we could all see on the main screen at the front of the room. The co-presentation worked pretty well except that, every 8–10 minutes or so, the Skype connection would be lost (gee, that never happens), and we would have to interrupt the presentation to call her back. It was a tad frustrating, but, as New York Times science writer George Johnson, who was sitting behind me, remarked, “That this works at all is pretty amazing.” And he was right; we often get so wrapped up in what are essentially new technologies that we expect them to work flawlessly—when only a few years earlier, it was scarcely possible that they worked at all. Sure, videoconferencing is nothing really new, but here it was taking place on an everyday computer with free software and not physically connected to anything except a projector (and wouldn’t it be great if those could be wireless).
Skype kept getting disconnected because the Ghost Ranch’s WiFi was not the most robust in the world; I recall getting frustrated with its slowness myself—forgetting that as recently as five years ago I was on the road struggling with dial-up using hotel phone lines and the 56K modem in my Apple iBook. I cursed it at the time—also forgetting that 15 years before that if I had suggested accessing the Internet from a hotel room it would have been as if I had suggested beaming up to the Starship Enterprise.
The “evolution and reliance on technology” was just one of the many recurring mealtime topics we talked about over the course of the week.
The Science Writing Workshop was attended by about 40 people, predominantly professional scientists (neuroscience and evolutionary biology were heavily represented), with a few professional writers, as well. (By my estimates, it was about 75% scientists, 25% writers.) We all dined together either in the Ghost Ranch’s dining room, or we assembled posses to go out and forage for food in downtown Santa Fe. (I personally will have a hard time looking at Southwest cuisine for a while.) Mealtime conversation was intensely intellectually stimulating, and I learned a great deal—and was actually able to hold my own much better than I thought I would.
Another recurring topic was the often contentious relationship between scientists and science journalists. That is, while there is a symbiotic relationship between the two, each side has its own agenda and those agendas often work at cross-purposes. Scientists have a responsibility to the facts, to the data they have collected, and the results of their studies, while journalists have a responsibility to their editors, publishers, and readers, and thus are often looking for “a good story” and gravitate toward the more sensational, even if the results of a given study don’t lend themselves to the kind of extrapolation that makes for a compelling story.
This is not unique to science writing, of course.
Many of the scientists present admitted they avoid talking to the press for fear of being misquoted or misrepresented. It’s a fair point, but I don’t think avoidance is the best strategy. Perhaps sound beatings would be a better approach.
A good example of this inadvertently presented itself in the project we had to complete for the workshop: On Tuesday, we went up to the Santa Fe Institute (SFI) (an interdisciplinary think tank in the hills above the city), listened to several scientific presentations, and then had to write and workshop a story about some aspect of one of them. (I’ll post my own as soon as I can find it...it’s on the other computer somewhere...)
The first presentation was by Caroline Buckee, a postdoctoral fellow at SFI who was researching malaria transmission, and she spoke at length about the microorganism that causes malaria. One of the reasons that malaria is such a problem, and has evolved resistance to drugs to treat it, is that it genetically recombines “like crazy” (a technical term). (After workshopping about a half dozen stories about malaria, I think I know all this by heart!) At one point, Dr. Buckee mentioned in passing—and I want to stress, in passing—that because the genetic structure of malaria is a moving target, the likelihood that there will be a vaccine against the disease is very slim.
During the Q&A, one of the audience members seized on that one passing statement and tried to get her to say one way or another: will there ever be a vaccine for malaria? It was obvious that she was not comfortable making a definitive statement either way; there is still so much that is unknown and who knows what discoveries await us? But, not having much experience with the press (or even the wannabe press that we were) she eventually was cornered into saying that, no, there probably won’t ever be a malaria vaccine. And yet, if a story were written with that as its primary thrust (and there were several), anyone who had been in the room would know that it was not entirely representative of the talk.
And we wonder why the public gets frustrated when one study seems to contradict an earlier one.
Kariena Dill, an Albuquerque-based writer who was in my group, expanded on this theme with a more egregious example involving the sequencing of the woolly mammoth genome. A summary of the research appeared in Nature magazine, and it bears mentioning that, according to the paper:
A major reason for sequencing the woolly mammoth is to identify functionally important amino-acid differences between mammoth and elephant.
That’s probably not the sexiest application in the world. Wouldn’t something like Jurassic Park be far cooler? And, sure enough, an editorial appeared in the same issue of Nature (sorry, I don’t have a link) suggesting that scientists were only a few million dollars away from actually cloning a woolly mammoth and bringing it back from extinction. Nowhere in the original study was this even suggested—but guess which angle was picked up by the rest of the science media?
Nicholas Wade’s lede in the Science Times was:
Scientists are talking for the first time about the old idea of resurrecting extinct species as if this staple of science fiction is a realistic possibility, saying that a living mammoth could perhaps be regenerated for as little as $10 million.
No, the
scientists were not talking about it at all. Science
journalists were. Nowhere in the story does any scientist suggest that the completely sequenced mammoth genome would be used to clone an actual mammoth, though one does appear to have been coaxed into speculating about the idea.
It probably sold a few papers (I mean, come on, woolly mammoths capture the human imagination like nothing else), but is it an accurate representation of the work? And where would we put a herd of mammoths, anyway? Because one would get rather lonely, I should think. Could we effectively reconstruct their ecosystem which, after all, may also be extinct? And, perhaps most importantly, what would mammoth meat taste like, and what wine would go best with it? (I am the sole proponent of the field of Gustatory Taxonomy, or classifying animals and plants according to how they taste.)
Joking aside, is there a solution to this contentious reporter-reported relationship? Part of it could involve scientists getting better training in talking to the press, preparing a list of talking points and religiously sticking to them the way politicians do. Another would be a greater understanding of science on the part of people who cover science and a curbing of the desire to immediately go for the lurid and sensational (good luck with that).
But then it also comes down to the public. Science writing, to my mind, should educate the public, not just about the details of any given topic, but about science in general. And, by the same token, the public should look to science writing to be educated about a topic, rather than entertained as if it were a Michael Crichton novel. (Well, okay, bad example...)
In much science writing, there is a tendency to, as one of my lunch tablemates put it, present science “as a product rather than a process.” Those studies that seemingly contradict earlier studies are good fodder for stand-up comedians, sure, but the people who throw up their hands in frustration over this fail to recognize that science is a process. New data arise that shed new light on, clarify, or yes, even contradict and disprove earlier data.
Rather than see that as science’s failing, we should recognize it as science’s greatest strength. It’s what science is supposed to do. It’s about understanding nature and the universe through empirical observations and experiments; it’s not about crafting a written-in-stone law that explains everything, or at least not right off the bat. (It’s nice when that can happen, though.) Even the well-established laws of physics or chemistry or biology that everyone today accepts were only arrived at after many attempts, missteps, errors, and contradictions. Unfortunately, this “changing nature of reality” is one reason why many people turn exclusively to religion; they like the idea of immutable laws that are written in stone and never change or get contradicted. (And for those who do contradict them, there is excommunication or execution.) However, reality doesn’t work this way.
I started reading George Johnson’s excellent book The Ten Most Beautiful Experiments on the flight home and nowhere else is this better illustrated: the fundamental scientific laws and principles we now take for granted were only arrived at after much thought, observation, and experiment. There were many missteps along the way, and the famous names from the history of science—Newton, Galileo, Lavoisier, etc.—got all the credit, but there were many who came before them.
In this day and age, it’s hard to convey to people why learning about science is important. I would make the following cases:
Understanding science is increasingly necessary to secure a decent job in an increasingly high-tech job market. Research and development of new technologies is how we achieve economic growth, and how we get better jobs. Understanding science is a major part of that. Even if you are not in actual R&D, knowing basic science is invaluable. A friend of mine is a research scientist for a company that makes high-tech products, and very often sales and marketing ask them to develop products that violate basic laws of physics.
Understanding science (and math) is also important for understanding many of the political, social, and medical issues facing the world today (and tomorrow). Things like climate change, stem-cell research, and many other “controversial” topics can only be effectively discussed and understood with a strong scientific underpinning.
But the most important reason I would give for learning about science is the thrill of discovery. When it comes right down to it, as a species, we’re explorers. We came out of the cave, we looked up at the sky or down into the ocean and started asking questions and seeking answers. We built better and better tools to get at those answers; we launched probes into space; we have robots wandering around on Mars and flying through the rings of Saturn. What enriches the human spirit more than the excitement of landing on another world, or exploring the bottom of the ocean, or....well, name it? After all, there has to be more than life than just buying and selling stuff.
I would love to have experienced the thrill that Americans had on that July day nearly 40 years ago—lousy picture and all—when Neil Armstrong mentioned the “giant leap for mankind,” saying, essentially, “look what we can do.”