Friday, August 8, 2008

The future of science, gradical change, and tools for the people

Maybe you've felt it - the buzz in a room, the tension in the air, the accelerating pace at which people are connecting and the realization that we're all in this together, even if we don't quite know what "this" is. At least in my small pocket of the world (wide web), something is brewing.

That something is The Future of Science. Michael Nielsen has written about this at length in preparation for his forthcoming book of the same name, with a lively discussion in the comments following. At BioBarCamp this past weekend (many thanks to John Cumbers and Attila Csordas for organizing!), the future of science became a recurring theme, with an impromptu discussion on open science the first day and spirited sessions on open science, web 2.0, the data commons, change in science, science "worship", and redefining "impact" and "failure" the second. Each of these topics could be their own blog series, and, in fact, many of them are. Even if people didn't always agree on the details, it was clear that everyone there (a biased group, inarguably) agreed that change is necessary, and inevitable. The question is, what will that change look like, and how will we get there?

The creators of Labmeeting.com put forth the following thesis:
Science relies on trust. Trust only remains intact when change occurs through consensus. Change through consensus is inherently gradual. (Therefore change in science must be gradual to succeed.)
Though you could agree or disagree with each statement, there are two things I'd like to discuss in particular. One is the issue of trust. Science relies on trust, right? I would say instead that science could be built on trust, if people weren't so worried about it! The most popular argument made against radical openness in science is based on the fear that other people will not act in good faith, i.e. if you make your lab notebook public, you could get scooped. And yet it is exactly this current climate of secrecy and cutthroat competition that encourages scooping and offers little recourse when it happens. If all research were open, digital, and timestamped, there would be an indisputable record of work and ideas that could be used to argue precedence.

Of course, this all starts to sound a little chicken and egg after a while. How do we assuage the fear of scooping enough for things to get sufficiently open so that scooping really isn't a problem? This brings us to the next point - that change must be gradual. Let me add the session leaders' conclusion to this: "the first step is to create incentives for scientists to voluntarily start doing the same everyday things on the same web platform." I think this is a valuable statement to keep in mind as more and more web 2.0 tools and platforms keep cropping up - that in some sense, the best way to enact satisfied change is to make it beneficial to the individual researcher, and allow them to discover this on their own terms. Scientists are a skeptical lot by training; the fact that they are also generally time-strapped and resource-starved makes them, ironically, reluctant to experiment, at least with the way they do their work. They neither need, nor want, another social networking tool.

The key that some groups have discovered (Labmeeting, Epernicus, and OpenWetWare among them) is to discover what people need, and then build something they will want. For Labmeeting, it is online paper management, for Epernicus it is effective question answering and resource finding (no more wild goose chases looking for someone who can help you with a specific problem), and for OWW it is tools for managing group websites and sharing protocols. Although Epernicus does rely on there being a social/professional network in place, the other two provide services that are useful even if you're the only one using it; the online community therefore can build itself without pressure. And Epernicus along with the others recognizes that in order to be successful among scientists, you need to provide them with something useful. In other words, you need to make tools for the people, rather than tools that need people.

So what about change? How will it happen and when? Well, I'm hoping Michael's book will tell us. ;) But I have a feeling it will be "gradical" - gradual at first, and then...

12 comments:

Anonymous said...

I'm always confounded by the notion that someone "scooping" someone else's research involved being unethical. That's not the real danger, as you point out, if someone else claims your results, you can hopefully right that wrong.

The real problem is that exposing your half-baked data to the world opens you up to someone else realizing what it means before you do. The fear isn't someone stealing your results, the fear is someone seeing them and making the intellectual leap before you do. The classic example is Rosalind Franklin and Watson and Crick. They saw her data in its half-baked state, and realized the structure of the double helix. She had the data and eventually would have figured things out. But, because someone else saw it before that could happen, she isn't credited with that achievement. While many find the whole affair a bit sordid, it seems to me that "open science" would lead to a continuous stream of this. You'd look at your competitor's data and perhaps see something they hadn't noticed yet. You'd probably cite their work in your paper, but you would rightfully be given credit for the discovery, not the person who generated the actual data.

I guess the positive side is that science would move forward faster. But a lot of people are going to end up unhappy and scooped. And no, timestamping won't help in this circumstance, unless the point you want to prove is that you did the legwork and someone else smarter than you figured out what it meant first.

Anonymous said...

David

I take a very different view, partly cause I subscribe to the view, "wherever you are, the smarter people are somewhere else" (via Bill Joy).

If you have data and someone can come up with an idea faster you can, all power to them. There is no guarantee that you can come up with every idea from your own data.

What is important is trust and credit. I would argue that if you make your data open, then if someone uses it to realize an idea, they pretty much have to give you credit, since your name is associated with it. If we always assume we can do more from are data, we live in a world where science is less important than our own egos. Why are we so afraid of others being smarter than us?

Yes there will be some scooping, but if you evaluate the risk-benefits, there is much more good to not do this because of a few negative examples.

Anonymous said...

Deepak, I think that's a very noble attitude, but one I would have difficulty maintaining as I was telling my students, postdocs and technicians that they'd have to seek employment elsewhere, or telling my family we wouldn't be able to afford food or rent for a while because the NIH had given away the grant money in my area of research to someone else who made a breakthrough using my data.

Yes, you'd get some credit, but no, you wouldn't be recognized for the achievement. Science is about making the discovery, not collecting the data and failing to reach a conclusion. Franklin was cited by Watson and Crick, and that's all the credit she received. If offered the choice, I'll take the Nobel Prize and the prestige, wealth and fame given to Watson and Crick over the obscurity, other than being used as a cautionary example, awarded to Franklin.

Open Science is, like many things, a wonderful idea in the abstract, but it gets bogged down in the real world. Human nature makes us competitive beasts. Some scientists would always be less open than others, hoping for an edge. More importantly, since funding, jobs and tenure are all competitive struggles, giving away your work before you've managed to get something out of it seems counterproductive.

Now, if you've got a system to remake the way we award funding and jobs, one which allows for Open Science, then I'd love to hear it. But calling for Open Science before that new system is in place is unlikely to gain much traction.

But back to my original point--for most scientists, the fear of being scooped isn't someone misrepresenting themselves as having done your experiments. It's someone having the "eureka moment" based on your data before you do. There's nothing unethical at all about that. Open Science requires you to be okay with this. And I think that's going to be a rare attitude, at least among ambitious scientists trying to make a career for themselves.

shwu said...

It would be wonderful if every time a Rosalind Franklin - Watson/Crick situation arose, Watson/Crick said, "hey, she's really on to something here, and I think we can help figure it out. Let's collaborate!" It doesn't happen as much as it should, but it does happen. I agree that the reward system needs to change before this becomes accepted, though - a system that recognizes all your myriad contributions rather than mostly single blockbuster publications. Like David said, calling for open science prematurely won't work, just like building tools that need people won't work unless that tool is extremely useful. Building a world that rewards openness and collaboration won't work unless you reward openness and collaboration.

Deepak said...

David,

We differ in philosophy. I actually would love if someone did something cool with data that I collected. All power to them.

Alternatively, we need to figure out how to apply copy left licenses, but locking it up for your eyes only is limiting yourself to the confines of a broken system.

Bill Hooker said...

If Watson/Crick hadn't acted badly ("sordid", indeed), that example would have ended very differently -- as per Shirley's comment about collaboration.

David, you seem to agree that the general idea is sound -- that collaboration is more efficient and productive than dog-eat-dog competition. Would you consider, then, opening up some aspect/s of your research, while keeping closed enough of your ideas that your family would not starve? Perhaps if enough people would dip a cautious toe into Open waters, the kind of cultural change Shirley describes could begin to occur.

Bill Hooker said...

Something else that doesn't sit right with me:

timestamping won't help in this circumstance, unless the point you want to prove is that you did the legwork and someone else smarter than you figured out what it meant first.

Why is it *necessarily* someone smarter than me? In my experience of the reverse situation, when I realize what someone else's data mean before they do, it's simply because I happen to know something they don't -- usually an odd, unexpected connection with my own work. It's got nothing to do with being smarter -- just as it would be ridiculous to claim that Watson and Crick demonstrated by their actions that they were *smarter* than Franklin.

Confronted with someone else's data and the realization that you know something about it that they don't, you have a choice: you can share, or you can try to grab as much of the credit as possible. How are you harmed by sharing? If the data are the Big Deal, you ride the coat-tails; if your intellectual leap is the crucial factor, you get the spotlight.

To put it another way, since Watson/Crick/Franklin is "the classic example": should Rosalind Franklin not have been included as an author on that famous paper? And in what way would Watson and Crick have suffered if she had been?

Anonymous said...

shwu--
Well, as I recall, there was some collaboration between Watson, Crick and Maurice Wilkins (Franklin's boss)--Wilkins and Franklin chose to publish their paper separately from Watson & Crick's in the same issue of Nature (and W&C's paper cites Franklin's work, both published and unpublished).

Good points though on trying to introduce tools where there's no obvious need. There's a nice recent set of articles on this subject here, here, and here.

deepak--
Again, I'm not so sure how happy I would be if I were shuttering my own lab while someone else was using my data and winning funding and prizes. I just may not be as noble as you, but then again, I've got a family to feed.

Bill--
Don't get me wrong, I'm a big fan of collaboration--the majority of the papers I published in my non-illustrious career at the bench were collaborative efforts with other labs. But I do think there needs to be some balance. I think one needs to exploit one's own work first before turning it over to the rest of the world. I don't see anything wrong with collecting enough data to write a paper, then opening things up to everyone. Sure, it may take a little longer, but it keeps people employed and rewards them for their hard work. And one should collaborate, but there's a big difference between sharing data amongst collaborators and throwing the doors open to everyone before you've thoroughly analyzed your own results.

Perhaps "smarter" is a loaded word. Let's instead say "someone who had the insight that you missed/lacked at that particular moment." Again, Franklin was offered coauthorship as I recall, but chose instead to let credit fall to those who had the insight she lacked at the time. My opinion is that if she continued on, and Watson and Crick hadn't seen her data, perhaps she would have reached that insight herself, and been able to rightfully claim the credit for both. The world got the double helix a little faster, and she became a footnote, rather than getting the spotlight. Good for the world, good for Watson and Crick, not so good for Franklin.

And I don't think people are wrong in looking out for themselves, protecting their careers, their labs, their employees and students. Would you rather be a great scientist or a cog in the machine? Most people I know would choose the best chance at greatness.

Also, there are always going to be people who look to take advantage of the system, particularly an open one--that's human nature. Take a look at the gaming that's going on with arXiv. Not everyone is going to play fair, and Open Science gives advantage to the unscrupulous, and to big labs that can afford the manpower to quickly replicate others' data. Timestamp or no, one could always claim never to have seen the grad student in question's open notebook (probably likely, given the flood of information which would occur if everyone suddenly went open and online), and that it was just coincidental that they were doing the same experiments at the same time. This happens all the time in the journal biz--lab head A gets sent a paper by lab head B to peer review, and realizes they're about to get scooped and quickly puts out their own paper while delaying the review process.

You guys may just have more faith in human nature than I do. But every scientist I've mentioned the idea to scoffs at it. They want to keep their employees and students working, they want to keep the lights on and keep their families fed. This probably slows the course of science, but then again, it makes it a viable career.

Anonymous said...

Franklin was offered coauthorship as I recall, but chose instead to let credit fall to those who had the insight she lacked at the time

Never heard this, and it would change my view of the situation considerably; can you point me to a source? (Pre-emptive: Watson's book doesn't count!)


Would you rather be a great scientist or a cog in the machine?

False dichotomy. Like 99.9999999% of everybody, including scientists, I'm muddling around in the middle somewhere; choosing to be "great" doesn't enter into it. Delusions of intellectual grandeur are behind most of the type-A assholery I see in science.

When you talk about getting credit, it sounds like you're thinking about discovering the double helix, when the vast majority of advances are incremental. We're not talking about seeing photo 51B and choosing to bogart the Nobel, we're talking about making some pedestrian addition to the knowledge base and choosing not to be an asshole about it. What I don't get is, what's the incentive NOT to share credit?

Anonymous said...

Is Wikipedia a believable source? I have my doubts, but here's their article on the subject:
"On the completion of their model, Francis Crick and James Watson had invited Maurice Wilkins to be a co-author of their paper describing the structure.[84][85] Wilkins turned down this offer, as he had taken no part in building the model.[86] Maurice Wilkins later expressed regret that greater discussion of co-authorship had not taken place as this might have helped to clarify the contribution the work at King's had made to the discovery.[87] There is no doubt that Franklin's experimental data were used by Crick and Watson to build their model of DNA in 1953 (see above). That she is not cited in their original paper outlining their model may be a question of circumstance, as it would have been very difficult to cite the unpublished work from the MRC report they had seen.[88] It should be noted that the X-ray diffraction work of both Wilkins and William Astbury are cited in the paper, and that the unpublished work of both Franklin and Wilkins are acknowledged in the paper.[1] Franklin and Raymond Gosling's own publication in the same issue of Nature was the first publication of this more clarified X-ray image of DNA."

Delusions of grandeur do indeed lead to assholery, but then again, most of the great scientists I've met (and I've worked directly with half a dozen or so nobel prize winners) have pretty big egos (and had them before they won the prize). Science doesn't select for friendliness or personality, it selects for achievement, and being a driven jerk is a selective advantage. It would be a lovely world if everyone made cupcakes for one another, but I don't see it happening any time soon.

What is the incentive not to share credit? Well, funding and jobs are limited commodities. The more credit one has for oneself, the more of those things one is likely to garner. There's also the matter of being right, being the actual one to make a discovery and getting the credit for that. In the case we're discussing, Franklin is rightly credited with collecting a key piece of data but not with discovering the double helix because she didn't discover it.

Unknown said...

"Science doesn't select for friendliness or personality, it selects for achievement, and being a driven jerk is a selective advantage."

But when discussing "the future of science" then I guess it should be allowed to assume that science could select for something else. In my view what you describe is not really "science" selecting. But rather the old-world institutions around science. The "nature-paper" credit of now, is a relic of when dead-tree publishing was a physical necessity. And the hirering based on nature paper authorship is a relic of the limited information available in a dead-tree limited world.

In my view this is why the brainstorming around openness, credit, timestamping, incremental contribution etc. is important. The goal is not to make open science work inside a world limited by dead-tree distribution of information. The goal is to think up a system that takes advantage of digital information to improve how science is done. Right?

Bill Hooker said...

@Anders: exactly so.