I’ve repeatedly heard that every academic, at some stage in their
career, has that one project that just won’t behave itself and become a nice
little publishable package. Experimental results lead to more questions than
answers. Reviewers say the ides are interesting but are unconvinced by the
conclusions or have issues in the reliability of the method. You're rejected but
encouraged to re-submit. The whole thing drags on for years, quietly ticking
away in the background, while research avenues with more promise for short-term
gain are chased instead. But the nemesis project never dies, it stays there at
the back of your head. Too much time and effort has already been invested, you’re
in too deep to give up on it now.
My own mini-version of this has just ended. My nemesis project
has just been published in Astrobiology (link, and link to non-paywalled
pre-print) 3 years after my supervisor scribbled down a ‘cool idea which won’t
take much time to test’ (I may be paraphrasing there, it was a long time ago).
This is not my usual post-publication summary of my work, I will hopefully
write that soon, instead this is a story of how much behind the scenes failure
can have gone into one, small, relatively insignificant, successfully published
paper.
The idea for the project came out of the PlanetaryProtection of the Outer Solar System project, which I have written about before
(link). In one of the early meetings back in 2016 my supervisor, clearly paying
full attention to whatever was being discussed at the time, scribbled a vague
experimental idea onto a scrap of paper. This idea was to see if we could use a
well-established environmental sampling technique (solid phase micro extraction,
SPME) to test spacecraft hardware surfaces for organic contamination. Now this was interesting as organic contamination
is a big issue in planetary protection, we don’t want to send dirty spacecraft
with highly sensitive instruments to the (currently) pristine icy moons of the
outer solar system. We’d end up only detecting muck from Earth and so either getting
all excited over nothing, misinterpreting it for evidence of alien life, or, a
real interesting extra-terrestrial signal would be missed, lost in background
noise from the contaminants. Current methods employed for detecting contaminants
on surfaces tend to be time consuming, complicated and may involve multiple
solvents being used in the process – themselves potential contaminants.
The premise was therefore simple: Get some stainless steel
to use as a budget stand in for a spacecraft surface, contaminate it, see if
SPME (coupled with GC-MS) is sensitive enough to detect contaminants at the levels
of cleanliness required for life-detection missions.
The first version of this study just involved leaving some
stainless steel L-shaped brackets (bought from a hardware store) out in the lab
to collect fallout contamination from the air and also handling them with and
without gloves to see if they picked up anything detectable from hand transfer.
To be scientifically valid a study like this must be reproducible,
so many repetitions of everything being tested are needed, simple, but
monotonous, time consuming work – perfect for an undergraduate summer
internship! Georgios, a 2nd year undergraduate and now co-author on
the final paper, gave up 6 weeks of his summer in 2017 to do this, creating
loads of data for me to work up afterwards. Now initially we thought this first
version of the study was pretty good, however the reviewers had other thoughts.
Reject but encourage re-submission.
The issues basically boiled down to our method being a bit
woolly and bullsh#t (again, paraphrasing). How could we know what was on the
surface to detect and therefore how sensitive the method was if we hadn’t
specifically contaminated it ourselves at a known concentration? We’d basically
skipped the proof of concept stage and gone straight to real-word
testing (well as real as you can get without a real spacecraft)...So yeah, fair enough.
Not having scope to dedicate 6 weeks of lab time to
completely redo the experimental side of this study myself, the project had to
get shelved until the following summer (2018) when I could get a second student,
Yuting, who was keen to get some experience in the mind-numbing, soul
destroying boredom of repetitive lab work.
In the meantime, I took the reviewer comments, which despite
being rather critical were all very valid and helpful, and developed a whole
new method for testing the sensitivity of this technique. This was to be much
more scientific, creating a whole range of solutions of astrobiologically-relevant
contaminants to contaminate a surface with much better-defined properties (although
it was still basically just a steel nut).
Once again, the student project seemed a success. Yuting
produced a shed load of nice replicate data over the summer, which I turned
into a completely new manuscript. None of the data set from Georgios’s original
experimental run even made it into the new work, and after a few weeks of
tidying up and writing we re-submitted.
Again, however, the reviewers didn’t quite agree, while they
did think the method was now (mostly) sound, they didn’t agree the results were
as promising as we did and wanted more and better data. Almost annoyingly this
wasn’t a rejection this time, there was now a time pressure involved with a
re-submission deadline. I could have ignored it and waited another year, but
there was an end in sight, a way to kill this thing. Jon just needed to work
some magic to tweak the mass spec settings to decrease the noise and make the
data more convincing. Unfortunately, this meant I now had to repeat all the experimental
work with the new settings myself, replicating a whole summer student project
in about 3 weeks.
This was not fun, but it worked.
Now at the end of it all, it is clear that without the
multiple knock backs and the intermediary time periods to just think about how
to improve the methodology, this study would’ve been pretty rubbish. This is definitely
a case where the review process has greatly improved an original idea and has shown me that rejections don’t always have to be a bad thing, they can be an opportunity.
However, this only took 3 years to take down, I’m not sure how I’d feel if this
had grown into some 5 or 10 year, or even career-spanning, monster.
Maybe that’ll be the next quick project…
No comments:
Post a Comment