Subzero sequencing


Tl;dr: The Russell Glacier outlet of the Greenland Ice Sheet. Some 200,000 square kilometres of ice within the Russell catchment drain to here. Vast, and the most important contributor to sea level rise, working 30km inland of this point in winter means sequencing DNA in an environment that teeters daily between sublime and violent.

As I write this I am thawing out in Copenhagen after participating in an expedition on the Greenland Ice Sheet. This is the fifth time I’ve worked on the Ice Sheet, but the trip had a major twist. We would be working outside of the melt season. This meant camping in temperatures of -10ºC to -30ºC and potential for even hurricane force winds coalescing to create significant windchill and risk of frostbite. Our task necessitated working in these conditions: One simply does not go exploring inside drainage shafts in ice sheets while they are actively carrying meltwater. It would make for a very short inquest.

While the overall story, results and images from the ice camp are under embargo, so beautiful aurora and tales of folk crawling from tents trashed in hurricane-force storms will have to wait.

Here I’ll write about some of the technical considerations and challenges encountered.

Working with PhD students Melanie Hay and Aliyah Debbonaire and science leader Joseph Cook, my goal was to lead the characterization of microbial communities sampled on the trip, using nanopore DNA sequencing. Of course, we’ve done this kind of thing before, including in field camps on Greenland as part of the Ice Alive filming. Neverthless, transferring our experience to sub-zero conditions in very remote, on-ice camp would present some challenges. Ewan Birney recently wrote that analysing genomes is routine in the way that the US Navy routinely lands planes on aircraft carriers.  Highly trained people with the right kit make it achieveable, but it is not easy.

Now try landing the plane in a whiteout, without reliable power, in temperatures as cold as your freezer at home.

So, in preparing for this project I identified three main obstacles:

1. The cold.

I had to anticipate temperatures as extreme as -30°C.  This poses two specific problems.

Regular laptops do not like such temperatures. Nanopore-specification computers are not cheap enough to be considered disposable. While there are ruggedized laptops, the high spec required to run a MinION (1 TB SSD, i7, >16GB RAM etc) lies in what I thought was a non overlapping part of the Venn diagram. Fortunately, with the help of Gwen and Simon from Aberystwyth University’s Information Services department, we found and procured a mil-spec, ONT spec laptop. Capable of functioning at -29°C and storage at -50°C, running Windows 10 with its linux subsystem this laptop would drive our MinION and be our on-ice bioinformatics platform (running albacore, canu, porechop, sintax, kaiju, centrifuge, prokka and others). I’m typing this blog post on it now, and it did its day job really well.

Critically, MinION flow cells cannot be allowed to freeze. To do so lyses the membranes holding the nanopores. This is why they must be stored at +2-8°C. From experience with ambient-shipped flow cells I knew there’s some tolerance at the upper end, but it would be game over should the flow cells drop just a few degrees beneath their storage envelope. After weeks of trials with ever thicker styrofoam boxes and chemical warmers for shipping tropical fish in our -25ºC cold lab at Aberystwyth I remained dissatisfied. The exothermic iron oxidation reaction in the warmers provides steady ~38ºC warmth for 72 hours in normal conditions. However, as temperatures nearby reach freezing, the chemical reaction fizzles out. This means there is a cascading failure mode to the system which can only be overcome by piling in too many heat packs which would cook the flow cells.  So, taking inspiration from the Sourdough prospectors, we went low-tech.

The only sustainable source of uninterrupted warmth for storing these flow cells on a subzero expeditionary project turns out to be humans. When wearing down insulation gear, there is a gradient from skin temperature to freezing or below. So, in the chest pocket of a Buffalo smock I carried a mini Peli case with five nanopore flow cells, monitored by a temperature logger run via bluetooth from a smartphone. This typically afforded temperatures +8-16ºC in the day time when covered by a down jacket. For comfort at night I would store the box loose inside an expedition sleeping bag. After half-hourly checks of temperature the first night in camp, I found I could store the flow cells confidently at +16-22ºC, warm enough to survive the occasional filling of the pee bottle in night time temperatures of -15ºC and likely colder. Although uncomfortable and unseemly (not ideal for a photoshoot for a major fashion company…) this system worked well, as I had a vested interest in staying warm and fed. The only time low temperature alarms were triggered were on the final morning while digging the camp out in severe windchill. By then it was time to get on the helicopter and much of the project was in the bag.

MinION runs themselves did not suffer unduly from the cold. Placing the MinION on a thawed cold pack with one heat pad inside a small NEB styrofoam box maintained 20°C nicely as an ambient temperature, while the ASIC had no trouble holding a steady 34.0°C.

Furthermore, freezing temperatures did not affect any of the lab procedures. We used PowerSoil kits as described in our preprint, thawing reagents with warm water. We also used the freeze-dried field sequencing kit gifted us by Oxford Nanopore, but I’m confident that ligation sequencing kits would have worked well too. After all, we all store all the kits and enzymes at -20ºC normally – and simply gentle thawing with warm water or hands would suffice.

Sadly, the one bit of kit which did not work was our miniPCR cycler. It detected the ambient temperature was too low, and decided not to play. This is a shame as it’s an otherwise useful bit of kit. So, we resorted to low tech again. Holding transposase to achieve room temperature, and then killing it with hot water in an insulated mug made for good libraries.

Metagenomic library prep using freeze-dried consumables.


2. Mains power

Power in camp would be provided by two 1kw generators. The supply of fuel, and as it turned out, their reliability, meant we could not operate on mains power throughout. For DNA extraction this meant we used a terralyzer and a dremelfuge as contingencies should the generators fail.

While our laptop was good for 6-8h battery time running a MinION in -25ºC conditions, we sought a contingency which could permit sequencing with real-time basecalling from battery power. We’re very grateful to the team at Oxford Nanopore Technologies for providing access to the MinIT, which uses a GPU to support fast basecalling. Our MinIT arrived at the last minute before departure but performed really well.


Kangerlussuaq: Even a former Cold War bomber base can look nice in the evening sunlight.

Initiating Rolex Laureate Joseph Cook into the nanoporati back in the “Kanger Centre” (a kitchen table in Kangerlussuaq) after our return to using both MinIT and freeze dried library preps was fun. Joe has just been listed on San Miguel’s Rich List of people rich in life experiences – surely metagenomics in a hostel must be up there…


3. Logistical fail

At the heart of it, using a MinIT, MinION, iPhone, Dremelfuge, Terralyzer, PowerAdd battery and lyophilized library prep kits it’s possible to do DNA extraction and sequencing with battery-powered kit that fits in a handbag (a MinBAG? Sorry – it’s been a long week…). Counting just the gear needed for sequencing, this weighs 2.2 kg.


Is a MinBAG a thing?

But possible does not mean sensible or sustainable. To achieve this project, Melanie and I have travelled with >100 kg of gear on a variety of trains, planes and helicopters.This joined well over 700 kg of gear required to sustain eleven people on the Greenland Ice Sheet for two weeks.

Of course our haul includes things like ice corers and sleeping bags and wetwipes, but packing up the equipment, consumables, computers and camping gear chosen for the job in the first week of the academic year is not to be undertaken lightly.  This became apparent two ways.

Rock > Paper > Scissors > Cello > Ice corer. Firstly, no sooner than we were at the departure gate for our flight from Birmingham, it transpired that a good few of the other seats on our small plane were occupied by an orchestra. We could see tuba, cello, xylophone being loaded but the bags of all passengers without a connecting flight were dumped on the tarmac.  We left for Greenland with only hand luggage and dented faith in the airline. Fortunately, our lab gear and most of our field clothing arrived in Greenland in the gap day scheduled between arrival and insertion. However, my cold weather boots did not. They were scheduled to arrive at 1005 local while we were due to be inserted by helicopter from 0940 local.

With only a pair of gaitered summer mountaineering boots to wear and one pair of hiking socks I could stand to lose a toe or two.  Personally, that is a price I might be prepared to pay in the line of duty, but prior occurrence of frostbite counts very badly in polar medical clearance exams. To deploy could be career ending.

Air Greenland worked a quiet miracle to hold the helicopter back and get the bag. They get the bag containing these boots to the helicopter flightline within fourteen minutes of the Airbus reaching the terminal. My toes and I are grateful.

This served as a good reminder that no matter how carefully planned a trip is, with overnight layovers between flights and contingency gap days, factors beyond your control can still mess up the logistics. A field project can be doomed to fail before it’s even left.

The other logistical slip up was well within my responsibility. Before departure, I had ordered a fresh Qubit DNA quantification kit. Fluorometric quantification using Qubit is the easiest way of verifying DNA extraction, and that you have acceptable yields in ligation sequencing prep steps after bead cleanups. On my checklist of 164 items to be packed and transported for this project, it was listed as “Qubit hsDNA kit”. Unfortunately, different storage conditions and the inclusion of SYBRSafe (a similar orange-DMSO dissolved stain in an identical tube) in the kit list meant I did not spot the Quant-it reagent was missing. This became apparent at the end of the first batch of DNA extractions.

We worked the problem – substituting Quant-it with the SYBRSafe, visualizing the assay cocktails on the BlueGel – to no avail. In the end I decided to sacrifice the flow cell with the lowest number of active pores, a five month veteran with <600 active pores brought as pore fodder, to estimate the DNA concentration. From the low occupancy it was clear we could do with concentrating the DNA, so the following morning we did – with wonderful results. Attention to detail is essential, but with many competing cognitive demands before deploying, cross-checking inventories is vital. In future I should inventory at the component level for any item which is from an opened packet.


So, in summary: While it is possible to fit everything you need for in-field DNA sequencing into a £7.99 insulated cool bag from Tesco with room to spare, doing so reliably in remote environments requires considerable preparation and contingency.


Acknowledgements: We are grateful to the support provided by Zoe, Rich and Rosemary at ONT, our primary sponsors Moncler, and Aberystwyth University’s Interdisciplinary Centre for Environmental Microbiology’s “MoulinEx” grant. I thank expedition leader Francesco Sauro and his Miles Beyond exploring team; Joseph Cook at the University of Sheffield; Melanie Hay, Aliyah Debbonaire, Alun Hubbard, Ian Keirle and Mike Rose at Aberystwyth University.

From 16S to the bigger picture

Content warning: This post contains strongly mixed feelings about 16S. Also a COI disclosure at the end. May contain traces of kitome.


Tl;dr: Our view of genomic diversity in a microbial community (left) subject to 16S analysis is limited to high resolution of a sub-region (centre) using standard workflows, but Nanopore sequencing of 16S (and indeed the whole rRNA operon) offers agile, error-prone long read data. Hopefully we can gain a (slightly) more transparent view of the bigger picture from using Nanopore to sequence 16S.

I’m responding to interest from Peter van Heusden and Sophie Nixon in using Nanopore sequencing for 16S analyses reply to our Radio 4 Nanopore exploits.

Some tweets will not cover this issue in sufficient detail, so here’s a rantpiece. Hopefully I will address the betamax/VHS or bluray/DVD issue but there are bigger questions to address in this area.

(For avoidance of doubt, we didn’t perform 16S for the Today programme. We did a shotgun metagenomic profile. Of course, 16S is not metagenomics. This is consequential for what follows.  I will also use “16S” as shorthand for “high throughput amplicon sequencing of a well-loved marker locus, for example the 16S rRNA gene or its product, 16S rRNA.)

We need to talk about 16S. It’s an old friend we think we know intimately, but we make some bad assumptions about them, lie about them to ourselves, and trash them endlessly to our peers. The problem is ours, not theirs. I did ponder whether I should tag this post NSFW – Not Safe for Woese – but I figured that would be crossing from cheeky to rude. I should acknowledge that there is so much we have learned about Archaea and Bacteria – even the fundamental existence of the Archaea alone – from 16S analyses that there will always be value in looking at 16S.

In recent months though,  microbiome product vendors have offered us wildly varying opinions ranging from the negging of 16S and its uncritical praise from “the leader in microbial genomics“. What’s a PI to think – let alone a member of the general public parting with their money?

Fortunately some in the academic community offer us a more nuanced interpretation. I have attended a brilliant (and funny) talk by Alan Walker on the myths and truths of microbiome analyses. Mick Watson’s team have published a series of robust recommendations on the technical choices to be made when confronted by the madness of the microbiome.

That many of their recommendations (e.g. “treat all samples alike”, “yeah, have controls”, “think about the power required for your experiment”, “be wary of interpreting across studies with different methods”, “work with the best quality samples you can”) are the bread and butter of training new scientists in any field at the undergraduate or high school level in How To Do A Science makes me wonder if the madness lies not in the microbiome, but ourselves. I fully expect to be backed up a study, somewhere showing a correlation between Bradyrhizobium relative abundance in the brain microbiome of PIs and the tendency to skip fundamental aspects of experimental design.

So, on the basis that we should be doing those things anyway (or providing equally robust, well-supported arguments as to why they would be inappropriate in your specific experiment) I will instead focus on why we perform 16S analyses. I guess a catch-all is that we wish to obtain a picture of our desired microbial community. From that picture we may wish to infer its phylogeny and make some (educated?) guesses about what the community does.

In my limited time (2006-2018) looking at microbial communities we have considered the following assumptions appropriate for achieving this aim with 16S:

  • That the denaturing points of fragments of 16S with different %GC content thrashed through an urea-soaked sliver of delinquent jellyfish with the capability to resolve maybe the 30 most abundant different denaturing points when picked up from a dirty lab floor, jigsawed back together and scanned in was OK.
  • That PCR amplifying 16S genes, cloning and sequencing a few dozen to a few thousand of them to represent the estimated ten-to-the-Brian-Cox number of genotypes in a sample was OK.
  • That PCR amplifying 16S genes, cutting them with an enzyme or three and then running them through a capillary to see the size of the first fragment in an amplicon was OK.
  • That PCR amplifying a little bit of 16S and “massively parallel sequencing” it a few thousand times per sample was OK.
  • That PCR amplifying a different bit of 16S and Illumina sequencing it a few more thousand times per sample was more OK.
  • That not providing experimental replicates for our samples because our budgets were smaller than our ambitions was OK.
  • That publishing papers describing microbiomes from low-biomass samples comprising taxa described in Table 1 of Salter et al (2014)  was OK.
  • That ever more intensive data processing to make the leap from correlations between 16S relative abundances to causation driven by genes in unsequenced regions of volatile genomes was OK.
  • That sequencing  cDNA of 16S to provide an caveat-free view of an active community (whatever “active” meant) was OK.
  • That highly sophisticated data processing will provide high resolution insights to not just the whole 16S gene but presumably the parent genome from exactly determining the sequence of a few hundred bases will be OK.

Over that decade I would expect each of these assumptions has jammed the desks of editors in any number of microbial science journals with papers based on them, followed by papers critiquing them and then another raft of papers highlighting newer, better methods. This is what we think of as progress. Papers, and hence careers, are built on it.

The funny thing is (to me) is that I have a sample set, obtained in 2006, which has provided congruent results on DGGE, T-RFLP, clone libraries, 454, and most recently Nanopore sequencing. The results are congruent with analyses of other samples from the same habitats done using Ion Torrent V1-V3 and Illumina V4. It even includes some of the Salter et al (2014) dirty dozen taxa, for they like oligotrophic, UV-stressed cool waters irrespective of whether they lie in a bottle of EB or the Arctic. (Orthogonal analyses e.g. FISH, culture etc) support that claim).

This, to me highlights some deeply shocking key truths:

  • Every method has its limitations.
  • Every method has its uses.
  • Our job is to apply these methods while recognizing their limitations.
  • In many cases, the limitations of 16S override the limitations of the method used to study it

And in that final point, it is worth remembering that 16S as a locus is far from ideal. It’s slow evolving, changing slower than virtually any process we’ve linked changes in 16S to. In the time taken to achieve a 3% divergence used to call an OTU, I expect we have gone from arguing about how to use stone tools to how to sequence stools. The 16S gene is also sometimes incongruent with taxonomy. Universal primers…aren’t. And so on. But, thanks to the pioneering work of Woese and all those inspired by him, if we are to pick a single locus to sequence, in 16S we are afforded well established tools and databases to work with. So even as we make a transition towards routine genome-resolved metagenomics it has its place.

After all, until we obtain single-contig genomes from every taxon (individual?) across orders of magnitude changes in abundance typical in uneven, complex microbial communities – now there’s an obvious challenge for PromethION – we are in the business of making extrapolative statements from incomplete data. We need to handle that uncertainty confidently.

So, what about Nanopore 16S sequencing? (Finally! The point!)

Caveat: I have nothing in the peer reviewed space on this, as I’ve yet to resubmit a paper rejected on the grounds of insufficient novelty – the core body of work is summarized in this talk at the Nanopore Community Meeting in New York*.

Well, it’s just the latest in the string of methods which will be used and abused in the name of 16S sequencing. There are some game-changers though. This is not VHS/Betamax or Bluray/DVD. It will prove to be either VHS/DVD or Betamax/Bluray (you decide what that means).

Specifically, Nanopore 16S offers conspicuous advantages

  • It is portable. Your 16S lab fits in your daysack.
  • It is fast. (Our first 16S run totally trashed live basecalling on our ONT spec laptop. While MinION can do 2.3Mb long reads without realizing it, sequencing short 1.2kbp, clean amplicons at 450 bases per second is turbocharged)
  • It can provide tens of thousands of useable full length 16S reads per barcode in a simple, kit-based multiplexed run within minutes/hours.
  • It does not require access to capital infrastructure or service providers.

But, what of its limitations. Insurmountable? Nope.

  • Error rate (will someone please think of the error rate! Somebody?!)
  • Established bioinformatic tools can’t handle ONT data. I have to treat every read as an OTU. (Here’s looking at the progressive, clever community of people that are developing e.g. QIIME, Mothur, vsearch).
  • The current 16S barcoded kit from ONT only has 12 barcodes. (There is capacity for more, and DIY barcodes work well in our experience). For off-the-shelf users this limits the size of your study, and given the quick throughput, the cost-effectiveness of a much longer flow cell life.
  • Reviewer 3 problems. See my comments above regarding the rate of change and opposition in the 16S analysis community.

These four challenges are soluble in my estimation: for the phylogenetic signal obtained by full length sequencing of 16S helps amortize the impact of error rate. Hopefully, there is ample motivation to provide bioinformatic solutions to this problem. If you have the dry skills to solve this, and need wet data – please contact me if you’re interested in collaborating.

At the moment I’ve aggregated 16S data to class/family but there is clear potential for genus/species level resolution. However, is this problem unique to Nanopore? How many papers present the same five figures – map/study design, phylum/class stacked barchart, PCoA, CCA/RDA, network diagram using Illumina data, focusing on such coarse taxonomic resolution.

Think outside the box: And here is the final, potential game changer. With Nanopore, you need not stick to 16S. The question is not “which region of 16S?” or even “can you do full length 16S?” or even ” why not do ITS too?” it is “wouldn’t it be rude not to do 23S while we’re at it?” This paper by Lee Kerkhof’s team illustrates this, and the value of single-amplicon 16S-ITS data is highlighted in a real world, Sanger based study I coauthored.

So – 16S is dead, long live 16S?


*Conflict of Interest – disclosure: My registration and travel costs to present this work were covered by ONT, and ONT have kindly provided reagents for outreach work including 16S analyses.





Zits & buried bodies – Nanopore for Radio


A couple of weeks ago a colleague asked me if we could do something on Nanopore sequencing for BBC Radio 4’s Today programme as it was due to broadcast from Aberystwyth. Thanks to prior experience with sequencing for the media, I said yes on the basis I could pick what we should sequence. Here I’ll summarize something of the process, outcome and implication.

Our brief was to extract DNA, prepare libraries, sequence and analyse soil metagenomes from two soil samples. This aligned well with the programme’s themes: nestling among serious and timely issues of student mental health, University funding and Brexit lay discussion ofthe implications of Brexit on food security. Soil health is an important aspect of that discussion. My intention was to shine a light on the typically hidden microbial diversity of soil as a serious-yet-not-so-serious way of highlighting the potential of portable metagenomics.  We were to compare garden soil samples provided by the host, Justin Webb and from my Vice Chancellor.

What did we do?

In about ~15 minutes of sequencing per sample, we generated 70-74,000 raw reads which were basecalled and profiled using kaiju. Kaiju is an on-line and standalone taxonomic classifier that uses protein level matches between all potential reading frames in your DNA read and the reference database. We’ve found it to be fairly accurate when used with Nanopore data, and both quick and reliable in returning results. Important considerations for taking DNA sequencing to the flagship programme of a national radio station. In all, from sample to insight took ca. 2h35 from the programme’s start including DNA extraction, library prep, sequencing and analysis.

You can hear me talk about this at 1h44 and 2h57 here.

What did we learn?

Some headlines:

In each sample, we could assign the reads to 6,369 taxa (Aberystwyth) and 6,461 (Justin Webb) from ca. 74,000 and 70,000 raw reads respectively. These range between species level classifications (as defined by Kaiju) to reads left assigned at phylum. This means Justin’s compost has marginally more detectable biodiversity than soil scant metres from a UNESCO Man and Biosphere Reserve. To my eye, this raises an important point about microbial biodiversity, its biogeography and its conservation. What we think we know from plants and animals doesn’t always fare well in translation.

Thanks to a 3 AM soil pH checkup last night I knew we had soil samples with pH 4.62 (Aberystwyth) and pH 5.57 (Justin Webb). There’s a litany of papers describing soil pH as a potential key driver of soil bacterial diversity, through a range of direct and indirect mechanisms. Here’s a fantastic example from Rob Griffiths at the Centre for Ecology and Hydrology. A map of the UK like no other.  It was no surprise the Aberystwyth soil was about twice as rich in assignments to the Acidobacteria phylum as Justin’s compost. Notably, there were 60 discrete taxa within just one of the classes of Acidobacteria. There’s still a lot we don’t know about Acidobacteria. I think I would be oversimplifying if I just said they are good at tolerating acidic conditions, perhaps it is more to do with dealing well with other things that come in the wake of low pH.

On air I described Justin’s soil as being home to many “common or garden bacteria”. By this I mean there were many members of Alphaproteobacteria, specifically Bradyrhizobiales. In that regard his soil was home to some of the most prevalent bacteria in soils. Often the bane of low-biomass studies, here we were dealing with > 1.5 micrograms of soil DNA so it seems legit.

Justin seemed strangely preoccupied with the idea I could tell where the bodies were hidden in his garden. There is some science to this idea. I think his secrets remain safe, but I could identify that almost 1% of the DNA sequences from his samples were assigned to Propionibacterium acnes. As a typical skin commensal of adolescents and adults this may well reflect the unorthodox sample container Justin’s compost arrived in…

One of the nice things about helping to shine a light on a poorly-lit area of microbiology is that you can hop between the inane – zits and such – and the serious. I could also flag the presence of Streptomycetes in both samples. Hardly unexpected but worth sharing the knowledge that these are powerful sources of many of our antibiotics. We desperately need more antibiotics, and it was great to wrap up by bringing the great plate count anomaly and the potential of metagenomics for antibiotic discovery to an audience of rush hour travellers across the UK.


This may well be the first time anyone’s tried to broadcast DNA sequencing live. It’s a niche pursuit, but one with strong outreach potential. If you’re thinking of trying this, or a broadcaster approaches you, here are some things to consider.

Ensure you are very familiar with the  technology and protocols you will use. Either stick to what you use in other contexts, or trial extensively. Don’t change too many variables in one go. With the recent update to the highly attractive MinKNOW2 we opted to roll back to MinKNOW 1 late last night to stay comfortable.

Don’t lose control of the overall science. Be explicit about what you can and can’t do for technical, ethical or taste reasons. You will need to identify a decent narrative, and that’s easier to do if what you did isn’t the distant cousin of a meaningful analysis.

Conversely, recognize that what you are about to do is science communication, not a research protocol. This may mean compromises, oversimplifications and behaviour generally not conducive to Reviewer 3. I once had great advice from colleague Professor Alun “Mr Frozen Planet” Hubbard in the context of Greenland filming. Put aside any notion you are doing anything scientifically worthy when doing stuff with the media. Anything you do manage (e.g. in camp metagenomics while filming ICE ALIVE) is a bonus.

Try to take with good grace any scientist who then thinks they can play Reviewer 3 in your timeline in response to a technical or scientific choice you’ve made. Enlightened colleagues know better than to do this.

Simple is good. David Eccles rightly points out that at the simplest, strings of ACGT are neat. As scientists we may obssess about N50s this and that. Nobody else cares. Show us the story. The zits not the fastqs.

Contingency is great.  We tried to have a contingency for all steps and two contingencies for critical steps.  I had two PhD students with prior experience of performance sequencing as helpers. This meant we could parallelize our task and provide mutual support. We did a complete run-through from sample to insight last night, in slow time. It meant a late night for Aliyah and Andre and an all nighter for me, but it gave us confidence we could deliver in real time.  We could also switch to Blue Peter mode if everything failed, thus achieving the primary goal of communicating some science about portable soil metagenomics at the expense of technical authenticity.

Doing a slow-time run through flagged an issue with RAD004 rapid library prep and our first sample, and so we kept with the one-pot ligation protocols with ridiculously short incubations today to maintain good strand/pore ratios.  On the day the only major changes were to the times of the interviews.

16th of May, afternoon
Get garden soil samples (collected from our VC and from Justin Webb)
Set up working area as an exhibition stall in the recording venue (our arts centre)
16th of May, evening into night – Arwyn’s lab
Extract metagenomic DNA from 6×0.25 grams of each soil sample, quantify and pool as needed. This should give us buckets of DNA to play with.
Prepare Josh-style one pot ligation libraries in duplicate for each sample (1h), run one per MinION. Stop at a minimum of 50,000 reads per sample.
Convert to fastq.gz and upload to to generate metagenomic profile (estimated time: 30 minutes)
Prepare a briefing document for Justin on interesting species/stories

17th of May- BROADCAST DAY -Arrive at Arts Centre at 0500h for 0600h show start.
050X:Helpers (Aliyah Debbonaire, Andre Soares) to do rapid soil extraction on 1 sample each (ET: 35-40 minutes)
Note: BBC happy for ambient noise, but TerraLyzer is ridiculously loud. Do bead-beating outside.
Arwyn to prime flow cells, qubit, help etc.
Backup: if DNA extraction poor, use pre-extracted material.
0650: Arwyn on air with Justin. Explain about microbiomes and portable sequencing.
0700: RAD004 library prep and load flow cells.
0715-0800: Sequence!
Backup: use aliquot of LSK library.
Backup2: we will have the lab processed data.
0800: stop, use live called fastq for kaiju (ET: 25-30 minutes).
Backup: Andre will have Kraken on site in event of network trouble.
Backup 2: We will have the lab processed data.
0840: stop, prepare for return to Justin.
0850: Reveal on air. Pick out a few species of interest.
0900: Show off air.
0901: Chill out and clear up.

Your schedule for achieving technical steps will not figure heavily in the priorities of a producer. Plan accordingly. Provide a contingency step-in for when someone has to be interviewed, for example. Recognize hurry-up-and-wait is inevitable.

Finally, enjoy it. This is not a life-or-death application of the technology, so relax.

Thanks to: Justin Webb and BBC Radio 4 for making this really easy for us, Oxford Nanopore Technologies for kindly providing the key consumables needed in very short order, Aberystwyth University’s comms team for NASA-like can-do and efficiency and in particular Aliyah Debbonaire and Andre Soares for maintaining the best of humour while working very long hours. I’m a lucky PhD supervisor.

The gist of between glaciers and genomes


This week I’ve been at the Microbiology Society Annual Conference in Birmingham and had the opportunity to present in the Microbial Diversity & Interactions in the Environment session in Tuesday. My theme was From glaciers to genomes…and back again and built around exploits with the Oxford Nanopore MinION in the last year. Cedric Laczny asked me to provide a summary of what was said and done for non-attendees – so here it is.

So far, we’ve mainly used MinION for in-field metagenomics, but I decided to only mention this as a scene setter. Our larger problem is that Earth has 70% of its freshwater stashed in glacial ice occupying 11% of its surface area and the genomic diversity microbial inhabitants of this (by volume) massive freshwater ecosystem is very poorly mapped. I believe we have fewer than 10 public bacterial genomes, less than five cyanobacterial genomes, no eukaryote or archaeal genomes and a slack handful of amplicon or shotgun datasets to cover ca. 198,000 glaciers and three ice sheets. It’s embarrassing to chat to folk in the medical microbiology community: last night I passed a poster reporting >10,000 Salmonella genomes.

As we are embarking on an unprecedented experiment in destroying glaciers and because microbes are confounding factors in that experiment it seems prudent to start discovering this genomic diversity. I have tried to engage the research community to gauge interest in a genuinely communal effort to sequence as many genomes as we can afford to, but the response has largely been muted or inclined to consolidate the project at one institution. If anyone who reads this is interested in a network based collaboration – I am all ears.

But, when starting from a low point, even incremental advances can be transformative. So for now we’ll go with me and my MinION.

The scope of my talk covered the behaviour of just a few cyanobacteria associated with cryoconite formation. My plans to show the Chris Hadfield-approved zoom shot into a Greenland cryoconite hole was scotched by IT issues – the lack of a computer mouse!


Nevertheless, in ten minutes I needed to cover cyanobacterial sequence diversity in cryoconite on a timescale from ~12,500 years before present to Tuesday before last – literally. I need to write a separate update about the Walters Kundert “Bleakest Midwinter” project, but for the moment I’ll just say we hit our bag limit on Svalbard cryoconite samples in the “light winter” phase this March and preliminary MinION metagenomes and qPCR on our portable Mic cycler are prompting interesting hypotheses about who lives and who dies between dark and light winter. I’ll look forward to the final phase of sampling so I can then batch the samples for other analyses.

The core part of my methodology has been to resolve these genomes from shotgun metagenomes sequenced on the MinION. I’ve been multiplexing 3-6 samples per flow cell using Josh Quick’s one pot barcoding ligation protocol. In contrast to the local norms of the parish I find myself in I am not able to bask in the glory of closed, single contig genomes formed from ultra long whale reads – but then again I am working with degraded, old (ancient?) bead-beaten DNA so my expectations were adjusted downwards from the start.

Nevertheless, I have been able to learn some interesting things. As this needs to be worked up into a couple of papers, I won’t delve into too much before peer review.

In short, following error correction and assembly with Canu, and then a binning strategy based on the taxonomic classifier to select contigs with good protein-level matches to taxa of interest I do have bins corresponding to discrete bacterial genomes.  Quick annotation with prokka throws up both interesting metabolic traits and the prospect of strain resolution. Where we have non-targeted metabolomics data from the same samples, the presence/absence of pathways matching the salient metabolite fluxes  is quite gratifying. For one of my genome bins which matches a non-cyanobacterial taxon where the evolution of a particular autotrophic pathway is an unfinished business. I can tick off genes for their presence or absence in full concordance with the sequenced isolate, a bacterium with a large and complex genome which was reportedly very difficult to assemble on short reads only.

But it is with the rRNA operons that I’ve been having most fun. Agreement between the binning and rRNA operon taxonomy is excellent. Last year I co-authored a study led by Takahiro Segawa which used a retro, Sanger based long read strategy to resolve contiguous 16S-ITS environmental sequences from cyanobacteria on a global range of glaciers. I think the ITS haplotype data in that study offers the highest resolution and spatial coverage of diversity across the terrestrial cryosphere, so I have simply been looking to see where my MinION metagenome-assembled genomes have matched the Segawa haplotypes. The good news is that the 16S genes match the expected 16S OTUs and the ITS haplotypes extracted from my genomes lie within the geographic clades for the population structures of those OTUs. So – seems legit.

Cyanobacterial haplotypes from pre-modern cryoconite cyanobacteria either published by Takahiro Segawa or in our possession also sit within the extant strains from those regions, hinting at the stable colonization of the cryosphere over extended timescales.

Personally I would hope this observation of congruence between pre-industrial and contemporary cryoconite ecosystem engineers helps make abundantly clear that the old chestnut “dark stuff on ice is simply pollutants” is utter dog toffee.

As always when presenting nanopore data someone is duty bound to ask the question “but isn’t the error rate terrible”?  This is based on the observation that the accuracy per base of uncorrected, raw reads is in the range 85-92%. Fair one. But we are not playing with these reads, we are playing with error-corrected assembled data.

While I can point to the efforts of folk who do multiple rounds of polishing with racon or nanopolish, and their dissatisfaction with 99.9x% accuracy for this initial effort I have only used error correction with canu. For one thing I am wary of citing CheckM statistics as remaining indels will likely undersell the estimation of completeness. For now, as the only hook I have to hang these genomes on is the ITS strain data, tracking the rRNA operons has been my goal. One of the next stages might be to look across the genome so I will be polishing and/or going hybrid as funds permit.

So what’s the error rate like in this context? Here’s a quick test: Dr Nathan Chrismas has kindly provided his BC1401 isolate of Phormidesmis priestleyi which represents the first cyanobacterial isolate genome from glacial ecosystems, sequenced with Illumina reads. For cyanobacterial genomes, one can usually read them as “metagenomes” as a number of cohabiting bacteria are difficult to get rid of, even in a unialgal culture. In his paper, Nathan devised a clever bioinformatics strategy for doing so…rather than hammering the culture with say, bleach (sorry Nathan). But, when resequencing the isolate from culture the presence of contaminants obviously recurs. So, turning the negative to a positive I can consider the ONT sequence data from BC1401 a reduced complexity, P. priestleyi dominated metagenome. This starts to sound like the real thing, covered by a similar sequencing effort, but for which there is a 213 contig genome sequenced with Illumina data to act as a benchmark. Long story short, across the 4.8kbp rRNA operon I obtain a 98.92% identity between the canu corrected and assembled ONT data and the Nathan’s Illumina assembly.  Not 99.9999% but fairly respectable for a first pass.

In summary, given the potential strain level resolution and functional insights from genomes recovered from metagenomics data generated on an USB powered device I get the feeling our MinION will be just as handy in our home lab as well as our field lab.

My thanks to collaborators Dr Joseph Cook, Dr Sara Rassner, Professor Andy Hodson and Professor Alun Hubbard for their contributions to this work in the varied form of samples, fieldwork and scripts for pulling out interesting contigs, and to the session organizers for the opportunity to speak.

Ice Alive

On Friday night I had the pleasure of attending and presenting for the launch of Rolex Young Laureate Joseph Cook‘s documentary Ice Alive at the Royal Geographical Society.

Narrated by The Life Scientific’s Professor Jim Al-Khalili and presenting former commander of the International Space Station Colonel Chris Hadfield, UK Polar Network President Archana Dayal, Dr Jenine McCutcheon and Dr Andrew Tedstone from NERC Black & Bloom, the film explores the connections between life and ice.

Set against a backdrop of stunning imagery from Greenland and Svalbard the documentary makes an arcane research field come alive to a much broader audience than I would have ever thought possible. In Hadfield’s words: in an increasingly complex world, knowledge may be your only means of survival.  Communicating our science clearly to the world in this way helps pack our survival kit for the coming decades.


The evening opened with an “audiovisual exploration” of the Greenland Ice Sheet, linking Joe’s UAV imagery with Hannah Peel‘s original composition.  The exploration covered ground ice I know well as AWS S6 and RG outlet but manages to conjure the sublime and convey both the power and vulnerability of the ice. Watching it full screen and with your speakers to 11 is to be recommended.

On Svalbard in August I had the pleasure of meeting Leverhulme Trust Artist in Residence Naomi Hart while I was kicking off my own Leverhulme research fellowship. We formed an unlikely partnership – she needed someone with a gun to cover her while she explored the ice, I needed a field buddy while I did what she describes as “cutting edge science with teabags and Danish coins”. IMGP0951

Naomi on the ice

Naomi’s project is to document the links between ice, climate, life and coal on Svalbard from the days of Eric Rugnose Brown to Andy Hodson as heads of geography departments in Sheffield University. Our view of life on ice has changed from Rugnose Brown (“of course the ice is devoid of life”) to Andy (e.g. the seminal review: Hodson et al [2008] Glacial Ecosystems, Ecol Monographs).  Naomi presented these links in a compelling talk and exhibition.  From Naomi I’ve learned the way of the artist and the scientist are often closer than we think: both build upon experimentation to find and communicate truths.

And of course, I couldn’t let the evening pass without sticking my oar in. As the current RGS Walters Kundert Arctic Fellow, and speaking at one of the great homes of exploration, my theme focused on the exploration of Earth’s microbial frontiers. My contention is that exploration is far from a done deal, and we face not the final frontier, but rather fractal frontiers. Here’s a snapshot.

I’ve likened this to the coastline paradox: the more we discover, the more there is to explore.  For my part, these frontiers are microbial, and our current cutting edge is to access the genomes of the microbes while on the frontier. I spoke about how the public health crisis represented by the Ebola outbreak in West Africa stimulated me to think about how we are tackling the environmental health crisis of Arctic warming, leading to in field DNA sequencing using Nanopore devices. I have little doubt that as our lenses sharpen yet again there will be more to discover.

My heartfelt thanks to Joseph Cook for the invitation to attend and present, to Proudfoot Films for an awesome documentary and to the Rolex Enterprise Awards and for making the event possible.

No bucks, no Buck Rogers

A few years ago, a friend gave me a copy of The Right Stuff, following a conversation over a brew in Greenland about the philosophy of working on challenging frontiers, and the kind of people it attracts. Joseph Cook has also written about the parallels between astronauts and Arctic scientists so I’ll not explore that here. Moreover, I’m several inches too tall for a ride on Soyuz, and my personality has far too many rough edges to take this kind of thing on breakfast TV with such flawless grace. Let’s not even talk motion sickness. It’s fair to say I therefore have no aspirations to be an astronaut!

Instead, I’m going to write about something even harder than becoming an astronaut. Funding a career in science.

A more recent read in the genre of astrobiography recalled lines from two Mercury 7 astronauts in the film of The Right Stuff:

Gordon Cooper: You know what makes this bird go up? FUNDING makes this bird go up.

Gus Grissom: He’s right. No bucks, no Buck Rogers

The relevance to Arctic science could not be clearer.  Any future in research has to be fundable. No bucks, no Buck Rogers.

In December I spent some time helping a talented younger scientist debug a fellowship application. I could see some great potential in the science and the applicant which wasn’t coming through clearly on paper in the drafts I looked at, early-on. You may have The Right Stuff, but no bucks…no Buck Rogers.

It occurred to me that in this era of enlightenment such topics as the basics of grant-craft make it on to graduate school curricula. It seems not. Things may not have changed since my own graduate training.

I have a chequered history with grant capture. Sure, there are many projects where I have been proud to secure funding from challenging sources (e.g. NERC, Royal Society and Leverhulme) as a PI in my late twenties/early thirties but this has often felt like more luck than judgement.  My boss made clear that while there was room for academic freedom, there was no room for failure: I should expect to bring in grants and 4* REF papers, building a group from the bench and pipettors I used for my PhD. The sooner the better. A kind colleague gave me photocopies of the case for support for two highly rated NERC grants, describing them as following the formula for success. The rest is the product of tolerant coinvestigators (ca. 2010: “Arwyn, have you fecked [Full Economic Costed] this grant yet up yet?” “Mind your language!”), tenaciously supportive research administrators and (usually) constructive comments from peer reviewers. It still feels like I’m breaking in to academia, with the sirens wailing in the distance.

So today I was pleased to abuse my position as the Director of an Interdisciplinary Research Centre to convene a workshop titled Your Fundable Future with its affiliated graduate research students. I reached out to the students past the middle of their PhDs. Their futures may well depend on funding sooner, rather than later. Hopefully I could pass on just some of the stuff I wished I knew at their career stage but have learned from hard knocks since.

As we handled real grant proposals (I’ve hitherto failed to gain funding for!) as part of some of  the exercises, I won’t detail the process further, but here are some of the resources which I’ve found helpful in trying to figure out the funding game.

It helps to start from a strong position with your science.

Mick Watson’s tips for early career researchers (less about grants, more about being able to submit them from a position of strength)

You will need to write clearly – your audience comprises clever people with no time or special interest in your pet project

Tim Clutton Brock – Perspective on grant writing (some technicalities dated in detail, but principles highly relevant)

Grantsmanship – Mark Pallen (a good use of 1h30+)

Writing Science (Ronseal!)

You will need to learn the rules of the game.  Ensuring perfect adherence to eligibility is important (including every detail of submission format).

Ten simple rules

Humans decide who gets funded on the basis of subjective interpretations of seemingly objective reviews, panels, criteria.

Unconscious bias – your reviewers will likely have it

Athene Donald and the ABCs of Panels

There’s other advice too. For example, it’s important to sell your project to the reader within the first few lines of a proposal. If you’ve read this far, that matters less though.







No Dramas

My recent grump-piece about field safety has returned to the foreground of my thinking as National Geographic chose to further publicize an incident from 2014 in which a scientist self-rescued from a crevasse in a snow-covered glacier in the Himalaya. By the scientist’s own admission he was alone, therefore unroped and visibly without the typical personal protective equipment (e.g. helmet).

To his credit, the scientist was able to climb out of the crevasse. Others in his situation are not so lucky, and the consequences don’t end in the crevasse: criminal charges are being brought against the employer of an Antarctic helicopter pilot who died in a crevasse fall. Some may have the savvy to extricate themselves from a crevasse, but all is for none if you’re jammed in hard, too deep or too severely injured.

I may be criticizing from an armchair, but I am not entirely an armchair critic. Crevasses are a fact of life and death for people working on glaciers. Glaciers are slot machines, and like slot machines, the odds are in the house’s favour and not yours. I’ve spent entire field seasons end-running or stepping over  slots commuting to field sites, and many trips roped up above the snowline to collect essential samples or measurements.

Here are some snaps from work which will be unlikely to be tagged with “heroic scientists risk life & limb” on any time soon on social media.


“Chris Flynn” preparing a corer on crevassed terrain in the Swedish Arctic


Sampling on bare ice, South Georgia. At this point in the day, the risks from crevasses were nil. But when working on unfamiliar glaciers it pays to wear a helmet and harness racked up for two reasons: firstly, it speeds up a transition to roped work, but also should a fall occur, it offers a much better point of attachment for rescuers than shell clothing. (Photo courtesy Dr Anne Jungblut).




A team of three moving up to the snowline on a Svalbard glacier. The site is well known to us for >10 years, but the exact position of deeply incised melt channels varies, meaning a cautious approach is merited from the start of a new season while the glacier is wet. Recreational mountaineers are unlikely to work in threes, carry the full range of kit or (potentially) rope up on terrain such as this. Different ball game.


Yours truly. Given my dainty proportions I am best placed at the tail end of a team of three. In this position, if I do not hold the fall, build decent anchors and transfer safely to them the consequences are likely to be serious for all.

For each of those trips there is a period of refresher, rehearsal and training. When working with scientists new to glaciers this is essential, but even after a decade I still practice basic skills of tying in with chest coils, moving while roped, self-rescue and building pulley systems before the start of any period of fieldwork which could require travel on snow covered glaciers.

None of these photos illustrate a scenario comparable to 127 Hours or Touching The Void – for a good reason. As such they are unlikely to have a fraction of the “reach” of a drama in the media.  Yes, we all have bad days at the office from time to time, and sometimes photos or video from such days play an useful role as “teachable moments” – be it for training or inquests. However, their promotion on social media to the general public distorts the reality that these are issues of workplace safety, not fallout from adventure for its own sake.

To sum up, I’ll borrow some boldface:

The underlying principle behind all Antarctic travel is the need to apply conservative and reasoned judgements to all decisions. Accidents often result from a chain of events caused by a number of small errors and bad decisions that eventually snowball into a serious situation.

From the BAS Field Operations Manual. One can substitute Antarctic for fieldwork in any number of snow and ice environments without losing the message. In crevasse country, a gram of preparation will beat a kilogram of badass.  The Field Ops Manual is publicly available at the user’s risk (and due to be updated) – but is a good source of reference material on field safety in polar environments, along with the Royal Geographic Society Polar Expeditions Manual. Hardly clickbait, but a starting point for being safer.


A nocturnal upon St Lucy’s Day

At the moment I’m escaping the cold in the UK by working up on Svalbard in the High Arctic. Half way between the top of Norway and the North pole, the current temperatures are hovering around -3*C, in spite of the depths of polar night. The next sunrise here in Longyearbyen will be on the 15th of February, 2018.

My work here is part of a project funded by the Royal Geographical Society’s Walters Kundert Arctic Fellowship which is looking at microbial life on glaciers during polar night, and how changes in the Arctic’s winter climate might affect microbial activities. This is the second trip as part of the project. We currently assume microbial activities on glaciers are limited to active melting conditions in summer, but there is little in the literature to validate this assumption.

A bleak future is also being painted in the media for Longyearbyen’s human inhabitants. I consider the notion of a warm, wet, winter as a “bleakest midwinter” for the region’s microbes, disturbed from their likely hibernation.  Today marks the publication of NOAA’s 2017 Arctic Report Card (Headline: Arctic shows no sign of returning to reliably frozen region of recent past decades) and it is also St Lucy’s day – the year’s midnight, according to John Donne. So, in this post, metagenomics meets metaphysics.


AugustAs shadow, a light, and body must be here. Svalbard glacier surface showing some algal biomass, dispersed cryoconite and cryoconite holes. All active microbial habitats.


December: Whither, as to the bed’s-feet, life is shrunk/ Dead and interr’d. Or not: Same glacier, excavating microbial habitats from under the snowpack.


The general balm th’ hydroptic earth hath drunk: Soaking excavated cryoconite in RNA later for return to the UK.


new alchemy. Meanwhile, using some some spare sample material, I prepared a bucketload of DNA for shotgun metagenomics on the Oxford Nanopore Technology’s new field sequencing kit: a freeze-dried, use anywhere library preparation kit.


A quintessence even from nothingness. Using a flow cell stored for 3+ weeks and flown to Svalbard at ambient temperatures (with >1200 active pores) I had a go at sequencing the DNA. Compared to out UK trials of the same kit, the results were poor, providing only hints at the microbial community structure.

My thanks are owed to the Royal Geographic Society for funding, and Professor Andy Hodson of UNIS for hosting.

Dead scientists don’t write papers

This is an important article on fieldwork safety written by Elizabeth Orr. It opens with a vignette from her PhD fieldwork in the Himalaya, describing trouble on slippery ice and then a river crossing gone wrong. I’m always very leery of river crossings: from another lifetime I recall a statistic that UK Special Forces have lost more people in water than in anything else they do – so not a trifling risk, but often part of the commute to the office for researchers working in remote environments.

The author notes a disconnect between geoscience fieldwork in principle and in practice. This troubles me.

So is it acceptable for graduate students to be sent into the cold as academic cannon fodder?

Thoughts? I’ve had a few.

JAFA syndrome

Rightly, much of the responsibility to train safe researchers falls to the shoulders of PIs to lead by example and make effective arrangements, but again, in my experience, the skills, qualifications and experience of PIs can vary. Can we expect (busy) academics without any formal training themselves and varying levels of skill and experience to set out an adequate scheme of training and operation? In my field I’ve encountered several principal investigators with respectable publication records which would indicate experience in the field but (as an example) are unable to use crampons to access a dry glacier. When working in a glacierized environment is such a person competent to evaluate the risks, own and mitigate them for their trainees? No.  Equally, palming the responsibility for their students off to other researchers or field stations presents complications. All fun while the Nature Geoscience papers are writing themselves, but makes for a messy inquest when it all goes wrong in the worst way.

Hardly a season in a remote field station is complete without some entirely predictable disaster befalling one research team or other: bad weather, bad logistics, bad company. Under the pressure of watching a costly field season circle the drain, bad decisions are made, pushing the window on weather, sleep/food or logistics. This is when Dr Murphy pays a visit. Better to have Plans B-Z and stick to them – factoring in weather from the start. This advice comes from someone who cut short his honeymoon to deliver a field project, only to spend three weeks waiting for conditions that never improved. It’s part of the package of fieldwork: best learn early.

So, to the pushy PI – I’d argue unsafe data is unethical data and thus unusable data. For the hard-headed, here’s a very pragmatic point of view: if you’re sending students who are borderline hypothermic and nurturing frostnip to collect the next n=2000 sample set, is the execution of the protocol you drafted in an air-conditioned office going to be of the same rigour and attention to detail as you would demand? I doubt it. PIs need to factor in safety at the heart of a fieldwork programme for their students.

Learn and live beats live and learn

Almost any institution will offer courses to grad students on everything from R to referencing (useful, no doubt) but the topic of stayin’ alive and working efficiently in the field is rarely on the environmental research curriculum of any institution I’ve heard of*.  Certainly in my own graduate education there were no opportunities for such training. It fell to my own initiative to upskill. In the UK, NERC has sponsored an advanced training short course for polar science students, but its availability is limited and the content highly condensed.

It’s a good start: but it only targets early career researchers, perhaps on the imperfect assumption their seniors have evaded Darwinism long enough to learn.

When my opinion has been sought, I have made the case that specific training for Arctic scientists should be mandatory in the way it is for Antarctic researchers upon deployment. Considering the many pathways and venues for Arctic science make it a relative free-for-all, this is more sustainable than insisting upon field guide “minders”.  Submitting a grant with >xx% polar north fieldwork? Enter your training certificate number in the box on Je-S (other grant submission interfaces are available) and prove your competence against a set standard.

Funders should care about this issue at this level: well trained people reduce the risk of fieldwork failure. In accepting a Royal Geographical Society Arctic & Mountain Research Fellowship recently I was pleased to note their insistence on detailed vetting of fieldwork plans and risk assessments.

It’s the day job

Predictably, people who build careers on fieldwork often love the outdoors and may be very accomplished in the realm of outdoors sports. This can be extremely positive, bringing a lifetime skills and experience to the table. But for new fieldworkers who live for the outdoors, it can also present problems adapting from playing hard to working hard.

Acceptable risks and concessions to safety in your own time may take on a very different legal and practical complexion in the workplace. Now, the goal is not adventure, but to deliver on (often publicly funded) science safely and effectively. If having an adventure is your primary goal in seeking a career in field science, get out and then get out there. You will not be satisfied by using science as a vehicle to quench your thirst for adventure, nor will you necessarily approach decisions in the field from a professional perspective.

Human factors

Notable by its omission is any discussion of fieldwork harassment. Sure, this is a crucial topic to address in its own right, but a working environment where harassment occurs is not a safe environment by definition. I do not seek to diminish the importance of preventing and confronting sexual harassment by stating that unacceptable behaviour in field settings is a problem which occurs between many demographics. Even from my level of privilege as a white European male, a harmful experience as a student still deeply affects how  I interact with anyone I meet in the field a decade later. I can only imagine the impact of behaviour as extreme as recent allegations of misconduct in Antarctica on an early career researcher.

Perhaps we could learn from history. The history of Antarctic exploration is not a shining example of promoting equality and diversity (e.g. the ice ceiling) but I recall a “Golden Era” Swedish expedition forced to overwinter as its ship sank. The standing order from the expedition’s commander was that everyone’s first duty was to be kind to each other. Simple, but powerful in achieving harmony within a cramped, dangerous environment.

Be the change

While the article’s fifteen point set of recommendation contains very good advice, I think we need a smarter approach at all levels.

We all had a good laugh at #fieldworkfail (well I did, right up until my employer’s PR officer got in touch to ask why my stumbling into a cryoconite hole on Greenland was trending in Germany) but for every minor embarrassment and tale of derring do there is a deeply unfunny tale. Light hearted books have been written about #fieldworkfail, but Elizabeth Orr’s article is one of the few contemporary attempts to address the topic to reach my radar.

A few years ago in pre-deployment training for an Antarctic project, an instructor asked what the participants considered to be an acceptable fatality rate for such work. Estimates from the class reached as high as 3%. The same organisation was achieving nearly those rates annually until the mid 1980s. Since then they have avoided all but one fatal accident. A remarkable change. How? By making everyone responsible for their actions, and in particular the leadership. Simply, leaders lead. Everyone else follows.

We may never achieve 0%, but the onus is on all to put safe work at the heart of their field agenda.



(*Correction: Archana Dayal kindly reminded me UNIS puts staff and students through fieldwork safety training. As a general point “Stayin’ Alive 101” isn’t on the graduate curriculum of many larger institutions as far as I am aware)


It’s hard to believe a year has passed since I last wrote up some personal highlights from the Nanopore Community Meeting in New York. Nevertheless, it’s been a year which has marked considerable progress within the world of nanopore sequencing – and in the ways we use nanopore sequencing in our lab.

Summaries of day 1 and day 2 are available here. This year’s meeting was packed with many, many talks from the user community and relatively few announcements of dramatic new tech directions. It seems right now is the time to polish many of the emerging strengths of nanopore sequencing.

To be honest, what is already being achieved is extremely impressive. Consider antimicrobial resistance as a case in point – a major societal challenge. Existing approaches take time. Time can cost lives, as bluntly demonstrated by Charles Chiu’s slide on time to effective antibiosis in sepsis. Several speakers showed how nanopore sequencing changes the dynamic: Patricia Simner reported detecting carbapenem antibiotic resistance genotypes within minutes of sequencing. I expect the next frontier is cost: where the tests are more expensive than many commonly prescribed antibiotics, the impact will be diluted. Low cost, application specific nanopore tools may well change this economic balance.

I think the highlight of the conference for me was that during an interactive plenary, discussion moved from technology itself to its societal implications of everyday genomics. If technologies such as nanopore sequencing are to be transformative for society, in what ways do we all wish society be transformed? Would we opt for our children to be sequenced at birth, and what would that mean for insurance? Or a future where your genome could be sequenced from a handshake and resynthesized in a genome foundry. Hypothetical for now, but the convergence of simplified sequencing, powerful computing and synthetic biology in a biohacker’s lab is one to watch.

This year I had the privilege of presenting in the microbial breakout session once more, reporting on our underground sequencing work. In two experiments in a coal mine, we demonstrated microbial identification using our lightweight metagenomad kit without benefit of power or internet. This experience has been useful in mounting our Arctic campaigns this summer, using rapid library preparations to characterize the microbial communities of the Greenland Ice Sheet while camped at the ice margin, achieving species level ID of cyanobacterial ecosystem engineers, and 16S rRNA gene sequencing on Svalbard to look at community responses to habitat changes on the glacier surface. Having shown in field metagenomics from sample-to-preprint publication within the likely doubling time of a glacier microbial community in 2016, it’s high time for me to consolidate a body of work in this area.

Cramming a little bit too much into the talk, I also presented our results from the lyophilized field kit on metagenomic sequencing. Until now the cold chain has been a significant constraint on deep field sequencing: essentially, in-field sequencing has moved the cold chain from sample return to lab to the deployment phase. It’s not going to be a problem for long. Using flow cells stored at +20*C for six days and freeze dried library reagents I was able to sequence a metagenome within two hours of Oxford Nanopore Technologies releasing the protocol. Although the “field site” was my kitchen table rather than the glacier – I’m very happy with the concordance between the field kit, nanopore rapid libraries and Illumina data from the same microbial community. I’ll post more about the lyophilized kit when I return to the UK, and release some data from a second experiment.