There is no scientific misconduct …

The youngest patient was two years old when she died three months after Professor Paolo Macchiarini’s experimental surgery to replace her trachea. Since then, Macchiarini has been accused of scientific malpractice. An external investigation found that he reported in a paper that ethics approval had been obtained when it had not, misrepresented patients’ outcomes in several papers, and other issues rated as malpractice.

The vice-chancellor at the Karolinska Institute, where Macchiarini worked, decided that Macchiarini acted “without due care”, but that his behaviour “does not qualify as scientific misconduct”.

That might have been the end of the matter but for the work of journalists. Vanity Fair reported Macchiarini’s plans to marry the producer of a TV documentary about his work in a ceremony officiated by Pope Francis at the Pope’s summer residence with guests including Vladimir Putin and Barack Obama. In reality, the Pope plans took him to South America instead of the wedding of the already-married surgeon. Macchiarini’s CV was only slightly less fanciful. Sveriges Television alleged that some Russian patients Macchiarini operated onwere not ill enough to warrant such a risky procedure.

The misconduct investigation into Macchiarini’s work was subsequently reopened: he was fired (and is under investigation for manslaughter by Swedish prosecutors); and the vice-chancellor resigned, as did several eminent scientists, including Nobel-prize judges.

Macchiarini’s work hastened the deaths of several patients, yet until pressurised by the media, the Karolinska Institute was prepared to overlook misconduct. There is no scientific misconduct so severe that distinguished scientists might not seek to ignore it. How can we ensure that university investigations into research misconduct (or indeed other types of misconduct) are thorough and fair, and as importantly, seen to be thorough and fair? Quis custodiet ipsos custodes?

Posted in Misconduct, Uncategorized | Tagged | 1 Comment

Bob Irvine’s zombie paper (hide the tin foil)

A couple of years ago, I criticised a paper by Bob Irvine published by WIT (a publisher on Beal’s List of possibly predatory publishers). Shortly afterwards, the paper was retracted with the editor writing “I have now received the result of a peer evaluation carried out urgently yesterday on the paper you brought into question, and have decided to withdraw it from our eLibrary.”

Recently, a commenter told me that a revised version of the paper was now published.

I don’t know what’s changed in the paper as I didn’t keep a copy of the paper and the journal does not disclose what changed. But I can compare the abstract. It was one paragraph. It is now four paragraphs. The text is otherwise identical (except that “greenhouse” has been mis-corrected to “Green House”). The declaration that Irvine does not understand climate models is still there

Most Coupled Model Intercomparison Project Phase 5 (CMIP5) climate models assume that the efficacy of a solar forcing is close to the efficacy of a similar sized Green House Gas (GHG) forcing.

As are the tin foil experiments.

If the paper was bad enough to merit retraction two years ago, it really isn’t clear why it merits publication now.




Posted in Fake climate sceptics | Tagged | Leave a comment

Data archiving at The Holocene: policy and practice

When I read a recent paper in The Holocene,  I wondered, the way one does, if the data were available, and turned to The Holocene’s submission guidelines.

SAGE [the publisher] acknowledges the importance of research data availability as an integral part of the research and verification process for academic journal articles.

The Holocene requests all authors submitting any primary data used in their research articles if the articles are accepted to be published in the online version of the journal, or provide detailed information in their articles on how the data can be obtained. This information should include links to third-party data repositories or detailed contact information for third-party data sources. Data available only on an author-maintained website will need to be loaded onto either the journal’s platform or a third-party platform to ensure continuing accessibility. Examples of data types include but are not limited to statistical data files, replication code, text files, audio files, images, videos, appendices, and additional charts and graphs necessary to understand the original research. The editor can also grant exceptions for data that cannot legally or ethically be released. All data submitted should comply with Institutional or Ethical Review Board requirements and applicable government regulations. For further information, please contact the editorial office.

The policy is not as strong as I would like – it requests rather than requires – and the first sentence of the main paragraph is difficult to parse. What is missing is a requirement for a data availability statement. Nature announced yesterday that this will be required in their journals.

But how well is the data archiving mandate followed? I’m going to look at the latest issue of The Holocene, and see if data has been archived for the papers. Note, I don’t know when this policy came into force so this test of compliance might be unfair if papers were submitted before the policy was announced.

The issue contains twelve research papers: one paper has most of the data in tables within the paper; four have supplementary online material; none link to third-party (e.g., figshare or datadryad), institutional or personal data repositories.

Unfortunately none of the supplementary online material are currently online (8th September). This is a tiny bit hopeless as the papers have been online since April. It would be so cunning to publish the supplementary material at the same time as the paper, ideally with a link from the PDF.

The absence of the supplemental material means that I cannot tell if they archive data or not. Even being optimistic about their contents, there is less than 50% compliance with the data archiving mandate.

It doesn’t have to be this way. The Journal of Ecology had a 93% data archiving compliance rate in 2015.





Posted in Peer reviewed literature | Tagged , | Leave a comment

Czarny Staw Gąsienicowy

Whenever I see a lake, especially one as beautiful as Czarny Staw Gąsienicowy in the High Tatras, I wonder about what hypotheses could be tested with a sediment core.


The lake, in a glacial cirque 1624 m a.s.l., is 51 m deep. At that depth, a Kajak corer or a micro-Kullenberg corer would be the obvious corers to use. Both are line-operated, the former would be good for sediment cores up to 50 cm long, the latter has a piston which improved sediment recovery when collecting longer cores (perhaps 2m). Fortunately, that is probably about as much lacustrine sediment as there is in the lake. The lake could be cored from the ice in spring (I’ve never done this – I was supposed to core a Finnish bay, but the ice was too thin), or from a small boat.

There are, not surprisingly, several palaeoecological studies on Czarny Staw Gąsienicowy and neighbouring lakes in the High Tatras (this list does not pretend to be complete – please add any I have missed in the comments).

Sienkiewicz and Gąsiorowsk (2014) take short cores from Czarny Staw Gąsienicowy and two other lakes and investigate the diatom stratigraphies over the last millennium and use EDDI to reconstruct nutrient status. The two other lakes have tourist cabins in their catchment and show eutrophication (the Secchi-disc depth is 12 m – these are not your pea soup eutrophic pond).

Gąsiorowski and Sienkiewicz (2010) investigate diatom and cladoceran stratigraphies from short cores from two lakes south of Czarny Staw Gąsienicowy (not the lakes in Sienkiewicz and Gąsiorowsk (2014)). They infer recent acidification-driven change in the stratigraphies following earlier climate-driven changes.

Kubovčík and Bitušík (2006) examine the chironomid response to pH changes in three lakes with different susceptibility to acidification from the Slovakian side of the Tatra. The best buffered lake shows no change in chironomid assemblages, whereas the least buffered lake has a large change in assemblage composition and a large drop in chironomid abundance.

Šporka et al (2002) investigate several proxies from Nižné Terianske pleso in Slovakia. Spherical carbonaceous particles give a good indication of the timing of atmospheric pollution. The pigment record is ambiguous (as it often is): downcore changes may be driven entirely by diagenesis, but there may also be a signal from changes in trophic state revealed by the other proxies. There was “no clear relationship between chironomid assemblage and temperature change”. Diatom and chrysophyte assemblages appear to have been influenced by acidification and perhaps also by warming.

On a longer time-scale, Marciniak (1986) presents a diatom stratigraphy from a 3 m-long core from Przedni Staw Lake that starts in the Older Dryas. It is a very descriptive paper, the likes of which would be difficult to publish now, and mentions some previous work on the same lake, including Cladoceran analyses.

There are also pollen analytical work including Rybníčková and Rybníček (2006) and Kłapyta et al (2015).

An overview of limnological studies in the Carpathian region, of which the Tatras are a part, is given by Buczkó et al (2009).

Posted in Uncategorized | Leave a comment

A comment on Lyons et al is published

Alas it is not the comment that I helped write. But hopefully that will be published soon.

To recap, Lyons et al (2016)  report that that the proportion of species pairs that are aggregated (i.e. co-occur) rather than segregated began to decline in the mid-Holocene, coinciding with the spread of agriculture across North America. Concluding that the organization of modern and late Holocene species assemblages

differs fundamentally from that of assemblages over the past 300 million years that pre-date the large-scale impacts of humans.


Bertelsmeier and Ollier dispute these findings. Their first objection is that Lyons treat their proportion data as if they comes from a Gaussian distribution. This is an entirely reasonable point to make.

A proportion of 50% of aggregated species could have been calculated either based on one aggregated and one segregated species pair, or based on 100 aggregated and 100 segregated pairs. The reliability of the estimate is clearly not the same. In total, 44% of the proportions are based on 5 or less species pairs from assemblages with several thousand random species pairs.

By using a Gaussian rather than a binomial distribution, Lyons et al give more weight than can be justified to data sets with few significantly aggregated or segregated taxon-pairs where the proportion of significant pairs is inherently uncertain. They also risk that predictions from their model will escape the zero to one range of proportion data (this is guaranteed to happen if the model is extrapolated).

Bertelsmeir and Ollier find that if the breakpoint analysis is re-run using a binomial error distribution, there is no breakpoint at 6000 years BP. Instead, a breakpoint occurs in the very recent data points.

In their reply, Lyons et al naturally object to this. They point out, correctly, that the taxon-pairs are not independent. If there are n taxa, there will be 0.5n(n-1) taxon-pairs and each taxon occurs in n-1 pairs.

Lyons et al find that Akaike information criterion (AIC) provide much stronger support for a breakpoint analysis if the data are assumed to come from a Gaussian distribution (74.6) than using a binomial distribution (637.0). Lower AIC values suggest better models.

Lyons et al conclude that this means their preferred model is better, but it is at best a poor approximation of the error in the data. A better strategy would be to deal with the over-dispersion in the data caused by the non-independence of the taxon-pairs. This could be easily be done by using a quasibinomial error distribution which relaxes the assumption about the relationship between the mean and the variance in the residuals.

Unfortunately, a breakpoint analysis on a GLM fitted with a quasibinomial error distribution does not converge. Even if it had, it would not have an AIC and so could not be compared directly with the Gaussian model, but it would in principle be a much better model. Alternatives for dealing with over-dispersion in proportion data, such as a beta-binomial model, would probably require some effort before the segmented package breakpoint analysis would work with them.

We don’t discuss the problem with the choice of a Gausssian model in our comment. With the word limit, we were restricted to what we saw as the most critical problems (inappropriate dataset selection, including duplicate datasets; pathological behaviour of the breakpoint analysis; and biases in the proportion of aggregated taxon-pairs with dataset size), and wanted to keep the rest of our analysis as close to Lyons et al’s methods as  possible.

In their reply Lyons et al, write that

Bertelsmeier and Ollier argue that datasets with only a few significant pairs should be excluded because those estimates are unreliable.

Except that I don’t think that B&O argues this. They do argue, as shown above, that data sets with few significantly non-random species pairs have less reliable estimates, but not that they should be excluded.

B&O’s second argument is that the temporal extent of each data set in Lyons et al’s analysis is, perhaps not surprisingly, correlated with its age. B&O suggest that this could cause biases in the proportion of aggregated taxon-pairs. I don’t find this argument any more compelling than Lyons et al’s argument that disturbance causes an increase in aggregated taxon-pairs. To me, aggregation and segregation look like different sides of the same coin. If you increase one, you increase the other and the proportion of each stays the same (of course, biases in the numerical methods may create patterns).

Lyons et al

stand by [their] original analyses and conclusions.

Interestingly, one of the original authors did not join the reply. I wonder if publishing a paper that concludes

Because aggregated and segregated species pairs may be shaped by similar processes both can be used to infer processes of community assembly.

which would appear to contradict Lyons et al, had anything to do with this.

Posted in Uncategorized | Leave a comment

The Humpty Dumpty theory of palaeoecology

“When I use a proxy,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean – neither more nor less.”

“The question is,” said Alice, “whether you can make a proxy mean so many different things.”

With apologies to Lewis Carroll

Just as I was beginning to run out of things to write about in the chironomid reconstruction from Lake Żabińskie (I still want to write about figure 2 from Larocque-Tobler et al and ask why it is missing 18 lakes), along comes another chironomid reconstruction from Lake Żabińskie, this one spanning the last 1000 years (Hernández-Almeida et al 2016).

You might expect that a 1000-year August air temperature reconstruction would be published in its own paper, as it is sure to be included in compilations of palaeo-climate reconstructions, but no, it is tagged onto the end of a multiproxy study where most of the time the chironomid stratigraphy is used as an anoxia indicator.

Chironomids are of course sensitive to anoxia (most oxygen breathers are), but it is not clear why the authors believe that the first principal component of the chironomid stratigraphy is an anoxia indicator. They write

In Lake Żabińskie, between AD 1896 and 2011, the chironomids PC1 had a relationship with changes in anoxic taxa

but don’t inform the reader that the relationship is very weak (Pearson’s correlation coefficient = 0.23) and non-significant. Not much of a relationship. The second PC axis, in so much as you would trust any ordination axis of these data, has a somewhat higher (0.37) and significant correlation.

Since the chironomid assemblage is a mixture of littoral chironomids (73%) that live in the oxygenated water above the thermocline and profundal chironomids that live in deeper, perhaps oxygen-depleted water, the relationship between anoxia and chironomid assemblages over time is likely to be complex and noisy.

1000yr reconstruction

Chrysophyte-inferred winter temperatures and chironomid-inferred summer temperatures

Despite the importance of the chironomid record, the stratigraphy is not shown and no reconstruction diagnostics are given. There is no way for the reader to evaluate whether the reconstruction is any good (it is possible that some of this essential information is included in the supplementary online material which does not seem to be available at the moment).

I do hope that the authors have used the correct version of the chironomid stratigraphy,  and that the data will be archived promptly. I wonder what the count sums were?


1000yr PC1.png

First principal component of chironomids, diatoms and chrysophytes. Note the poor correspondence with the reconstructions.

Hernández-Almeida et al also include a chrysophyte-based reconstruction of winter temperature (first presented in Hernández-Almeida et al 2015a), but spend most of the paper using the first principal component of the chrysophyte data as a nutrient indicator. In contrast, Hernández-Almeida et al 2015b, interpret the same chrysophyte stratigraphy as an indicator of calcium concentrations and relate this to May-October zonal wind speed. These two previous papers by Hernández-Almeida et al make no attempt to acknowledge, still less address, these radically different interpretations of the same data. The winter temperature reconstruction follows previous literature and seems to have been planned in advance, whereas the zonal wind reconstruction reads like a fishing expedition.

Of course I understand the motivation for producing more that one reconstruction from a proxy record that took months of microscope time to generate. And of course I fully accept that biotic proxies are influenced by multiple environmental variables and that at different sites different variables might be possible to reconstruct. The problem is in knowing which. This is why well designed palaeoecological experiments are so powerful.

Reconstructing multiple variables from the same data inevitably means that the assumption of transfer functions that

Environmental variables other than the one of interest have negligible influence, or their joint distribution with the environmental variable does not change with time.

must be violated. Juggins (2013) showed that ecologically important secondary environmental variables can severely bias reconstructions. The chironomid anoxia response will bias the temperature reconstruction and vice versa; the chysophyte winter temperature response will bias the May-October zonal wind speed record and vice versa.

Considerable work is needed to demonstrate that a proxy can be used to make meaningful reconstructions of a single variable. Far more is need to make a convincing case for multiple reconstructions: Hernández-Almeida et al don’t even try.

“The question is,” said Humpty Dumpty, “is it publishable – that’s all.”

Posted in Peer reviewed literature, transfer function | Tagged , , , | 2 Comments

Nei, de er ikke geiter

P1050069I knew they were not ibex or ibis, so I assumed they were goats, high (at least it felt high) in the Tatra Mountains. Later I was asked if I had seen any chamois. No. But I I had seen… Wait, what does a chamois look like?

The Tatra chamois is critically endangered, but the population has recovered substantially over the last couple of decades.

These are what they probably eat.


Posted in Uncategorized | Tagged | Leave a comment