H.-J. Lüdecke, a member of a German climate skeptic group who had an interesting paper published in Climate of the Past last year has a paper in Climate of the Past Discussion awaiting peer review. As with the previous paper, it uses spectral analysis to make inferences about the Earth’s climate.

Lüdecke et al 2015 seek to detect de Vries’s ~200-year solar cycle in three annually resolved palaeoclimate time series that are in excess of 2000 years long, and the reconstruction of solar activity from Steinhilber et al (2012). Of course they find de Vries’s cycles, so what matters is whether their methods are reasonable. I may have misinterpreted what Lüdecke et al have done, so please correct me if I have got it wrong.

Rather than simply running a Fourier transform on the raw reconstructions, they first process the data. They fit a linear regression to a window containing the first 100 years of data to find the slope and multiply this by 100. They move their window along by one year and repeat the analysis. This gives them the moving rate-of-change of the proxy in units per 100 years.

This moving rate-of-change is a filter that alters the spectra of the data: high frequency variability is suppressed. It can be reasonable to use a filter before spectral analysis, but there are good filters and bad filters with generate lots of spurious spectral signals. Which is Lüdecke et al’s?

I’m going to generate some white noise and apply Lüdecke’s method to it.

library(gtools) z<-1:100 x<-rnorm(2000) #x<-arima.sim(list(ar=.7), n=2000) x2<-running(x, fun=function(a){coef(lm(a~z))[2]*100}, width=100) spectrum(x)#Flat sx2<-spectrum(x2) x11(4,height=6) par(mfrow=c(2,1), mar=c(3,3,1,1), mgp=c(1.5,.5,0)) plot(x, type="l", xlab="Time", ylab="Value") lines(x2, type="l", col=2) plot(sx2$fre, sx2$spe, xlim=c(0, 1/10) , type="l" , log="y", xlab="Frequency", ylab="Spectrum") abline(v=1/200,col=4)

In the time domain, Lüdecke-filtered white noise shows considerable centennial variability. In the spectral domain, Lüdecke-filtered white noise shows a spectral peak near 200 yr. A de Vries cycle in white noise? With red noise (ar(1) = 0.7) the result is similar.

Lüdecke et al construct conﬁdence intervals for their spectra from analysis of 10 000

random time series with the same lengths and Hurst exponents as the proxy series. In principal this ought to correct for any artefacts from the filter. However, it is not clear that Lüdecke et al apply their filter to the simulated data. I say that first because they do not state that they do, and second because their confidence intervals lack the oscillating structure that is apparent in the periodiogram of Lüdecke-filtered white noise (this structure becomes much clearer when several periodiograms on Lüdecke-filtered white noise are averaged together).

z<-1:100 n100<-replicate(100,{ x<-rnorm(2000) #x<-arima.sim(list(ar=.7), n=2000) x2<-running(x, fun=function(a){coef(lm(a~z))[2]*100}, width=100) sx2<-spectrum(x2) sx2$spec }) x11(4,4);par(mar=c(3,3,1,1), mgp=c(1.5,.5,0)) plot(sx2$fre, rowMeans(n100), type="l", log="y", xlim=c(0,0.1), xlab="Frequency", ylab="Spectrum", ylim=c(10E-4, 22)) lines(sx2$fre, apply(n100,1,quantile, prob=.95), col=2) abline(v=1/200,col=4) legend("topright", legend=c("Mean spectra", "95% of distribution"), col=1:2, lty=1, bty="n")

If the spectra of the surrogate time series have not been treated in the same way as the proxies, the confidence intervals will be invalid.

Another curiosity in Lüdecke et al’s analysis is the padding of the data with 25000 zeros. This will allow them to have a much higher spectral resolution for the figures, but will not give more information.

The paper goes on to fit sine curves to the filtered data, and make projections for the next century (cooling of course), but if the first part of the paper falls, the whole paper falls.

Lüdecke, H.-J., Weiss, C.O. & A. Hempelmann, A. (2015) Paleoclimate forcing by the solar De Vries/Suess cycle. Clim. Past Discuss., 11, 279–305.

Neat. I wondered exactly the same thing. In the “confidence interval” section they do something interesting too.

It looks, to me, like they generate 10000 synthetic series, filter them then calculate the correlation with the sine curves derived from the actual proxy series. This is rather a steep hurdle for the synthetics to jump because the sine curves were chosen to maximise the correlation with the actual proxy series. The synthetics could be out of phase, or have a slightly different period and hence a much lower correlation.

Imagine, as an example, what would happen if you calculated the correlation of one of the other actual proxy series with the since curves derived from another – the correlations would have to be lower and this in a series in which the authors identify a “significant” periodicity.

A more sensible way to calculate the confidence interval would be to process the synthetics exactly as if they were the original series all the way through to the correlation calculation. I imagine the correlations would then be a good deal higher for the reasons you note above – the running filter has a weird spectral response.

But as you say, if they fluffed their first move…

Cheers,

John

“This moving rate-of-change is a filter that alters the spectra of the data: high frequency variability is suppressed.”It doesn’t just suppress the HF. The spectrum rises linearly for low frequencies, then tapers for periods longer than the regression line. So it is a band pass filter, selecting frequencies that peak a little below the freq corresponding to the regression period. So yes, it’s a generator. Recent post here.

Its spectrum is the red curve in the plot below. Period 10 years, but it scales. The other curves are regress with tapering.

This is likely a symptom of deep boredom, but I pushed a time series of volcanic forcing for the past thousand years through the filter. Gives a cycle of damn near 200 years (four and a bit full “cycles” in 900 years).

http://www.cru.uea.ac.uk/cru/projects/soap/pw/data/model/echo-g/echo-g_forcing.htm

From which we can conclude… that the De Vries/Suess cycles cause volcanoes. Obviously.

This correlation of solar and volcanic forcing over the last millennium is a massive pain when trying to understand climate change over this time. So many papers conclude that solar forcing drove climate variability, neglecting the volcanic forcing which has the same general pattern.

Well I’d like to see Figure 3 as a log-log plot, as that’s standard EE practice. Any time I see a linear-linear plot I’m immediately suspicious (Particularly if the spectra are smoothed where two successive single point peaks are separated by a single point trough. If you can’t plot the raw spectra, that’s a non sequitur, in my book, at least).

As to the 100-year (i. e. 100-point for annual data) OLS FIR filter, it does give one a slope (1st derivative) which you can numerically integrate to get a somewhat ‘lumpy’ low frequency time series). AFAIK (at least in my experience), all FIR filters display nodal behavior, so for the time series where n (2000) is an even multiple of the filter length m (100), you will get m/2 -1 = 49 nodes, the 1st node is at ~1.5/100, ~2.5/100, ~3.5/100, … , ~49.5/100. The OLS FIR filter also has about the worst normalized filter characteristic of any filter I’ve ever seen for estimation of the 1st derivative (there really is no pass band or stop band to speak of, the entire curve is one big transition band). If I were to use a FIR filter it would be something like LOESS and then use FD stencils to compute the 1st and 2nd derivatives.

But given a long uniform-in-time series, my 1st choice will always be a 2-pass Butterworth filter. At least that filter has a monotonic frequency characteristic. Another bonus is that you can apply FD stencils BEFORE the filter stage (in DP or QP), these will look as noisy as all get out, but after filtering, you get the exact same answer as one would get if you filtered 1st then do the FD stencil thing. Downside is the auto correlation introduced, this usually means down sampling (decimation) below the half point of the filter.

Doing stuff like PCA is WAY above my pay grade.

My basic complaint about filtering data though, is the potential for selection bias introduced via the filtering process itself. And in that regard, this ‘paper’ looks like a sure wiener (you just gotta love the bounded periodicity approaches, ice age for sure, my fitted sine curves say so).

From Tables 2&3 we have periods in years of: 186.0 189.4 200.8 202.2 184.0 189.9 199.7 204.8 203.0 228.1 230.9 215.5 232.1

their extrapolating a cycle over little ice age now? can’t tell what they’ve done so went on to find some music to go with it: https://www.youtube.com/watch?v=roB4hRdLlas&list=RDp65WOKJUfu4&index=15

Richard, are you going to expand this and send it as a review

I’m waiting to see what the reviews say. If none of the reviewers (or other commenters) make the points I’ve made here, I will send something to COPD.

You may be on deck. Ref 2 gets part of it but not all.

Yes, ref 2 is not impressed, but doesn’t show teeth. I’ll wait for the authors to respond – they replied to ref 1 fairly fast.

Time to show your teeth. Past time actually

Sharpening knives (and trying to finish the analysis for my EGU poster next week)

OK now I am confused. They write that they use the OLS filter to analyse the data, but then they don’t use it. The spectral analyses (fig 3-5) are on the raw data.

The deed is done.

Seems the paper didn’t survive!

“This discussion paper has been under review for the journal Climate of the Past (CP). A final paper in CP is not foreseen.”

Indeed – and I’m not surprised. The only question is, will the authors squeal that the scientific establishment conspired to suppress their revolutionary ideas or admit their paper was junk?