All age-depth models are wrong, but getting better

Today at EGU, Mathias Trachsel presented an update to my 2005 paper “All-age depth models are wrong: but how badly?“. He looked at the performance of the Bayesian age-depth models that have been developed over the last decade. Generally, they perform better than the classical age-depth models, but there are some problems setting parameters.

His presentation can be downloaded here.

A manuscript based on the same analyses is almost ready for submission.

Advertisements

About richard telford

Ecologist with interests in quantitative methods and palaeoenvironments
This entry was posted in Uncategorized. Bookmark the permalink.

3 Responses to All age-depth models are wrong, but getting better

  1. Kaustubh says:

    Excellent! Thank you for posting – this was quite appropriate considering what I was working on today. Looking forward to see the manuscript…

  2. ucfagls says:

    Thanks for posting this & the slides Richard, Mathias, Joe.

    A couple of comments related to whether you are comparing apples with oranges regarding CLAM and the other Bayesian methods.

    1) Regarding the point “Not very realistic uncertainty estimates”: The confidence interval for the CLAM models is for the mean of the response. Are the confidence intervals for the Bayesian methods from the posterior on the predicted value? If so, you aren’t really comparing like for like are you? Wouldn’t you need some form of Monte Carlo simulation to get similar prediction intervals from CLAM (simulate from posterior of spline coefficients, plus add on Gaussian noise (assuming that’s the model CLAM fits?), repeat lots of times?)
    2) Regarding “Very smooth accumulation rates”, isn’t that a given because you used a spline, which essentially assumes that the data generating process is or can be approximated by a smooth function? That said, I doubt you can do much with so few observations in the context of a spline model.

    If you haven’t already, you should look at the monotonic constraints on splines in the mgcv package. I have a function that I’ve been using for 210Pb models for this purpose recently which you’d be welcome to have/look at. You can basically force the fitted spline to be monotonic by fitting with gam(), then fiddling with / updating the constraints using the mono.con() function in mgcv before refitting the model using pcls() which takes this extra constraint into account.

    • 1) Yes, agree that the correct interpretation of CLAM and the Bayesian uncertainties are different. But in practice, people treat them as if they are the same. I’m not sure that the correct interpretation of the CLAM uncertainties is particularly useful. The MS will discuss this.

      It might be possible to post-process the results of a spline model to find Bayesian equivalent errors, but no-one does, and it would not be possible for linear interpolation models.

      2) Agree that the spline model is guaranteed to give smoothly changing accumulation rates. That doesn’t worry me as much as the very narrow uncertainty on the accumulation rates.

      3) Agree that it would be easy to add a monotonicity constraint to CLAM for splines, not sure about the polynomial models. It would improve CLAM, but not enough to make it my method of choice.

      Also agree that it would be easy to add a monotonicity constraint to CLAM.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s