Thoughts on Forecasting Timber Prices

17 02 2013

Forecasting timber (stumpage) prices and wood demand relies primarily on the manipulation of historical data and relationships to assess future scenarios and prices.  While the statistical techniques are somewhat agnostic, the development and application of relevant forecasts requires judgment.  How? Context and experience help determine which data may or may not be relevant, what scenarios may or may not reflect reality on the ground, and on what critical assumptions we depend to realize or negate our view of the future.  In sum, stumpage forecasts combine statistics with the smell of sawdust.

Generally, a purely data-driven, “built-in-a-vacuum” forecast that relies solely on historical information assumes a future that reflects the past and differs only by the specific variables of interest.  It sets aside data inconsistencies, technology shifts, demographic preferences and emerging markets.  However, the world of timber prices evolves, adapts and swerves.

For example, in our forecast of pine pulpwood stumpage prices in the U.S. South, we rely heavily on historic macro and micro factors to establish the baselines by state and region.  However, wood bioenergy markets lack the history to influence the price forecast in this way.  So we develop a separate view, based on how we understand demand to affect prices locally, and layer this into the forecast using a combination of quantitative analysis and experiential judgment.  We supplement the effort with multiple tests and scenarios.  Every six months, we rebuild our forecast models and review, test and reconsider scenarios and assumptions.  (After which we toast the gods and have a drink.  Wouldn’t you?) Generally, the amount of testing and supplemental judgment is inversely proportional to the amount and depth of historic data available.

In our business, we analyze markets and advise management on capital allocation for timber-related and wood-using assets. Our views on timber markets and modeling prices must be transparent, coherent and integrated. Models need flexibility and transparency to deal with questions, potential errors and new information. To quote Mellody Hobson, the president of Ariel Investments in Chicago, “If you have a problem, the only way to fix it is if you have a process you can dissect, so that when something is missed you can go back to your source document.”

Advertisements

Actions

Information

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




%d bloggers like this: