INFORMS Online

 
home
about
roster
meetings
more info
Roundtable logo
 
 

INFORMS Roundtable Meeting at Marina del Rey, CA
12-13 February 2006

The Winter 2006 Roundtable retreat was held at the Marriott Marina del Rey following a wonderful event at the J.P. Getty Center at Los Angeles.  Approximately 40 Roundtable meeting attendees traveled 30 minutes from the hotel to the impressive Getty Center.  The retreat included a luncheon hosted by UCLA in the Getty private dining room to recognize the Roundtable for their contributions to the OR/MS profession over the past 24 years.  The group toured the campus with Michael Palladino (lead architect) and Tom Farrell (designer) of the Getty Center.  Brief biographies for each of these architects can be found in Attachment 1.0.

The theme of the winter Roundtable meeting was “Forecasting.”  This meeting was coordinated by Milton Pope for the Roundtable Meetings Committee.

Marina del Rey Meeting Welcome

Dr. Milton Pope welcomed Roundtable members and guests to the Los Angeles meeting.  He previewed the meeting agenda.

Milton introduced George Freestone, the General Secretary for the Roundtable.

Administrative Items

Evaluation forms for the Marina del Rey Meeting were provided to each attendee and George encouraged each member to return the completed form to him during the meeting.  Summary results from the returned meeting forms for Marina del Rey are enclosed as Attachment 2.1.

The Roundtable’s unaudited December 2005 financial statement is shown in Attachment 2.2.

Meeting attendees are listed in Attachment 2.3.

George also encouraged review and approval prior to public Internet posting of the edited San Francisco, California minutes from the Roundtable meeting there in November 2005.

Milton encouraged all attendees to briefly introduce themselves and their affiliation.  He then introduced Dr. Michael Phillips, the evening’s speaker.

Eta™ Analysis: Forecasting and Quantitative Analysis for Investment Banking and Hedge Funds; Led by Dr. Michael Phillips, Professor of Finance, California State University, Northridge, CA.

The slides from Dr. Phillips’s presentation can be found in Attachment 3.1.  Mike indicated that co-authors for his presentation are William P. Jennings and M. Chapman Findlay.  Dr. Phillips also prepared an  Eta™ Analysis for each publicly traded Roundtable firm, and that is included in Attachment 3.2.

Mike introduced the outline for his presentation:

  • A brief tour of the past century of investment forecasting.
    • A discussion of the current schizophrenia in most academic finance departments—and, perhaps, larger corporations.
  • How Eta® Analysis bridges the gap between corporate finance and investments”.
  • Various applications to corporate finance, investment banking, hedge funds, and INFORMS member firms.

A Short History of Investment Forecasting:

Trying to predict good and bad stocks has captivated forecasters for decades.  At some point, as new technology and data become available and as new forecasting methods are created, they’ll be applied to investment analysis.  But investment professionals often have very little use for forecasts.  A popular interpretation of the Efficient Market Hypothesis is that forecasting doesn’t help since all available information is already discounted into asset prices.  Indeed, there have been many academic studies that have “conclusively” shown time after time that excess returns are not possible using forecasting methods. Or have they?

Investors have not always been skeptical about forecasters.  By the late 1920s, there were numerous professional forecasters producing charts, cycles, and specific predictions about the marketplace. The construction of “Business Barometers” were widespread.  But did these things work?  Could forecasting increase profits?  Does forecasting work?  In Volume 1 of Econometrica (July 1933), Alfred Cowles wrote the path-breaking article “Can Stock Market Forecasters Forecast?”  (This article is of also of interest to statisticians because it included one of the first published Monte Carlo studies.)   His research examined the forecasting success of 20 insurance companies, 16 financial services, and 25 financial publications providing forecasts. His database included approximately 7,500 separate recommendations over approximately a 5-year period.  Cowles performed a Monte Carlo–style analysis, selecting possible stocks and then possible actions with the stocks to randomly emulate the “professional” investment forecasters. There was virtually no difference between the “professional” forecasts and the randomly created “forecasts.”  This can be observed on page 11 of Attachment 3.1.    So, in 1933 the Cowles Commission for Research in Economics demonstrated that the leading investment forecasters of the
time did not do significantly better than chance. This triggered a sea change in how investment professionals analyzed securities and made investment decisions, one which largely remains in place today.

But what were these “forecasts”?  Some of the methods used in the 1920s will seem familiar, and we all have seen some of their current realizations, but these are not methods that most of today’s professional forecasters and statisticians would choose.  The primary forecasting methods were graphical and generally trying to capture the time-series properties of a series using cyclical decomposition.  Initially, the analyses looked at measuring recurring time between peaks and troughs or at the apparent correlation between an indicator series and a target price series.  As it was determined that peak-trough analysis was lacking, more complicated cyclical structures were envisioned.  Later, the apparent coherence between multiple series was used to create “confirmations” and “indications.”  W. S. Jevons (1862 and forward) created the first modern cyclical analysis of economic variables by applying graphical techniques to price series. Originally a meteorologist, he recognized that weather charting techniques might describe economic data.  His initial charts with an apparent seasonal effect over time caused him to look at possible climate-related explanations.  By 1875, Jevons was relating the sunspot cycle to apparent peaks and troughs in economic fluctuations, identifying a 10–12 year “Trade Cycle” presumably caused by fluctuations in solar activity. (This spawned a major cottage industry that to some extent continues today.)  In 1876, Ohio farmer Samuel Benner wrote “Benner’s Prophecies of Future Ups and Downs in Prices,” which looked at the “Cast Iron Rule” of prices that “one extreme invariably follows another in all the operations of nature.” (This was the precursor of “Fibonacci Sequence” forecasting, which is still popular with some “technical analysts” of the market.) 

In 1894, Charles Dow created an 11-stock index to track the market.  In 1896, the Dow Jones Industrial Averages were launched, soon followed by the Dow Jones Transportation and Utility indices.  “Dow Theory” has its roots in The Wall Street Journal columns by Charles Dow from 1899–1902 and The Wall Street Journal articles by William Hamilton from 1908–1929.  According to The Wall Street Journal, transportation “confirms” industrials and defines trends.  In 1904, Roger Ward Babson (presidential candidate in 1940 and founder of Babson College) started “Babson’s Statistical Organization” based on the premise that “action and reaction are equal.” Or, “areas of prosperity and depression must be
equal.”  In 1912, Babson published his “Ascertaining and Forecasting Business Conditions by the Study of
Statistics” in the Publications of the American Statistical Association, Vol. 13.  In this paper, Babson argued for the role of the business statistician as forecaster, as a business navigator rather than simply an accountant.  A side note: Babson concluded his 1912 paper with the following forecast: “In other words, the future of this country depends on the preacher and the statistician.”

In 1913, James Brookmire published his article “Methods of Business Forecasting based on
Fundamental Statistics” in the American Economic Review (Vol. 3 #1).  Where Jevons, Benner, and Babson were essentially using primitive AR and smoothing models, Brookmire introduced a graphical
decomposition using economic variables to explain stock prices.  In the December 1916 issue of the American Economic Review (Vol. 6 #4, pages 739–769), Harvard professor Warren M.Persons published “Construction of a Business Barometer Based Upon Annual Data,” which applied Karl Pearson’s Variate Difference Correlation Method and “Student’s” 1914 Biometrika article “The Elimination of Spurious Correlation due to Position in Time and Space.”  Person’s paper argued for the use of regression models to weight differences of various economic time series to create “Business Barometers” in contrast to Babson and Brookmire, who averaged various time series for their respective forecasts.

In 1933, the Cowles report showed that stock forecasts, then largely based on cycles, graphs of leading and lagging indicators, and various barometers, didn’t work.  Shortly after, in 1937, John Burr Williams completed his doctoral dissertation at Harvard, published as “The Theory of Investment Value,” which outlined the framework for many parts of modern financial theory.  Using what he called “algebraic budgeting”, Williams defined investment value as “the present worth of future dividends, or of future coupons and principal.”  He introduced the financial applications of numerous present value formulas and is the father of “discounted cash flow” analysis.  Quoting from JASA (Vol. 34 #205, page 195), John Burr Williams indicated “the Theory of Investment Value shows how traders would act in the stock market if they were perfectly rational and foreseeing.  Gradually as men do become more intelligent and better informed, market prices should draw closer to the values given by our theory….”  Burr Williams changed the forecasting target for investment.  Cowles (1933) evaluated stock market forecasters attempting to time the market, to identify peaks and troughs.  Burr Williams changed the focus from the time series of the market to the underlying components of asset value. Rather than forecasting stock prices directly, Burr Williams emphasized future corporate earnings and dividends.  His impact was not universally recognized. 
In a review of Williams (1938) the next year in JASA, the reviewer concludes:  “Quantitative and qualitative tests and the establishment [of] investment standards are earnestly to be desired, but a theory which depends in large measure upon long-range forecasts and which ignores ‘popular opinion’ in the market does not appeal to either the intelligent investor or to the professional analyst.”  And, in the Journal of Business (1939):  “Obviously, calculations of present worth of future payments are no better than the dependability of those payments and, therefore, unless dividend payments can be estimated with reasonable accuracy, it is futile to devise formulas, no matter how good mathematically, for arriving at present worth.”  Within the context of discounted cash flow analysis, John Burr Williams introduced the algebraic pro forma modeling of financial statements.  Virtually every modern corporate finance book presents some version of “financial forecasting” for purposes of projecting earnings, cash needs, and corporate planning, and most companies have some sort of financial forecasting system taking sales and cost forecasts over time and generating analysis of future cash flows, financial needs, and corporate value.  Similarly, it seems that “Chapter 4” of virtually every current corporate finance textbook teaches the same present value, perpetuity, and annuity formulas derived by Burr Williams in “The Theory of Investment Value,” and these same formulae are widely used for capital budgeting, NPV, project evaluation, and other corporate valuation applications. 

However, although the formulas derived by Burr Williams and the financial forecasting methods he proposed to make his equations useful have generated subfields of finance (cash flow valuation and corporate finance), current finance textbooks have at best a minimal discussion relating the two areas for purposes of valuing stock. Similarly, it has been my observation that a cultural wall in larger corporations frequently divides the treasurer’s office or the investment management group from those analysts working on specific forecasts and present value calculations for project evaluation and management.  Such a division is a consequence of years of financial and investment research based on work of Nobel Prize winner Harry Markowitz.

In Markowitz’s original 1952 Journal of Finance paper, he specifically cites Burr Williams as his first reference and argues that “The hypothesis (or maxim) that the investor does (or should) maximize discounted returns must be rejected.” His reasoning, in short, is that focusing on a single asset does not allow for diversification to be a preferred choice. If returns maximization is your objective function, there is no necessity to invest in a lower expected return investment.  Markowitz changed the focus from corporate earnings to stock returns.  In this paper, Markowitz focuses on the statistical properties of a linear combination of random variables. He defines each stock’s return as a random variable, and then looks at conditions defining mean and variance efficient portfolios (portfolios with max mean return per unit variance, minimum variance per unit return). (The resulting portfolio problem is a large product area for major linear programming vendors and operations research professionals.)  Markowitz (1952) begins his modeling with “let rit be the anticipated return (however decided upon) at time t per dollar invested in security i… .” He focuses on the returns distribution rather than the forecasting of the prices underlying the returns.  By using returns, rather than portfolio values, his analysis allowed for direct comparison of different investments or combinations of investments.  Since this 1952 publication, the majority of investment research has utilized returns rather than prices and on variations of a random walk (or similar random process) to describe the transition path of Rt and the associated covariance structures.

Financial Econometrics was launched in 1964 with the publication of The Random Character of Stock Market Prices, edited by Paul H. Cootner.  Containing articles by a “Who’s Who” of modern finance, this collection of papers tries to answer the question: “Is there any evidence that historical data about the price of a stock will enable us to improve our forecasts of the future profit from holding this stock?”  The conclusion supported the 1933 Cowles study:  Univariate forecasting methods don’t improve investment performance.

In 1964, William Sharpe (who later shared the Nobel with Harry Markowitz and Merton Miller) introduced the “Capital Asset Pricing Model” (CAPM).  The emphasis in Financial Econometrics has become estimating the statistical distribution of µ and σ and the covariance structure across securities, using ever more complicated models (GARCH, Neural Nets, Chaos Theory, Wavelets) on ever more frequently observed data.  There is a wide range of models now being applied for estimating µ.  Analysts might estimate ß, or multiple index ß, or extract factors of the returns covariance matrix which might be related to economy-wide factors (e.g., the Arbitrage Pricing Theory (APT).)  With the focus on returns and the mean variance trade-off, and with the apparent lack of information in time series of returns, the Efficient Market Hypothesis (EMH) became the prevailing assumption of “Modern Portfolio Theory.”

Here’s what one textbook says about EMH’s implications for fundamental stock analysis and valuation:
Because stocks are fully and fairly priced, investors need not waste their time trying to find and capitalize on mis-priced (undervalued or overvalued) securities. –
―Gittman, Principles of Managerial Finance, 11e, p. 341, discussing implications of the EMH.
Even so, financial statements are analyzed and projections are made.  Some investors do so following the 1934 “Graham and Dodd” approach. Most corporate planning groups and financial strategy professors do so with regularity for budgeting and planning purposes.  But to the extent that financial-statement-based stock analysis is performed, it is usually on a “one-up” basis with limited application to the distributions of µ and σ, more often used to identify a stock for a “buy list” rather than for formulating a portfolio.

Eta® Analysis

In recent years, the issue of whether or not financial returns were truly an I(1) series have been studied and generally they have been found to be close, but not quite. Some sort of fractional differencing might be superior. There are new theories of “noise trading.” As market data gets ever more frequently observed, questions also arise whether there is actually new information included in each price revision.  Consider a generic function for all pro forma profits and their discounted values:

  • Profitt = f(economic variables, firm specific variables)
  • PV(Profit) = g(economic variables, firm specific variables, financial variables) = Forecasted Value for Firm
  • Forecasted Value/# shares = Forecasted Price

 

  • So: Price = h(economic, financial, firm specific factors)
  • Consider a low-order Taylor series of the h(.) function.
  • This suggests that Price, at a given time, can be at least partially expressed as a linear function in economic, financial, and firm-specific variables.

Such a linear function would not be a time-series model but would use contemporaneous data. It would not use lagged stock prices, indeed it would not depend on any time-series properties of security prices, so it would not appear to be a cyclical model.  To the extent that “forecasts” were present, it would be in the values of the economic variables used to evaluate the function at a point.  Such a linear equation can be viewed as a linearized reduced-form equation of the present value pro forma earnings, excluding the value of orthogonal firm-specific information.  We call the valuation results the Eta® Model Price.

The Eta®  “Stars” Assess how favorable the current economy is to a given stock’s prospective return

  • Calculated using the 18 Eta measures and current (could be forecasted) values of the corresponding Eta factors.
  • Calculated for stocks with R2 ≥ 80%.
  • Calculated for stocks with Price ≥ $5.
  • Stocks must have ≥ 3-year trading history.

Dr. Phillips suggested that Eta® Analysis explains over 80% of stock price variation and explains 98% of  mutual fund variation in selected evaluations.  This was determined using some history to determine model and some additional history to evaluate models.  Mike also commented that ßs are uncorrelated with Eta® “Stars.”

Mike provided a handout, Attachment 3.2, showing Eta® Analysis for publicly traded Roundtable firms. 

Irv Lustig asked if the approach was to use a rolling three-year window, recalculating every weekend, assuming the same variables for each stock with a need to forecast the 18 independent variables.  Dr. Phillips confirmed this.  Marc Cohen asked how Eta® Analysis handles the situation where there is a fundamental change in the relationship of the company to the 18 indices used.  Mike responded that the breadth of the 18 different indices minimizes the impact that any one may have.

Following the kickoff speaker’s presentation, the Roundtable enjoyed dinner and then a session to recognize Dr. Arthur Geoffrion upon his retirement from UCLA and for his many contributions related to creating and nurturing the INFORMS Roundtable since its inception in 1982. 

Recognition for Dr. Arthur M. Geoffrion: Led by Dr. Charles Beall, Chair of the Roundtable Executive Committee.

Dr. Beall introduced the recognition opportunity we have to celebrate Art’s many contributions to the Roundtable.  The recognition event began at 8:30 PM Pacific time.  A slideshow of many photographs of Art in Roundtable activities over the years provided a backdrop for the session.  Chuck began, however, by recognizing Art’s wife, Helen, with a beautiful bouquet of flowers.

A summary of comments offered by those present, those who dialed in via conference call, and those who provided comments but could not attend are compiled in Attachment 4.0.

Chuck recognized Art with honorary lifetime membership in the Roundtable.  Chuck also recognized him with a memory stick of the photographs providing a backdrop for the session, a framed article from the July–August 1982 OR/MS Today when then-President of TIMS Art Geoffrion described his vision of the Roundtable, and an exquisite glass vase with the following engraving:

Art Geoffrion

Founding Father of the

 

For Your Guidance, Passion, and Wisdom,
We Are Eternally Grateful.

globe engrave

From Your Many Roundtable Friends

A cake with the same inscription was offered as dessert.  Art offered his appreciation and thanks and summarized his reflections of the Roundtable.

Milton Pope kicked off the Monday session.

Roundtable Recognition and Introductions

Chuck Beall, Chair of the Roundtable Executive Committee, recognized the following individuals for three consecutive years of service to Roundtable Committees.

  • Doug Hays – Meetings Committee
  • Ranga Nuggehalli – Meetings Committee
  • Mike Grant – Meetings Committee
  • Allen Butler – Membership Committee

Chuck also recognized the following individuals who are retiring from Roundtable Committee service.

  • Laurie Dutton – Meetings Committee
  • Hans-Peter Gantz – Meetings Committee
  • Doug Hay – Meetings Committee
  • Irv Lustig – Meetings Committee
  • Irv Salmeen – Membership Committee
  • Glenn Bailey – Membership Committee
  • Allen Butler – Membership Committee
  • Marc Cohen – Membership Committee
  • Laurie Dutton – Executive Committee
  • Irv Salmeen – Executive Committee

 

Michael Grant, chair of the Meetings Committee, congratulated and thanked Milton Pope for his coordination and leadership in arranging the Winter meeting.

Photographs of each of these individuals receiving an award are available on the meeting CD and from the Roundtable Webpage.

Dr. Pope introduced the first speaker for Monday, Brenda Wolfe.

Exploring the Intersection of Large-Scale Forecasting, Time-Series Mining, and Visualization, Led by Ms. Brenda Wolfe, Product Manager, SAS Institute, Cary, NC.

Brenda’s presentation can be found in Attachment 5.0.  Brenda’s role at SAS is to determine what features are incorporated into the forecasting software the company markets. 

She introduced the organization of her presentation:

  • The business forecasting problem.
  • Areas that still aren’t adequately addressed.
  • Where mining and visualization can help.

Throughout the presentation Brenda commented on and showed potential features being considered for SAS’s product.

The Business Problem

Brenda described the very large scale of forecasting problems that would be the focus of her talk:  situations where 50,000 or more items need to be forecasted, few resources are available to do the work, the forecast must be produced in a short elapsed time, and the forecast must be reconciled across hierarchies.  An example hierarchy is shown on slide 4 of Attachment 5.0.  Many forecasting environments are taking last year’s history and simply using it as the forecast.  Can we make the analyst’s life better and save money?

Brenda summarized attributes for a good forecasting system:

  • Scaleable.
  • Automated.
  • Malleable.
    • Malleability allows for process improvements to be incorporated into the iterative process

Slides 6 through 19 describe a practical forecasting process.  This includes the following steps:

  • Data preparation.
  • Model selection.
    • Model construction needs to be automated.
      • A forecasting engine is needed, including a list of possible models and intelligence for choosing.
    • Can have sophisticated time-series models.
  • Forecast generation.
    • Results with visualization are helpful.
    • Jeff Karrenbauer asked if forecast selection and generation is interactive or just “push the button for results?”  Brenda responded that SAS can be run in both modes, but the amount of elapsed time available and the number of time series can dictate an automated approach.
    • Mitch Burman inquired about database size limitations and Brenda indicated that there is virtually no limit. 
  • Reconciliation.
    • Irv Lustig inquired why “bottoms-up” reconciliation is not universally chosen.  Brenda responded that where data is sparse at the low level of the hierarchy leads to nasty aggregated results if you do this. 
    • Irv followed up and asked who makes the call on the reconciliation method.  Brenda responded that forecast analysts do, and SAS will soon offer advice on which method may be most appropriate.
    • Glenn Wegryn pointed out that high-level corporate expectations could be the anchor that suggests “top-down” reconciliation might be preferred. 
    • Laurie Dutton asked if reconciliation should be forced and Brenda described the trade-offs to consider.
  • Forecast refinement.
  • Performance evaluation.
  • Publishing and reporting.

The steps are detailed and some examples are shown in the attachment.   Scott Ellis asked why people don’t invest more time in improving the quality of forecasts.  Brenda replied that she is seeing a turnaround where greater interest is being shown in exactly that.  She further described the approach frequently used when a potential customer is considering SAS:  a proof-of-concept is done where a comparison with the accuracy of the status quo method is considered. 

Areas for Improvement

Brenda described the following areas for improvement:

  • New product or life-cycle modeling.
    • Involved in the Forecast Generation and Reconciliation steps of the process described above.
    • Data mining or time-series mining.
    • Finding proxy series  (examples shown on slides 22-29)
    • Attribute driven.
    • Data driven.
      • Clustering.
      • Segmentation.
      • Classification.
    • Time-series mining:
      • Dynamic time warping.  Irv Salmeen asked if dynamic time warping is different than autocorrelation in ARIMA (autoregressive integrated moving average).  Brenda responded that the approach is similar, and it can be shown pictorially, but dynamic time warping is a brute force method with applicability for image data, motion data, voice data, and handwriting.  All that matters is the order in which things occur. 
      • Similarity measures – sliding sequence.
      • Dimension reduction – helps take seasonality out of the time series.Discrete Fourier transformation – represent the time series as a linear. Combination of sines and cosines, but keep only the first n/2 coefficients.  Why n/2 coefficients? Because each sine wave requires 2 numbers, for the phase (w) and amplitude (A, B).
      • Wavelets –  represent the time series as a linear combination of Wavelet basis functions, but keep only the first N coefficients.  Although there are many different types of Wavelets, researchers in time-series mining/indexing generally use Haar Wavelets.  Haar Wavelets seem to be as powerful as the other Wavelets for most problems and are very easy to code.
      • Singular Value Decomposition (SVD) – represent the time series as a linear combination of eigenwaves but keep only the first N coefficients.  SVD is similar to Fourier and Wavelet approaches that represent the data in terms of a linear combination of shapes (in this case eigenwaves).  SVD differs in that the eigenwaves are data dependent. 
      • Piecewise linear approximation – represent the time series as a sequence of straight lines or represent the time series as a sequence of box basis functions.
      • SAX – Symbolic Aggregate approximation to represent data with letters, like music.  Eamonn Keough (at eamonn@cs.ucr.edu) is a contact for this approach.
  • Alternative hierarchy construction.
    • Data-driven hierarchies (e.g., all ATMs at businesses that are closed over the weekends) are an area of opportunity.
  • Forecast evaluation.
    • Tile charts can be very helpful to highlight error (intensity of color shows where error is large, size of boxes on the chart show the size of the time series).
    • 10,000 series can be shown graphically on one chart.
    • See slides 51 – 52 of the attachment.
  • Performance Evaluation.
    • Error in histograms.  Mean Absolute Percent Error – before and after forecast improvement.
    • See slides 53 – 58 of the attachment.

Irv Lustig pointed out that forecasting generally yields a point forecast and asked about forecasting a shape.  Brenda replied that many forecasts are a point, confidence parameters and an assumed normal distribution.

Brenda concluded her presentation with these thoughts:

  • Business forecasting is much improved.
  • Some areas still aren’t adequately addressed.
  • Mining and visualization show promise.

Back to Public-Access Minutes page

 
 
INFORMS Online