Are Neural Networks a Better Forecaster?

By Kudyba, Stephan | Futures (Cedar Falls, IA), October 1998 | Go to article overview
Save to active project

Are Neural Networks a Better Forecaster?


Kudyba, Stephan, Futures (Cedar Falls, IA)


In some cases, neural nets prove to be superior to their regression counterparts if they're applied correctly. Here's a representative comparison of a neural net for forecasting electricity demand with a well-designed regression model.

Despite the technical analysis influx futures trading has experienced in the past decade, statistical models of markets' fundamental underpinnings continue to enjoy a strong following. Thus, deciding how to go about building a model remains a key decision for many traders and analysts. Complicating this is the continuous evolution of modeling techniques. Early this century, the question was whether a statistical approach to economic theory was an appropriate analytical method. The acceptance of this procedure then led to the sophisticated forms of regression and econometric applications.

The increased availability of economic data sources in today's information age as well as ongoing enhancements in computer technology have enabled a constant "pushing of the envelope" of forecasting techniques. The stages of the process range from single equation regression and univariate or sole time series applications to sophisticated systems of equations and finally to state-of-the-art "artificial intelligence" or neural network technology.

Here, we will analyze this quantitative modeling technology by running a comparison of forecast accuracy between a sophisticated Semtsa (structural econometric model time series analysis) and a neural net-based computer algorithm, looking at predictive accuracy between the two approaches. This experiment offers some insight into where model builders should focus their efforts.

A closer look We often refer to neural nets as artificial intelligence because they are computer algorithms that attempt to reproduce human thought and reasoning. A simple neural net comprises a number of processing elements called neurons that are connected to other neurons that form layers of neurons.

The connectors or "synapses" have a weight that is assigned to them during the optimization process. This structure mirrors the neural net of a human brain where the neurons, in conjunction with their synapses, process input impulses during the human learning stage. The optimization process of neural nets resembles the minimizing of the error term in that of least squared standard regression. The difference lies in that neural nets consider linear, non-linear and pattern recognition relationships in the input data and conduct the optimization process automatically.

Standard regression models require the analyst to apply different functional forms to the data, such as deciding on exponential powers of the explanatory variables. By including the functional form as part of the optimization process, neural nets claim to better predict trending relationships among model variables and identify crucial turning points. Also, neural nets use a unique algorithm approach in such a way the technology does not have a problem with collinearity, a strong relationship between two of the predictor variables, which can cause major errors in standard regression analysis models.

But neural nets do have limitations. One potential shortcoming is the notion of overfitting or overoptimization (see, for example, "The Amateur Scientist," Scientific American, September 1992). This common problem occurs during training. The neural net's performance in linking dependent to independent variables generally will improve the longer the weight adjustment process continues. However, when out-of-sample data are introduced to the system, the net's performance often breaks down. The reason is the neural net learned too many anomalies of the given input data, which it attempted to apply to the new data. Such overfitting results from the improper design of the network structure.

The representative neural net in this article was built with a program that includes certain algorithms that limit the negative overfitting aspects of neural nets.

The rest of this article is only available to active members of Questia

Sign up now for a free, 1-day trial and receive full access to:

  • Questia's entire collection
  • Automatic bibliography creation
  • More helpful research tools like notes, citations, and highlights
  • Ad-free environment

Already a member? Log in now.

Notes for this article

Add a new note
If you are trying to select text to create highlights or citations, remember that you must now click or tap on the first word, and then click or tap on the last word.
Loading One moment ...
Project items
Notes
Cite this article

Cited article

Style
Citations are available only to our active members.
Sign up now to cite pages or passages in MLA, APA and Chicago citation styles.

Cited article

Are Neural Networks a Better Forecaster?
Settings

Settings

Typeface
Text size Smaller Larger
Search within

Search within this article

Look up

Look up a word

  • Dictionary
  • Thesaurus
Please submit a word or phrase above.
Print this page

Print this page

Why can't I print more than one page at a time?

While we understand printed pages are helpful to our users, this limitation is necessary to help protect our publishers' copyrighted material and prevent its unlawful distribution. We are sorry for any inconvenience.
Full screen

matching results for page

Cited passage

Style
Citations are available only to our active members.
Sign up now to cite pages or passages in MLA, APA and Chicago citation styles.

Cited passage

Welcome to the new Questia Reader

The Questia Reader has been updated to provide you with an even better online reading experience.  It is now 100% Responsive, which means you can read our books and articles on any sized device you wish.  All of your favorite tools like notes, highlights, and citations are still here, but the way you select text has been updated to be easier to use, especially on touchscreen devices.  Here's how:

1. Click or tap the first word you want to select.
2. Click or tap the last word you want to select.

OK, got it!

Thanks for trying Questia!

Please continue trying out our research tools, but please note, full functionality is available only to our active members.

Your work will be lost once you leave this Web page.

For full access in an ad-free environment, sign up now for a FREE, 1-day trial.

Already a member? Log in now.

Are you sure you want to delete this highlight?