Neural Networks: The Dream That Won't Die: Analysis Techniques Come and Go, but Some Methods Continue to Push the Frontier of What Is Possible. One of These Is Neural Network Technology. Has Its Time Finally Come?

By Ruggiero, Murray A., Jr. | Modern Trader, December 2012 | Go to article overview

Neural Networks: The Dream That Won't Die: Analysis Techniques Come and Go, but Some Methods Continue to Push the Frontier of What Is Possible. One of These Is Neural Network Technology. Has Its Time Finally Come?


Ruggiero, Murray A., Jr., Modern Trader


Neural networks, if used properly, can provide the framework for a plethora of market analysis tools that can supplement an existing trading program or suggest new directions for future research. While the history of these tools dates back much further, their modern application took root in the late 1980s and came of age in 1993 when patent no. 5241620 was awarded to this author for the concept of embedding a neural network into a common spreadsheet. Suddenly, neural networks were not just part of the professional mainstream, but the average trading populace could access them.

The analytical foundation for this leap is built on an algorithm called back propagation. In layman's terms, this is a method that allows a network to learn to discriminate between classes that can't be distinguished based on linear properties. Rumelhart, Hinton and Williams presented a well-received paper on what they called "Backward propagation of errors" in 1985. Others who did research into this approach include David Parker and Paul Werbos. Werbos arguably invented these techniques and presented them in "Introduction to Pattern Analysis," his 1974 Ph.D. dissertation at Harvard.

The back propagation algorithm consists of a multi-layer perception that uses non-linear activation functions (see "Simple net," right). The most commonly used functions are the sigmoid, which ranges from 0 to 1, and the hyperbolic tangent function, which ranges from -1 to 1. All inputs and target outputs must be mapped into these ranges when used in these types of networks.

The "magic" of back propagation, or backprop, is that mathematical calculations (the type typically found in first-year calculus) adjust the weights of the connections to minimize the error across the training set. An important attribute of these methods is they generate a reasonably low error across the training set of inputs. However, they do not find the absolute minimum error, but the local minimum. This means that training a neural network is not exact and depends on the precise data set. Repeating the same experiment does not always give the same answer.

Backprop, in its original form, had a lot of issues. Many variations of this algorithm attempt to resolve those weaknesses. Early ideas used momentum and variable learning rate adjustment techniques, such as simulated annealing. When newer tactics are combined with older ones, the combination can optimize learning. For example, we perform batch learning in parallel so that we can run it on multiple cores, saving a tremendous amount of time. All of these variations are supervised learning algorithms: We give them input patterns and train them to output a certain target set of results. In doing so, we map the patterns, which in turn allows us to generalize for new patterns that were not used in training.

There are other algorithms, such as radial nets and kernel regression (also known as Support Vector Machines). All of these algorithms can be used to create approximations of non-linear functions. This approximates how neural networks map a given input to an out put. Put simply, we create a universal function "approximator" that, given a set of inputs, can provide a good idea of what the optimal solution to a problem would be.

Client driven

As with most things, interest in neural networks took off when the customer started demanding it. Traders, hungry for the next big thing, were clamoring for the technology in the early 1990s. However, the vast majority of these traders had no background in the processes --and those who had the background knew nothing about the markets or how they work.

But neural networks were not the perfect solution, and after many years of trial and error, it became clear why: Standard neural-network-based signal processing techniques simply do not work in the markets as signal generators. In other words, the process of implementing neural networks correctly must begin far earlier in trading system development. …

The rest of this article is only available to active members of Questia

Already a member? Log in now.

Notes for this article

Add a new note
If you are trying to select text to create highlights or citations, remember that you must now click or tap on the first word, and then click or tap on the last word.
One moment ...
Default project is now your active project.
Project items

Items saved from this article

This article has been saved
Highlights (0)
Some of your highlights are legacy items.

Highlights saved before July 30, 2012 will not be displayed on their respective source pages.

You can easily re-create the highlights by opening the book page or article, selecting the text, and clicking “Highlight.”

Citations (0)
Some of your citations are legacy items.

Any citation created before July 30, 2012 will labeled as a “Cited page.” New citations will be saved as cited passages, pages or articles.

We also added the ability to view new citations from your projects or the book or article where you created them.

Notes (0)
Bookmarks (0)

You have no saved items from this article

Project items include:
  • Saved book/article
  • Highlights
  • Quotes/citations
  • Notes
  • Bookmarks
Notes
Cite this article

Cited article

Style
Citations are available only to our active members.
Buy instant access to cite pages or passages in MLA, APA and Chicago citation styles.

(Einhorn, 1992, p. 25)

(Einhorn 25)

1. Lois J. Einhorn, Abraham Lincoln, the Orator: Penetrating the Lincoln Legend (Westport, CT: Greenwood Press, 1992), 25, http://www.questia.com/read/27419298.

Cited article

Neural Networks: The Dream That Won't Die: Analysis Techniques Come and Go, but Some Methods Continue to Push the Frontier of What Is Possible. One of These Is Neural Network Technology. Has Its Time Finally Come?
Settings

Settings

Typeface
Text size Smaller Larger Reset View mode
Search within

Search within this article

Look up

Look up a word

  • Dictionary
  • Thesaurus
Please submit a word or phrase above.
Print this page

Print this page

Why can't I print more than one page at a time?

Help
Full screen

matching results for page

    Questia reader help

    How to highlight and cite specific passages

    1. Click or tap the first word you want to select.
    2. Click or tap the last word you want to select, and you’ll see everything in between get selected.
    3. You’ll then get a menu of options like creating a highlight or a citation from that passage of text.

    OK, got it!

    Cited passage

    Style
    Citations are available only to our active members.
    Buy instant access to cite pages or passages in MLA, APA and Chicago citation styles.

    "Portraying himself as an honest, ordinary person helped Lincoln identify with his audiences." (Einhorn, 1992, p. 25).

    "Portraying himself as an honest, ordinary person helped Lincoln identify with his audiences." (Einhorn 25)

    "Portraying himself as an honest, ordinary person helped Lincoln identify with his audiences."1

    1. Lois J. Einhorn, Abraham Lincoln, the Orator: Penetrating the Lincoln Legend (Westport, CT: Greenwood Press, 1992), 25, http://www.questia.com/read/27419298.

    Cited passage

    Thanks for trying Questia!

    Please continue trying out our research tools, but please note, full functionality is available only to our active members.

    Your work will be lost once you leave this Web page.

    Buy instant access to save your work.

    Already a member? Log in now.

    Author Advanced search

    Oops!

    An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.