Paper
22 March 1999 Prestructuring neural networks via extended dependency analysis with application to pattern classification
George G. Lendaris, Thaddeus T. Shannon, Martin Zwick
Author Affiliations +
Abstract
We consider the problem of matching domain-specific statistical structure to neural-network (NN) architecture. In past work we have considered this problem in the function approximation context; here we consider the pattern classification context. General Systems Methodology tools for finding problem-domain structure suffer exponential scaling of computation with respect to the number of variables considered. Therefore we introduce the use of Extended Dependency Analysis (EDA), which scales only polynomially in the number of variables, for the desired analysis. Based on EDA, we demonstrate a number of NN pre-structuring techniques applicable for building neural classifiers. An example is provided in which EDA results in significant dimension reduction of the input space, as well as capability for direct design of an NN classifier.
© (1999) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
George G. Lendaris, Thaddeus T. Shannon, and Martin Zwick "Prestructuring neural networks via extended dependency analysis with application to pattern classification", Proc. SPIE 3722, Applications and Science of Computational Intelligence II, (22 March 1999); https://doi.org/10.1117/12.342895
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Electronic design automation

Image classification

Picosecond phenomena

Global system for mobile communications

Prototyping

Associative arrays

Data modeling

Back to Top