Introduction
Introduction Statistics Contact Development Disclaimer Help
DTIC ADA264665: Training Neural Networks with Weight Constraints
by Defense Technical Information Center
Thumbnail
Download
Web page
Hardware implementation of artificial neural networks
imposes a variety of constraints. Finite weight
magnitudes exist in both digital and analog devices.
Additional limitations are encountered due to the
imprecise nature of hardware components. These
constraints can be overcome with a stochastic global
optimization strategy which effectively searches the
range of the weight space and is robust to quantization
and modeling errors. Evolutionary programming is proposed
as a solution to training networks with these
constraints. This work investigates the use of
evolutionary programming in optimizing a network with
weight constraints. Comparisons are made to the
backpropagation training algorithm for networks with both
unconstrained and hard-limited weight magnitudes. Neural
networks, Analog, Digital, Stochastic
Date Published: 2018-03-10 08:25:18
Identifier: DTIC_ADA264665
Item Size: 9282739
Language: english
Media Type: texts
# Topics
DTIC Archive; McDonnell, John R ; NAV...
# Collections
dticarchive
additional_collections
# Uploaded by
@chris85
# Similar Items
View similar items
PHAROS
You are viewing proxied material from tilde.pink. The copyright of proxied material belongs to its original authors. Any comments or complaints in relation to proxied material should be directed to the original authors of the content concerned. Please see the disclaimer for more details.