Skip to main content
eScholarship
Open Access Publications from the University of California

Converting Cascade-Correlation Neural Nets into Probabilistic Generative Models

Abstract

Humans are not only adept in recognizing what class an in-put instance belongs to (i.e., classification task), but perhapsmore remarkably, they can imagine (i.e., generate) plausibleinstances of a desired class with ease, when prompted. Inspiredby this, we propose a framework which allows transformingCascade-Correlation Neural Networks (CCNNs) into proba-bilistic generative models, thereby enabling CCNNs to gen-erate samples from a category of interest. CCNNs are a well-known class of deterministic, discriminative NNs, which au-tonomously construct their topology, and have been successfulin accounting for a variety of psychological phenomena. Ourproposed framework is based on a Markov Chain Monte Carlo(MCMC) method, called the Metropolis-adjusted Langevin al-gorithm, which capitalizes on the gradient information of thetarget distribution to direct its explorations towards regionsof high probability, thereby achieving good mixing proper-ties. Through extensive simulations, we demonstrate the effi-cacy of our proposed framework. Importantly, our frameworkbridges computational, algorithmic, and implementational lev-els of analysis.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View