Skip to main content
eScholarship
Open Access Publications from the University of California

Generalized Laminar Population Analysis (gLPA) for Interpretation of Multielectrode Data from Cortex.

  • Author(s): Głąbska, Helena T
  • Norheim, Eivind
  • Devor, Anna
  • Dale, Anders M
  • Einevoll, Gaute T
  • Wójcik, Daniel K
  • et al.
Abstract

Laminar population analysis (LPA) is a method for analysis of electrical data recorded by linear multielectrodes passing through all lamina of cortex. Like principal components analysis (PCA) and independent components analysis (ICA), LPA offers a way to decompose the data into contributions from separate cortical populations. However, instead of using purely mathematical assumptions in the decomposition, LPA is based on physiological constraints, i.e., that the observed LFP (low-frequency part of signal) is driven by action-potential firing as observed in the MUA (multi-unit activity; high-frequency part of the signal). In the presently developed generalized laminar population analysis (gLPA) the set of basis functions accounting for the LFP data is extended compared to the original LPA, thus allowing for a better fit of the model to experimental data. This enhances the risk for overfitting, however, and we therefore tested various versions of gLPA on virtual LFP data in which we knew the ground truth. These synthetic data were generated by biophysical forward-modeling of electrical signals from network activity in the comprehensive, and well-known, thalamocortical network model developed by Traub and coworkers. The results for the Traub model imply that while the laminar components extracted by the original LPA method overall are in fair agreement with the ground-truth laminar components, the results may be improved by use of gLPA method with two (gLPA-2) or even three (gLPA-3) postsynaptic LFP kernels per laminar population.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
Current View