Tensors or multidimensional arrays are higher order generalizations of ma-trices. They are natural structures for expressing data that have inherent higher
order structures. Tensor decompositions and Tensor approximations play an important role in learning those hidden structures. They have many applications in
machine learning, statistical learning, data science, signal processing, neuroscience,
and more.
Canonical Polyadic Decomposition (CPD) is a tensor decomposition that
decomposes a tensor to minimal number of summation of rank 1 tensors. While for
a given tensor, Low-Rank Tensor Approximation (LRTA) aims at finding a new
one whose rank is small and that is close to the given one.
We study the generating polynomials for computing tensor decompositions
and low-rank approximations for given tensors and propose methods that can compute tensor decompositions for generic tensors under certain rank conditions. For
low-rank tensor approximation, the proposed method guarantees that the constructed tensor is a good enough low-rank approximation if the tensor is to be approximated is close enough to a low-rank one. The proof built on perturbation
analysis is presented.
When the rank is higher than the second dimension, we are not able to findthe common zeros of generating polynomials directly. In this case, we need to
use the quadratic equations that we get from those generating polynomials. We
show that under certain conditions, we are able to find the tensor decompositions
using standard linear algebra operations (i.e., solving linear systems, singular value
decompositions, QR decompositions). Numerical examples and some comparisons
are presented to show the performance of our algorithm.
Multi-view learning is frequently used in data science. The pairwise correla-
tion maximization is a classical approach for exploring the consensus of multiple
views. Since the pairwise correlation is inherent for two views, the extensions to
more views can be diversified and the intrinsic interconnections among views are
generally lost. To address this issue, we propose to maximize the high-order tensor
correlation. This can be formulated as a low-rank approximation problem with the
high-order correlation tensor of multi-view data. We propose to use the generat-
ing polynomial method to efficiently solve the high-order correlation maximization
problem of tensor canonical correlation analysis for multi-view learning. Numerical
results on simulated data and two real multi-view data sets demonstrate that our
proposed method not only consistently outperforms existing methods but also is
efficient for large scale tensors.