Skip to main content
Open Access Publications from the University of California

UC Irvine

UC Irvine Electronic Theses and Dissertations bannerUC Irvine

Parameter Analysis on Variants of Kernel Regression over Graphs


The dissertation focuses on understanding parameter influences on variants of kernel regression models over graphs. Graphs are used to represent complex systems where components in the system are modeled as nodes, and relationships among the components are denoted as edges connecting nodes. Kernel regression models can be used to solve graph-related problems such as graph signal reconstruction and prediction. In the graph signal reconstruction problem, a common case is to predict an unknown attribute of a node using known values of the same attribute from other nodes and the network structure. In the graph prediction problem, a common case is to predict a graph signal over the network based on historical graph signals. The essence of the two problems is to model the input-output relationship, and the kernel-based regression model with an iterative solution is a simple yet possibly powerful solution. The dissertation will first show an application of the kernel regression model on the graph signal reconstruction problem over multi-layer graphs. The graph signal reconstruction problem aims to estimate unknown nodal values based on known nodal values and the multi-layer network structure. Viewing the mapping from the local network structure of a node to the nodal value as a function in a Reproducing Kernel Hilbert Space (RKHS), a regression model based on multiple kernels is built and a minimization problem is formatted. With the gradient descent algorithm, it is easy to find the solution to the minimization problem iteratively. In this application of the kernel-based regression model, the predicting ability of the model is verified. It is also seen from the application that the single-kernel models are used as building blocks of a multi-kernel model and that the performance is de- pendent on the hyper-parameter settings on the single-kernel regression models. To achieve better performance with less computational cost by selecting suitable hyper-parameters for the model, the dissertation then presents an analysis framework to analyze the influence of the hyper-parameters on the predictions of single-kernel regression models. Noting that due to the iterative nature of the model solution, it is hard to figure out the influence of the hyper-parameters directly. So, the main idea of the proposed framework is to express the model prediction as a weighted sum of the training observations, and then to analyze the influence of parameters on the observation weights. With the framework, it is found that the weights of the parameters are scaled kernel values of the input for prediction and inputs for training observations. This verifies that the kernel performs as a similarity measure and shows that the scaling factor for kernel values is related to the time difference between the two inputs of the kernel. The framework helps better understand the impact of the hyper parameters and hints at suitable selections of those parameters. After that, the framework is generalized to do parameter analysis for an iterative solution of a kernel regression model dealing with the graph signal prediction problem where the input is agnostic. In the generalized framework, the solution acquired from the batch gradient descent algorithm can be analyzed, making the solution acquired from the gradient descent algorithm a special case.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View