 Main
Parameter Analysis on Variants of Kernel Regression over Graphs
 Zhao, Yue
 Advisor(s): Ayanoglu, Ender
Abstract
The dissertation focuses on understanding parameter influences on variants of kernel regression models over graphs. Graphs are used to represent complex systems where components in the system are modeled as nodes, and relationships among the components are denoted as edges connecting nodes. Kernel regression models can be used to solve graphrelated problems such as graph signal reconstruction and prediction. In the graph signal reconstruction problem, a common case is to predict an unknown attribute of a node using known values of the same attribute from other nodes and the network structure. In the graph prediction problem, a common case is to predict a graph signal over the network based on historical graph signals. The essence of the two problems is to model the inputoutput relationship, and the kernelbased regression model with an iterative solution is a simple yet possibly powerful solution. The dissertation will first show an application of the kernel regression model on the graph signal reconstruction problem over multilayer graphs. The graph signal reconstruction problem aims to estimate unknown nodal values based on known nodal values and the multilayer network structure. Viewing the mapping from the local network structure of a node to the nodal value as a function in a Reproducing Kernel Hilbert Space (RKHS), a regression model based on multiple kernels is built and a minimization problem is formatted. With the gradient descent algorithm, it is easy to find the solution to the minimization problem iteratively. In this application of the kernelbased regression model, the predicting ability of the model is verified. It is also seen from the application that the singlekernel models are used as building blocks of a multikernel model and that the performance is de pendent on the hyperparameter settings on the singlekernel regression models. To achieve better performance with less computational cost by selecting suitable hyperparameters for the model, the dissertation then presents an analysis framework to analyze the influence of the hyperparameters on the predictions of singlekernel regression models. Noting that due to the iterative nature of the model solution, it is hard to figure out the influence of the hyperparameters directly. So, the main idea of the proposed framework is to express the model prediction as a weighted sum of the training observations, and then to analyze the influence of parameters on the observation weights. With the framework, it is found that the weights of the parameters are scaled kernel values of the input for prediction and inputs for training observations. This verifies that the kernel performs as a similarity measure and shows that the scaling factor for kernel values is related to the time difference between the two inputs of the kernel. The framework helps better understand the impact of the hyper parameters and hints at suitable selections of those parameters. After that, the framework is generalized to do parameter analysis for an iterative solution of a kernel regression model dealing with the graph signal prediction problem where the input is agnostic. In the generalized framework, the solution acquired from the batch gradient descent algorithm can be analyzed, making the solution acquired from the gradient descent algorithm a special case.
Main Content
Enter the password to open this PDF file:













