- Main
Hybrid Power Model for Data Center Energy Efficiency
- Bernard, Nigel
- Advisor(s): Jeon, Hyeran
Abstract
With the growing complexity of big data workloads that require abundant data and computation, data centers consume a tremendous amount of power daily. In an effort to minimize data center power consumption, works in the literature developed power models that can be used for load balancing algorithm decision making. There are several difficulties that make power modeling a nontrivial task. For one, the inter-dependencies between components must be captured in addition to the direct effect components have on energy consumption. Additionally, a power model should persist for a wide variety of workloads and generalize to different platforms. In the past, analytical power models mainly focused on single-variable CPU utilization functions to predict power consumption. However, with the increase of system memory and disk/cache accesses in emerging workloads like machine learning, these models face a significant accuracy reduction because they do not account for power hungry memory related events. As a result, machine-learning-based power models attempt to consider relevant parameters that capture these memory relevant events in emerging workloads. However, these machine-learning-based power models exhibit high latency and accuracy improvement mostly on non-compute-intensive workloads. Additionally, machine learning power models are not easily generalized to different platforms because they are trained with profiling data of certain server nodes, given the increasing hardware heterogeneity in data centers. Additionally, as more datacenters are migrating to run containerized applications to leverage the isolation, security and scalability of containerized technology, there is a lack of study in the literature regarding container power modeling. Thus, load migration algorithms are unable to make accurate migrating decisions regarding the appropriate containers. This thesis tackles these issues by proposing several ideas. First, a hybrid power model is proposed that selects the best power model out of a lightweight analytical model and a more accurate DNN model by considering prediction accuracy and performance/power overhead. A workload classifier is incorporated in the hybrid power model that evaluates the common characteristics of currently working workloads and determines the better power model that captures the characteristics. Then the hybrid power model outputs a power prediction specific to the power model the workload is classified as. Secondly, a ground truth standardization method is proposed that enables one machine learning model to be used by various heterogeneous server nodes without a significant accuracy discrepancy due to server-specific features. Thirdly, a novel container power prediction is proposed to predict the power draw of individual container applications for accurate load migration decision making. We compare our hybrid power model against a state-of-the-art recursive autoencoder power model (RAE) and an analytical power model. Our experiments show that our hybrid power model provides up to 5-10% energy savings when integrated as a load migration trigger, compared to RAE and analytical power model.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-