Localization of nonlinear damage using state-space-based predictions under stochastic excitation
- Author(s): Liu, G;
- Mao, Z;
- Todd, M;
- Huang, Z
- et al.
Published Web Locationhttps://doi.org/10.1088/0964-1726/23/2/025036
This paper presents a study on localizing damage under stochastic excitation by state-space-based methods, where the damaged response contains some nonlinearity. Two state-space-based modeling algorithms, namely auto- and cross-predictions, are employed in this paper, and the greatest prediction error will be achieved at the sensor pair closest to the actual damage, in terms of localization. To quantify the distinction of prediction error distributions obtained at different sensor locations, the Bhattacharyya distance is adopted as the quantification metric. There are two lab-scale test-beds adopted as validation platforms, including a two-story plane steel frame with bolt loosening damage and a three-story benchmark aluminum frame with a simulated tunable crack. Band-limited Gaussian noise is applied through an electrodynamic shaker to the systems. Testing results indicate that the damage detection capability of the state-space-based method depends on the nonlinearity-induced high frequency responses. Since those high frequency components attenuate quickly in time and space, the results show great capability for damage localization, i.e., the highest deviation of Bhattacharyya distance is coincident with the sensors close to the physical damage location. This work extends the state-space-based damage detection method for localizing damage to a stochastically excited scenario, which provides the advantage of compatibility with ambient excitations. Moreover, results from both experiments indicate that the state-space-based method is only sensitive to nonlinearity-induced damage, thus it can be utilized in parallel with linear classifiers or normalization strategies to insulate the operational and environmental variability, which often affects the system response in a linear fashion. © 2014 IOP Publishing Ltd.