Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published March 15, 2020 | public
Journal Article

Blind, simultaneous identification of full-field vibration modes and large rigid-body motion of output-only structures from digital video measurements

Abstract

Many operating structures undergo vibration associated with significant rigid-body motion, such as earthquake-excited structures which vibrate while moving (and/or rocking) along with the shaking ground, rotating structures such as the blades of helicopter, aircraft, wind turbine, and in UAV's measurements (images/videos) of infrastructure response which is coupled with the motion of the UAV itself. In many such cases, the rigid motion is very dominant; measurement and identification of the relatively subtle deformation (flexible) motion of the structure in their operations is very challenging, especially when high-resolution vibration measurement and high-fidelity dynamics modeling are required to capture and characterize spatially local and temporally transient dynamics behaviors and features of the structure. Traditional contact-type wired or wireless sensors, such as accelerometers, strain gauges, and LVDT's, are discrete point-wise sensors that only allow instrumentations at a limited number of places, providing very low spatial resolution vibration measurements. Non-contact interferometry-based optical measurement techniques using laser beams such as laser Doppler vibrometers, electronic speckle pattern interferometry (ESPI) and holographic interferometry, can provide high spatial resolution vibration measurement without the need to install sensors on the structure. However, they are very vulnerable to ambient vibration and rigid-body motion. As an alternative non-contact optical measurement method, photogrammetry uses white-light imaging of digital video cameras that are relatively low-cost, agile, and provides simultaneous measurements with very high spatial resolution where every pixel effectively becomes a sensing unit. Furthermore, it is robust to rigid-body motion. Using image processing algorithms such as optical flow and image correlation, photogrammetry has been successfully used for vibration measurements and experimental modal analysis of structures (including rotating-type) where full-field (or as many points as the pixel number of the image frame on the structure) modal parameters (mode shapes) can be obtained. However, existing such methods typically rely on installing and tracking the optical markers or speckle paints on the structure's surface, which becomes less applicable for large measurement area or in harsh environment (high temperature and corrosion). In addition, they are computationally extensive. Aiming to alleviate these challenges, this study presents the development of adapting a recently-proposed video-based method for output-only, simultaneous identification of both the subtle, full-field deformation modes and the dominant rigid-body motion from only the video of the vibrating while moving structure. Full-field dynamic parameters (deformation modes) of structure, as well as its rigid-body motion can be efficiently estimated, respectively. Laboratory experiments were conducted on a bench-scale building structure with "free-free" boundary condition. Comparisons were made with an analytical finite element model. Results demonstrate that the method can efficiently and accurately extract the full-field, (total) motion of the structure and further identify or separate the subtle deformation modes and large rigid-body motion, respectively. Furthermore, videos reconstructed by the method (provided in the supplementary materials) show that it enables realistic visualization of the subtle deformation modes of the structure in presence of large rigid-body motion. Several practical limitations for implementing the method are also discussed.

Additional Information

© 2020 Elsevier. Received 2 July 2018, Revised 4 January 2020, Accepted 6 January 2020, Available online 31 January 2020.

Additional details

Created:
August 22, 2023
Modified:
October 19, 2023