Eng (Feb 2022)
Generalized Damping Model for MEMS Oscillators from Molecular to Viscous Flow Regime
Abstract
In this study, we investigate the damping phenomena acting on piezoelectrically driven MEMS oscillators. Three different geometrical shapes of MEMS oscillators are presented, including cantilevers, bending oscillators, and paddle oscillators. An analytical model for their resonance frequencies is derived. The bending modes of these micro-oscillator structures are characterized regarding their resonance frequency and their quality factor as a function of the ambient pressure in a nitrogen atmosphere as well as the dependence on the distance to a neighboring plate representing a geometrical boundary (e.g., to the package or to the mounting). The investigations cover a pressure range from 10−3 mbar up to 900 mbar and a gap width from 150 µm to 3500 µm. Consequently, a Knudsen number range over six orders of magnitude from 100 to 10−4 is covered. The measurement data are evaluated with a generalized damping model consisting of four parts representing the individual damping mechanisms (intrinsic, molecular, transitional, and viscous). The evaluated parameters are analyzed as a function of the resonance frequency and the gap width. The data reveal an exponential growing saturation behavior, which is determined by two characteristic lengths, being correlated with the viscous and the thermal boundary layer thickness, respectively. This leads to an estimation of the strength and of the range of the damping effect just by calculating the boundary layer thicknesses given by the resonance frequency and the gas properties. From these results, we gain fundamental insights on the viscous and transitional damping mechanisms as well as on the intrinsic losses. In conclusion, a basic concept is provided to reduce the damping of micro-oscillator bending modes and thus increase the quality factor. Additionally, the results are supported by finite element simulations revealing the temperature and pressure distribution within the gap.
Keywords