Abstract
This paper concerns the application of a long short-term memory model (LSTM) for high-resolution reconstruction of turbulent pressure fluctuation signals from sparse (reduced) data. The model’s training was performed using data from high-resolution computational fluid dynamics (CFD) simulations of high-speed turbulent boundary layers over a flat panel. During the preprocessing stage, we employed cubic spline functions to increase the fidelity of the sparse signals and subsequently fed them to the LSTM model for a precise reconstruction. We evaluated our reconstruction method with the root mean squared error (RMSE) metric and via inspection of power spectrum plots. Our study reveals that the model achieved a precise high-resolution reconstruction of the training signal and could be transferred to new unseen signals of a similar nature with extremely high success. The numerical simulations show promising results for complex turbulent signals, which may be experimentally or computationally produced.
Original language | English |
---|---|
Article number | 4 |
Journal | Computation |
Volume | 12 |
Issue number | 1 |
DOIs | |
Publication status | Published - Jan 2024 |
Keywords
- deep learning
- interpolation
- LSTM
- signal reconstruction
- time series
- turbulence