Generalizability of transformer-based deep learning for multidimensional turbulent flow data

Dimitris Drikakis, Ioannis William Kokkinakis, Daryl Fung, S. Michael Spottswood

    Research output: Contribution to journalArticlepeer-review

    Abstract

    Deep learning has been going through rapid advancement and becoming useful in scientific computation, with many opportunities to be applied to various fields, including but not limited to fluid flows and fluid-structure interactions. High-resolution numerical simulations are computationally expensive, while experiments are equally demanding and encompass instrumentation constraints for obtaining flow, acoustics and structural data, particularly at high flow speeds. This paper presents a Transformer-based deep learning method for turbulent flow time series data. Turbulent signals across spatiotemporal and geometrical variations are investigated. The pressure signals are coarsely-grained, and the Transformer creates a fine-grained pressure signal. The training includes data across spatial locations of compliant panels with static deformations arising from the aeroelastic effects of shock-boundary layer interaction. Different training approaches using the Transformer were investigated. Evaluations were carried out using the predicted pressure signal and their power spectra. The Transformer's predicted signals show promising performance. The proposed method is not limited to pressure fluctuations and can be extended to other turbulent or turbulent-like signals.

    Original languageEnglish
    Article number026102
    JournalPhysics of Fluids
    Volume36
    Issue number2
    DOIs
    Publication statusPublished - 1 Feb 2024

    Fingerprint

    Dive into the research topics of 'Generalizability of transformer-based deep learning for multidimensional turbulent flow data'. Together they form a unique fingerprint.

    Cite this