Machine-Learning Methods on Noisy and Sparse Data

Konstantinos Poulinakis, Dimitris Drikakis, Ioannis W. Kokkinakis, Stephen Michael Spottswood

Research output: Contribution to journalArticlepeer-review

Abstract

Experimental and computational data and field data obtained from measurements are often sparse and noisy. Consequently, interpolating unknown functions under these restrictions to provide accurate predictions is very challenging. This study compares machine-learning methods and cubic splines on the sparsity of training data they can handle, especially when training samples are noisy. We compare deviation from a true function f using the mean square error, signal-to-noise ratio and the Pearson (Formula presented.) coefficient. We show that, given very sparse data, cubic splines constitute a more precise interpolation method than deep neural networks and multivariate adaptive regression splines. In contrast, machine-learning models are robust to noise and can outperform splines after a training data threshold is met. Our study aims to provide a general framework for interpolating one-dimensional signals, often the result of complex scientific simulations or laboratory experiments.

Original languageEnglish
Article number236
JournalMathematics
Volume11
Issue number1
DOIs
Publication statusPublished - Jan 2023
Externally publishedYes

Keywords

  • deep neural networks
  • feedforward neural networks
  • interpolation
  • machine learning
  • MARS
  • noisy data
  • sparse data
  • splines

Fingerprint

Dive into the research topics of 'Machine-Learning Methods on Noisy and Sparse Data'. Together they form a unique fingerprint.

Cite this