Interpolation is a method of constructing new data points within the range of a discrete set of known data points. It is often required to interpolate the value of that function for an intermediate value of independent variable. This can be achieved by curve fitting or regression analysis. There are many different interpolation methods. Some examples of this include piecewise constant interpolation, linear interpolation, polynomial interpolation, spline interpolation and Gaussian processes. Other forms of interpolation can be constructed by picking a different class of interpolates.

Extrapolation is the process of estimating, beyond the original observation intervals, the value of a variable on the basis of its relationship with another variable. Extrapolation is similar to interpolation. However extrapolation is subject to greater uncertainty and a higher risk of producing meaningless results. Like interpolation, extrapolation uses a variety of techniques that require prior knowledge of the process that created the existing data points. These techniques include linear extrapolation, polynomial extrapolation, conic extrapolation and French curve extrapolation. Typically the quality of a particular method of extrapolation is limited by the assumptions about the function made by the method.

Regression analysis is a statistical process for estimating the relationships among variables. Regression includes many techniques for modeling and analyzing several variables, when the focus is on the relationship between a dependent variable and one or more independent variables. Regression analysis helps one understand how the typical value of the dependent variable changes when any one of the independent variables is varied which the other independent variable are held at a fixed value. Regression analysis is widely used for prediction and forecasting, where its use has substantial overlap with the field of machine learning.