It is easier to understand the graphical view rather than mathematical view, that’s why curve fitting Statistics is useful in statistics, curve fitting statistics is a way to represent large number of data in the form of curve. Proper definition of curve fitting statistics states that it is a way to represent best fit curve of large amount of data. With the help of these fitting curves, we can easily visualize the whole data and we can easily summarize the data means we can easily judge that where data is placed and we can easily judge relationship between data. Now we will discuss how we fit curve into statistics. |

So, this dotted graph, which represents relationship between height data of a person and weight data of a person, is called as a scatter plots in Statistics. This kind of graph shows statistics part because these graphs represent relationships between group of data and in statistics, when we deal with group of data.

This scatter graph is useful to find out correlation between two Set of data like we make a scatter plot graph on an ice cream shop’s sell and temperature because sale of ice cream is depends upon temperature. When temperature is high, selling of ice cream increases and correlation between selling and temperature is high in this situation means there are lot of dotted point are linked in this situation and when temperature is low, then selling of ice cream remains low in this situation and correlation between selling and temperature is also low in this situation means no dotted links exist in scatter plot between temperature and selling.

**Curve fitting can be thought as process of finding a function in mathematics that best fits a sequence of data points.**Here we will discuss polynomial Curve Fitting This is to be done by abiding to some rules of fitness. Different techniques we use for purpose of curve fitting are: smoothing and interpolation. Various examples for curve fitting can be Linear Regression, non – linear regression and polynomial fitting. Extrapolation is defined as using curve that has been fitted outside range of data being fitted and interpolation means to use curve within fitted data. Interpolation technique can be used for the Polynomials fitting process. In th.is fitted polynomial curve will pass through each and every data Point. Values for data points are estimated, which are found to be present on fitted curve. An example illustrates one of the simpler approaches:

Suppose a polynomial is defined as P (X) = (X – x1) y0 / (x0 - x1) + (X - x0) y1 / (x1 - x0). Where, (x0, y0) and (x1, y1) are two data points that are needed to be fit in a particular sequence. Polynomial P (X) has characteristic that P (x0) = y0 and P (x1) = y1 which represents a straight line. If we try to fit more data points, degree of overall polynomial will be comparatively higher. You can consider a polynomial of this type:

P (X) = (X - X1) (X - X2) Y0 / [(X0 - X1) (X0 - X2)] + (X - X0) (X - X2) Y1 / [(X1 - X0) (X1 - X2)] + (X - X0) (X - X1) Y2 / [(X2 - X0) (X2 – X1)].

Here, data points we considered are: (X0, Y0), (X1, Y1) and (X2, Y2). Other examples of polynomial curve fitting include divided differences and iterated interpolation.

**Curve fitting when done by method of least squares Curve Fitting, we assume such curves that are best fit with minimum abnormalities squared (least squared error) from a given sample of data.**

Suppose we have data points as (x1, y1), (x2, y2)… (xn, yn). Where, “x” is an independent variable and 'y' a dependent variable. Curve that we get after fitting F (x) has error represented as “E” from every data Point. Error can be calculated as:

E1 = y1 – F(x),

E2 = y2 – F(x),

E3 = y3 – F(x),

And so on to,

En = yn – F(x)

Best fitting curve has a characteristic that:

Pi = E1

^{2}+ E2

^{2}+ E3

^{2}+….. + En

^{2}= n (summation) i= 1 Ei

^{2},

Polynomial least squares fitting is one such method for polynomial curves fitting. In general Polynomials are most frequently used curves in mathematics. Least squares line method is performed on a Straight Line of standard form: y = mx + c. We use this method to estimate the data points like (x1, y1), (x2, y2)… (xn, yn) where, n > 2 and those are best fit. Same way we use this method to estimate data points of Parabola curve of the form y = a x2 + bx + c. We can also call it as a quadratic binomial function. Here, n > 3. Where, 'n' is total number of data points in Sample Space.

Least – Squares p th degree of polynomials represented as: y = a0 + a1 x + a2 x

^{2 }+ …. + am xm. Here also we check for fitness of curve by estimating data points within which the curve can be found existing. Here, n > p + 1.

**Curve fitting algorithm in mathematics deals with estimating those data points in given Set of samples for which curves are best fitted**. There are several techniques that we follow to implement this algorithm. Let us discuss them in brief as follows:

1. Least squares Curve Fitting: This curve fitting technique is based on the use of least squares, which uses only those curves that are best fit with least deviations or errors when squared (least squared error) from a given set of data. Let us say data points of a Sample Space are given as (N1, N1), (N2, M2)… (Nn, Mn). Where, “N” and “M” are two independent and dependent variables respectively. For any curve F (N) we are going to estimate these points that are best fit for curve. Deviations or errors are represented as “Error” that are calculated from every data Point as:

Error1 = M1 – F (N)

Error2 = M2 – F (N)

Error3 = M3 – F (N)

And so on to,

Error n = Mn – F (N)

Best fitting curve has a property according to which value 'p i' is calculated as summation of error values from all data points respectively as:

P i = Error1

^{2}+ Error2

^{2}+ Error3

^{2}+….. + Errorn

^{2}= n (summation) i= 1 Errori

^{2}

2. We have other methods also for curve fitting depending on type of curve we are dealing with:

a. Linear Regression for curves like straight lines y = ax + b.

b. Exponential curve fitting for exponent Functions like y = a ex.

c. Rational curve fitting for rational or fractional function like y = 1 / (ax + c).

d. Logarithmic fitting for curves including logs like y = log (ax + g).

e. Polynomial Fitting for polynomial curves like y = ax

^{2}+ bx + d.

**Exponential Curve Fitting is a type of curve fitting method that uses Functions with expressions containing exponential terms.**We have different types of exponential functions that can best suit this fitting method:

1. Y = A

^{BX}+ C

2. Y = A

^{eXB}+ C

3. Y = e

^{2XA}+ BX + C

4. Y = A B

^{X}

5. Y = A e

^{X}

Where, 'e' is a mathematical term (base of natural log) that holds a value equals to 2.718.

Representing exponential functions 'I' their simples form: Y = A HX. Substituting value of 'X' equals to 0 we get corresponding value of 'Y' as 'a'. This can also be assumed to be the start value, y = a. 'H' in the given function is called as growth factor.

Exponential functions can be thought of resulting from persistency of relative growth. For every increasing value of X, Y also increases by multiplying 'X' with factor 'H'. Total growth is result of addition while relative growth results from multiplication. Let us consider example of two exponential functions given as: Y = 10 (0.8)

^{X}and Y = (1.1)

^{ X}.

In differentiation growth of any function, which can also be called as increasing rate of function f (X) at any value of 'X' is given as f '(X) or Y'. We can also call it as Slope of a Tangent at that Point. Relative rate of increase or growth is calculated as: f '(X) / f(X). When you see exponential functions of form Y= A e

^{BX}, where e = 2.718, differential of such a function will result into its original value only. That is, the function Y = e

^{X }is derivative of its own. Thus its relative growth is given as constant i.e. 1.