Sales Toll Free No: 1-855-666-7446

Linear Independence

Top

In linear algebra, we deal with vector spaces, matrices and linear equations. There can be either a finite-dimensional or an infinite-dimensional vector space which depends upon how many linearly independent basis vectors it has. The concept of linear dependence or linear independence is studied in the reference of linear combination which is defined as the sum of n distinct terms. Each term is written in the form of product of a vector and a scalar quantity. Assume that the a vector space is defined by {$u_{1}, u_{2}, ..., u_{n}$} over a field of scalars {$a_{1}, a_{2}, ..., a_{n}$}, then the linear combination would be $a_{1}u_{1} + a_{2} u_{2} + ... + a_{n}u_{n}$.

In this article, we will throw light on the concept of linear independence. We will briefly learn about linear independence of random variables, matrices, functions, determinant and see few examples based on this notion.

Definition

Back to Top
A set of vectors is said to be linearly independent when there does not exist any vector in given set which could be expressed as the scalar multiple of any other vector in that set. i.e. any vector is not a linear combination of the other. On the contrary, when any vector can be written as the scalar multiple of the other or can form a linear combination with other, it is called that the set is linearly dependent.

Let a set of vectors V = {$u_{1}, u_{2}, ..., u_{n}$} is defined over the set of scalars A = {$a_{1}, a_{2}, ..., a_{n}$}, then set V is defined as linearly independent if the linear combination equation $a_{1}u_{1} + a_{2} u_{2} + ... + a_{n}u_{n} = 0$ is satisfied only by $a_{1} = a_{2} = ... = a_{n} = 0$

Linear Independence of Matrices

Back to Top
Let A be an m x n order matrix of scalars and

$X$ = $\begin{bmatrix} x_{1}\\ x_{2}\\ ...\\ x_{n}\end{bmatrix}$

be a matrix of vector variables.

Also, suppose $a_{1}, a_{2}, ..., a_{n}$ be any column matrix $A$, then $X$ is said to be linearly independent if the equation of linear combination

$a_{1}x_{1} + a_{2} x_{2} + ... + a_{n}x_{n} = 0$

is satisfied only if $a_{1} = a_{2} = ... = a_{n} = 0$.

Linear Independence Using Determinant Test

Back to Top
How to determine whether a matrix is linearly independent or dependent? Well, it can be easily done using determinant test. A matrix to be linearly independent, its determinant must be nonzero. In other words, if the value of determinant of a matrix is not equal to zero, then it is said to be a linearly independent matrix. And if the determinant comes out to be zero, then it means that matrix is linearly dependent. This is quite a simple test in order to check if a matrix is linear independent or not. We just have to find the value of determinant and it should not be zero for the matrix to be linearly independent.

Linear Independence of Functions

Back to Top
Assume that there be a set of functions denoted by $f_{1}(x), f_{2}(x), ... ,f_{n}(x)$ which is known as linearly independent if in the set, there exists no such function which could be expressed in terms of linear combination of another function or functions in the set.

For instance: The set of functions $f_{1}(x)$ = $x,\ f_{2}(x) = 3 sin^{2}x$ and $f_{3}(x) = 4cos^{2}x$ is linearly independent. Since, no function can be written as the linear combination of another.

On the other hand, the functions $t^{2},\ 3t\ +\ 1,\ 3t^{2}\ +\ 6t\ +\ 2$ and $t^{3}$ are linearly dependent as we may write

$3t^{2}\ +\ 6t\ +\ 2$ = $3(t^{2})\ +\ 2(3t\ +\ 1)$

Projective Space of Linear Dependences

Back to Top
When set of vectors $u_{1}, u_{2}, ..., u_{n}$ show linear dependence with the scalar components $a_{1}, a_{2}..., a_{n}$, where at least one is nonzero, in such a way that $a_{1}u_{1} + a_{2} u_{2} + ... + a_{n}u_{n}$ = $0$

Then the set of vectors is said to be linearly dependent.

If we identify two linear dependences such that one is a non-zero multiple of the other, then the set of vectors $u_{1}, u_{2}, ..., u_{n}$ is known as a projective space. In this situation, both the linear dependences define same linear relationship of the vectors.

Examples

Back to Top
Have a look at following examples.
Example 1: 

Check if the set of vectors $(3,\ 1,\ 6),\ (2,\ 1,\ 4)$ and $(2,\ 0,\ 4)$ are linearly independent ?

Solution:
 

Let's form a matrix whose rows are $(3,\ 1,\ 6),\ (2,\ 1,\ 4)$ and $(2,\ 0,\ 4)$.

$\begin{bmatrix} 3 & 1 & 6\\ 2 & 1 & 4\\ 2 & 0 & 4\end{bmatrix}$

This will be linearly independent if its determinant is nonzero.

$\begin{vmatrix} 3 & 1 & 6\\ 2 & 1 & 4\\ 2 & 0 & 4\end{vmatrix}$

= $3(1\ .\ 4\ -\ 0\ .\ 4)\ -\ 1\ (2\ .\ 4\ -\ 2\ .\ 4)\ +\ 6(2\ .\ 0\ -\ 1\ .\ 2)$

= $3(4)\ -\ 1\ (0)\ +\ 6(-2)$

= $12\ -\ 12$ = $0$

Hence, given set of vectors are linearly independent.
Example 2: 

Prove that the functions $e^{x}$ and $e^{2x}$ are linearly independent for real variable $x$.

Solution: 

Lets form a linear combination of given functions with real numbers, say $a$ and $b$. It is given by

$a\ e^{x}\ +\ be^{2x}$ = $0$

For linear independence, we are required to prove that $a$ = $b$ = $0$.

We have

$a\ e^{x}\ +\ be^{2x}$ = $0$

Dividing this by $e^{x}$ (since it can never be zero), we get

$a\ +\ be^{x}$ = $0$

$be^{x}$ = $-a$

As we can see that $be^{x}$ is equal to constant term $-a.$ It implies that $be^{x}$ must be independent of $x$. As $e^{x}$ cannot be zero, therefore $b$ = $0$.

It shows that a must be equal to zero as well.

Hence, given variables are linearly independent.