A tool for determining whether a set of functions is linearly independent or not.
For Functions of One Variable:
For a given set of functions {g1(x), g2(x), g3(x) ... gn(x)} the Wronskian is defined by
	| g1     g2      g3   ...   gn    |
	| g'1    g'2     g'3  ...   g'n   |
W=	| g''1   g''2    g''3 ...   g''n  |
	| ................................|
	| g(n-1)1 g(n-1)2 g(n-1)3 ... g(n-1)n|

Which is interpreted as the determinant of the square matrix formed by n rows, the first row consisting of the functions in question, the second row consisting of their first derivatives, the third row consisting of their second derivatives, and so on, up to the nth row consisting of their (n-1) derivatives.

If the Wronskian is not equal to zero for any value x in the domain of {g1, g2, g3...gn} then the functions are linearly independent. The converse is also true. If W = 0 for all x in the domain, then the functions are linearly dependent.

It is also possible to determine if a set of functions is linearly independent on a given interval by considering only values of x in that interval. If W = 0 for all x in an interval I, then the set of functions is linearly dependent on I.

For information on how to evaluate this determinant, see determinant

For Vector Functions:
For a set of n column vectors {x1(t), x2(t), x3(t) ... xn(t)}, each with n elements, the Wronskian is defined by:
	| x11  x21  x31 ... xn1|
	| x12  x22  x32 ... xn2|
W=	| x13  x23  x33 ... xn3|
	| x1n  x2n  x3n ... xnn|

In this case, the Wronskian is simply the determinant of the matrix formed by combining the individual column vectors. (Note however, that there must be n column vectors each with n rows because the determinant is only defined for square matrices.)

The same rules for determining linear independence or dependence apply as for functions of one variable. If W is nonzero at any point t on an interval I, the set of vectors is linearly independent on I.

Wronskians are handy for dealing with some second order linear ordinary differential equations. For example consider the equation y'' + p(t)y' +q(t)y=0 (*)
where p and q are non constant functions that are everywhere non zero.

Calculating the Wronskian The wronskian of 2 linearly independant solutions y1 and y2 is y1y2'-y1'y2.

If we want to be able to do anything useful with the wronskian, we're going to want to be able to calculate it without knowing 2 linearly independant solutions.

Consider W'(t)=y1y2'' + y1'y2'-y1'y2'-y1''y2=y1y2''-y1''y2
then W'(t) + p W(t) = y1(y2''+py2') - y2(y1''+py1')
But since y1 and y2 are solutions of (*) then y2''+py2' = -qy2 and y1''+py1' = -qy1
thus W'(t) + pW(t) = -y1y2q + y2y1q=0
We now have a first order linear equation for W, which causes us to rub our hands with glee and shout "Huzzah", as we know how to solve these:
W(t)=k exp(-∫p(s)ds), with k constant (for our purposes non zero) This also shows that the wronskian is everywhere non null.


We now have a nice shiny expression for the Wronskian and I can tell some of you just can't wait to taste its awesome power. One use is if, by luck, the inspiration of the Holy Spirit or some other method, you have managed to guess what y1 is, and you would like to find a second linearly independant solution, as you will then have all the solutions of the equation (the set of solutions of an nth order linear differential equation is a vector space of dimension n). To do this, you just need to write down what the wronskian is again:
y2/y1-y1'y2/y12 = W(t)/y12
Noticing that the left hand side is the derivative of y2/y1 yields y2=y1at W(s)/y1(s)2ds
Of course some times this integral will be so horrible that you've more or less wasted your time, but you will probably be able to obtain a series expansion by expanding the integrand. You could also define a new function which is defined to be the antiderivative of W(t)y12

Another application of wronskians is guessing those pesky particular solutions. Our equation is now y'' + p(t)y' +q(t)y=h(t). As before, we'll have p and q everywhere non zero, and to make things interesting, non constant. I'll also assume h everywhere non-zero(if p and q constant you should try functions "like" h). I'll assume that, by hook or by crook you've solved the homogenous equation and so you have 2 linearly independant solutions y1 and y2. If h=0, then the solutions are the functions y = λy1+μy2 , λ and μ constants. The idea behind this method is to try and find a particular solution that is a nonlinear combination of y1 and y2, ie y = f y1+g y2, for some functions f and g of t. To make this useful, we are going to add an extra condition on f and g : f' y1+g' y2=0 we then have :
y=f y1+g y2
y'=f y1'+g y2' (using our condition on f and g)
y''=f' y1' +g' y2'+f y1''+g y2''

this gives y''+py'+qy = f (y1''+py1'+qy1)+g (y2''+py2'+qy2) +f' y1'+g' y2'=h
y1 and y2 both solve the homogenous equation, so the first 2 terms disappear. we are left with :
f'y1'+g'y2'=h Multiplying through by W/h gives:
(f'W/h)y1'+(g'W/h)y2'=W Identifying this with the expression of the Wronksian in terms of y1,y1',y2 and y2' gives :
Which gives you a particular solution. Use it with care.

In case you were wondering, the wronskian is named after Mr. Wronski.

Log in or register to write something here or to contact authors.