In the last post, we’ve already seen the motivation of the question of convergence to equilibrium. This post continues to give a method which does not only give qualitative result (does the trajectory converge to equilibrium?) but (usually) also quantitave result (how fast is the convergence?).
Let us start with the classic Gronwall’s inequality: Assume that is a differentiable function. If
Especially, when then decays exponentially.
A heat equation
We continue to see how to show the convergence to equilibrium for a heat equation with homogeneous Neumann boundary condition: let be a bounded domain with boundary. Consider the heat equation for
The equilibrium satisfies in and on . Then we see that any constant function is an equilibrium.
However, note that from , we get, after integrating over
Using the homogeneous Neumann boundary condition we have . Then it follows that
for all .
This is called the mass conservation (or conservation of mass) of the heat equation. This gives us a hint that the equilibrium should also satisfies the mass conservation, i.e.
or equivalently .
We now aim to prove the convergence of to as .
Denote by the usual norm of . We define an entropy functional
Compute the time derivative of we have
We call the quantity the entropy dissipation (or entropy production). By the Poincaré inequality
This is called an entropy-entropy dissipation estimate. Therefore, we have
thus, by the Gronwall lemma
Theorem. The trajectory of the heat equation with homogeneous Neumann boundary condition converges exponentially to with the rate where is the constant in the Pointcaré inequality.
We now state the basic idea of the entropy method for an evolution equation of the form
which has a unique equilibrium and a set of mass conservations.
We aim to find
(i) an entropy functional which has the property and has
(ii) nonnegative entropy dissipation ; and
(iii) an entropy-entropy dissipation estimate of the form
If we have (i)-(iii) then it follows first by Gronwall’s lemma that exponentially and then by (i) that exponentially.
(Some) Good things about entropy method
(i) The entropy method is based on an functional inequality (entropy-entropy dissipation estimate) which is usually not directly linked to the evolution system. That makes the method is quite robust to generalisation. Once the functional inequality is established, it can be used in any other system which has similar kind of entropy functional and entropy dissipation.
(ii) The entropy method usually gives explicit rate of convergence. If we can prove the functional inequality with explicit constant, then the convergence follows with explicit constant. This is the quantitative result that has been mentioned before.