#!/usr/bin/env python
# coding: utf-8
# # Appendix: a gentle introduction to differential equations
#
#
#
# ## What is the differential equation?
#
#
# The differential equation is the relationship between the function sought and its function
# derivative.
#
# In the general case, we are talking about the equation of the $n$th rank if we have
# relations:
#
# $$F(y^{(n)}(x),y^{(n-1)}(x),\dots,y(x),x) = 0,$$
#
# where:
#
# - $y^{(n)}(x) = \frac{d^n f(x)}{dx^n}.$
# - $y(x)$ is the function you are looking for, or a dependent variable,
# - $x$ is called an independent variable
#
# We may also have a situation that we have $m$ $n$ equations on $m$
# $y_i.$ function A special case is the $m$ system of the first equations
# $m$ degree of function. It turns out that it is possible from the $n$ equation of this degree,
# create an equivalent $n$ system of first order equations.
#
# ## Example: Newton equation for one particle in one dimension
#
#
# The particle's motion is described by:
#
# $$m a = F$$
#
# Acceleration is the second derivative of the position over time, and the force is in
# generality some function of $x$ position and time. So we have:
#
# $$m \ddot x = F(\dot x, x, t)$$
#
# Let's introduce the new $v = \frac{dx(t)}{dt}$ function now. Substituting in
# the previous equation can be written:
#
# $$\begin{cases} \dot x = v \\ \dot v = - \frac{F(\dot x,x,t)}{m} x. \end{cases}$$
#
# We see that we have received two systems from one equation of the second degree
# first degree equations.
#
# First order equations are often presented in a form in which after
# the right side of the equal sign stands for the derivative and the left for the expression
# depending on the function:
#
# $$\underbrace{\frac{dx}{dt}}_{\text{derivative }} = \underbrace{f(x,t)}_{\text{Right Hand Side}}$$
#
# ## Geometric interpretation of differential equations.
#
#
# Consider the system of two equations:
#
# $$\begin{cases} \dot x = f(x,y) \\ \dot y = g(x,y) \end{cases}.$$
#
# This is the so-called two-dimensional autonomous system of differential equations
# ordinary. Autonomy means the independence of right parties from time
# (ie, independent variable). An example of such a system can be traffic
# particles in one dimension with forces independent of time.
#
# Equations from the above system can be approximated by substituting derivatives
# differential quotient:
#
# $$\begin{cases} \frac{x(t+h)-x(t)}{h} = f(x,y) \\ \frac{y(t+h)-y(t)}{h} = g(x,y) \end{cases},$$
#
# multiplying each equation by $h$ and transferring a member from value
# dependent variables at the moment $t$ to the right page we get:
#
# $$\begin{cases} x(t+h) = x(t) + h \cdot f(x,y) \\ y(t+h) = y(t) +h \cdot g(x,y). \end{cases}$$
#
# According to the definition of a derivative, in the $h\to\infty$ border of the $f(x,y)$ expression
# can be taken at the time between $t$ and $t+h$. Let's assume for ease,
# that we will take a moment $t$.
#
#
#
# Comment: this choice leads to the so-called overt algorithm if
# we would take a moment eg $t+h$ it would be an entangled algorithm and execution
# the step would be related to the solution of algebraic equations.
#
#
#
# This system has an interesting interpretation:
#
# - firstly, notice that the pair of functions determines the vector field on
# plane
# - secondly, these equations give us a recipe like the value of the function in
# of the moment $t$ get the value from the "next" moment $t+h$ which can be
# useful for recreating the $(x(t),y(t))$ curve.
#
# ## Vector field
#
#
# A vector field is a function that gives each point of space
# assigns a certain vector size. If the space will be e.g.
# $\mathbb{R}^2$ is a function that will consist of two functions
# scalar:
#
# $$\vec F(x,y) = \begin{cases}f(x,y) \\ g(x,y) \end{cases}$$
#
# This vector field can be visualized by drawing arrows for a certain one
# number of points on the plane. An example of widespread use of such
# vector field is wind speed field.
#
# In Sage, we can draw a vector field with `plot _vector_ field`
# In[ ]:
var('x y')
f(x,y) = -0.4*x - 0.9*y
g(x,y) = 0.9*x - 0.4*y
plt_v = plot_vector_field((f(x,y),g(x,y)),(x,-1.2,1.2),(y,-1.2,1.2),figsize=6,aspect_ratio=1)
plt_v
# ## Graphical solution of the system of two differential equations
#
#
# Using the derived approximated formulas to allow calculating
# solution of the system of differential equations at the moment $t+h$ knowing them in
# At the moment of Mathath2 we can try to sketch a solution based on the chart
# vector field. It is enough to move in small steps according to
# the local direction of the arrows.
#
# Let's try to do it with the help of the algorithm:
#
# 1. we take the starting point in $t$
# 2. we calculate the point at the moment $t+h$
# 3. we draw the end point on the graph
# 4. we take the final point as the starting point
# 5. we go back to 1.
# In[ ]:
x0,y0 = (1,0)
h = 0.2
# By executing this cell many times we get the next steps of the algrorytm:
# In[ ]:
x1,y1 = x0+h*f(x0,y0),y0+h*g(x0,y0)
plt_v = plt_v + point((x0,y0)) + arrow2d( (x0,y0), (x0+h*f(x0,y0),y0+h*g(x0,y0) ),width=1,arrowsize=2,arrowshorten=-10,aspect_ratio=1 )
x0,y0 = x1,y1
plt_v
# In[ ]:
h = 0.2
x0,y0 = (1,0)
plts = [plot_vector_field((f(x,y),g(x,y)),\
(x,-1.2,1.2),(y,-1.2,1.2),\
figsize=(3,3),aspect_ratio=1)]
for i in range(5):
x1,y1 = x0+h*f(x0,y0),y0+h*g(x0,y0)
plt_v = plt_v + point((x0,y0)) + arrow( (x0,y0), (x0+h*f(x0,y0),y0+h*g(x0,y0)) ,width=1,arrowsize=2,arrowshorten=-10 )
x0,y0 = x1,y1
plts.append(plt_v)
# In[ ]:
plts[-1].show()
# animate(plts).show()
# We have the following conclusions:
#
# 1. The solution of the system of 2 first order equations is the curve w
# $\mathbb{R}^2.$ space
# 2. The curve depends on the selection of the starting point.
# 3. Two solutions coming from different starting points may be
# go down to one point, but **can not intersect!**
# 4. Because we have an unlimited selection of starting points and
# there is (3) the solution of the system of two equations is two-parameter
# family of flat curves.
#
# The differential equation (or system of equations) with the initial condition is called
# in the mathematics of Cauchy. Point (3) is known as
# Piccard's theorem on the existence and uniqueness of solutions to the problem
# Cauchy and it's worth noting that it imposes some restrictions on
# variability of the right sides of the system of equations.
#
# ## Analytical solutions of differential equations
#
# Differential equations can be analyzed using the graphical method a
# numeric values can be obtained with any accuracy using
# approximate method. These methods do not limit the form in any way
# right sides of the layout.
#
# Is it possible to obtain an analytical formula for the family of functions being
# the solution of the differential equation?
#
# This is difficult in the general case, however there are several forms of equations
# differentials in which we can always find an analytical solution.
# One of such cases is one separable equation of the first
# degree. Separability means that the right side is the product of the function
# $x$ and $t$:
#
# $$\frac{dx}{dt} = f(x,t) = a(x)\cdot b(t).$$
# In this case, we can write the equation, treating the derivative as
# product of differentials:
#
# $$\frac{dx}{dt} = a(x)\cdot b(t)$$
# and integrate the above expression on both sides. Because the left side is not
# it explicitly includes time integration after $x$ we are doing as if $x$ was
# independent variable.
#
# ### Example:
#
# $$\frac{dx}{dt} = - k x$$
#
# $$\frac{dx}{x} =-k dt$$
#
# $$\log(x(t)) =-k t + C$$
# assuming that $x>0$.
# When we solve $x$ we have:
#
# $$x(t) = e^{-kt +C}$$
# Let's see how the integration constant depends on the initial condition. Let
# $x(0)=x_0$, we have:
#
# $$x(t=0) = e^{-k0+C} =e^{C}.$$
# So we can save the solution with the initial condition $x(0)=x_0$ as:
#
# $$x(t) =x_0 e^{-kt}.$$
# Let's check if this solution agrees with the obtained approximate method:
# In[ ]:
L = []
k = 1.0
dt = 0.01
x0 = 1.2
X = x0
czas = 0
xt = [X]
ts = [0]
for i in range(500):
X = X + dt*(-k*X)
czas = czas + dt
if not i%10:
xt.append(X)
ts.append(czas)
var('t')
p1 = plot( x0*exp(-k*t) ,(t,0,5),color='red',figsize=(5,2) )
p2 = point(zip(ts,xt))
p1 + p2
# ## Solving ODEs using `desolve_odeint`
#
#
# There are several algorithms built in to the Sage system that significantly
# more accurately and more efficiently solve differential equations. Without
# going into the details of their implementation, it is worth learning them
# use.
#
# A good choice in general case is the function `desolve_ system:`
#
# `desolve_odeint (right sides of differential equations, initial conditions, times, searched)`
#
#
# For our example, we have the use of this procedure looks like the following
# way:
# In[ ]:
f = -k*x
ic = 1.2
t = srange(0,5.01,0.5)
sol = desolve_odeint(f,ic,t,x)
p = points(zip(t,sol[:,0]),size=40,color='green')
(p1 + p).show()
print k, t
# The solution is passed in the form of a matrix (in fact, type
# np.array from the numpy package) in which for every $n$ equations each row contains
# $n$ variable values in subsequent time periods.
# In our case, we have one equation:
# In[ ]:
sol.shape
# In[ ]:
type(sol)
# ### Example: harmonic oscillator
#
# A system of two differential equations corresponding to the motion of a particle in
# Potential (1d)
#
# $$U(x) = \frac{1}{2} k x^2$$
#
# Newton's equation:
#
# $$m \ddot x = m a = F = -U'(x) = -k x$$
#
# what can you save:
#
# $$\begin{cases} \dot x = v \\ \dot v = - k x \end{cases}$$
# In[ ]:
var('t')
var('x, v')
k = 1.2
times = srange(0.0, 11.0, 0.025, include_endpoint=True)
sol = desolve_odeint([v, -k*x], [1,0], times, [x,v])
# The solution is a numpy array (see [Introduction to
# numpy] (https://sage2.icse.us.edu.pl/home/pub/114/)), which can be
# conveniently and efficiently searched by the "slicing" technique, e.g.
# In[ ]:
sol [::200, :]
# The parametric dependence of $(x(t),v(t))$ can be presented on the plane
# (X, v):
#
# The dependencies on time, speed and position are given by functions
# periodic:
# In[ ]:
var('x v')
k = 1.2
sol = desolve_odeint([v, -k*x], [3.1,0], times, [x,v])
px = line(zip(times,sol[:,0]),figsize=(5,2))
pv = line(zip(times,sol[:,1]),figsize=(5,2),color='red')
px+pv
# Because this system is known as a harmonic oscillator and we know that
# solution for the initial condition $x(0)=1$, $v(0)=0$ is in the form:
#
# $$x(t) = \cos(\sqrt{k}t), v(t) = -\sin(\sqrt{k}t).$$
#
# therefore, we can compare the result of the approximate method and the solution
# Analytical.
#
# An analytical solution can also be obtained using the Sage function
# desolve, which solves the differential equations symbolically:
# In[ ]:
sage: var('t k')
sage: assume(k>0)
sage: x = function('x')(t)
sage: de = diff(x,t,2) == -k*x
sage: desolve(de, x,ivar=t)
# Even if we know the form of the differential equation solution, then we can
# always use desolve, this is the correct application of the condition
# initial. Take, for example, the harmonic oscillator in which at the moment
# the initial $x(0)=x_0$ and $v(0)=v_0$:
# In[ ]:
var('t k')
assume(k>0)
x = function('x')(t)
de = diff(x,t,2) == -k*x
var('v0,x0')
show( desolve(de, x,ics=[0,x0,v0],ivar=t))
# Let's compare the numerical and analytic solution for the condition
# initial $x_0,v_0=0,1$:
# In[ ]:
sage: var('t x v')
sage: k=1.22
sage: sol = desolve_odeint([v, -k*(x)], [1.,0], times, [x,v])
sage: px = line(zip(times,sol[:,0]),figsize=(5,2))
sage: px+plot(cos(sqrt(k)*t),(t,0,10),color='green')
# In[ ]:
sage: var('t')
sage: pv = line(zip(times,sol[:,1]),figsize=(5,2),color='red')
sage: pv+plot(-sqrt(k)*sin(sqrt(k)*t),(t,0,10),color='green')
# ### Example 2: mathematical pendulum:
#
# Newton's equation:
#
# $$m \ddot x = m a = F = -U'(x) = -k \sin(x)$$
#
# can be written in a form of a system od two ODEs:
#
# $$\begin{cases} \dot x = v \\ \dot v = - k \sin(x) \end{cases}$$
# \newpage