Skip to content

Commit 035fe13

Browse files
authored
Reorganized interpolation and integration (#370)
* Reorganized interpolation and integration * Added tauchen
1 parent f0eeee8 commit 035fe13

File tree

11 files changed

+470
-368
lines changed

11 files changed

+470
-368
lines changed

lectures/_toc.yml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,8 @@ parts:
1313
numbered: true
1414
chapters:
1515
- file: more_julia/generic_programming
16-
- file: more_julia/general_packages
16+
- file: more_julia/auto_differentiation
17+
- file: more_julia/quadrature_interpolation
1718
- file: more_julia/data_statistical_packages
1819
- file: more_julia/optimization_solver_packages
1920
- caption: Software Engineering

lectures/about_lectures.md

Lines changed: 10 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -45,10 +45,18 @@ While Julia has many features of a general purpose language, its specialization
4545
using Matlab or Fortran than using a general purpose language - giving it an advantage in being closer
4646
to both mathematical notation and direct implementation of mathematical abstractions.
4747

48+
Julia has both a large number of useful, well written libraries and many incomplete poorly maintained proofs of concept.
49+
50+
A major advantage of Julia libraries is that, because Julia itself is sufficiently fast, there is less need to mix in low level languages like C and Fortran.
51+
52+
As a result, most Julia libraries are written exclusively in Julia.
53+
54+
Not only does this make the libraries more portable, it makes them much easier to dive into, read, learn from and modify.
55+
4856
### A Word of Caution
4957

5058
The disadvantage of specialization is that Julia tends to be used by domain experts, and consequently
51-
the ecosystem and language for non-mathematical/non-scientfic computing tasks is inferior to Python.
59+
the ecosystem and language for non-mathematical/non-scientific computing tasks is inferior to Python.
5260

5361
Another disadvantage is that, since it tends to be used by experts and is on the cutting edge, the tooling is
5462
much more fragile and rudimentary than Matlab.
@@ -59,12 +67,6 @@ not expect the development tools to quite as stable, or to be comparable to Matl
5967

6068
Nevertheless, the end-result will always be elegant and grounded in mathematical notation and abstractions.
6169

62-
For these reasons, Julia is most appropriate at this time for researchers who want to:
63-
64-
1. invest in a language likely to mature in the 3-5 year timeline
65-
1. use one of the many amazing packages that Julia makes possible (and are frequently impossible in other languages)
66-
1. write sufficiently specialized algorithms that the quirks of the environment are much less important than the end-result
67-
6870
## Advantages of Julia
6971

7072
Despite the short-term cautions, Julia has both immediate and long-run advantages.
@@ -76,16 +78,13 @@ The advantages of the language itself show clearly in the high quality packages,
7678
- Interval Constraint Programming and rigorous root finding: [IntervalRootFinding.jl](https://github.com/JuliaIntervals/IntervalRootFinding.jl)
7779
- GPUs: [CuArrays.jl](https://github.com/JuliaGPU/CuArrays.jl)
7880
- Linear algebra for large-systems (e.g. structured matrices, matrix-free methods, etc.): [IterativeSolvers.jl](https://juliamath.github.io/IterativeSolvers.jl/dev/), [BlockBandedMatrices.jl](https://github.com/JuliaMatrices/BlockBandedMatrices.jl), [InfiniteLinearAlgebra.jl](https://github.com/JuliaMatrices/InfiniteLinearAlgebra.jl), and many others
79-
- Automatic differentiation: [Zygote.jl](https://github.com/FluxML/Zygote.jl) and [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl)
81+
- Automatic differentiation: [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) and [Enzyme.jl](https://github.com/EnzymeAD/Enzyme.jl)
8082

8183
These are in addition to the many mundane but essential packages available. While there are examples of these packages in other languages, no
8284
other language can achieve the combination of performance, mathematical notation, and composition that Julia provides.
8385

8486
The composition of packages is especially important, and is made possible through Julia's use of something called [multiple-dispatch](https://en.wikipedia.org/wiki/Multiple_dispatch).
8587

86-
The promise of Julia is that you write clean mathematical code, and have the same code automatically work with automatic-differentiation, interval arithmetic, and GPU arrays--all of which may be used in
87-
cutting edge algorithms in packages and combined seamlessly.
88-
8988
## Open Source
9089

9190
All the computing environments we work with are free and open source.

lectures/dynamic_programming/jv.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -137,7 +137,7 @@ and use Monte Carlo integration, or discretize.
137137

138138
Here we will use [Gauss-Jacobi Quadrature](https://en.wikipedia.org/wiki/Gauss–Jacobi_quadrature) which is ideal for expectations over beta.
139139

140-
See {doc}`general packages <../more_julia/general_packages>` for details on the derivation in this particular case.
140+
See {doc}`quadrature and interpolation <../more_julia/quadrature_interpolation>` for details on the derivation in this particular case.
141141

142142
```{code-cell} julia
143143
function gauss_jacobi(F::Beta, N)

lectures/getting_started_julia/julia_by_example.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -930,6 +930,8 @@ until $| x^{n+1} - x^n|$ is below a tolerance
930930

931931
For those impatient to use more advanced features of Julia, implement a version of Exercise 8(a) where `f_prime` is calculated with auto-differentiation.
932932

933+
See {doc}`auto-differentiation <../more_julia/auto_differentiation>` for more.
934+
933935
```{code-cell} julia
934936
using ForwardDiff
935937

lectures/introduction_dynamics/finite_markov.md

Lines changed: 0 additions & 70 deletions
Original file line numberDiff line numberDiff line change
@@ -1156,69 +1156,6 @@ and return the list of pages ordered by rank.
11561156

11571157
When you solve for the ranking, you will find that the highest ranked node is in fact `g`, while the lowest is `a`.
11581158

1159-
(mc_ex3)=
1160-
### Exercise 3
1161-
1162-
In numerical work it is sometimes convenient to replace a continuous model with a discrete one.
1163-
1164-
In particular, Markov chains are routinely generated as discrete approximations to AR(1) processes of the form
1165-
1166-
$$
1167-
y_{t+1} = \rho y_t + u_{t+1}
1168-
$$
1169-
1170-
Here ${u_t}$ is assumed to be i.i.d. and $N(0, \sigma_u^2)$.
1171-
1172-
The variance of the stationary probability distribution of $\{ y_t \}$ is
1173-
1174-
$$
1175-
\sigma_y^2 := \frac{\sigma_u^2}{1-\rho^2}
1176-
$$
1177-
1178-
Tauchen's method {cite}`Tauchen1986` is the most common method for approximating this continuous state process with a finite state Markov chain.
1179-
1180-
A routine for this already exists in [QuantEcon.jl](http://quantecon.org/quantecon-jl) but let's write our own version as an exercise.
1181-
1182-
As a first step we choose
1183-
1184-
* $n$, the number of states for the discrete approximation
1185-
* $m$, an integer that parameterizes the width of the state space
1186-
1187-
Next we create a state space $\{x_0, \ldots, x_{n-1}\} \subset \mathbb R$
1188-
and a stochastic $n \times n$ matrix $P$ such that
1189-
1190-
* $x_0 = - m \, \sigma_y$
1191-
* $x_{n-1} = m \, \sigma_y$
1192-
* $x_{i+1} = x_i + s$ where $s = (x_{n-1} - x_0) / (n - 1)$
1193-
1194-
Let $F$ be the cumulative distribution function of the normal distribution $N(0, \sigma_u^2)$.
1195-
1196-
The values $P(x_i, x_j)$ are computed to approximate the AR(1) process --- omitting the derivation, the rules are as follows:
1197-
1198-
1. If $j = 0$, then set
1199-
1200-
$$
1201-
P(x_i, x_j) = P(x_i, x_0) = F(x_0-\rho x_i + s/2)
1202-
$$
1203-
1204-
1. If $j = n-1$, then set
1205-
1206-
$$
1207-
P(x_i, x_j) = P(x_i, x_{n-1}) = 1 - F(x_{n-1} - \rho x_i - s/2)
1208-
$$
1209-
1210-
1. Otherwise, set
1211-
1212-
$$
1213-
P(x_i, x_j) = F(x_j - \rho x_i + s/2) - F(x_j - \rho x_i - s/2)
1214-
$$
1215-
1216-
The exercise is to write a function `approx_markov(rho, sigma_u, m = 3, n = 7)` that returns
1217-
$\{x_0, \ldots, x_{n-1}\} \subset \mathbb R$ and $n \times n$ matrix
1218-
$P$ as described above.
1219-
1220-
* Even better, write a function that returns an instance of [QuantEcon.jl's](http://quantecon.org/quantecon-jl) MarkovChain type.
1221-
12221159
## Solutions
12231160

12241161
### Exercise 1
@@ -1313,10 +1250,3 @@ tags: [remove-cell]
13131250
@test ranked_pages['l'] ≈ 0.032017852378295776
13141251
end
13151252
```
1316-
1317-
### Exercise 3
1318-
1319-
A solution from [QuantEcon.jl](https://github.com/QuantEcon/QuantEcon.jl) can be found [here](https://github.com/QuantEcon/QuantEcon.jl/blob/master/src/markov/markov_approx.jl).
1320-
1321-
[^pm]: Hint: First show that if $P$ and $Q$ are stochastic matrices then so is their product --- to check the row sums, try postmultiplying by a column vector of ones. Finally, argue that $P^n$ is a stochastic matrix using induction.
1322-

0 commit comments

Comments
 (0)