Skip to content

Commit

Permalink
Try Mermaid in Jupyter notebook in GitHub
Browse files Browse the repository at this point in the history
  • Loading branch information
Pim-Mostert committed Oct 4, 2024
1 parent 023e431 commit 88729e7
Show file tree
Hide file tree
Showing 2 changed files with 92 additions and 4 deletions.
56 changes: 56 additions & 0 deletions docs/SimpleBayesianNetwork.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Hoi"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"$P(Q) = \\hat{Y}$\n",
"\n",
"```\n",
"\n",
"hoi \n",
"mooi man\n",
"```\n",
"\n",
"```mermaid\n",
"\n",
"graph TD\n",
" Q((Q))\n",
" Y((Y))\n",
"\n",
" Q-->Y\n",
" \n",
"```"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.7"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
40 changes: 36 additions & 4 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
[TOC]

# Original sum-product algorithm

## Simple example Baysian network
## Simple example Bayesian network

First consider a simple Bayesian network, in which there is one hidden variable $Q$ and one observed variable $Y$.

Expand Down Expand Up @@ -78,7 +80,7 @@ f_1(q) = P(Q) \\
f_2(q, y) = P(Y|Q)
```

### Message definitions
#### Message definitions

```math
Expand All @@ -88,8 +90,8 @@ d_3(y) & =
P(\hat{Y}|Y)
\qquad \qquad \qquad \qquad \qquad \qquad \qquad
& \begin{cases}
1 & \text{if } \hat{y} = y \\
0 & \text{if } \hat{y} \ne y \\
1 & \text{if } y = \hat{y} \\
0 & \text{if } y \ne \hat{y} \\
\end{cases} \\
a_3(y) & =
d_3(y)
Expand All @@ -115,4 +117,34 @@ b_1(q) & =
\end{align}
```

#### Inference

Since most messages depend on other message, a number of iterations is needed to calculate the final values for all messages (in this case 4 iterations: $d_3$→$a_3$→$b_2$→$a_2$→$b_3$).

After that, the messages can be used to perform inference on the hidden variable $Q$:

```math
a_1(q)b_1(q) = P(\hat{Y}|Q)P(Q)
```

Normalizing yields the posterior distribution for $Q$:

```math
P(Q|\hat{Y}) = \frac{P(\hat{Y}|Q)P(Q)}{P(\hat{Y})}
= \frac{P(\hat{Y}|Q)P(Q)}{\sum\limits_{q}{P(\hat{Y}|Q)P(Q)}}
= \frac{a_1(q)b_1(q)}{\sum\limits_{q}{a_1(q)b_1(q)}}
```

Furthermore, the messages can be used to calculate the conditional probability distributions between hidden and/or observed variables:

```math
```

0 comments on commit 88729e7

Please sign in to comment.