Skip to content

Commit b32ba19

Browse files
committed
update first week
1 parent 31beed1 commit b32ba19

File tree

8 files changed

+261
-658
lines changed

8 files changed

+261
-658
lines changed

doc/pub/week1/html/week1-bs.html

Lines changed: 8 additions & 62 deletions
Original file line numberDiff line numberDiff line change
@@ -199,19 +199,14 @@
199199
None,
200200
'defining-different-types-of-rbms-energy-based-models'),
201201
('Gaussian binary', 2, None, 'gaussian-binary'),
202-
('Representing the wave function',
203-
2,
204-
None,
205-
'representing-the-wave-function'),
206-
('Define the cost function', 2, None, 'define-the-cost-function'),
207202
('Extrapolations and model interpretability',
208203
2,
209204
None,
210205
'extrapolations-and-model-interpretability'),
211-
('Physics based statistical learning and data analysis',
206+
('Discipline based statistical learning and data analysis',
212207
2,
213208
None,
214-
'physics-based-statistical-learning-and-data-analysis'),
209+
'discipline-based-statistical-learning-and-data-analysis'),
215210
("Bayes' Theorem", 2, None, 'bayes-theorem'),
216211
('"Quantified limits of the nuclear '
217212
'landscape":"https://journals.aps.org/prc/abstract/10.1103/PhysRevC.101.044307"',
@@ -397,10 +392,8 @@
397392
<!-- navigation toc: --> <li><a href="#network-elements-the-energy-function" style="font-size: 80%;">Network Elements, the energy function</a></li>
398393
<!-- navigation toc: --> <li><a href="#defining-different-types-of-rbms-energy-based-models" style="font-size: 80%;">Defining different types of RBMs (Energy based models)</a></li>
399394
<!-- navigation toc: --> <li><a href="#gaussian-binary" style="font-size: 80%;">Gaussian binary</a></li>
400-
<!-- navigation toc: --> <li><a href="#representing-the-wave-function" style="font-size: 80%;">Representing the wave function</a></li>
401-
<!-- navigation toc: --> <li><a href="#define-the-cost-function" style="font-size: 80%;">Define the cost function</a></li>
402395
<!-- navigation toc: --> <li><a href="#extrapolations-and-model-interpretability" style="font-size: 80%;">Extrapolations and model interpretability</a></li>
403-
<!-- navigation toc: --> <li><a href="#physics-based-statistical-learning-and-data-analysis" style="font-size: 80%;">Physics based statistical learning and data analysis</a></li>
396+
<!-- navigation toc: --> <li><a href="#discipline-based-statistical-learning-and-data-analysis" style="font-size: 80%;">Discipline based statistical learning and data analysis</a></li>
404397
<!-- navigation toc: --> <li><a href="#bayes-theorem" style="font-size: 80%;">Bayes' Theorem</a></li>
405398
<!-- navigation toc: --> <li><a href="#quantified-limits-of-the-nuclear-landscape-https-journals-aps-org-prc-abstract-10-1103-physrevc-101-044307" style="font-size: 80%;">"Quantified limits of the nuclear landscape":"https://journals.aps.org/prc/abstract/10.1103/PhysRevC.101.044307"</a></li>
406399
<!-- navigation toc: --> <li><a href="#mathematics-of-deep-learning-and-neural-networks" style="font-size: 80%;">Mathematics of deep learning and neural networks</a></li>
@@ -477,10 +470,9 @@ <h2 id="overview-of-first-week-january-20-24-2025" class="anchor">Overview of fi
477470
<!-- subsequent paragraphs come in larger fonts, so start with a paragraph -->
478471
<ol>
479472
<li> Presentation of course</li>
480-
<li> Discussion of possible projects and presentation of participants</li>
473+
<li> Discussion of possible projects</li>
481474
<li> Deep learning methods, mathematics and review of neural networks</li>
482475
<li> <a href="https://youtu.be/" target="_self">Video of lecture to be posted after lecture</a></li>
483-
<li> Test your background knowledge (to be added)</li>
484476
</ol>
485477
</div>
486478
</div>
@@ -1162,6 +1154,8 @@ <h2 id="network-elements-the-energy-function" class="anchor">Network Elements, t
11621154
adjusting the energy function to best fit our problem.
11631155
</p>
11641156

1157+
<p>Recently these energy functions have been replaced by Neural Networks. This will be discussed later in the course.</p>
1158+
11651159
<!-- !split -->
11661160
<h2 id="defining-different-types-of-rbms-energy-based-models" class="anchor">Defining different types of RBMs (Energy based models) </h2>
11671161

@@ -1196,54 +1190,6 @@ <h2 id="gaussian-binary" class="anchor">Gaussian binary </h2>
11961190
</div>
11971191

11981192

1199-
<!-- !split -->
1200-
<h2 id="representing-the-wave-function" class="anchor">Representing the wave function </h2>
1201-
1202-
<p>The wavefunction should be a probability amplitude depending on
1203-
\( \boldsymbol{x} \). The RBM model is given by the joint distribution of
1204-
\( \boldsymbol{x} \) and \( \boldsymbol{h} \)
1205-
</p>
1206-
1207-
$$
1208-
P_{\mathrm{rbm}}(\boldsymbol{x},\boldsymbol{h}) = \frac{1}{Z} \exp{-E(\boldsymbol{x},\boldsymbol{h})}.
1209-
$$
1210-
1211-
<p>To find the marginal distribution of \( \boldsymbol{x} \) we set:</p>
1212-
1213-
$$
1214-
P_{\mathrm{rbm}}(\boldsymbol{x}) =\frac{1}{Z}\sum_{\boldsymbol{h}} \exp{-E(\boldsymbol{x}, \boldsymbol{h})}.
1215-
$$
1216-
1217-
<p>Now this is what we use to represent the wave function, calling it a neural-network quantum state (NQS)</p>
1218-
$$
1219-
\vert\Psi (\boldsymbol{X})\vert^2 = P_{\mathrm{rbm}}(\boldsymbol{x}).
1220-
$$
1221-
1222-
1223-
<!-- !split -->
1224-
<h2 id="define-the-cost-function" class="anchor">Define the cost function </h2>
1225-
1226-
<p>Now we don't necessarily have training data (unless we generate it by
1227-
using some other method). However, what we do have is the variational
1228-
principle which allows us to obtain the ground state wave function by
1229-
minimizing the expectation value of the energy of a trial wavefunction
1230-
(corresponding to the untrained NQS). Similarly to the traditional
1231-
variational Monte Carlo method then, it is the local energy we wish to
1232-
minimize. The gradient to use for the stochastic gradient descent
1233-
procedure is
1234-
</p>
1235-
1236-
$$
1237-
C_i = \frac{\partial \langle E_L \rangle}{\partial \theta_i}
1238-
= 2(\langle E_L \frac{1}{\Psi}\frac{\partial \Psi}{\partial \theta_i} \rangle - \langle E_L \rangle \langle \frac{1}{\Psi}\frac{\partial \Psi}{\partial \theta_i} \rangle ),
1239-
$$
1240-
1241-
<p>where the local energy is given by</p>
1242-
$$
1243-
E_L = \frac{1}{\Psi} \hat{\boldsymbol{H}} \Psi.
1244-
$$
1245-
1246-
12471193
<!-- !split -->
12481194
<h2 id="extrapolations-and-model-interpretability" class="anchor">Extrapolations and model interpretability </h2>
12491195

@@ -1258,7 +1204,7 @@ <h2 id="extrapolations-and-model-interpretability" class="anchor">Extrapolations
12581204
</p>
12591205

12601206
<!-- !split -->
1261-
<h2 id="physics-based-statistical-learning-and-data-analysis" class="anchor">Physics based statistical learning and data analysis </h2>
1207+
<h2 id="discipline-based-statistical-learning-and-data-analysis" class="anchor">Discipline based statistical learning and data analysis </h2>
12621208

12631209
<p>The above concepts are in some sense the difference between <b>old-fashioned</b> machine
12641210
learning and statistics and Bayesian learning. In machine learning and prediction based
@@ -1271,7 +1217,7 @@ <h2 id="physics-based-statistical-learning-and-data-analysis" class="anchor">Phy
12711217
to make these predictions.
12721218
</p>
12731219

1274-
<p>Physics based statistical learning points however to approaches that give us both predictions and correlations as well as being able to produce error estimates and understand causations. This leads us to the very interesting field of Bayesian statistics and Bayesian machine learning.</p>
1220+
<p>A discipline (Bioscience, Chemistry, Geoscience, Math, Physics..) based statistical learning points however to approaches that give us both predictions and correlations as well as being able to produce error estimates and understand causations. This leads us to the very interesting field of Bayesian statistics and Bayesian machine learning.</p>
12751221

12761222
<!-- !split -->
12771223
<h2 id="bayes-theorem" class="anchor">Bayes' Theorem </h2>

doc/pub/week1/html/week1-reveal.html

Lines changed: 5 additions & 62 deletions
Original file line numberDiff line numberDiff line change
@@ -199,10 +199,9 @@ <h2 id="overview-of-first-week-january-20-24-2025">Overview of first week, Janua
199199
<p>
200200
<ol>
201201
<p><li> Presentation of course</li>
202-
<p><li> Discussion of possible projects and presentation of participants</li>
202+
<p><li> Discussion of possible projects</li>
203203
<p><li> Deep learning methods, mathematics and review of neural networks</li>
204204
<p><li> <a href="https://youtu.be/" target="_blank">Video of lecture to be posted after lecture</a></li>
205-
<p><li> Test your background knowledge (to be added)</li>
206205
</ol>
207206
</div>
208207
</section>
@@ -964,6 +963,8 @@ <h2 id="network-elements-the-energy-function">Network Elements, the energy funct
964963
\( W \). Thus, when we adjust them during the learning procedure, we are
965964
adjusting the energy function to best fit our problem.
966965
</p>
966+
967+
<p>Recently these energy functions have been replaced by Neural Networks. This will be discussed later in the course.</p>
967968
</section>
968969

969970
<section>
@@ -1002,64 +1003,6 @@ <h2 id="gaussian-binary">Gaussian binary </h2>
10021003
</div>
10031004
</section>
10041005

1005-
<section>
1006-
<h2 id="representing-the-wave-function">Representing the wave function </h2>
1007-
1008-
<p>The wavefunction should be a probability amplitude depending on
1009-
\( \boldsymbol{x} \). The RBM model is given by the joint distribution of
1010-
\( \boldsymbol{x} \) and \( \boldsymbol{h} \)
1011-
</p>
1012-
1013-
<p>&nbsp;<br>
1014-
$$
1015-
P_{\mathrm{rbm}}(\boldsymbol{x},\boldsymbol{h}) = \frac{1}{Z} \exp{-E(\boldsymbol{x},\boldsymbol{h})}.
1016-
$$
1017-
<p>&nbsp;<br>
1018-
1019-
<p>To find the marginal distribution of \( \boldsymbol{x} \) we set:</p>
1020-
1021-
<p>&nbsp;<br>
1022-
$$
1023-
P_{\mathrm{rbm}}(\boldsymbol{x}) =\frac{1}{Z}\sum_{\boldsymbol{h}} \exp{-E(\boldsymbol{x}, \boldsymbol{h})}.
1024-
$$
1025-
<p>&nbsp;<br>
1026-
1027-
<p>Now this is what we use to represent the wave function, calling it a neural-network quantum state (NQS)</p>
1028-
<p>&nbsp;<br>
1029-
$$
1030-
\vert\Psi (\boldsymbol{X})\vert^2 = P_{\mathrm{rbm}}(\boldsymbol{x}).
1031-
$$
1032-
<p>&nbsp;<br>
1033-
</section>
1034-
1035-
<section>
1036-
<h2 id="define-the-cost-function">Define the cost function </h2>
1037-
1038-
<p>Now we don't necessarily have training data (unless we generate it by
1039-
using some other method). However, what we do have is the variational
1040-
principle which allows us to obtain the ground state wave function by
1041-
minimizing the expectation value of the energy of a trial wavefunction
1042-
(corresponding to the untrained NQS). Similarly to the traditional
1043-
variational Monte Carlo method then, it is the local energy we wish to
1044-
minimize. The gradient to use for the stochastic gradient descent
1045-
procedure is
1046-
</p>
1047-
1048-
<p>&nbsp;<br>
1049-
$$
1050-
C_i = \frac{\partial \langle E_L \rangle}{\partial \theta_i}
1051-
= 2(\langle E_L \frac{1}{\Psi}\frac{\partial \Psi}{\partial \theta_i} \rangle - \langle E_L \rangle \langle \frac{1}{\Psi}\frac{\partial \Psi}{\partial \theta_i} \rangle ),
1052-
$$
1053-
<p>&nbsp;<br>
1054-
1055-
<p>where the local energy is given by</p>
1056-
<p>&nbsp;<br>
1057-
$$
1058-
E_L = \frac{1}{\Psi} \hat{\boldsymbol{H}} \Psi.
1059-
$$
1060-
<p>&nbsp;<br>
1061-
</section>
1062-
10631006
<section>
10641007
<h2 id="extrapolations-and-model-interpretability">Extrapolations and model interpretability </h2>
10651008

@@ -1075,7 +1018,7 @@ <h2 id="extrapolations-and-model-interpretability">Extrapolations and model inte
10751018
</section>
10761019

10771020
<section>
1078-
<h2 id="physics-based-statistical-learning-and-data-analysis">Physics based statistical learning and data analysis </h2>
1021+
<h2 id="discipline-based-statistical-learning-and-data-analysis">Discipline based statistical learning and data analysis </h2>
10791022

10801023
<p>The above concepts are in some sense the difference between <b>old-fashioned</b> machine
10811024
learning and statistics and Bayesian learning. In machine learning and prediction based
@@ -1088,7 +1031,7 @@ <h2 id="physics-based-statistical-learning-and-data-analysis">Physics based stat
10881031
to make these predictions.
10891032
</p>
10901033

1091-
<p>Physics based statistical learning points however to approaches that give us both predictions and correlations as well as being able to produce error estimates and understand causations. This leads us to the very interesting field of Bayesian statistics and Bayesian machine learning.</p>
1034+
<p>A discipline (Bioscience, Chemistry, Geoscience, Math, Physics..) based statistical learning points however to approaches that give us both predictions and correlations as well as being able to produce error estimates and understand causations. This leads us to the very interesting field of Bayesian statistics and Bayesian machine learning.</p>
10921035
</section>
10931036

10941037
<section>

doc/pub/week1/html/week1-solarized.html

Lines changed: 7 additions & 59 deletions
Original file line numberDiff line numberDiff line change
@@ -226,19 +226,14 @@
226226
None,
227227
'defining-different-types-of-rbms-energy-based-models'),
228228
('Gaussian binary', 2, None, 'gaussian-binary'),
229-
('Representing the wave function',
230-
2,
231-
None,
232-
'representing-the-wave-function'),
233-
('Define the cost function', 2, None, 'define-the-cost-function'),
234229
('Extrapolations and model interpretability',
235230
2,
236231
None,
237232
'extrapolations-and-model-interpretability'),
238-
('Physics based statistical learning and data analysis',
233+
('Discipline based statistical learning and data analysis',
239234
2,
240235
None,
241-
'physics-based-statistical-learning-and-data-analysis'),
236+
'discipline-based-statistical-learning-and-data-analysis'),
242237
("Bayes' Theorem", 2, None, 'bayes-theorem'),
243238
('"Quantified limits of the nuclear '
244239
'landscape":"https://journals.aps.org/prc/abstract/10.1103/PhysRevC.101.044307"',
@@ -388,10 +383,9 @@ <h2 id="overview-of-first-week-january-20-24-2025">Overview of first week, Janua
388383
<p>
389384
<ol>
390385
<li> Presentation of course</li>
391-
<li> Discussion of possible projects and presentation of participants</li>
386+
<li> Discussion of possible projects</li>
392387
<li> Deep learning methods, mathematics and review of neural networks</li>
393388
<li> <a href="https://youtu.be/" target="_blank">Video of lecture to be posted after lecture</a></li>
394-
<li> Test your background knowledge (to be added)</li>
395389
</ol>
396390
</div>
397391

@@ -1061,6 +1055,8 @@ <h2 id="network-elements-the-energy-function">Network Elements, the energy funct
10611055
adjusting the energy function to best fit our problem.
10621056
</p>
10631057

1058+
<p>Recently these energy functions have been replaced by Neural Networks. This will be discussed later in the course.</p>
1059+
10641060
<!-- !split --><br><br><br><br><br><br><br><br><br><br>
10651061
<h2 id="defining-different-types-of-rbms-energy-based-models">Defining different types of RBMs (Energy based models) </h2>
10661062

@@ -1093,54 +1089,6 @@ <h2 id="gaussian-binary">Gaussian binary </h2>
10931089
</div>
10941090

10951091

1096-
<!-- !split --><br><br><br><br><br><br><br><br><br><br>
1097-
<h2 id="representing-the-wave-function">Representing the wave function </h2>
1098-
1099-
<p>The wavefunction should be a probability amplitude depending on
1100-
\( \boldsymbol{x} \). The RBM model is given by the joint distribution of
1101-
\( \boldsymbol{x} \) and \( \boldsymbol{h} \)
1102-
</p>
1103-
1104-
$$
1105-
P_{\mathrm{rbm}}(\boldsymbol{x},\boldsymbol{h}) = \frac{1}{Z} \exp{-E(\boldsymbol{x},\boldsymbol{h})}.
1106-
$$
1107-
1108-
<p>To find the marginal distribution of \( \boldsymbol{x} \) we set:</p>
1109-
1110-
$$
1111-
P_{\mathrm{rbm}}(\boldsymbol{x}) =\frac{1}{Z}\sum_{\boldsymbol{h}} \exp{-E(\boldsymbol{x}, \boldsymbol{h})}.
1112-
$$
1113-
1114-
<p>Now this is what we use to represent the wave function, calling it a neural-network quantum state (NQS)</p>
1115-
$$
1116-
\vert\Psi (\boldsymbol{X})\vert^2 = P_{\mathrm{rbm}}(\boldsymbol{x}).
1117-
$$
1118-
1119-
1120-
<!-- !split --><br><br><br><br><br><br><br><br><br><br>
1121-
<h2 id="define-the-cost-function">Define the cost function </h2>
1122-
1123-
<p>Now we don't necessarily have training data (unless we generate it by
1124-
using some other method). However, what we do have is the variational
1125-
principle which allows us to obtain the ground state wave function by
1126-
minimizing the expectation value of the energy of a trial wavefunction
1127-
(corresponding to the untrained NQS). Similarly to the traditional
1128-
variational Monte Carlo method then, it is the local energy we wish to
1129-
minimize. The gradient to use for the stochastic gradient descent
1130-
procedure is
1131-
</p>
1132-
1133-
$$
1134-
C_i = \frac{\partial \langle E_L \rangle}{\partial \theta_i}
1135-
= 2(\langle E_L \frac{1}{\Psi}\frac{\partial \Psi}{\partial \theta_i} \rangle - \langle E_L \rangle \langle \frac{1}{\Psi}\frac{\partial \Psi}{\partial \theta_i} \rangle ),
1136-
$$
1137-
1138-
<p>where the local energy is given by</p>
1139-
$$
1140-
E_L = \frac{1}{\Psi} \hat{\boldsymbol{H}} \Psi.
1141-
$$
1142-
1143-
11441092
<!-- !split --><br><br><br><br><br><br><br><br><br><br>
11451093
<h2 id="extrapolations-and-model-interpretability">Extrapolations and model interpretability </h2>
11461094

@@ -1155,7 +1103,7 @@ <h2 id="extrapolations-and-model-interpretability">Extrapolations and model inte
11551103
</p>
11561104

11571105
<!-- !split --><br><br><br><br><br><br><br><br><br><br>
1158-
<h2 id="physics-based-statistical-learning-and-data-analysis">Physics based statistical learning and data analysis </h2>
1106+
<h2 id="discipline-based-statistical-learning-and-data-analysis">Discipline based statistical learning and data analysis </h2>
11591107

11601108
<p>The above concepts are in some sense the difference between <b>old-fashioned</b> machine
11611109
learning and statistics and Bayesian learning. In machine learning and prediction based
@@ -1168,7 +1116,7 @@ <h2 id="physics-based-statistical-learning-and-data-analysis">Physics based stat
11681116
to make these predictions.
11691117
</p>
11701118

1171-
<p>Physics based statistical learning points however to approaches that give us both predictions and correlations as well as being able to produce error estimates and understand causations. This leads us to the very interesting field of Bayesian statistics and Bayesian machine learning.</p>
1119+
<p>A discipline (Bioscience, Chemistry, Geoscience, Math, Physics..) based statistical learning points however to approaches that give us both predictions and correlations as well as being able to produce error estimates and understand causations. This leads us to the very interesting field of Bayesian statistics and Bayesian machine learning.</p>
11721120

11731121
<!-- !split --><br><br><br><br><br><br><br><br><br><br>
11741122
<h2 id="bayes-theorem">Bayes' Theorem </h2>

0 commit comments

Comments
 (0)