We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent a656290 commit e3d02f5Copy full SHA for e3d02f5
content/linear-algebra.rst
@@ -253,7 +253,7 @@ Now compute the covariance matrix together with its eigenvectors and eigenvalues
253
M = transpose(X)*X
254
P = eigvecs(M)
255
E = eigvals(M)
256
- # divide E by r=150 to get variance
+ # divide E by r-1=150-1=149 to get variance
257
258
.. code-block:: text
259
@@ -296,7 +296,7 @@ The basis :math:`P` of eigenvectors we got above is orthogonal and normalized:
296
4.7765e-16 -4.7269e-16 1.0 1.55799e-17
297
2.98372e-16 -1.41867e-16 1.55799e-17 1.0
298
299
-We may perform dimensionality reduction by projecting the data to this subspace:
+We may perform dimensionality reduction by projecting the data to a smaller subspace:
300
301
.. code-block:: julia
302
0 commit comments