Skip to content

Commit a623b90

Browse files
authored
Merge pull request #117 from data-derp/fix/42-formatting
42 | ks | formating lists
2 parents 22949ed + cf5bc56 commit a623b90

File tree

1 file changed

+12
-11
lines changed
  • versioned_docs/version-2.0/data-science-and-interpretability

1 file changed

+12
-11
lines changed

versioned_docs/version-2.0/data-science-and-interpretability/overview.mdx

Lines changed: 12 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ Feeling Lost? [Machine Learning Estimators Map](https://scikit-learn.org/stable/
6666
* Dimensionality Reduction
6767

6868
### Optimization & Reinforcement Learning
69-
* Convex Optimization, Genetic Algorithms
69+
* Convex Optimization, Genetic Algorithms
7070
* Deep Reinforcement Learning
7171

7272
## Intuition
@@ -79,13 +79,14 @@ Feeling Lost? [Machine Learning Estimators Map](https://scikit-learn.org/stable/
7979

8080
</div>
8181
What are some limitations of linear models (y = mx + b)? or even multiple linear regression (y = m1x1 + m2x2 + … + b)?
82-
* Assumes **monotonic gradient (slope)** between the target variable (y) and any feature (e.g. x1)
83-
* Assumes **constant slope steepness** between the target variable (y) and any feature (e.g. x1)
82+
83+
- Assumes **monotonic gradient (slope)** between the target variable (y) and any feature (e.g. x1)
84+
- Assumes **constant slope steepness** between the target variable (y) and any feature (e.g. x1)
8485
for the entire domain of that feature
8586

8687
Machine Learning models can often provide the **flexibility** to overcome this. Think of it as **automatic curve fitting** and **if-else**!
87-
* You don’t need to hard-code the predicates
88-
* The algorithm will determine and optimize your if-conditions
88+
- You don’t need to hard-code the predicates
89+
- The algorithm will determine and optimize your if-conditions
8990

9091
However, note the issue happening with the green line in the picture 😵
9192

@@ -101,7 +102,7 @@ However, note the issue happening with the green line in the picture 😵
101102
</div>
102103

103104
* Flexibility vs Generalization
104-
* Overfitting vs Underfitting
105+
* Overfitting vs Underfitting
105106
* How can we avoid overfitting in particular?
106107
* Complexity vs Interpretability
107108
* Is that dichotomy still strictly true today?
@@ -164,16 +165,16 @@ However, note the issue happening with the green line in the picture 😵
164165
* Deep Learning
165166
* PyTorch
166167
* Super flexible, create any model architecture
167-
* TensorFlow
168-
* the Keras API is super easy to use
168+
* TensorFlow
169+
* the Keras API is super easy to use
169170
* if not using Keras, probably better to go with PyTorch
170-
* MLOps, CD4ML
171+
* MLOps, CD4ML
171172
* [MLflow](https://mlflow.org/) for experiment tracking, logging, model versioning
172173
* [DVC](https://dvc.org/) for data versioning
173-
* [Azure Machine Learning](https://azure.microsoft.com/en-us/services/machine-learning/) and/or [AWS SageMaker](https://aws.amazon.com/sagemaker/)
174+
* [Azure Machine Learning](https://azure.microsoft.com/en-us/services/machine-learning/) and/or [AWS SageMaker](https://aws.amazon.com/sagemaker/)
174175
* covers full lifecycle management
175176

176-
177+
177178
## Interpretability
178179
<div style={{textAlign: 'center'}}>
179180

0 commit comments

Comments
 (0)