@@ -66,7 +66,7 @@ Feeling Lost? [Machine Learning Estimators Map](https://scikit-learn.org/stable/
66
66
* Dimensionality Reduction
67
67
68
68
### Optimization & Reinforcement Learning
69
- * Convex Optimization, Genetic Algorithms
69
+ * Convex Optimization, Genetic Algorithms
70
70
* Deep Reinforcement Learning
71
71
72
72
## Intuition
@@ -79,13 +79,14 @@ Feeling Lost? [Machine Learning Estimators Map](https://scikit-learn.org/stable/
79
79
80
80
</div >
81
81
What are some limitations of linear models (y = mx + b)? or even multiple linear regression (y = m1x1 + m2x2 + … + b)?
82
- * Assumes ** monotonic gradient (slope)** between the target variable (y) and any feature (e.g. x1)
83
- * Assumes ** constant slope steepness** between the target variable (y) and any feature (e.g. x1)
82
+
83
+ - Assumes ** monotonic gradient (slope)** between the target variable (y) and any feature (e.g. x1)
84
+ - Assumes ** constant slope steepness** between the target variable (y) and any feature (e.g. x1)
84
85
for the entire domain of that feature
85
86
86
87
Machine Learning models can often provide the ** flexibility** to overcome this. Think of it as ** automatic curve fitting** and ** if-else** !
87
- * You don’t need to hard-code the predicates
88
- * The algorithm will determine and optimize your if-conditions
88
+ - You don’t need to hard-code the predicates
89
+ - The algorithm will determine and optimize your if-conditions
89
90
90
91
However, note the issue happening with the green line in the picture 😵
91
92
@@ -101,7 +102,7 @@ However, note the issue happening with the green line in the picture 😵
101
102
</div >
102
103
103
104
* Flexibility vs Generalization
104
- * Overfitting vs Underfitting
105
+ * Overfitting vs Underfitting
105
106
* How can we avoid overfitting in particular?
106
107
* Complexity vs Interpretability
107
108
* Is that dichotomy still strictly true today?
@@ -164,16 +165,16 @@ However, note the issue happening with the green line in the picture 😵
164
165
* Deep Learning
165
166
* PyTorch
166
167
* Super flexible, create any model architecture
167
- * TensorFlow
168
- * the Keras API is super easy to use
168
+ * TensorFlow
169
+ * the Keras API is super easy to use
169
170
* if not using Keras, probably better to go with PyTorch
170
- * MLOps, CD4ML
171
+ * MLOps, CD4ML
171
172
* [ MLflow] ( https://mlflow.org/ ) for experiment tracking, logging, model versioning
172
173
* [ DVC] ( https://dvc.org/ ) for data versioning
173
- * [ Azure Machine Learning] ( https://azure.microsoft.com/en-us/services/machine-learning/ ) and/or [ AWS SageMaker] ( https://aws.amazon.com/sagemaker/ )
174
+ * [ Azure Machine Learning] ( https://azure.microsoft.com/en-us/services/machine-learning/ ) and/or [ AWS SageMaker] ( https://aws.amazon.com/sagemaker/ )
174
175
* covers full lifecycle management
175
176
176
-
177
+
177
178
## Interpretability
178
179
<div style = { {textAlign: ' center' }} >
179
180
0 commit comments