diff --git a/PyTorch 101 Part 1 - Computational Graphs and Autograd in PyTorch.ipynb b/PyTorch 101 Part 1 - Computational Graphs and Autograd in PyTorch.ipynb index 2c0a2b2..791a1a9 100644 --- a/PyTorch 101 Part 1 - Computational Graphs and Autograd in PyTorch.ipynb +++ b/PyTorch 101 Part 1 - Computational Graphs and Autograd in PyTorch.ipynb @@ -8,7 +8,7 @@ "\n", "When we design software to implement neural networks, we want to come up with a way that can allow us to seamlessly compute the gradients, regardless of the architecture type so that the programmer doesn't have to manually compute gradients when changes are made to the network. \n", "\n", - "We galvanise this idea in form of a data structure called a **Computation graph**. A computation graph looks very similar to the diagram of the graph that we made in the image above. However, the nodes in a computation graph are basically operators. These operators are basically the mathematical operators except for one case, where we need to represent creation of a user-defined variable.\n", + "We represent the computation with a data structure called a **Computation graph**. A computation graph looks very similar to the diagram of the graph that we made in the image above. However, the nodes in a computation graph are basically operators. These operators are basically the mathematical operators except for one case, where we need to represent creation of a user-defined variable.\n", "\n", "![Computation Graph](images/computation_graph_forward.png)\n", "\n",