Skip to content

Commit 60a6f66

Browse files
Chapter 06 Done
1 parent 8594814 commit 60a6f66

File tree

3 files changed

+334
-0
lines changed

3 files changed

+334
-0
lines changed

Chapter 06/1.Random_Optimization.ipynb

+317
Large diffs are not rendered by default.

Chapter 06/README.md

+15
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# Introducing Optimization
2+
3+
Now that we have a built a network, that can perfrom one forward propogate, and calculate the loss, its time to introduce a method that will help us to change the weights and biases of the network. This is the hardest part of a neural netowork.
4+
5+
First, we can think to use multiple random values, and stick with the one that gives us the least error. Lets actually try making something like that.
6+
7+
Rest of the chapter will be continued in the notebook.
8+
9+
Basically, we are just incrementing the weights and biases by a small amount, and then saving the version with the best accuracy.
10+
11+
[Optimization With Random Values](./1.Random_Optimization.ipynb)
12+
13+
---
14+
15+
End of Chapter 06

README.md

+2
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,8 @@ Huge kudos to the authors for the book, and the excellent animations they made.
2222

2323
[Chapter 05 - Loss Functions](./Chapter%2005/)
2424

25+
[Chapter 06 - Introduction to Optimization](./Chapter%2006/)
26+
2527
## Some other resources used in this repo
2628

2729
[The neural network images i've put in here - NN-SVG](http://alexlenail.me/NN-SVG/index.html)

0 commit comments

Comments
 (0)