Skip to content

Commit

Permalink
writeup
Browse files Browse the repository at this point in the history
  • Loading branch information
tommytracey committed Jul 10, 2018
1 parent 6dcbd25 commit bd08e83
Showing 1 changed file with 10 additions and 6 deletions.
16 changes: 10 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,9 @@ Here are two sample outputs from the final algorithm:
Outlined below is a summary of the steps and challenges encountered during this project. However, a more detailed walk-through along with the source code is available via the iPython notebooks.

[//]: # (TODO - create html version for Keras)
- Keras+Tensorflow — [.ipynb version](dog_app_v6_keras.ipynb), [.html version](https://rawgit.com/tommytracey/aind-dog-project/master/dog_app_v6_keras.html)
- PyTorch — [.ipynb version](dog_app_v7_pytorch.ipynb), [.html version](https://rawgit.com/tommytracey/aind-dog-project/master/dog_app_v7_pytorch.html)
- Data exploration — [.ipynb version](dog-breed-data-exploration.ipynb), [.html version](https://rawgit.com/tommytracey/aind-dog-project/master/dog-breed-data-exploration.ipynb)
- Keras+Tensorflow implementation — [.ipynb version](dog_app_v6_keras.ipynb), [.html version](https://rawgit.com/tommytracey/aind-dog-project/master/dog_app_v6_keras.html)
- PyTorch implementation — [.ipynb version](dog_app_v7_pytorch.ipynb), [.html version](https://rawgit.com/tommytracey/aind-dog-project/master/dog_app_v7_pytorch.html)

### Steps (high-level)
1. Data Preparation
Expand All @@ -61,15 +62,18 @@ Outlined below is a summary of the steps and challenges encountered during this

#####  
## Implementation & Results
*Coming Soon* — In the meantime, checkout the notebooks.
*Coming Soon* — In the meantime, checkout the notebooks.


#####  
## Reflections
#### Improving Accuracy
- Train longer
- Additional training on poor performing breeds
- Deeper architecture + Augmentation
- **Additional training on the more difficult breeds.**
- Create a new training set with a distribution based on prediction accuracy — i.e., poor performing breeds have more training images, while higher performing breeds have fewer images.
- **Deeper architecture + Augmentation**
- My Keras+Tensorflow version used a deep architecture (Xception), but did not use augmentation. Conversely, my PyTorch version used augmentation, but a less elaborate architecture (ResNet). Combining these two and exploring other augmentation schemes *should* improve accuracy.
- **Train longer.**
- In my PyTorch version, it seems I may have left some additional convergence on the table. My training accuracy was still quite low, probably due to all the augmentations. Perhaps given more training time this could have translated to marginally higher validation and test accuracies.

#### PyTorch vs. Keras+Tensorflow

Expand Down

0 comments on commit bd08e83

Please sign in to comment.