From e5858baa9cd64c642e1f767868b7d18eec97970f Mon Sep 17 00:00:00 2001 From: Julien Date: Wed, 16 Dec 2020 01:06:48 +0100 Subject: [PATCH] Readme done --- README.md | 75 +++++++++++++++++++++++++++++++++++++------------------ 1 file changed, 51 insertions(+), 24 deletions(-) diff --git a/README.md b/README.md index 9e55aad..be3d5f0 100644 --- a/README.md +++ b/README.md @@ -55,58 +55,85 @@ deactivate ## Folder architecture -This folder contains several python modules, categorized as follows : +Our project contains many folders organized as follows: ``` project │ README.md │ requirements.txt | run.py +| data_augmentation.ipynb +| pipeline.ipynb │ └───data -│ └───training -│ └───test_set_images -| └───predictions -│ └───generated -| └───flip -│ └───rotation -│ │ ... +│ └───train +| | └───original +| | └───generated +| | +│ └───test +| └───original +| └───predictions │ └───helpers | colab.py | constants.py -| file_manipulation.py +| image_loading.py │ ... | -└───model_save -| final_model -| -└───notebooks -│ create_submission.ipynb -│ data_augmentation.ipynb -| ... +└───models +│ unet.py +│ cnn_patch_prediction.py | -└───assets -│ ... +└───saved_models +| best_cnn +| final_model +| ``` Here are the main subfolders description:
- data + root
- ajsdkskadjakd + Here are the main files: + +
+ +
+ data +
+ The data folder contains the training and test images. The first is composed of the 100 original images as well as the artificially created ones, in total around 1000. The test set folder contains two subfolders: the original ones with the images used for the AICrowd submission and the predictions folder, containing the predicted outputs as well as the masks superposed on the original images, to give a qualitative evaluation of our predictions. +
helpers
- asdasda + This folder contains the python scripts used during the pipeline or to run predictions. +
+
+ +
+ models +
+ models contains two files implementing the architecture our models: +
+
- model_save + saved_models
- asdas + saved_models is the folder used to store the trained models using the Keras module from Tensorflow.
+ ## Training pipeline Here is described the pipeline used to train the model and fine tune the hyperparameters of our final model. @@ -122,7 +149,7 @@ According to the paper that first described the U-Net architecture [[1]](https:/ * flip * combination of the previous transformations -The script used to generate new data is illustrated in the `data_augmentation.ipynb` notebook, using the Tensorflow preprocessing class `ImageDataGenerator`. The generated images can be found in the `data/generated` folder. +The script used to generate new data is illustrated in the `data_augmentation.ipynb` notebook, using the Tensorflow preprocessing class `ImageDataGenerator`. The generated images can be found in the `data/training/generated` folder. ### Training