This folder includes all the work that I've done for that course during 2016-2017 Spring Semester. All of the work is done in python with only using numpy, without help of any deep/machine learning libraries. So it contains good information for the low level implementations of learning frameworks. It includes modular layers, how to construct architectures using those layers and how to train them again using defined loss functions and backprops.
Contains implementations of a kNN classifier for CIFAR10 dataset. Implemented in task1.
Contains implementations of a fully vectorized SVM for CIFAR10 dataset. Done in task2.
Contains implementations of a fully vectorized Softmax classifier for CIFAR10 dataset. Very similar to SVM, only uses softmax loss instead of hinge loss. You can find the details in task3.
Contains a fully vectorized, two layer NN implementation for CIFAR10 dataset. Details are in task4.
Contains a fully vectorized, two layer NN implementation for next character prediction. Dataset consists of sentences of a novel. We feed 5 characters and try to predict 6th letter. You can find the details in task5.
You can find various experiments for next character prediction using fully connected layers. You can check the experiments in task7.
Denoising Auto Encoders CNNs Saliency Maps Fooling CNNs Deep Dreaming CIFAR-10 Dataset TinyImageNet-100-A
Vanilla RNNs LSTMs Image Captioning using Microsoft COCO Dataset Next Character Prediction using RNNs and LSTMs
Basic Question Answering using synthetic dataset generated in Turkish. Dataset Generator 3 Different approaches using RNNs and Fully connected layers Comparison of those approaches