Switching to Chinese version: 中文版 🎈
📚 URL to gitbook (Only in Chinese version for now): https://lyhue1991.github.io/eat_tensorflow2_in_30_days
🚀 URL to github repo (Chinese): https://github.com/lyhue1991/eat_tensorflow2_in_30_days/tree/master
🚀 URL to github repo (English): https://github.com/lyhue1991/eat_tensorflow2_in_30_days/tree/english
Conclusion first:
For the engineers, priority goes to TensorFlow2.
For the students and researchers,first choice should be Pytorch.
The best way is to master both of them if having sufficient time.
Reasons:
-
- Model implementation is the most important in the industry. Only deployment supports for tensorflow models (not Pytorch) is the present situation in the majority of the internet enterprises (in China). What's more, the industry prefers the models with higher availability; in most cases, they use well-validated modeling architectures with the minimized requirements of adjustment.
-
- Fast iterative development and publication is the most important for the researchers since they need to test a lot of new models. Pytorch has advantages in accessing and debugging comparing with TensorFlow2. Pytorch is most frequently used in academy since 2019 with a large amount of the cutting-edge results.
-
- Overall, TensorFlow2 and Pytorch are quite similar in programming nowadays, so mastering one helps learning the other. Mastering both framework provides you a lot more open-sourced models and helps you switching between them.
Conclusion first:
Keras will be discontinued in development after version 2.3.0, so use tf.keras.
Keras is a high-level API for the deep learning frameworks. It help the users to define and training DL networks with a more intuitive way.
The Keras libraries installed by pip implement this high-level API for the backends in tensorflow, theano, CNTK, etc.
tf.keras is the high-level API just for Tensorflow, which is based on low-level APIs in Tensorflow.
Most but not all of the functions in tf.keras are the same for those in Keras (which is compatible to many kinds of backend). tf.keras has a tighter combination to TensorFlow comparing to Keras.
With the acquisition by Google, Keras will not update after version 2.3.0 , thus the users should use tf.keras from now on, instead of using Keras installed by pip.
It is suggested that the readers have foundamental knowledges of machine/deep learning and experience of modeling using Keras or TensorFlow 1.0.
For those who have zero experience of machine/deep learning, it is strongly suggested to refer to "Deep Learning with Python" along with reading this book.
"Deep Learning with Python" is written by François Chollet, the inventor of Keras. This book is based on Keras and has no machine learning related prerequisites to the reader.
"Deep Learning with Python" is easy to understand as it uses various examples to demonstrate. No mathematical equation is in this book since it focuses on cultivating the intuitive to the deep learning.
This is a introduction reference book which is extremely friendly to human being. The lowest goal of the authors is to avoid giving up due to the difficulties, while "Don't let the readers think" is the highest target.
This book is mainly based on the official documents of TensorFlow together with its functions.
However, the authors made a thorough restructuring and a lot optimizations on the demonstrations.
It is different from the official documents, which is disordered and contains both tutorial and guidance with lack of systematic logic, that our book redesigns the content according to the difficulties, readers' searching habits, and the architecture of TensorFlow. We now make it progressive for TensorFlow studying with a clear path, and an easy access to the corresponding examples.
In contrast to the verbose demonstrating code, the authors of this book try to minimize the length of the examples to make it easy for reading and implementation. What's more, most of the code cells can be used in your project instantaneously.
Given the level of difficulty as 9 for learning Tensorflow through official documents, it would be reduced to 3 if learning through this book.
This difference could be demonstrated as the following figure:
(1) Study Plan
The authors wrote this book using the spare time, especially the two-month unexpected "holiday" of COVID-19. Most readers should be able to completely master all the content within 30 days.
Time required everyday would be between 30 minutes to 2 hours.
This book could also be used as library examples to consult when implementing machine learning projects with TensorFlow2.
Click the blue captions to enter the corresponding chapter.
(2) Software environment for studying
All the source codes are tested in jupyter. It is suggested to clone the repository to local machine and run them in jupyter for an interactive learning experience.
The authors would suggest to install jupytext that converts markdown files into ipynb, so the readers would be able to open markdown files in jupyter directly.
#For the readers in mainland China, using gitee will allow cloning with a faster speed
#!git clone https://gitee.com/Python_Ai_Road/eat_tensorflow2_in_30_days
#It is suggested to install jupytext that converts and run markdown files as ipynb.
#!pip install -i https://pypi.tuna.tsinghua.edu.cn/simple -U jupytext
#It is also suggested to install the latest version of TensorFlow to test the demonstrating code in this book
#!pip install -i https://pypi.tuna.tsinghua.edu.cn/simple -U tensorflow
import tensorflow as tf
#Note: all the codes are tested under TensorFlow 2.1
tf.print("tensorflow version:",tf.__version__)
a = tf.constant("hello")
b = tf.constant("tensorflow2")
c = tf.strings.join([a,b]," ")
tf.print(c)
tensorflow version: 2.1.0
hello tensorflow2
If you find this book helpful and want to support the author, please give a star ⭐️ to this repository and don't forget to share it to your friends 😊
Please leave comments in the WeChat official account "Python与算法之美" (Elegance of Python and Algorithms) if you want to communicate with the author about the content. The author will try best to reply given the limited time available.