Machine Learning for Big Data using PySpark with real-world projects
2023-01-11
Run spark-submit for Apache Spark (PySpark) using Docker
2023-09-12
3 mins read

A couple of years ago I completed Deep Learning Specialization taught by AI pioneer Andrew Ng. I found this series of courses immensely helpful in my learning journey of deep learning. After years, I decided to prepare and share some notes which highlight key concepts I learned in this specialization.
The content of these documents is mainly adapted from this GitHub repository. I have added a multitude of explanations, illustrations, and visualization to make some complex concepts easier to grasp for readers. These could be good references for Machine Learning Engineers, Deep Learning Engineers, and Data Scientists to refresh their minds on the fundamentals of Deep Learning. You can download notes for each course from the following links.

Course 1: Neural Networks and Deep Learning

In this course, you will get a high-level introduction to the idea behind Deep Learning and how it revolutionalizes the modern Big Data era. By the end of the course, you will know how to build your own neural network and train it on some sample data. This will enable you to understand what’s really going on under the hood instead of just looking at neural networks as a black box.

Course 2: Improving Deep Neural Networks

This course teaches how to optimize your model’s performance by applying many algorithms and techniques. For instance, how to tune the learning rate, the number of layers, and the number of neurons in each layer. Then regularization techniques like dropout and Batch Normalization are covered, to end with an optimization section that discusses stochastic gradient descent, momentum, RMS Prop, and Adam optimization algorithms.

Course 3: Structuring Machine Learning Projects

This course is all about how to build ML projects, get results quickly and iterate to improve these results. It gives delightful insights into how to diagnose the outcomes of our models so that we can see where the performance problem is coming from if there is one: small training set, different distributions of train and test set, over-fitting, and other problems are covered, along with their solutions.

Course 4: Convolutional Neural Networks (CNNs)

This course teaches how to build convolutional neural networks and apply them to image data. When it comes to computer vision, CNNs are the bee knees. CNNs have given rise to incredible improvements in facial recognition, classifying X-ray reports, and self-driving car systems. Notes are based on lecture videos and supplementary material provided and my own understanding of the topics.

Course 5: Sequence Models

In this course, you will become familiar with Sequence Models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more. You will be able to build and train Recurrent Neural Networks (RNNs) and commonly-used variants such as GRUs and LSTMs. You will apply RNNs to Character-level Language Modeling, gain experience with Natural Language Processing and Word Embedding, and use HuggingFace tokenizers and Transformer Models to solve different NLP tasks such as Named Entity Recognition and Question Answering.

You can also download all notes as a single file from the following link:

Happy Learning!

Leave a Reply

Your email address will not be published. Required fields are marked *