Skip to content

This repository summaries my progress and contains my solutions for the given programming assignments while taking Andrew Ng's course on Coursera about machine learning.

Notifications You must be signed in to change notification settings

ossamaazzaz/machine_learning_coursera

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Machine learning - Coursera

This repository summaries my progress and contains my solutions for the given programming assignments while taking Andrew Ng's course on Coursera about machine learning.

PS: The course includes tutorials about Octave, so all the suggested solutions are implemented using Octave. 

About the course

The course takes about 11 weeks to complete. And it is about how to apply the most advanced machine learning algorithms to such problems as anti-spam, image recognition, clustering, building recommender systems, and many other problems. It also covers how to select the right algorithm for the right job, as well as ‘debugging’ and figuring out how to improve a learning algorithm's performance.

Link to the course: https://www.coursera.org/learn/machine-learning/

Weeks summary:

week 1:

During this week, I have got to know more about what machine learning is and its two learning types (supervised and unsupervised). I have also seen how linear regression predicts a real-valued output based on an input value. This week discusses the application of linear regression to housing price prediction, presents the notion of a cost function, and introduces the gradient descent method for learning. There also another optional module in this week. It provides a refresher on linear algebra concepts which is necessary for the rest of the course.

week 2:

After completing week 1 and getting to know linear regression, a question must be asked: "What if the input has more than one value?". This chapter contains a module that responds to the question and shows how linear regression can be extended to accommodate multiple input features. It also discusses best practices for implementing linear regression. This week includes another module which is about the real implementation using programming. The module introduces Octave/Matlab basics, how to manipulate data using Octave and shows how to submit an assignment. Finally, by the end of this week, there is a programming assignment that requires using Octave, we'll be talking about it in detail in the next section of this file.

week 3:

In this week, I've got an introduction to the notion of classification, the cost function for logistic regression, and the application of logistic regression to multi-class classification. As well as the overfitting problem and regularization, which helps prevent models from overfitting the training data.

week 4:

This week covers neural networks. I learned that "neural networks" is a model inspired by how the brain works. It is widely used today in many applications: when your phone interprets and understand your voice commands, it is likely that a neural network is helping to understand your speech; when you cash a check, the machines that automatically read the digits also use neural networks.

week 5:

week 6:

week 7:

week 8:

week 9:

week 10:

week 11:

Programming assignments:

You can find the documents of all the programming assignments inside the "pdfs" folder, and you can find all the resources and the files needed for each assignment inside the "All exercises" folder in this repository. You will find a solution for each assignment in a separate folder at the root of this repository.

ex1_week2:

As told in the summary of week 2, this week ends by an exercise. In this exercise, I have implemented linear regression and got to see it work on data. I have applied what I learned in the last module of this week: plotting and visualizing data, fitting the linear regression parameters θ to the dataset using gradient descent, computing the coast and debugging.

ex2_week3:

This programing assignment is about implementing logistic regression and applying it to two different datasets. First of all, I wrote the sigmoid function, then I wrote both cost and gradient functions for logistic regression. After that, I applied regularization on the previous example to prevent my model from overfitting the training data.

ex3_week4:

In this exercise, I implemented one-vs-all logistic regression and neural networks to recognize hand-written digits.

ex4_week5:

ex5_week6:

ex6_week7:

ex7_week8:

About

This repository summaries my progress and contains my solutions for the given programming assignments while taking Andrew Ng's course on Coursera about machine learning.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages