reCAPTCHA WAF Session Token
Data Science and ML

Learn Probability in Computer Science with Stanford University for FREE


Image by Author

 

For those diving into the world of computer science or needing a touch-up on their probability knowledge, you’re in for a treat. Stanford University has recently updated its YouTube playlist on its CS109 course with new content!

The playlist comprises 29 lectures to provide you with gold-standard knowledge of the basics of probability theory, essential concepts in probability theory, mathematical tools for analyzing probabilities, and then ending data analysis and Machine Learning.

So let’s get straight into it…

 

 

Link: Counting

Learn about the history of probability and how it has helped us achieve modern AI, with real-life examples of developing AI systems. Understand the core counting phases, counting with ‘steps’ and counting with ‘or’. This includes areas such as artificial neural networks and how researchers would use probability to build machines. 

 

 

Link: Combinatorics

The second lecture goes into the next level of seriousness counting – this is called Combinatorics. Combinatorics is the mathematics of counting and arranging. Dive into counting tasks on n objects, through sorting objects (permutations), choosing k objects (combinations), and putting objects in r buckets. 

 

 

Link: What is Probability?

This is where the course really starts to dive into Probability. Learn about the core rules of probability with a wide range of examples and a touch on the Python programming language and its use with probability. 

 

 

Link: Probability and Bayes

In this lecture, you will dive into learning how to use conditional probabilities, chain rule, the law of total probability and Bayes theorem. 

 

 

Link: Independence

In this lecture, you will learn about probability in respect of it being mutually exclusive and independent, using AND/OR. The lecture will go through a variety of examples for you to get a good grasp.

 

 

Link: Random Variables and Expectations

Based on the previous lectures and your knowledge of conditional probabilities and independence, this lecture will dive into random variables, use and produce the probability mass function of a random variable, and be able to calculate expectations. 

 

 

Link: Variance Bernoulli Binomial

You will now use your knowledge to solve harder and harder problems. Your goal for this lecture will be to recognise and use Binomial Random Variables, Bernoulli Random Variables, and be able to calculate the variance for random variables. 

 

 

Link: Poisson

Poisson is great when you have a rate and you care about the number of occurrences. You will learn about how it can be used in different aspects along with Python code examples.

 

 

Link: Continuous Random Variables

The goals of this lecture will include being comfortable using new discrete random variables, integrating a density function to get a probability, and using a cumulative function to get a probability. 

 

 

Link: Normal Distribution

You may have heard this about normal distribution before, in this lecture, you will go through a brief history of normal distribution, what it is, why it is important and practical examples.

 

 

Link: Joint Distributions

In the previous lectures, you will have worked with 2 random variables at most, the next step of learning will be to go into any given number of random variables.

 

 

Link: Inference

The learning goal of this lecture is how to use multinomials, appreciate the utility of log probabilities, and be able to use the Bayes theorem with random variables. 

 

 

Link: Inference II

The learning goal continues from the last lecture of combining Bayes theorem with random variables. 

 

 

Link: Modelling

In this lecture, you will take everything you have learned so far and put it into perspective about real-life problems – probabilistic modelling. This is taking a whole bunch of random variables being random together.

 

 

Link: General Inference

You will dive into general inference, and in particular, learn about an algorithm called rejection sampling. 

 

 

Link: Beta

This lecture will go into the random variables of probabilities which are used to solve real-world problems. Beta is a distribution for probabilities, where its range values between 0 and 1. 

 

 

Link: Adding Random Variables I

At this point of the course, you will be learning about deep theory and adding random variables is an introduction to how to attain results of the theory of probability. 

 

 

Link: Central Limit Theorem

In this lecture, you will dive into the central limit theorem which is an important element in probability. You will go through practical examples so that you can grasp the concept.

 

 

Link: Bootstrapping and P-Values I

You will now move into uncertainty theory, sampling and bootstrapping which is inspired by the central limit theorem. You will go through practical examples. 

 

 

Link: Algorithmic Analysis

In this lecture, you will dive a bit more into computer science with an in-depth understanding of the analysis of algorithms, which is the process of finding the computational complexity of algorithms.

 

 

Link: M.L.E.

This lecture will dive into parameter estimation, which will provide you with more knowledge on machine learning. This is where you take your knowledge of probability and apply it to machine learning and artificial intelligence. 

 

 

Link: M.A.P.

We’re still at the stage of taking core principles of probability and how it applied to machine learning. In this lecture, you will focus on parameters in machine learning regarding probability and random variables. 

 

 

Link: Naive Bayes

Naive Bayes is the first machine learning algorithm you will learn about in depth. You will have learnt about the theory of parameter estimation, and now will move on to how core algorithms such as Naive Bayes lead to ideas such as neural networks. 

 

 

Link: Logistic Regression

In this lecture, you will dive into a second algorithm called Logistic regression which is used for classification tasks, which you will also learn more about. 

 

 

Link: Deep Learning

As you’ve started to dive into machine learning, this lecture will go into further detail about deep learning based on what you have already learned. 

 

 

Link: Fairness

We live in a world where machine learning is being implemented in our day-to-day lives. In this lecture, you will look into the fairness around machine learning, with a focus on ethics. 

 

 

Link: Advanced Probability

You have learnt a lot about the basics of probability and have applied it in different scenarios and how it relates to machine learning algorithms. The next step is to get a bit more advanced about probability. 

 

 

Link: Future of Probability

The learning goal for this lecture is to learn about the use of probability and the variety of problems that probability can be applied to solve these problems. 

 

 

Link: Final Review

And last but not least, the last lecture. You will go through all the other 28 lectures and touch on any uncertainties. 

 

 

Being able to find good material for your learning journey can be difficult. This probability for computer science course material is amazing and can help you grasp concepts of probability that you were unsure of or needed a touch up.
 
 

Nisha Arya is a Data Scientist and Freelance Technical Writer. She is particularly interested in providing Data Science career advice or tutorials and theory based knowledge around Data Science. She also wishes to explore the different ways Artificial Intelligence is/can benefit the longevity of human life. A keen learner, seeking to broaden her tech knowledge and writing skills, whilst helping guide others.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
WP Twitter Auto Publish Powered By : XYZScripts.com
SiteLock