Image by Author
For these diving into the world of pc science or needing a touch-up on their likelihood information, you’re in for a deal with. Stanford University has not too long ago up to date its YouTube playlist on its CS109 course with new content material!
The playlist contains 29 lectures to offer you with gold-standard information of the fundamentals of likelihood concept, important ideas in likelihood concept, mathematical instruments for analyzing chances, after which ending information evaluation and Machine Learning.
So let’s get straight into it…
Link: Counting
Learn concerning the historical past of likelihood and the way it has helped us obtain trendy AI, with real-life examples of growing AI programs. Understand the core counting phases, counting with ‘steps’ and counting with ‘or’. This contains areas resembling synthetic neural networks and the way researchers would use likelihood to construct machines.
Link: Combinatorics
The second lecture goes into the subsequent degree of seriousness counting – that is known as Combinatorics. Combinatorics is the arithmetic of counting and arranging. Dive into counting duties on n objects, by means of sorting objects (permutations), selecting okay objects (mixtures), and placing objects in r buckets.
Link: What is Probability?
This is the place the course actually begins to dive into Probability. Learn concerning the core guidelines of likelihood with a variety of examples and a contact on the Python programming language and its use with likelihood.
Link: Probability and Bayes
In this lecture, you’ll dive into studying learn how to use conditional chances, chain rule, the regulation of complete likelihood and Bayes theorem.
Link: Independence
In this lecture, you’ll find out about likelihood in respect of it being mutually unique and impartial, utilizing AND/OR. The lecture will undergo quite a lot of examples for you to get a superb grasp.
Link: Random Variables and Expectations
Based on the earlier lectures and your information of conditional chances and independence, this lecture will dive into random variables, use and produce the likelihood mass operate of a random variable, and be capable of calculate expectations.
Link: Variance Bernoulli Binomial
You will now use your information to unravel more durable and more durable issues. Your aim for this lecture might be to recognise and use Binomial Random Variables, Bernoulli Random Variables, and be capable of calculate the variance for random variables.
Link: Poisson
Poisson is nice when you could have a price and also you care concerning the variety of occurrences. You will find out about how it may be used in completely different features alongside with Python code examples.
Link: Continuous Random Variables
The targets of this lecture will embrace being snug utilizing new discrete random variables, integrating a density operate to get a likelihood, and utilizing a cumulative operate to get a likelihood.
Link: Normal Distribution
You might have heard this about regular distribution earlier than, in this lecture, you’ll undergo a short historical past of regular distribution, what it’s, why it is crucial and sensible examples.
Link: Joint Distributions
In the earlier lectures, you should have labored with 2 random variables at most, the subsequent step of studying might be to enter any given variety of random variables.
Link: Inference
The studying aim of this lecture is learn how to use multinomials, admire the utility of log chances, and be capable of use the Bayes theorem with random variables.
Link: Inference II
The studying aim continues from the final lecture of mixing Bayes theorem with random variables.
Link: Modelling
In this lecture, you’ll take the whole lot you could have realized up to now and put it into perspective about real-life issues – probabilistic modelling. This is taking an entire bunch of random variables being random collectively.
Link: General Inference
You will dive into normal inference, and in specific, find out about an algorithm known as rejection sampling.
Link: Beta
This lecture will go into the random variables of chances that are used to unravel real-world issues. Beta is a distribution for chances, the place its vary values between 0 and 1.
Link: Adding Random Variables I
At this level of the course, you’ll be studying about deep concept and including random variables is an introduction to learn how to attain outcomes of the idea of likelihood.
Link: Central Limit Theorem
In this lecture, you’ll dive into the central restrict theorem which is a vital component in likelihood. You will undergo sensible examples so that you could grasp the idea.
Link: Bootstrapping and P-Values I
You will now transfer into uncertainty concept, sampling and bootstrapping which is impressed by the central restrict theorem. You will undergo sensible examples.
Link: Algorithmic Analysis
In this lecture, you’ll dive a bit extra into pc science with an in-depth understanding of the evaluation of algorithms, which is the method of discovering the computational complexity of algorithms.
Link: M.L.E.
This lecture will dive into parameter estimation, which is able to present you with extra information on machine studying. This is the place you’re taking your information of likelihood and apply it to machine studying and synthetic intelligence.
Link: M.A.P.
We’re nonetheless on the stage of taking core ideas of likelihood and the way it utilized to machine studying. In this lecture, you’ll give attention to parameters in machine studying relating to likelihood and random variables.
Link: Naive Bayes
Naive Bayes is the primary machine studying algorithm you’ll find out about in depth. You could have learnt concerning the concept of parameter estimation, and now will transfer on to how core algorithms resembling Naive Bayes result in concepts resembling neural networks.
Link: Logistic Regression
In this lecture, you’ll dive right into a second algorithm known as Logistic regression which is used for classification duties, which additionally, you will study extra about.
Link: Deep Learning
As you’ve began to dive into machine studying, this lecture will go into additional element about deep studying primarily based on what you could have already realized.
Link: Fairness
We reside in a world the place machine studying is being applied in our day-to-day lives. In this lecture, you’ll look into the equity round machine studying, with a give attention to ethics.
Link: Advanced Probability
You have learnt quite a bit concerning the fundamentals of likelihood and have utilized it in completely different eventualities and the way it pertains to machine studying algorithms. The subsequent step is to get a bit extra superior about likelihood.
Link: Future of Probability
The studying aim for this lecture is to find out about the usage of likelihood and the number of issues that likelihood might be utilized to unravel these issues.
Link: Final Review
And final however not least, the final lecture. You will undergo all the opposite 28 lectures and contact on any uncertainties.
Being capable of finding good materials for your studying journey might be troublesome. This likelihood for pc science course materials is superb and might help you grasp ideas of likelihood that you just have been not sure of or wanted a contact up.
Nisha Arya is a Data Scientist and Freelance Technical Writer. She is especially in offering Data Science profession recommendation or tutorials and concept primarily based information round Data Science. She additionally needs to discover the alternative ways Artificial Intelligence is/can profit the longevity of human life. A eager learner, in search of to broaden her tech information and writing abilities, while serving to information others.
https://www.kdnuggets.com/learn-probability-in-computer-science-with-stanford-university-for-free