Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Introduction to Probability in Artificial Intelligence, Study notes of Probability and Statistics

The basics of probability and its applications in artificial intelligence. It covers definitions, axioms, random variables, joint distributions, conditional probability, chain rule, Bayes' Rule, and inference. The document also explores the history of probability, which began as a study of gambling techniques. The University of Wisconsin-Madison is mentioned as the institution where the course is taught. The document could be useful as study notes or a summary for a student preparing for an exam in an artificial intelligence course.

Typology: Study notes

2021/2022

Uploaded on 05/11/2023

butterflymadam
butterflymadam 🇺🇸

4.4

(26)

61 documents

1 / 34

Toggle sidebar

Related documents


Partial preview of the text

Download Introduction to Probability in Artificial Intelligence and more Study notes Probability and Statistics in PDF only on Docsity! CS 540 Introduction to Artificial Intelligence Probability Fred Sala University of Wisconsin-Madison Jan 28, 2021 Probability: What is it good for? • Language to express uncertainty Win At Poker • Wisconsin Ph.D. student Ye Yuan 5th in WSOP Not unusual: probability began as study of gambling techniques pokernews.com Cardano Liber de ludo aleae Book on Games of Chance 1564! Outline • Basics: definitions, axioms, RVs, joint distributions • Independence, conditional probability, chain rule • Bayes’ Rule and Inference Basics: Outcomes & Events • Outcomes: possible results of an experiment • Events: subsets of outcomes we’re interested in Ex: Basics: Probability Distribution • We have outcomes and events. • Now assign probabilities Back to our example: Basics: Axioms • Rules for probability: – For all events – Always, – For disjoint events, • Easy to derive other laws. Ex: non-disjoint events Visualizing the Axioms: | * Axiom 1: E€ Ff, P(E) >0 Visualizing the Axioms ¢ Also, other laws: P(E, U E>) = P(E;) + P(B>) — P(E, 0 EB») Basics: Random Variables • Really, functions • Map outcomes to real values • Why? – So far, everything is a set. – Hard to work with! – Real values are easy to work with Basics: CDF & PDF • Can still work with probabilities: • Cumulative Distribution Func. (CDF) • Density / mass function Wiki CDF • Given a joint distribution – Get the distribution in just one variable: – This is the “marginal” distribution. Basics: Marginal Probability Basics: Marginal Probability P(X =a) =¥0, P(X =a,Y =)) Sunny | Cloudy | Rainy hot | 150/365 | 40/365 | 5/365 cold | 50/365 | 60/365 | 60/365 [P(hot), P(cold)] = [222, 222] Probability Tables • Write our distributions as tables • # of entries? 6. – If we have variables with values, we get entries – Big! For a 1080p screen, 12 bit color, size of table: – No way of writing down all terms cold hot 5/36540/365150/365 60/36560/36550/365 RainyCloudySunny Chain Rule • Apply repeatedly, • Note: still big! – If some conditional independence, can factor! – Leads to probabilistic graphical models Reasoning With Conditional Distributions • Evaluating probabilities: – Wake up with a sore throat. – Do I have the flu? • One approach: – Too strong. • Inference: compute probability given evidence – Can be much more complex! • Want: • Bayes’ Rule: • Parts: – Sore throat rate – Flu rate – Sore throat rate among flu sufferers So: Using Bayes’ Rule Bayesian Inference • Terminology: • Prior: estimate of the probability without evidence Prior Bayesian Inference • Terminology: • Likelihood: probability of evidence given a hypothesis. Likelihood Bayesian Inference • Terminology: • Posterior: probability of hypothesis given evidence. Posterior
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved