Bayesian Learning.ppt
《Bayesian Learning.ppt》由会员分享,可在线阅读,更多相关《Bayesian Learning.ppt(25页珍藏版)》请在麦多课文档分享上搜索。
1、Bayesian Learning,Provides practical learning algorithms Nave Bayes learning Bayesian belief network learning Combine prior knowledge (prior probabilities)Provides foundations for machine learning Evaluating learning algorithms Guiding the design of new algorithms Learning from models : meta learnin
2、g,Bayesian Classification: Why?,Probabilistic learning: Calculate explicit probabilities for hypothesis, among the most practical approaches to certain types of learning problems Incremental: Each training example can incrementally increase/decrease the probability that a hypothesis is correct. Prio
3、r knowledge can be combined with observed data. Probabilistic prediction: Predict multiple hypotheses, weighted by their probabilities Standard: Even when Bayesian methods are computationally intractable, they can provide a standard of optimal decision making against which other methods can be measu
4、red,Basic Formulas for Probabilities,Product Rule : probability P(AB) of a conjunction of two events A and B: Sum Rule: probability of a disjunction of two events A and B:Theorem of Total Probability : if events A1, ., An are mutually exclusive with,Basic Approach,Bayes Rule:,P(h) = prior probabilit
5、y of hypothesis h P(D) = prior probability of training data D P(h|D) = probability of h given D (posterior density ) P(D|h) = probability of D given h (likelihood of D given h),The Goal of Bayesian Learning: the most probable hypothesis given the training data (Maximum A Posteriori hypothesis ),An E
6、xample,Does patient have cancer or not?,A patient takes a lab test and the result comes back positive. The test returns a correct positive result in only 98% of the cases in which the disease is actually present, and a correct negative result in only 97% of the cases in which the disease is not pres
7、ent. Furthermore, .008 of the entire population have this cancer.,MAP Learner,For each hypothesis h in H, calculate the posterior probability,Output the hypothesis hmap with the highest posterior probability,Comments:Computational intensiveProviding a standard for judging the performance of learning
8、 algorithmsChoosing P(h) and P(D|h) reflects our prior knowledge about the learning task,Bayes Optimal Classifier,Question: Given new instance x, what is its most probable classification? Hmap(x) is not the most probable classification! Example: Let P(h1|D) = .4, P(h2|D) = .3, P(h3 |D) =.3 Given new
9、 data x, we have h1(x)=+, h2(x) = -, h3(x) = - What is the most probable classification of x ? Bayes optimal classification:,Example:,P(h1| D) =.4, P(-|h1)=0, P(+|h1)=1 P(h2|D) =.3, P(-|h2)=1, P(+|h2)=0 P(h3|D)=.3, P(-|h3)=1, P(+|h3)=0,Nave Bayes Learner,Assume target function f: X- V, where each in
10、stance x described by attributes . Most probable value of f(x) is:,Nave Bayes assumption:,(attributes are conditionally independent),Bayesian classification,The classification problem may be formalized using a-posteriori probabilities:P(C|X) = prob. that the sample tuple X= is of class C.E.g. P(clas
11、s=N | outlook=sunny,windy=true,)Idea: assign to sample X the class label C such that P(C|X) is maximal,Estimating a-posteriori probabilities,Bayes theorem: P(C|X) = P(X|C)P(C) / P(X) P(X) is constant for all classes P(C) = relative freq of class C samples C such that P(C|X) is maximum = C such that
12、P(X|C)P(C) is maximum Problem: computing P(X|C) is unfeasible!,Nave Bayesian Classification,Nave assumption: attribute independence P(x1,xk|C) = P(x1|C)P(xk|C) If i-th attribute is categorical: P(xi|C) is estimated as the relative freq of samples having value xi as i-th attribute in class C If i-th
13、attribute is continuous: P(xi|C) is estimated thru a Gaussian density function Computationally easy in both cases,Naive Bayesian Classifier (II),Given a training set, we can compute the probabilities,Play-tennis example: estimating P(xi|C),Example : Nave Bayes,Predict playing tennis in the day with
- 1.请仔细阅读文档,确保文档完整性,对于不预览、不比对内容而直接下载带来的问题本站不予受理。
- 2.下载的文档,不会出现我们的网址水印。
- 3、该文档所得收入(下载+内容+预览)归上传者、原创作者;如果您是本文档原作者,请点此认领!既往收益都归您。
下载文档到电脑,查找使用更方便
2000 积分 0人已下载
下载 | 加入VIP,交流精品资源 |
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- BAYESIANLEARNINGPPT
