A Review of Hidden Markov Models for Context-Based .ppt
《A Review of Hidden Markov Models for Context-Based .ppt》由会员分享,可在线阅读,更多相关《A Review of Hidden Markov Models for Context-Based .ppt(66页珍藏版)》请在麦多课文档分享上搜索。
1、A Review of Hidden Markov Models for Context-Based Classification ICML01 Workshop on Temporal and Spatial Learning Williams College June 28th 2001,Padhraic Smyth Information and Computer Science University of California, Irvinewww.datalab.uci.edu,Outline,Context in classificationBrief review of hidd
2、en Markov models Hidden Markov models for classificationSimulation results: how useful is context? (with Dasha Chudova, UCI),Historical Note,“Classification in Context” was well-studied in pattern recognition in the 60s and 70s e.g, recursive Markov-based algorithms were proposed, before hidden Mark
3、ov algorithms and models were fully understoodApplications in OCR for word-level recognition remote-sensing pixel classification,Papers of Note,Raviv, J., “Decision-making in Markov chains applied to the problem of pattern recognition”, IEEE Info Theory, 3(4), 1967Hanson, Riseman, and Fisher, “Conte
4、xt in word recognition,” Pattern Recognition, 1976Toussaint, G., “The use of context in pattern recognition,” Pattern Recognition, 10, 1978Mohn, Hjort, and Storvik, “A simulation study of some contextual classification methods for remotely sensed data,” IEEE Trans Geo. Rem. Sens., 25(6), 1987.,Conte
5、xt-Based Classification Problems,Medical Diagnosis classification of a patients state over timeFraud Detection detection of stolen credit cardElectronic Nose detection of landminesRemote Sensing classification of pixels into ground cover,Modeling Context,Common Theme = Context class labels (and feat
6、ures) are “persistent” in time/space,Modeling Context,Common Theme = Context class labels (and features) are “persistent” in time/space,X1,C1,X2,C2,X3,C3,XT,CT,- - - - - - - -,Features (observed),Class (hidden),Time,Feature Windows,Predict Ct using a window, e.g., f(Xt, Xt-1, Xt-2) e.g., NETtalk app
7、lication,X1,C1,X2,C2,X3,C3,XT,CT,- - - - - - - -,Features (observed),Class (hidden),Time,Alternative: Probabilistic Modeling,E.g., assume p(Ct | history) = p(Ct | Ct-1) first order Markov assumption on the classes,X1,C1,X2,C2,X3,C3,XT,CT,- - - - - - - -,Features (observed),Class (hidden),Time,Brief
8、review of hidden Markov models (HMMs),Graphical Models,Basic Idea: p(U) an annotated graphLet U be a set of random variables of interest1-1 mapping from U to nodes in a graphgraph encodes “independence structure” of modelnumerical specifications of p(U) are stored locally at the nodes,Acyclic Direct
9、ed Graphical Models (aka belief/Bayesian networks),In general,p(X1, X2,XN) = p(Xi | parents(Xi ) ),p(A,B,C) = p(C|A,B)p(A)p(B),Undirected Graphical Models (UGs),Undirected edges reflect correlational dependencies e.g., particles in physical systems, pixels in an image Also known as Markov random fie
10、lds, Boltzmann machines, etc,Examples of 3-way Graphical Models,Markov chain p(A,B,C) = p(C|B) p(B|A) p(A),Examples of 3-way Graphical Models,Markov chain p(A,B,C) = p(C|B) p(B|A) p(A),Independent Causes: p(A,B,C) = p(C|A,B)p(A)p(B),Hidden Markov Graphical Model,Assumption 1: p(Ct | history) = p(Ct
11、| Ct-1) first order Markov assumption on the classesAssumption 2: p(Xt | history, Ct ) = p(Xt | Ct ) Xt only depends on current class Ct,Hidden Markov Graphical Model,X1,C1,X2,C2,X3,C3,XT,CT,- - - - - - - -,Features (observed),Class (hidden),Time,Notes:- all temporal dependence is modeled throughthe
12、 class variable C- this is the simplest possible model- Avoids modeling p(X|other Xs),Generalizations of HMMs,R1,C1,R2,C2,R3,C3,RT,CT,- - - - - - - -,Spatial Rainfall (observed),State (hidden),Hidden state model relating atmospheric measurements to local rainfall“Weather state” couples multiple vari
13、ables in time and space(Hughes and Guttorp, 1996)Graphical models = language for spatio-temporal modeling,A1,A2,A3,AT,Atmospheric (observed),Exact Probability Propagation (PP) Algorithms,Basic PP Algorithm Pearl, 1988; Lauritzen and Spiegelhalter, 1988 Assume the graph has no loops Declare 1 node (a
14、ny node) to be a root Schedule two phases of message-passing nodes pass messages up to the root messages are distributed back to the leaves(if loops, convert loopy graph to an equivalent tree),Properties of the PP Algorithm,Exact p(node|all data) is recoverable at each node i.e., we get exact poster
- 1.请仔细阅读文档,确保文档完整性,对于不预览、不比对内容而直接下载带来的问题本站不予受理。
- 2.下载的文档,不会出现我们的网址水印。
- 3、该文档所得收入(下载+内容+预览)归上传者、原创作者;如果您是本文档原作者,请点此认领!既往收益都归您。
下载文档到电脑,查找使用更方便
2000 积分 0人已下载
下载 | 加入VIP,交流精品资源 |
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- AREVIEWOFHIDDENMARKOVMODELSFORCONTEXTBASEDPPT

链接地址:http://www.mydoc123.com/p-377876.html