BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//IDSS - ECPv5.14.2.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:IDSS
X-ORIGINAL-URL:https://idss.mit.edu
X-WR-CALDESC:Events for IDSS
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:UTC
BEGIN:STANDARD
TZOFFSETFROM:+0000
TZOFFSETTO:+0000
TZNAME:UTC
DTSTART:20170101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=UTC:20170908T110000
DTEND;TZID=UTC:20170908T120000
DTSTAMP:20220523T190911
CREATED:20170831T225546Z
LAST-MODIFIED:20170908T172603Z
UID:6074-1504868400-1504872000@idss.mit.edu
SUMMARY:New Provable Techniques for Learning and Inference in Probabilistic Graphical Models
DESCRIPTION:Speaker: Andrej Risteski (Princeton University)\nA common theme in machine learning is succinct modeling of distributions over large domains. Probabilistic graphical models are one of the most expressive frameworks for doing this. The two major tasks involving graphical models are learning and inference. Learning is the task of calculating the “best fit” model parameters from raw data\, while inference is the task of answering probabilistic queries for a model with known parameters (e.g. what is the marginal distribution of a subset of variables\, after conditioning on the values of some other variables). Learning can be thought of as finding a graphical model that “explains” the raw data\, while the inference queries extract the “knowledge” the graphical model contains. \nI will focus on a few vignettes from my research which give new provable techniques for these tasks:\n– In the context of learning\, I will talk about method-of-moments techniques for learning noisy-or Bayesian networks\, which are used for modeling the causal structure of diseases and symptoms.\n– In the context of inference\, I will talk about a new understanding of a class of algorithms for calculating partition functions\, called variational methods\, through the lens of convex programming hierarchies. Time permitting\, I will also speak about MCMC methods for sampling from highly multimodal distributions using simulated tempering. \nThe talk will assume no background\, and is meant as a “meet and greet” talk surveying various questions I’ve worked on and am interested in. \nBiography\nI work in the intersection of machine learning and theoretical computer science\, with the primary goal of designing provable and practical algorithms for problems arising in machine learning. Broadly\, this includes tasks like clustering\, maximum likelihood estimation\, inference\, learning generative models. \nAll of these tend to be non-convex in nature and intractable in general. However\, in practice\, a plethora of heuristics like gradient descent\, alternating minimization\, convex relaxations\, variational methods work reasonably well. In my research\, I endeavor to understand what are realistic conditions under which we can give guarantees of the performance of these algorithms\, as well as devise new\, more efficient methods. \nI was a PhD student in the Computer Science Department at Princeton University\, working under the advisement of Sanjeev Arora. Starting September 2017\, I will hold a joint position in the Institute for Data\, Systems\, and Society (IDSS) and the Applied Mathematics department at MIT\, as a Norbert Wiener Fellow and applied mathematics instructor respectively.
URL:https://idss.mit.edu/calendar/new-provable-techniques-for-learning-and-inference-in-probabilistic-graphical-models/
LOCATION:MIT Building E18\, Room 304\, Ford Building (E18)\, 50 Ames Street\, Cambridge\, MA\, United States
CATEGORIES:Stochastics and Statistics Seminar Series
GEO:42.3620185;-71.0878444
X-APPLE-STRUCTURED-LOCATION;VALUE=URI;X-ADDRESS=MIT Building E18 Room 304 Ford Building (E18) 50 Ames Street Cambridge MA United States;X-APPLE-RADIUS=500;X-TITLE=Ford Building (E18):geo:-71.0878444,42.3620185
END:VEVENT
END:VCALENDAR