BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Events - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://events.ucsc.edu
X-WR-CALDESC:Events for Events
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20250309T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20251102T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20260308T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20261101T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20270314T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20271107T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20260202T120000
DTEND;TZID=America/Los_Angeles:20260202T130000
DTSTAMP:20260430T113418
CREATED:20260122T191932Z
LAST-MODIFIED:20260128T171007Z
UID:10009093-1770033600-1770037200@events.ucsc.edu
SUMMARY:Statistics Seminar: Mathematical Foundations for Machine Learning from a Nonlinear Time Series Perspective
DESCRIPTION:Presenter: Jiaqi Li\, William H. Kruskal Instructor\, University of Chicago \nDescription:Modern machine learning (ML) algorithms achieve remarkable empirical success\, yet providing rigorous statistical guarantees remains a major challenge\, particularly in distributional theory and online inference methods. In this talk\, we will introduce a novel framework to provide mathematical foundations for ML by bringing powerful tools in nonlinear time series. First\, we focus on the stochastic gradient descent (SGD) with constant learning rates. By interpreting the SGD sequence as a nonlinear AR(1) process\, we can establish the geometric moment contraction (GMC) for SGD regardless of initializations. By this GMC property\, we can derive refined asymptotic theory of SGD and its averaging variant\, including general moment convergence\, quenched central limit theorems\, quenched invariance principles\, and sharp Berry- Esseen bounds. Then\, we extend this theoretical framework to SGD with dropout regularization\, a widely used but theoretically underexplored technique in deep learning. By establishing GMC under explicit learning-rate and dimensional scaling regimes\, we obtain asymptotic normality and invariance principles for dropout SGD and its averaged version. These results enable online inference\, for which we introduce a fully recursive estimator of the long-run covariance matrix appearing in the limiting distributions. The proposed online confidence intervals with asymptotically correct coverage can be generalized to many other ML algorithms. Overall\, viewing online learning algorithms as nonlinear time series provides a powerful toolkit for deriving statistical guarantees in modern ML\, with implications for high-dimensional stochastic optimization and real-time uncertainty quantification. \nBio:Jiaqi Li is a William H. Kruskal Instructor in the Department of Statistics at the University of Chicago. She obtained her PhD in Statistics from Washington University in St. Louis in 2024. Her research focuses on developing theoretical guarantees and statistical inference methods for machine learning algorithms. She also works on time series data\, especially in the high- dimensional settings with complex temporal and cross-sectional dependency structures. She also\ncollaborates with neuroscientists on applications in fMRI and EEG data. \nHosted by: Statistics Department \nZoom link: https://ucsc.zoom.us/j/96647674332?pwd=rCHfeGpKslaGS5iIPP5Jh29mQiMJID.1
URL:https://events.ucsc.edu/event/statistics-seminar-mathematical-foundations-for-machine-learning-from-a-nonlinear-time-series-perspective/
LOCATION:https://ucsc.zoom.us/j/96647674332?pwd=rCHfeGpKslaGS5iIPP5Jh29mQiMJID.1
CATEGORIES:Lectures & Presentations,Seminars
ATTACH;FMTTYPE=image/jpeg:https://events.ucsc.edu/wp-content/uploads/2026/01/ph.d.-presentation-graphic-option-1-1.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20260202T160000
DTEND;TZID=America/Los_Angeles:20260202T170000
DTSTAMP:20260430T113418
CREATED:20260128T184233Z
LAST-MODIFIED:20260128T184233Z
UID:10009126-1770048000-1770051600@events.ucsc.edu
SUMMARY:AM Seminar: Are Graph Learning Methods Actually Learning?
DESCRIPTION:Presenter: Seshadhri Comandur\, Professor of Computer Science\, UCSC \nDescription: There has been a lot of literature on graph machine learning over the past few years\, and a bewildering array of new methods. This talk is based on a series of results making a provocative argument. Maybe many graph machine learning methods are not really that effective\, and the progress we are seeing is an artifact of experimental design and measurement. I will talk about some results showing that low-dimensional embeddings with dot product similarity (arguably the most common graph ML technique) cannot capture salient aspects of real-world graphs. Follow-up work demonstrates that simple benchmarks seem to outperform fancier methods\, and that there are significant shortcomings in existing accuracy measurement. \nBio: C. Seshadhri (Sesh) is a professor of Computer Science at the University of California\, Santa Cruz and an Amazon scholar. Prior to joining UCSC\, he was a researcher at Sandia National Labs\, Livermore in the Information Security Sciences department\, during 2010-2014. His primary interest is the theoretical study of algorithms\, especially those with a mix of graphs and randomization. By and large\, Sesh works at the boundary of theoretical computer science (TCS) and data mining. His work spans many areas: sublinear algorithms\, graph algorithms\, graph modeling\, scalable computation\, and data mining. In the theory world\, his work has resolved numerous open problems in monotonicity testing and graph property testing. A number of his papers in the interface of TCS and applied algorithms have received paper awards at KDD\, WWW\, ICDM\, SDM\, and WSDM. He received the 2019 SDM/IBM Early Career Award for Excellence in Data Analytics. Sesh got his Ph.D from Princeton University and spent two years as a postdoc in IBM Almaden Labs. \nHosted by: Ashesh Chattopadhyay\, Applied Mathematics Department
URL:https://events.ucsc.edu/event/am-seminar-are-graph-learning-methods-actually-learning/
LOCATION:CA
CATEGORIES:Lectures & Presentations,Seminars
ATTACH;FMTTYPE=image/jpeg:https://events.ucsc.edu/wp-content/uploads/2026/01/sesh.jpeg
END:VEVENT
END:VCALENDAR