BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Events - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://events.ucsc.edu
X-WR-CALDESC:Events for Events
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20250309T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20251102T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20260308T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20261101T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20270314T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20271107T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20260420T160000
DTEND;TZID=America/Los_Angeles:20260420T170000
DTSTAMP:20260417T100433
CREATED:20260331T180549Z
LAST-MODIFIED:20260331T180549Z
UID:10011821-1776700800-1776704400@events.ucsc.edu
SUMMARY:AM Seminar: Variational Inference and Density Estimation with Non-Negative Tensor Train
DESCRIPTION:Presenter: Dr. Xun Tang\, Stanford University \nDescription: This talk covers an efficient numerical approach for compressing a high-dimensional discrete distribution function into a non-negative tensor train (NTT) format. The two settings we consider are variational inference and density estimation\, whereby one has access to either the unnormalized analytic formula of the distribution or the samples generated from the distribution. In particular\, the compression is done through a two-stage approach. In the first stage\, we use existing subroutines to encode the distribution function in a tensor train format. In the second stage\, we use an NTT ansatz to fit the obtained tensor train. For the NTT fitting procedure\, we use a log barrier term to ensure the positivity of each tensor component\, and then utilize a second-order alternating minimization scheme to accelerate convergence. In practice\, we observe that the proposed NTT fitting procedure exhibits drastically faster convergence than an alternative multiplicative update method that has been previously proposed. Through challenging numerical experiments\, we show that our approach can accurately compress target distribution functions. \nBio: Xun Tang is a postdoc in Stanford University\, department of mathematics\, hosted by Prof. Lexing Ying. Xun works on tensor network methods for scientific computing and data science\, and Xun also works on optimal transport algorithms. Xun will join HKUST department of mathematics in August 2026 as an incoming assistant professor. \nHosted by: Applied Mathematics Department
URL:https://events.ucsc.edu/event/am-seminar-variational-inference-and-density-estimation-with-non-negative-tensor-train/
LOCATION:CA
CATEGORIES:Lectures & Presentations,Seminars
ATTACH;FMTTYPE=image/png:https://events.ucsc.edu/wp-content/uploads/2026/03/BElogoWHITE.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20260420T160000
DTEND;TZID=America/Los_Angeles:20260420T170000
DTSTAMP:20260417T100433
CREATED:20260331T181211Z
LAST-MODIFIED:20260331T181211Z
UID:10011822-1776700800-1776704400@events.ucsc.edu
SUMMARY:Statistics Seminar: Hierarchical Clustering with Confidence
DESCRIPTION:Presenter: Snigdha Panigrahi\, Associate Professor\, Department of Statistics\, University of Michigan \nDescription:Agglomerative hierarchical clustering is one of the most widely used approaches for exploring how observations in a dataset relate to each other. However\, its greedy nature makes it highly sensitive to small perturbations in the data\, often producing different clustering results and making it difficult to separate genuine structure from spurious patterns. In this talk\, I will show how randomizing hierarchical clustering can be useful not just for measuring stability but also for designing valid hypothesis testing procedures based on the clustering results. We propose a simple randomization scheme to construct valid p-values at each node of a hierarchical clustering dendrogram\, quantifying evidence against greedy merges while controlling the Type I error rate. Our method applies to any linkage without case-specific derivations\, is substantially more powerful than existing selective inference approaches\, and provides an estimate of the number of clusters with a probabilistic guarantee on overestimation. \nBio:Snigdha Panigrahi is an Associate Professor of Statistics at the University of Michigan\, where she also holds a courtesy appointment in the Department of Biostatistics. She received her PhD in Statistics from Stanford University in 2018 and has been a faculty member at Michigan since then. Her research focuses on converting purely predictive machine learning algorithms into principled inferential methods. She is an elected member of the International Statistical Institute\, and her work has been recognized with an NSF CAREER Award and the Bernoulli New Researcher’s Award. Her editorial service\, past and present\, includes Journal of Computational and Graphical Statistics\, Bernoulli\, and Journal of the Royal Statistical Society: Series B. \nHosted by: Statistics Department
URL:https://events.ucsc.edu/event/statistics-seminar-hierarchical-clustering-with-confidence/
LOCATION:CA
CATEGORIES:Lectures & Presentations,Seminars
ATTACH;FMTTYPE=image/jpeg:https://events.ucsc.edu/wp-content/uploads/2026/03/ph.d.-presentation-graphic-option-1.jpg
END:VEVENT
END:VCALENDAR