BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Events - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://events.ucsc.edu
X-WR-CALDESC:Events for Events
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20240310T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20241103T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20250309T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20251102T090000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
TZNAME:PDT
DTSTART:20260308T100000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
TZNAME:PST
DTSTART:20261101T090000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/Los_Angeles:20251209T160000
DTEND;TZID=America/Los_Angeles:20251209T170000
DTSTAMP:20260417T100810
CREATED:20251202T204536Z
LAST-MODIFIED:20251209T182652Z
UID:10005719-1765296000-1765299600@events.ucsc.edu
SUMMARY:Zhu\, R. (ECE) -  From Neuromorphic Principles to Efficient Neural Language Architectures
DESCRIPTION:While Large Language Models exhibit remarkable capabilities\, their reliance on the standard Transformer architecture imposes prohibitive computational costs and quadratic memory complexity. To bridge the gap between biological efficiency and high-performance AI\, we have established foundational work in linearizing attention and maximizing hardware utilization through architectures such as RWKV and MatMul-Free networks. Addressing the remaining bottlenecks in long-term memory consolidation and optimization stability\, we propose a research roadmap focused on “In-Place Test-Time Training” (TTT) to enable compositional memory via dynamic weight updates\, and the Muon optimizer to stabilize deep reasoning through orthogonal gradient updates. Ultimately\, this work aims to unify neuromorphic principles with scalable deep learning to enable robust performance in resource-efficient environments. \nEvent Host: Ridger Zhu\, Ph.D. Student\, Electrical and Computer Engineering  \nAdvisor: Jason Eshraghian \nZoom- https://ucsc.zoom.us/j/95241268060?pwd=WDMgDWhhSyXNh8NZpBDvgpbcMVbvUz.1 \nPasscode- 256794
URL:https://events.ucsc.edu/event/ridger-z-ece-from-neuromorphic-principles-to-efficient-neural-language-architectures/
LOCATION:Engineering 2\, Engineering 2 1156 High Street\, Santa Cruz\, CA\, 95064
CATEGORIES:Ph.D. Presentations
ATTACH;FMTTYPE=image/png:https://events.ucsc.edu/wp-content/uploads/2025/10/option-3.png
GEO:37.0009723;-122.0632371
X-APPLE-STRUCTURED-LOCATION;VALUE=URI;X-ADDRESS=Engineering 2 Engineering 2 1156 High Street Santa Cruz CA 95064;X-APPLE-RADIUS=500;X-TITLE=Engineering 2 1156 High Street:geo:-122.0632371,37.0009723
END:VEVENT
END:VCALENDAR