BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Biomedical Mathematics Group - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://www.ibs.re.kr/bimag
X-WR-CALDESC:Events for Biomedical Mathematics Group
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Asia/Seoul
BEGIN:STANDARD
TZOFFSETFROM:+0900
TZOFFSETTO:+0900
TZNAME:KST
DTSTART:20240101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Asia/Seoul:20250404T140000
DTEND;TZID=Asia/Seoul:20250404T160000
DTSTAMP:20260423T115642
CREATED:20250326T091007Z
LAST-MODIFIED:20250330T013324Z
UID:10919-1743775200-1743782400@www.ibs.re.kr
SUMMARY:Accurate predictions on small data with a tabular foundation model - Dongju Lim
DESCRIPTION:In this talk\, we discuss the paper “Accurate predictions on small data with a tabular foundation model” by Noah Hollmann et al.\, Nature (2025). \nAbstract \nTabular data\, spreadsheets organized in rows and columns\, are ubiquitous across scientific fields\, from biomedicine to particle physics to economics and climate science1\,2. The fundamental prediction task of filling in missing values of a label column based on the rest of the columns is essential for various applications as diverse as biomedical risk models\, drug discovery and materials science. Although deep learning has revolutionized learning from raw data and led to numerous high-profile success stories3\,4\,5\, gradient-boosted decision trees6\,7\,8\,9 have dominated tabular data for the past 20 years. Here we present the Tabular Prior-data Fitted Network (TabPFN)\, a tabular foundation model that outperforms all previous methods on datasets with up to 10\,000 samples by a wide margin\, using substantially less training time. In 2.8 s\, TabPFN outperforms an ensemble of the strongest baselines tuned for 4 h in a classification setting. As a generative transformer-based foundation model\, this model also allows fine-tuning\, data generation\, density estimation and learning reusable embeddings. TabPFN is a learning algorithm that is itself learned across millions of synthetic datasets\, demonstrating the power of this approach for algorithm development. By improving modelling abilities across diverse fields\, TabPFN has the potential to accelerate scientific discovery and enhance important decision-making in various domains.
URL:https://www.ibs.re.kr/bimag/event/a-differentiable-gillespie-algorithm-for-simulating-chemical-kinetics-parameter-estimation-and-designing-synthetic-biological-circuits-dongju-lim/
LOCATION:Daejeon
CATEGORIES:Journal Club
ORGANIZER;CN="Jae Kyoung Kim":MAILTO:jaekkim@kaist.ac.kr
END:VEVENT
END:VCALENDAR