BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Biomedical Mathematics Group - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Biomedical Mathematics Group
X-ORIGINAL-URL:https://www.ibs.re.kr/bimag
X-WR-CALDESC:Events for Biomedical Mathematics Group
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Asia/Seoul
BEGIN:STANDARD
TZOFFSETFROM:+0900
TZOFFSETTO:+0900
TZNAME:KST
DTSTART:20230101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Asia/Seoul:20240105T140000
DTEND;TZID=Asia/Seoul:20240105T160000
DTSTAMP:20260425T042259
CREATED:20231130T084919Z
LAST-MODIFIED:20231215T004743Z
UID:8754-1704463200-1704470400@www.ibs.re.kr
SUMMARY:Hyeontae Jo\, Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery
DESCRIPTION:We will discuss about “Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery” IEEE Transactions on neural networks and learning systems 32.9 (2020): 4166-4177. \nAbstract \n\nSymbolic regression is a powerful technique to discover analytic equations that describe data\, which can lead to explainable models and the ability to predict unseen data. In contrast\, neural networks have achieved amazing levels of accuracy on image recognition and natural language processing tasks\, but they are often seen as black-box models that are difficult to interpret and typically extrapolate poorly. In this article\, we use a neural network-based architecture for symbolic regression called the equation learner (EQL) network and integrate it with other deep learning architectures such that the whole system can be trained end-to-end through backpropagation. To demonstrate the power of such systems\, we study their performance on several substantially different tasks. First\, we show that the neural network can perform symbolic regression and learn the form of several functions. Next\, we present an MNIST arithmetic task where a convolutional network extracts the digits. Finally\, we demonstrate the prediction of dynamical systems where an unknown parameter is extracted through an encoder. We find that the EQL-based architecture can extrapolate quite well outside of the training data set compared with a standard neural network-based architecture\, paving the way for deep learning to be applied in scientific exploration and discovery
URL:https://www.ibs.re.kr/bimag/event/2024-01-05-jc/
LOCATION:B378 Seminar room\, IBS\, 55 Expo-ro Yuseong-gu\, Daejeon\, 34126\, Korea\, Republic of
CATEGORIES:Journal Club
ORGANIZER;CN="Jae Kyoung Kim":MAILTO:jaekkim@kaist.ac.kr
END:VEVENT
END:VCALENDAR