{"id":12365,"date":"2026-04-06T18:25:50","date_gmt":"2026-04-06T09:25:50","guid":{"rendered":"https:\/\/www.ibs.re.kr\/bimag\/?post_type=tribe_events&#038;p=12365"},"modified":"2026-04-06T19:37:49","modified_gmt":"2026-04-06T10:37:49","slug":"foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan","status":"publish","type":"tribe_events","link":"https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/","title":{"rendered":"Foundation Models for Wearable Movement Data in Mental Health Research &#8211; Aqsa Awan"},"content":{"rendered":"<p>In this tallk, we discuss the paper \u201cFoundation Models for Wearable Movement Data in Mental Health Research\u201d by Franklin Y. Ruan et al., arXiv, 2025.<\/p>\n<p><strong>Abstract<\/strong><\/p>\n<p>Pretrained foundation models and transformer architectures have driven the success of large language models (LLMs) and other modern AI breakthroughs. However, similar advancements in health data modeling remain limited due to the need for innovative adaptations. Wearable movement data offers a valuable avenue for exploration, as it&#8217;s a core feature in nearly all commercial smartwatches, well established in clinical and mental health research, and the sequential nature of the data shares similarities to language. We introduce the Pretrained Actigraphy Transformer (PAT), the first open source foundation model designed for time-series wearable movement data. Leveraging transformer-based architectures and novel techniques, such as patch embeddings, and pretraining on data from 29,307 participants in a national U.S. sample, PAT achieves state-of-the-art performance in several mental health prediction tasks. PAT is also lightweight and easily interpretable, making it a robust tool for mental health research.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In this tallk, we discuss the paper \u201cFoundation Models for Wearable Movement Data in Mental Health Research\u201d by Franklin Y. Ruan et al., arXiv, 2025. Abstract Pretrained foundation models and &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Foundation Models for Wearable Movement Data in Mental Health Research &#8211; Aqsa Awan&#8221;<\/span><\/a><\/p>\n","protected":false},"author":13,"featured_media":0,"template":"","meta":{"_editorskit_title_hidden":false,"_editorskit_reading_time":0,"_editorskit_is_block_options_detached":false,"_editorskit_block_options_position":"{}","_uag_custom_page_level_css":"","_tribe_events_status":"","_tribe_events_status_reason":"","footnotes":""},"tags":[],"tribe_events_cat":[219],"class_list":["post-12365","tribe_events","type-tribe_events","status-publish","hentry","tribe_events_cat-journal-club","cat_journal-club"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Foundation Models for Wearable Movement Data in Mental Health Research - Aqsa Awan - Biomedical Mathematics Group<\/title>\n<meta name=\"description\" content=\"Pretrained foundation models and transformer architectures have driven the success of large language models (LLMs) and other modern AI breakthroughs. However, similar advancements in health data modeling remain limited due to the need for innovative adaptations. Wearable movement data offers a valuable avenue for exploration, as it&#039;s a core feature in nearly all commercial smartwatches, well established in clinical and mental health research, and the sequential nature of the data shares similarities to language. We introduce the Pretrained Actigraphy Transformer (PAT), the first open source foundation model designed for time-series wearable movement data. Leveraging transformer-based architectures and novel techniques, such as patch embeddings, and pretraining on data from 29,307 participants in a national U.S. sample, PAT achieves state-of-the-art performance in several mental health prediction tasks. PAT is also lightweight and easily interpretable, making it a robust tool for mental health research.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Foundation Models for Wearable Movement Data in Mental Health Research - Aqsa Awan - Biomedical Mathematics Group\" \/>\n<meta property=\"og:description\" content=\"Pretrained foundation models and transformer architectures have driven the success of large language models (LLMs) and other modern AI breakthroughs. However, similar advancements in health data modeling remain limited due to the need for innovative adaptations. Wearable movement data offers a valuable avenue for exploration, as it&#039;s a core feature in nearly all commercial smartwatches, well established in clinical and mental health research, and the sequential nature of the data shares similarities to language. We introduce the Pretrained Actigraphy Transformer (PAT), the first open source foundation model designed for time-series wearable movement data. Leveraging transformer-based architectures and novel techniques, such as patch embeddings, and pretraining on data from 29,307 participants in a national U.S. sample, PAT achieves state-of-the-art performance in several mental health prediction tasks. PAT is also lightweight and easily interpretable, making it a robust tool for mental health research.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/\" \/>\n<meta property=\"og:site_name\" content=\"Biomedical Mathematics Group\" \/>\n<meta property=\"article:modified_time\" content=\"2026-04-06T10:37:49+00:00\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"1 minute\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/event\\\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\\\/\",\"url\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/event\\\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\\\/\",\"name\":\"Foundation Models for Wearable Movement Data in Mental Health Research - Aqsa Awan - Biomedical Mathematics Group\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/#website\"},\"datePublished\":\"2026-04-06T09:25:50+00:00\",\"dateModified\":\"2026-04-06T10:37:49+00:00\",\"description\":\"Pretrained foundation models and transformer architectures have driven the success of large language models (LLMs) and other modern AI breakthroughs. However, similar advancements in health data modeling remain limited due to the need for innovative adaptations. Wearable movement data offers a valuable avenue for exploration, as it's a core feature in nearly all commercial smartwatches, well established in clinical and mental health research, and the sequential nature of the data shares similarities to language. We introduce the Pretrained Actigraphy Transformer (PAT), the first open source foundation model designed for time-series wearable movement data. Leveraging transformer-based architectures and novel techniques, such as patch embeddings, and pretraining on data from 29,307 participants in a national U.S. sample, PAT achieves state-of-the-art performance in several mental health prediction tasks. PAT is also lightweight and easily interpretable, making it a robust tool for mental health research.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/event\\\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/event\\\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\\\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/event\\\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Events\",\"item\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/events\\\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Foundation Models for Wearable Movement Data in Mental Health Research &#8211; Aqsa Awan\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/#website\",\"url\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/\",\"name\":\"Biomedical Mathematics Group\",\"description\":\"\uae30\ucd08\uacfc\ud559\uc5f0\uad6c\uc6d0 \uc758\uc0dd\uba85\uc218\ud559\uadf8\ub8f9\",\"publisher\":{\"@id\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/#organization\",\"name\":\"IBS Biomedical Mathematics Group\",\"url\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/cms\\\/wp-content\\\/uploads\\\/2021\\\/02\\\/ibs-circle-1.png\",\"contentUrl\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/cms\\\/wp-content\\\/uploads\\\/2021\\\/02\\\/ibs-circle-1.png\",\"width\":250,\"height\":250,\"caption\":\"IBS Biomedical Mathematics Group\"},\"image\":{\"@id\":\"https:\\\/\\\/www.ibs.re.kr\\\/bimag\\\/#\\\/schema\\\/logo\\\/image\\\/\"}}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Foundation Models for Wearable Movement Data in Mental Health Research - Aqsa Awan - Biomedical Mathematics Group","description":"Pretrained foundation models and transformer architectures have driven the success of large language models (LLMs) and other modern AI breakthroughs. However, similar advancements in health data modeling remain limited due to the need for innovative adaptations. Wearable movement data offers a valuable avenue for exploration, as it's a core feature in nearly all commercial smartwatches, well established in clinical and mental health research, and the sequential nature of the data shares similarities to language. We introduce the Pretrained Actigraphy Transformer (PAT), the first open source foundation model designed for time-series wearable movement data. Leveraging transformer-based architectures and novel techniques, such as patch embeddings, and pretraining on data from 29,307 participants in a national U.S. sample, PAT achieves state-of-the-art performance in several mental health prediction tasks. PAT is also lightweight and easily interpretable, making it a robust tool for mental health research.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/","og_locale":"en_US","og_type":"article","og_title":"Foundation Models for Wearable Movement Data in Mental Health Research - Aqsa Awan - Biomedical Mathematics Group","og_description":"Pretrained foundation models and transformer architectures have driven the success of large language models (LLMs) and other modern AI breakthroughs. However, similar advancements in health data modeling remain limited due to the need for innovative adaptations. Wearable movement data offers a valuable avenue for exploration, as it's a core feature in nearly all commercial smartwatches, well established in clinical and mental health research, and the sequential nature of the data shares similarities to language. We introduce the Pretrained Actigraphy Transformer (PAT), the first open source foundation model designed for time-series wearable movement data. Leveraging transformer-based architectures and novel techniques, such as patch embeddings, and pretraining on data from 29,307 participants in a national U.S. sample, PAT achieves state-of-the-art performance in several mental health prediction tasks. PAT is also lightweight and easily interpretable, making it a robust tool for mental health research.","og_url":"https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/","og_site_name":"Biomedical Mathematics Group","article_modified_time":"2026-04-06T10:37:49+00:00","twitter_card":"summary_large_image","twitter_misc":{"Est. reading time":"1 minute"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/","url":"https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/","name":"Foundation Models for Wearable Movement Data in Mental Health Research - Aqsa Awan - Biomedical Mathematics Group","isPartOf":{"@id":"https:\/\/www.ibs.re.kr\/bimag\/#website"},"datePublished":"2026-04-06T09:25:50+00:00","dateModified":"2026-04-06T10:37:49+00:00","description":"Pretrained foundation models and transformer architectures have driven the success of large language models (LLMs) and other modern AI breakthroughs. However, similar advancements in health data modeling remain limited due to the need for innovative adaptations. Wearable movement data offers a valuable avenue for exploration, as it's a core feature in nearly all commercial smartwatches, well established in clinical and mental health research, and the sequential nature of the data shares similarities to language. We introduce the Pretrained Actigraphy Transformer (PAT), the first open source foundation model designed for time-series wearable movement data. Leveraging transformer-based architectures and novel techniques, such as patch embeddings, and pretraining on data from 29,307 participants in a national U.S. sample, PAT achieves state-of-the-art performance in several mental health prediction tasks. PAT is also lightweight and easily interpretable, making it a robust tool for mental health research.","breadcrumb":{"@id":"https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.ibs.re.kr\/bimag\/event\/foundation-models-for-wearable-movement-data-in-mental-health-research-aqsa-awan\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.ibs.re.kr\/bimag\/"},{"@type":"ListItem","position":2,"name":"Events","item":"https:\/\/www.ibs.re.kr\/bimag\/events\/"},{"@type":"ListItem","position":3,"name":"Foundation Models for Wearable Movement Data in Mental Health Research &#8211; Aqsa Awan"}]},{"@type":"WebSite","@id":"https:\/\/www.ibs.re.kr\/bimag\/#website","url":"https:\/\/www.ibs.re.kr\/bimag\/","name":"Biomedical Mathematics Group","description":"\uae30\ucd08\uacfc\ud559\uc5f0\uad6c\uc6d0 \uc758\uc0dd\uba85\uc218\ud559\uadf8\ub8f9","publisher":{"@id":"https:\/\/www.ibs.re.kr\/bimag\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.ibs.re.kr\/bimag\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.ibs.re.kr\/bimag\/#organization","name":"IBS Biomedical Mathematics Group","url":"https:\/\/www.ibs.re.kr\/bimag\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.ibs.re.kr\/bimag\/#\/schema\/logo\/image\/","url":"https:\/\/www.ibs.re.kr\/bimag\/cms\/wp-content\/uploads\/2021\/02\/ibs-circle-1.png","contentUrl":"https:\/\/www.ibs.re.kr\/bimag\/cms\/wp-content\/uploads\/2021\/02\/ibs-circle-1.png","width":250,"height":250,"caption":"IBS Biomedical Mathematics Group"},"image":{"@id":"https:\/\/www.ibs.re.kr\/bimag\/#\/schema\/logo\/image\/"}}]}},"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false,"dimag-thumbnail":false,"twentyseventeen-featured-image":false,"twentyseventeen-thumbnail-avatar":false},"uagb_author_info":{"display_name":"Hyeong Jun Jang","author_link":"https:\/\/www.ibs.re.kr\/bimag\/author\/hyeong-jun-jang\/"},"uagb_comment_info":0,"uagb_excerpt":"In this tallk, we discuss the paper \u201cFoundation Models for Wearable Movement Data in Mental Health Research\u201d by Franklin Y. Ruan et al., arXiv, 2025. Abstract Pretrained foundation models and &hellip; Continue reading \"Foundation Models for Wearable Movement Data in Mental Health Research &#8211; Aqsa Awan\"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/www.ibs.re.kr\/bimag\/wp-json\/wp\/v2\/tribe_events\/12365","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.ibs.re.kr\/bimag\/wp-json\/wp\/v2\/tribe_events"}],"about":[{"href":"https:\/\/www.ibs.re.kr\/bimag\/wp-json\/wp\/v2\/types\/tribe_events"}],"author":[{"embeddable":true,"href":"https:\/\/www.ibs.re.kr\/bimag\/wp-json\/wp\/v2\/users\/13"}],"version-history":[{"count":1,"href":"https:\/\/www.ibs.re.kr\/bimag\/wp-json\/wp\/v2\/tribe_events\/12365\/revisions"}],"predecessor-version":[{"id":12366,"href":"https:\/\/www.ibs.re.kr\/bimag\/wp-json\/wp\/v2\/tribe_events\/12365\/revisions\/12366"}],"wp:attachment":[{"href":"https:\/\/www.ibs.re.kr\/bimag\/wp-json\/wp\/v2\/media?parent=12365"}],"wp:term":[{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.ibs.re.kr\/bimag\/wp-json\/wp\/v2\/tags?post=12365"},{"taxonomy":"tribe_events_cat","embeddable":true,"href":"https:\/\/www.ibs.re.kr\/bimag\/wp-json\/wp\/v2\/tribe_events_cat?post=12365"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}