{"id":2365,"date":"2025-06-02T14:00:38","date_gmt":"2025-06-02T18:00:38","guid":{"rendered":"https:\/\/research.gsd.harvard.edu\/real\/?p=2365"},"modified":"2025-06-02T14:01:41","modified_gmt":"2025-06-02T18:01:41","slug":"project-synthia-2","status":"publish","type":"post","link":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/","title":{"rendered":"PROJECT SYNTHIA"},"content":{"rendered":"\n<h1 class=\"wp-block-heading\" id=\"h-project-synthia-is-now-exhibiting-in-loeb-library\">PROJECT SYNTHIA<\/h1>\n\n\n\n<h2 class=\"wp-block-heading\">An Experiment in Emergent Design<\/h2>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"768\" src=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-1024x768.jpg\" alt=\"Text on glass &quot;A project by REAL lab&quot;\" class=\"wp-image-2381\" srcset=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-1024x768.jpg 1024w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-300x225.jpg 300w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-768x576.jpg 768w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-1536x1152.jpg 1536w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-2048x1536.jpg 2048w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>Project Synthia, by the Responsive Environments and Artifacts Lab (REAL) at the Harvard GSD, is a case study in emergent design methodology\u2014an approach that treats collaboration as a dynamic system, where outcomes are not imposed but negotiated. Here, design is framed as an act of assembly, shaped by a multiplicity of voices\u2014human, artificial, historical, and speculative\u2014rather than a singular vision. This exhibition is not a fixed narrative but an evolving synthesis, reflecting the tensions and harmonies of interdisciplinary thought. Synthia emerges from the intersections of geology, futuring, cognitive science, physics, art, design, architecture, history, landscape architecture and generative AI\u2014each contributing fragments to a larger, unresolved puzzle. <\/p>\n\n\n\n<p><\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Synthia Lumen Timelapse\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/Ou8xRqV6qi8?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>The exhibition draws inspiration from radical experiments in design methodology, from Bruno Munari\u2019s Da cosa nasce cosa (from one thing comes another) to the interdisciplinary ethos of Gyorgy Kepes, Chicago\u2019s New Bauhaus, and MIT\u2019s Center for Advanced Visual Studies (CAVS). Like these predecessors, it challenges the notion of authorship and control, embracing uncertainty as a generative force. <\/p>\n\n\n\n<p><\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"1560\" src=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210885-e-scaled.jpg\" alt=\"Visitors walking amongst the exhibition\" class=\"wp-image-2371\" srcset=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210885-e-scaled.jpg 2560w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210885-e-300x183.jpg 300w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210885-e-1024x624.jpg 1024w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210885-e-768x468.jpg 768w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210885-e-1536x936.jpg 1536w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210885-e-2048x1248.jpg 2048w\" sizes=\"auto, (max-width: 2560px) 100vw, 2560px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"1920\" src=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210786-e-scaled.jpg\" alt=\"Booklet with diagram of sketches\" class=\"wp-image-2374\" srcset=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210786-e-scaled.jpg 2560w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210786-e-300x225.jpg 300w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210786-e-1024x768.jpg 1024w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210786-e-768x576.jpg 768w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210786-e-1536x1152.jpg 1536w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210786-e-2048x1536.jpg 2048w\" sizes=\"auto, (max-width: 2560px) 100vw, 2560px\" \/><\/figure>\n\n\n\n<p>At its core, Project Synthia is a meditation on the assembly of ideas, artifacts, and intelligences. It traverses time and discipline, collapsing distinctions between past and future, human and machine, fiction and materiality. It is both an excavation and an invention, a recognition of the mess we have made, and a proposition for how we might make sense of it. The project was developed as an experimental assemblage of perspective, expertise, and technology. The project stays in the unsettled space of play, co-creation, and contradiction. Here, we found surprising new directions and ways of working, materializing now, in a place of speculative imagination and functional resolution.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"1995\" src=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1220011-e-scaled.jpg\" alt=\"Machine on display in the exhibition\" class=\"wp-image-2377\" srcset=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1220011-e-scaled.jpg 2560w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1220011-e-300x234.jpg 300w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1220011-e-1024x798.jpg 1024w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1220011-e-768x599.jpg 768w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1220011-e-1536x1197.jpg 1536w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1220011-e-2048x1596.jpg 2048w\" sizes=\"auto, (max-width: 2560px) 100vw, 2560px\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>This approach leaves space for a breadth of outcomes and accepts uncertainty as central to collaboration, play, and iterative practice. As technology enables more voices\u2014human, artificial, historical, and speculative\u2014to contribute to creative production, we need methods that can accommodate these perspectives and check against systemic biases of emerging technologies, averaging of ideas, and the illusion of finality.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Exhibtion Photos<\/h3>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"1920\" src=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210823-scaled.jpg\" alt=\"\" class=\"wp-image-2375\" srcset=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210823-scaled.jpg 2560w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210823-300x225.jpg 300w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210823-1024x768.jpg 1024w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210823-768x576.jpg 768w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210823-1536x1152.jpg 1536w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210823-2048x1536.jpg 2048w\" sizes=\"auto, (max-width: 2560px) 100vw, 2560px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"1863\" src=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210795-e-scaled.jpg\" alt=\"Exhibtion with text, video and machines\" class=\"wp-image-2373\" srcset=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210795-e-scaled.jpg 2560w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210795-e-300x218.jpg 300w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210795-e-1024x745.jpg 1024w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210795-e-768x559.jpg 768w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210795-e-1536x1118.jpg 1536w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210795-e-2048x1491.jpg 2048w\" sizes=\"auto, (max-width: 2560px) 100vw, 2560px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"2560\" height=\"1920\" src=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210782-e-scaled.jpg\" alt=\"Layered Illustration of a rock and bike frame\" class=\"wp-image-2370\" srcset=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210782-e-scaled.jpg 2560w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210782-e-300x225.jpg 300w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210782-e-1024x768.jpg 1024w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210782-e-768x576.jpg 768w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210782-e-1536x1152.jpg 1536w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210782-e-2048x1536.jpg 2048w\" sizes=\"auto, (max-width: 2560px) 100vw, 2560px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1920\" height=\"2560\" src=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210797-e-scaled.jpg\" alt=\"Photo of text and illustration of senses on the exhibition wall.\" class=\"wp-image-2372\" srcset=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210797-e-scaled.jpg 1920w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210797-e-225x300.jpg 225w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210797-e-768x1024.jpg 768w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210797-e-1152x1536.jpg 1152w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210797-e-1536x2048.jpg 1536w\" sizes=\"auto, (max-width: 1920px) 100vw, 1920px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"781\" src=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210813-e-1024x781.jpg\" alt=\"Close up of model\" class=\"wp-image-2376\" srcset=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210813-e-1024x781.jpg 1024w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210813-e-300x229.jpg 300w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210813-e-768x586.jpg 768w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210813-e-1536x1172.jpg 1536w, https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210813-e-2048x1562.jpg 2048w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">Our collaborators:<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Joelle Abi Rached MD PhD\u00a0<\/li>\n\n\n\n<li>Amelia Gan MDes \u201923<\/li>\n\n\n\n<li>Sean Nakamura Dolan MArch \u201823<\/li>\n\n\n\n<li>Youtian Duan MDes \u201924<\/li>\n\n\n\n<li>Cynthia Deng MArch \u201820<\/li>\n\n\n\n<li>Elif Erez-Henderson MArch \u201820<\/li>\n\n\n\n<li>Christian Nakarado<\/li>\n\n\n\n<li>Melanie Louterbach MLA \u201824<\/li>\n\n\n\n<li>Matte Lim MDes \u201925<\/li>\n\n\n\n<li>Justin Booz, MLA\/MDes \u201925<\/li>\n\n\n\n<li>Gem Barton<\/li>\n\n\n\n<li>Prof, Melissa Franklin\u00a0<\/li>\n\n\n\n<li>Jake Walker MDes \u201824<\/li>\n\n\n\n<li>Brooke Chornyak MDes \u201924<\/li>\n\n\n\n<li>Adrian Massey MDes \u201804<\/li>\n\n\n\n<li>Marieke Van Damme<\/li>\n\n\n\n<li>Beichen Xie MDes \u201824<\/li>\n<\/ul>\n\n\n\n<div style=\"height:100px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n","protected":false},"excerpt":{"rendered":"<p>PROJECT SYNTHIA An Experiment in Emergent Design Project Synthia, by the Responsive Environments and Artifacts Lab (REAL) at the Harvard [&hellip;]<\/p>\n","protected":false},"author":171,"featured_media":2381,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[32,33,28,19,34,1],"tags":[],"class_list":["post-2365","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-human-machine-interaction","category-intelligent-environments","category-portfolio","category-projects","category-senses-perception","category-uncategorized"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.7 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>PROJECT SYNTHIA - REAL<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"PROJECT SYNTHIA - REAL\" \/>\n<meta property=\"og:description\" content=\"PROJECT SYNTHIA An Experiment in Emergent Design Project Synthia, by the Responsive Environments and Artifacts Lab (REAL) at the Harvard [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/\" \/>\n<meta property=\"og:site_name\" content=\"REAL\" \/>\n<meta property=\"article:published_time\" content=\"2025-06-02T18:00:38+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-06-02T18:01:41+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"2560\" \/>\n\t<meta property=\"og:image:height\" content=\"1920\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Isa He\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Isa He\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"2 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/\"},\"author\":{\"name\":\"Isa He\",\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/#\/schema\/person\/26af605123cca3e604b547fb0a4cebdc\"},\"headline\":\"PROJECT SYNTHIA\",\"datePublished\":\"2025-06-02T18:00:38+00:00\",\"dateModified\":\"2025-06-02T18:01:41+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/\"},\"wordCount\":407,\"image\":{\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg\",\"articleSection\":[\"HUMAN-MACHINEINTERACTION\",\"INTELLIGENT ENVIRONMENTS\",\"Portfolio\",\"Projects\",\"SENSES &amp; PERCEPTION\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/\",\"url\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/\",\"name\":\"PROJECT SYNTHIA - REAL\",\"isPartOf\":{\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg\",\"datePublished\":\"2025-06-02T18:00:38+00:00\",\"dateModified\":\"2025-06-02T18:01:41+00:00\",\"author\":{\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/#\/schema\/person\/26af605123cca3e604b547fb0a4cebdc\"},\"breadcrumb\":{\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#primaryimage\",\"url\":\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg\",\"contentUrl\":\"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg\",\"width\":2560,\"height\":1920,\"caption\":\"Text on glass \\\"A project by REAL lab\\\"\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/research.gsd.harvard.edu\/real\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"PROJECT SYNTHIA\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/#website\",\"url\":\"https:\/\/research.gsd.harvard.edu\/real\/\",\"name\":\"REAL\",\"description\":\"Responsive Environments and Artifacts Lab\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/research.gsd.harvard.edu\/real\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/research.gsd.harvard.edu\/real\/#\/schema\/person\/26af605123cca3e604b547fb0a4cebdc\",\"name\":\"Isa He\",\"sameAs\":[\"https:\/\/scholar.harvard.edu\/isahe\"],\"url\":\"https:\/\/research.gsd.harvard.edu\/real\/author\/isathe\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"PROJECT SYNTHIA - REAL","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/","og_locale":"en_US","og_type":"article","og_title":"PROJECT SYNTHIA - REAL","og_description":"PROJECT SYNTHIA An Experiment in Emergent Design Project Synthia, by the Responsive Environments and Artifacts Lab (REAL) at the Harvard [&hellip;]","og_url":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/","og_site_name":"REAL","article_published_time":"2025-06-02T18:00:38+00:00","article_modified_time":"2025-06-02T18:01:41+00:00","og_image":[{"width":2560,"height":1920,"url":"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg","type":"image\/jpeg"}],"author":"Isa He","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Isa He","Est. reading time":"2 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#article","isPartOf":{"@id":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/"},"author":{"name":"Isa He","@id":"https:\/\/research.gsd.harvard.edu\/real\/#\/schema\/person\/26af605123cca3e604b547fb0a4cebdc"},"headline":"PROJECT SYNTHIA","datePublished":"2025-06-02T18:00:38+00:00","dateModified":"2025-06-02T18:01:41+00:00","mainEntityOfPage":{"@id":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/"},"wordCount":407,"image":{"@id":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#primaryimage"},"thumbnailUrl":"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg","articleSection":["HUMAN-MACHINEINTERACTION","INTELLIGENT ENVIRONMENTS","Portfolio","Projects","SENSES &amp; PERCEPTION"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/","url":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/","name":"PROJECT SYNTHIA - REAL","isPartOf":{"@id":"https:\/\/research.gsd.harvard.edu\/real\/#website"},"primaryImageOfPage":{"@id":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#primaryimage"},"image":{"@id":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#primaryimage"},"thumbnailUrl":"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg","datePublished":"2025-06-02T18:00:38+00:00","dateModified":"2025-06-02T18:01:41+00:00","author":{"@id":"https:\/\/research.gsd.harvard.edu\/real\/#\/schema\/person\/26af605123cca3e604b547fb0a4cebdc"},"breadcrumb":{"@id":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#primaryimage","url":"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg","contentUrl":"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg","width":2560,"height":1920,"caption":"Text on glass \"A project by REAL lab\""},{"@type":"BreadcrumbList","@id":"https:\/\/research.gsd.harvard.edu\/real\/2025\/06\/02\/project-synthia-2\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/research.gsd.harvard.edu\/real\/"},{"@type":"ListItem","position":2,"name":"PROJECT SYNTHIA"}]},{"@type":"WebSite","@id":"https:\/\/research.gsd.harvard.edu\/real\/#website","url":"https:\/\/research.gsd.harvard.edu\/real\/","name":"REAL","description":"Responsive Environments and Artifacts Lab","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/research.gsd.harvard.edu\/real\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/research.gsd.harvard.edu\/real\/#\/schema\/person\/26af605123cca3e604b547fb0a4cebdc","name":"Isa He","sameAs":["https:\/\/scholar.harvard.edu\/isahe"],"url":"https:\/\/research.gsd.harvard.edu\/real\/author\/isathe\/"}]}},"jetpack_featured_media_url":"https:\/\/research.gsd.harvard.edu\/real\/files\/2025\/06\/P1210851-scaled.jpg","_links":{"self":[{"href":"https:\/\/research.gsd.harvard.edu\/real\/wp-json\/wp\/v2\/posts\/2365","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/research.gsd.harvard.edu\/real\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/research.gsd.harvard.edu\/real\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/research.gsd.harvard.edu\/real\/wp-json\/wp\/v2\/users\/171"}],"replies":[{"embeddable":true,"href":"https:\/\/research.gsd.harvard.edu\/real\/wp-json\/wp\/v2\/comments?post=2365"}],"version-history":[{"count":0,"href":"https:\/\/research.gsd.harvard.edu\/real\/wp-json\/wp\/v2\/posts\/2365\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/research.gsd.harvard.edu\/real\/wp-json\/wp\/v2\/media\/2381"}],"wp:attachment":[{"href":"https:\/\/research.gsd.harvard.edu\/real\/wp-json\/wp\/v2\/media?parent=2365"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/research.gsd.harvard.edu\/real\/wp-json\/wp\/v2\/categories?post=2365"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/research.gsd.harvard.edu\/real\/wp-json\/wp\/v2\/tags?post=2365"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}