{"670995":{"#nid":"670995","#data":{"type":"event","title":"Ph.D. Dissertation Defense - Nathaniel Glaser","body":[{"value":"\u003Cp\u003E\u003Cstrong\u003ETitle:\u003C\/strong\u003E \u003Cem\u003ECollaborative Perception and Planning for Multi-View and Multi-Robot Systems\u003C\/em\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003ECommittee:\u003C\/strong\u003E\u003Cbr \/\u003E\r\nDr. Zsolt Kira (Advisor) - School of Interactive Computing, Georgia Institute of Technology\u003Cbr \/\u003E\r\nDr. James Hays - School of Interactive Computing, Georgia Institute of Technology\u003Cbr \/\u003E\r\nDr. Patricio Vela - School of Electrical and Computer Engineering, Georgia Institute of Technology\u003Cbr \/\u003E\r\nDr. Pratap Tokekar - Department of Computer Science, University of Maryland\u003Cbr \/\u003E\r\nDr. Milutin Pajovic - Senior Research Scientist, Analog Devices\u003C\/p\u003E\r\n","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EThe field of robotics has historically focused on egocentric, single-agent systems. \u0026nbsp;However, robots in such systems are susceptible to single points of failure. \u0026nbsp;For instance, a single sensor failure or adverse environmental condition can render an isolated robot \u0022blind\u0022. \u0026nbsp;On the other hand, robots in multi-agent systems have the opportunity to overcome potentially dangerous blind spots, via communication and collaboration with their peers. \u0026nbsp;In this dissertation, we address these communication-critical settings with Collaborative Perception and Planning for Multi-View and Multi-Robot Systems. First, we develop several learned communication and spatial registration schemes for immediate collaboration. These schemes allow us to efficiently communicate and align visual observations between moving agents for a single instant in time. \u0026nbsp;We demonstrate improved egocentric semantic segmentation accuracy for a swarm of obstruction-prone aerial quadrotors. \u0026nbsp;Second, we extend our methods to short-term collaboration, where robots compress short sequences of observations into easily communicable representations, such as maps and costmaps. \u0026nbsp;In addition to conserving bandwidth, our methods (1) reduce task duration for multi-robot mapping and (2) improve safety for planning, especially in self-driving settings where egocentric vision has potentially fatal blind spots. \u0026nbsp;Third, we stress-test our methods on large-scale, long-term collaboration. \u0026nbsp;In the setting of production-scale robotic farming, we show how collaborative perception is capable of handling large numbers of heterogeneous robots by corresponding ambiguous data over long stretches of time. \u0026nbsp;We hope that our work on collaborative perception will help transition single-agent systems to robust and efficient multi-agent capabilities.\u003C\/p\u003E\r\n","format":"limited_html"}],"field_summary_sentence":[{"value":"Collaborative Perception and Planning for Multi-View and Multi-Robot Systems"}],"uid":"28475","created_gmt":"2023-11-10 13:56:14","changed_gmt":"2023-11-10 13:56:53","author":"Daniela Staiculescu","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2023-11-17T13:00:00-05:00","event_time_end":"2023-11-17T15:00:00-05:00","event_time_end_last":"2023-11-17T15:00:00-05:00","gmt_time_start":"2023-11-17 18:00:00","gmt_time_end":"2023-11-17 20:00:00","gmt_time_end_last":"2023-11-17 20:00:00","rrule":null,"timezone":"America\/New_York"},"location":"Coda C1315 (\u0022Grant Park\u0022)","extras":[],"related_links":[{"url":"https:\/\/gatech.zoom.us\/j\/9381543689","title":"Zoom link"}],"groups":[{"id":"434381","name":"ECE Ph.D. Dissertation Defenses"}],"categories":[],"keywords":[{"id":"100811","name":"Phd Defense"},{"id":"1808","name":"graduate students"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1788","name":"Other\/Miscellaneous"}],"invited_audience":[{"id":"78771","name":"Public"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}