<node id="670995">
  <nid>670995</nid>
  <type>event</type>
  <uid>
    <user id="28475"><![CDATA[28475]]></user>
  </uid>
  <created>1699624574</created>
  <changed>1699624613</changed>
  <title><![CDATA[Ph.D. Dissertation Defense - Nathaniel Glaser]]></title>
  <body><![CDATA[<p><strong>Title:</strong> <em>Collaborative Perception and Planning for Multi-View and Multi-Robot Systems</em></p>

<p><strong>Committee:</strong><br />
Dr. Zsolt Kira (Advisor) - School of Interactive Computing, Georgia Institute of Technology<br />
Dr. James Hays - School of Interactive Computing, Georgia Institute of Technology<br />
Dr. Patricio Vela - School of Electrical and Computer Engineering, Georgia Institute of Technology<br />
Dr. Pratap Tokekar - Department of Computer Science, University of Maryland<br />
Dr. Milutin Pajovic - Senior Research Scientist, Analog Devices</p>
]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Collaborative Perception and Planning for Multi-View and Multi-Robot Systems]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>The field of robotics has historically focused on egocentric, single-agent systems. &nbsp;However, robots in such systems are susceptible to single points of failure. &nbsp;For instance, a single sensor failure or adverse environmental condition can render an isolated robot "blind". &nbsp;On the other hand, robots in multi-agent systems have the opportunity to overcome potentially dangerous blind spots, via communication and collaboration with their peers. &nbsp;In this dissertation, we address these communication-critical settings with Collaborative Perception and Planning for Multi-View and Multi-Robot Systems. First, we develop several learned communication and spatial registration schemes for immediate collaboration. These schemes allow us to efficiently communicate and align visual observations between moving agents for a single instant in time. &nbsp;We demonstrate improved egocentric semantic segmentation accuracy for a swarm of obstruction-prone aerial quadrotors. &nbsp;Second, we extend our methods to short-term collaboration, where robots compress short sequences of observations into easily communicable representations, such as maps and costmaps. &nbsp;In addition to conserving bandwidth, our methods (1) reduce task duration for multi-robot mapping and (2) improve safety for planning, especially in self-driving settings where egocentric vision has potentially fatal blind spots. &nbsp;Third, we stress-test our methods on large-scale, long-term collaboration. &nbsp;In the setting of production-scale robotic farming, we show how collaborative perception is capable of handling large numbers of heterogeneous robots by corresponding ambiguous data over long stretches of time. &nbsp;We hope that our work on collaborative perception will help transition single-agent systems to robust and efficient multi-agent capabilities.</p>
]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2023-11-17T13:00:00-05:00]]></value>
      <value2><![CDATA[2023-11-17T15:00:00-05:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[Coda C1315 ("Grant Park")]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
          <item>
        <url>https://gatech.zoom.us/j/9381543689</url>
        <link_title><![CDATA[Zoom link]]></link_title>
      </item>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>434381</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[ECE Ph.D. Dissertation Defenses]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>100811</tid>
        <value><![CDATA[Phd Defense]]></value>
      </item>
          <item>
        <tid>1808</tid>
        <value><![CDATA[graduate students]]></value>
      </item>
      </field_keywords>
  <userdata><![CDATA[]]></userdata>
</node>
