<node id="670895">
  <nid>670895</nid>
  <type>event</type>
  <uid>
    <user id="27707"><![CDATA[27707]]></user>
  </uid>
  <created>1699283163</created>
  <changed>1699283163</changed>
  <title><![CDATA[PhD Defense by Nathaniel Moore Glaser]]></title>
  <body><![CDATA[<p><span><span>Dear faculty members and fellow students,</span></span></p>

<p><span>&nbsp;</span></p>

<p><span><span>You are cordially invited to attend my dissertation defense on Friday, November 17th.</span></span></p>

<p><span>&nbsp;</span></p>

<p><strong><span>Title: </span></strong><span>Collaborative Perception and Planning for Multi-View and Multi-Robot Systems</span></p>

<p><span>&nbsp;</span></p>

<p><strong><span>Date: </span></strong><span>Friday, November 17th, 2023</span></p>

<p><strong><span>Time: </span></strong><span>1:00 PM - 3:00 PM EST</span></p>

<p><strong><span>Location: </span></strong><span>Coda C1315 ("Grant Park") or <a href="https://gatech.zoom.us/j/9381543689" title="https://gatech.zoom.us/j/9381543689">this zoom link</a></span></p>

<p><span>&nbsp;</span></p>

<p><strong><span>Nathaniel Moore Glaser</span></strong></p>

<p><span>Robotics PhD Candidate</span></p>

<p><span>School of Electrical and Computer Engineering</span></p>

<p><span>Georgia Institute of Technology</span></p>

<p><span>&nbsp;</span></p>

<p><strong><span>Committee:</span></strong></p>

<p><span>Dr. Zsolt Kira (Advisor) - School of Interactive Computing, Georgia Institute of Technology</span></p>

<p><span>Dr. James Hays - School of Interactive Computing, Georgia Institute of Technology</span></p>

<p><span>Dr. Patricio Vela - School of Electrical and Computer Engineering, Georgia Institute of Technology</span></p>

<p><span>Dr. Pratap Tokekar - Department of Computer Science, University of Maryland</span></p>

<p><span>Dr. Milutin Pajovic - Senior Research Scientist, Analog Devices</span></p>

<p><span>&nbsp;</span></p>

<p><strong><span>Abstract:</span></strong></p>

<p><span>The field of robotics has historically focused on <em>egocentric, single-agent</em> systems.&nbsp; However, robots in such systems are susceptible to single points of failure.&nbsp; For instance, a single sensor failure or adverse environmental condition can render an isolated robot "blind".&nbsp; On the other hand, robots in <em>multi-agent</em> systems have the opportunity to overcome potentially dangerous blind spots, via communication and collaboration with their peers.&nbsp; In this dissertation, we address these communication-critical settings with <strong>Collaborative Perception and Planning for Multi-View and Multi-Robot Systems</strong>.</span></p>

<p><span>&nbsp;</span></p>

<p><span>First, we develop several learned communication and spatial registration schemes for <strong><em>immediate collaboration</em></strong>.&nbsp; These schemes allow us to efficiently communicate and align visual observations between moving agents for <em>a single instant in time</em>.&nbsp; We demonstrate improved egocentric semantic segmentation accuracy for a swarm of obstruction-prone aerial quadrotors.&nbsp;&nbsp;</span><span>Second, we extend our methods to <strong><em>short-term&nbsp;collaboration</em></strong>, where robots compress short sequences of observations into easily communicable representations, such as maps and costmaps.&nbsp; In addition to conserving bandwidth, our methods (1) reduce task duration for multi-robot mapping and (2) improve safety for planning, especially in self-driving settings where egocentric vision has potentially fatal blind spots.&nbsp;&nbsp;Third, we stress-test our methods on <strong><em>large-scale, long-term collaboration</em></strong>.&nbsp; In the setting of production-scale robotic farming, we show how collaborative perception is capable of handling <em>large</em> numbers of <em>heterogeneous</em> robots by corresponding <em>ambiguous</em> data over <em>long</em> stretches of time.&nbsp;&nbsp;We hope that our work on collaborative perception will help transition single-agent systems to robust and efficient multi-agent capabilities.</span></p>
]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Collaborative Perception and Planning for Multi-View and Multi-Robot Systems]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p><span><span><span>Collaborative Perception and Planning for Multi-View and Multi-Robot Systems</span></span></span></p>
]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2023-11-17T13:00:00-05:00]]></value>
      <value2><![CDATA[2023-11-17T15:00:00-05:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[Coda C1315 ("Grant Park") or this zoom link]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>221981</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[Graduate Studies]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>100811</tid>
        <value><![CDATA[Phd Defense]]></value>
      </item>
      </field_keywords>
  <userdata><![CDATA[]]></userdata>
</node>
