<node id="668475">
  <nid>668475</nid>
  <type>event</type>
  <uid>
    <user id="28475"><![CDATA[28475]]></user>
  </uid>
  <created>1689283708</created>
  <changed>1689343516</changed>
  <title><![CDATA[Ph.D. Dissertation Defense - Ashwin Lele]]></title>
  <body><![CDATA[<p><span><span><strong><span>Title</span></strong><em><span>:&nbsp; </span></em></span></span><em>Spiking Neural Networks Enabled Circuits and Systems for Edge Robots</em></p>

<p><span><span><strong><span>Committee:</span></strong></span></span></p>

<p><span><span><span>Dr. </span><span>Arijit Raychowdhury, ECE, Chair</span><span>, Advisor</span></span></span></p>

<p><span><span><span>Dr. </span><span>Suman Datta, ECE</span></span></span></p>

<p><span><span><span>Dr. </span><span>Justin Romberg, ECE</span></span></span></p>

<p><span><span><span>Dr. </span><span>Visvesh Sathe, ECE</span></span></span></p>

<p><span><span><span>Dr. </span><span>Alexey Tumanov, CoC</span></span></span></p>
]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Spiking Neural Networks Enabled Circuits and Systems for Edge Robots]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>Robotic computing at the edge needs to meet multiple constraints on power and form factor while delivering the required performance for power-hungry neural network kernels. This work proposed spiking neural network (SNN) alternatives and augmentations for algorithms and circuits for edge robots. We show SNN-driven locomotion for power-constrained hexapod robots, SNN-augmented target tracking for high-speed aerial robots and SNN-assisted visual navigation for size-critical micro-robots. The first part of the work extends rhythmic leg movement of insects to an SNN-based gait generator to demonstrate an online training method. We then utilize an event-based vision sensor as the sensory front-end to the hexapod locomotion to show the first spike-only closed-loop robotic platform. In the second part, we observe that SNN and event-camera forms a sensor-processor pair well-suited for high-speed processing while frame-camera with convolutional neural network (CNN) suits the applications with the high-accuracy requirement. This trade-off between accuracy vs. latency in the event and frame-based visual processing arises from the detailed temporal and spatial resolutions captured by event and frame cameras respectively. We utilize these complementary strengths to build high-speed target identification and tracking system with SNN providing high-speed but noisy target estimates with CNN preserving the lost accuracy by providing reliable periodic anchors. We build a heterogeneous SoC with low-power RRAM compute-in-memory mapping CNN and high-speed SRAM compute-near-memory accelerating SNN. The final part of the work generalizes the idea of multi-modal processing applied in the previous chapters to divide the robotic computing workloads between trainable-CNN for perception tasks and physics-based symbolic processing for motion encoding tasks. Our SoC uses RRAM compute-near-memory kernels to accelerate CNN-based perception while SRAM compute-in-memory carries out SNN-based localization on micro-robots. To summarize, this work attempted to substitute and augment the compute-constrained robotic hardware with SNN for energy saving and performance improvement.</p>
]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2023-07-21T11:00:00-04:00]]></value>
      <value2><![CDATA[2023-07-21T13:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[Online]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
          <item>
        <url>https://teams.microsoft.com/l/meetup-join/19%3ameeting_NDE3ODE1ZjgtYjczMi00NDE5LThlYmUtOTE5OTYyZjAxODdh%40thread.v2/0?context=%7b%22Tid%22%3a%22482198bb-ae7b-4b25-8b7a-6d7f32faa083%22%2c%22Oid%22%3a%226f30fa3a-5c53-40ff-90eb-2dad1829ac34%22%7d</url>
        <link_title><![CDATA[Microsoft Teams Meeting link]]></link_title>
      </item>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>434381</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[ECE Ph.D. Dissertation Defenses]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>192484</tid>
        <value><![CDATA[PhD Defense, graduate students]]></value>
      </item>
      </field_keywords>
  <userdata><![CDATA[]]></userdata>
</node>
