<node id="670979">
  <nid>670979</nid>
  <type>event</type>
  <uid>
    <user id="28475"><![CDATA[28475]]></user>
  </uid>
  <created>1699556591</created>
  <changed>1699556656</changed>
  <title><![CDATA[Ph.D. Proposal Oral Exam - Biswadeep Chakraborty]]></title>
  <body><![CDATA[<p><span><span><span><strong><span>Title:&nbsp; </span></strong><em><span>Dynamics in Diversity: Exploiting Neuronal and Synaptic Heterogeneity in Recurrent Spiking Neural Networks</span></em></span></span></span></p>

<p><span><span><strong><span>Committee:&nbsp; </span></strong></span></span></p>

<p><span><span><span>Dr. </span><span>Mukhopadhyay</span><span>, Advisor</span>&nbsp;&nbsp;&nbsp; </span></span></p>

<p><span><span><span>Dr. </span><span>Datta</span><span>, Chair</span></span></span></p>

<p><span><span><span>Dr. </span><span>Romberg</span></span></span></p>
]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Dynamics in Diversity: Exploiting Neuronal and Synaptic Heterogeneity in Recurrent Spiking Neural Networks]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p><span><span>In this research, we propose a novel heterogeneous recurrent spiking neural network (HRSNN) that utilizes spike-timing-dependent plasticity (STDP) rules with varying neuronal and synaptic dynamics. We aim to model, characterize, and optimize this unsupervised learning method analytically and empirically. The thesis illustrates how the temporal event processing in such an HRSNN model offers significant advantages over traditional homogeneous RSNN (MRSNN), backpropagated RSNN (BPRSNN), and deep neural networks. By harnessing the diversity in the neuronal and synaptic timescales, we engineer more robust, generalizable models, which are otherwise not possible using supervised learning. Furthermore, STDP allows these learning models to train with considerably less data, learn the underlying dynamics in much fewer timesteps, and showcase remarkable noise resilience. However, one key drawback of using heterogeneity in the parameters is that the model becomes exponentially more complex and, thus, becomes much more challenging to train, and finding the optimal hyperparameters becomes exponentially more difficult as the number of neurons increases. Using standard activity-based pruning methods for HRSNN models makes the model extremely unstable and, thus, hard to train. Moreover, such pruning methods do not fully exploit the diversity in the neuronal timescales present in an HRSNN model. Thus, we introduce a novel Lyapunov Noise Pruning (LNP) algorithm that uses graph sparsification methods and utilizes Lyapunov exponents to design a stable sparse HRSNN from an untrained HRSNN model by leveraging the diversity in neuronal timescales in HRSNN. Going forward, we aim to exploit the rich dynamics of HRSNNs to tackle problems that have hitherto posed significant challenges for conventional DNN models. In particular, we intend to look into NP-hard combinatorial optimization and variational inequality problems. In harnessing the capabilities of HRSNNs, we have unveiled a pathway to not only more efficient neural processing but also to address complex problems, redefining the boundaries of neural computation.</span></span></p>
]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2023-11-10T14:30:00-05:00]]></value>
      <value2><![CDATA[2023-11-10T16:30:00-05:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[Room 2108, Klaus]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
          <item>
        <url>https://teams.microsoft.com/l/meetup-join/19%3ameeting_OTdkZjc5ZmYtODdjZi00M2ZlLTk2NmItMTE3MTcwNDcwOWU3%40thread.v2/0?context=%7b%22Tid%22%3a%22482198bb-ae7b-4b25-8b7a-6d7f32faa083%22%2c%22Oid%22%3a%22c08c2d11-f243-48d6-9918-de47fcbcbe1c%22%7d</url>
        <link_title><![CDATA[Microsoft Teams Meeting link]]></link_title>
      </item>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>434371</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[ECE Ph.D. Proposal Oral Exams]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1788</tid>
        <value><![CDATA[Other/Miscellaneous]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>102851</tid>
        <value><![CDATA[Phd proposal]]></value>
      </item>
          <item>
        <tid>1808</tid>
        <value><![CDATA[graduate students]]></value>
      </item>
      </field_keywords>
  <userdata><![CDATA[]]></userdata>
</node>
