<node id="592693">
  <nid>592693</nid>
  <type>news</type>
  <uid>
    <user id="33939"><![CDATA[33939]]></user>
  </uid>
  <created>1497448656</created>
  <changed>1497448656</changed>
  <title><![CDATA[Assistant Professor Devi Parikh Earns IJCAI Computers and Thought Award]]></title>
  <body><![CDATA[<p>School of Interactive Computing Assistant Professor <strong>Devi Parikh</strong> was named recipient of the 2017 <a href="https://www.ijcai.org/awards">International Joint Conferences on Artificial Intelligence Computers and Thought Award</a>, which is considered to be the premier award for artificial intelligence researchers under the age of 35.</p>

<p>She was selected by the IJCAI-17 Awards Selection Committee for her contributions at the intersection of words, pictures, and common sense. This includes semantic image understanding, the use of visual attributes for human-machine collaboration and visual abstractions for learning common sense, and enabling humans to interact with visual content via natural language.</p>

<p>Parikh joins a particularly exclusive list of 27 AI visionaries who have been awarded since 1971, including Terry Winograd, David Marr and Tom Mitchell in the early days and Stuart Russell, Daphne Koller, Carlos Guestrin, and Andrew Ng more recently.</p>

<p>Parikh said that she is excited about the recognition that her lab&rsquo;s work in visual question answering (VQA) is getting.</p>

<p>&ldquo;Through making our large datasets and systems publicly available, we have enabled research groups around the world to make significant progress on building machines that can automatically answer questions about visual content,&rdquo; Parikh said. &ldquo;This has applications in any scenario where it is difficult, if not impossible, for someone to sift through visual data to elicit the information they need, be it aiding visually-impaired users, users on low-bandwidth networks that cannot support visual data, or assisting analysts in making decisions based on large quantities of visual feeds.</p>

<p>&ldquo;It has been rewarding to play a role in the creation of an entirely new sub-field of scientific endeavor in artificial intelligence and witness the research community rally around VQA.&rdquo;</p>

<p>This is one of a number of awards that Parikh has earned in recent months. She earned a <a href="http://www.cc.gatech.edu/news/588083/pair-ic-assistant-professors-earn-awards-research-explainable-intelligent-systems-and">Google Research Faculty Award</a>, an <a href="http://www.cc.gatech.edu/news/586463/amazon-research-awards-fund-computer-vision-and-machine-learning-projects">Amazon Academic Research Award</a>, and was featured last week in <em>Forbes</em> magazine as one of a handful of <a href="https://www.forbes.com/sites/mariyayao/2017/05/18/meet-20-incredible-women-advancing-a-i-research/2/#cee2a6e4edee">women advancing artificial intelligence research</a>.</p>
]]></body>
  <field_subtitle>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_subtitle>
  <field_dateline>
    <item>
      <value>2017-06-14T00:00:00-04:00</value>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_dateline>
  <field_summary_sentence>
    <item>
      <value><![CDATA[The Computers and Thought Award is considered to be the premier award for AI researchers under the age of 35.]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_summary>
  <field_media>
          <item>
        <nid>
          <node id="586462">
            <nid>586462</nid>
            <type>image</type>
            <title><![CDATA[Devi Parikh]]></title>
            <body><![CDATA[]]></body>
                          <field_image>
                <item>
                  <fid>223510</fid>
                  <filename><![CDATA[Devi Parikh.jpg]]></filename>
                  <filepath><![CDATA[/sites/default/files/images/Devi%20Parikh.jpg]]></filepath>
                  <file_full_path><![CDATA[http://tlwarc.hg.gatech.edu//sites/default/files/images/Devi%20Parikh.jpg]]></file_full_path>
                  <filemime>image/jpeg</filemime>
                  <image_740><![CDATA[]]></image_740>
                  <image_alt><![CDATA[]]></image_alt>
                </item>
              </field_image>
            
                      </node>
        </nid>
      </item>
      </field_media>
  <field_contact_email>
    <item>
      <email><![CDATA[david.mitchell@cc.gatech.edu]]></email>
    </item>
  </field_contact_email>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_contact>
    <item>
      <value><![CDATA[<p>David Mitchell</p>

<p>Communications Officer</p>
]]></value>
    </item>
  </field_contact>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <!--  TO DO: correct to not conflate categories and news room topics  -->
  <!--  Disquisition: it's funny how I write these TODOs and then never
         revisit them. It's as though the act of writing the thing down frees me
         from the responsibility to actually solve the problem. But what can I
         say? There are more problems than there's time to solve.  -->
  <links_related> </links_related>
  <files> </files>
  <og_groups>
          <item>47223</item>
          <item>1299</item>
          <item>50876</item>
      </og_groups>
  <og_groups_both>
          <item>
        <![CDATA[Computer Science/Information Technology and Security]]>
      </item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>153</tid>
        <value><![CDATA[Computer Science/Information Technology and Security]]></value>
      </item>
      </field_categories>
  <core_research_areas>
          <term tid="39501"><![CDATA[People and Technology]]></term>
      </core_research_areas>
  <field_news_room_topics>
      </field_news_room_topics>
  <links_related>
          <link>
      <url>https://www.ijcai.org/awards</url>
      <title></title>
      </link>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>47223</item>
          <item>1299</item>
          <item>50876</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[College of Computing]]></item>
          <item><![CDATA[GVU Center]]></item>
          <item><![CDATA[School of Interactive Computing]]></item>
      </og_groups_both>
  <field_keywords>
          <item>
        <tid>174685</tid>
        <value><![CDATA[computers and thought award]]></value>
      </item>
          <item>
        <tid>173616</tid>
        <value><![CDATA[devi parikh]]></value>
      </item>
          <item>
        <tid>2556</tid>
        <value><![CDATA[artificial intelligence]]></value>
      </item>
          <item>
        <tid>166848</tid>
        <value><![CDATA[School of Interactive Computing]]></value>
      </item>
      </field_keywords>
  <field_userdata>
      <![CDATA[]]>
  </field_userdata>
</node>
