{"588083":{"#nid":"588083","#data":{"type":"news","title":"Pair of IC Assistant Professors Earn Awards for Research in Visual Question Answering","body":[{"value":"\u003Cp\u003ETwo assistant professors in the School of Interactive Computing received awards for their respective research in the field of Visual Question Answering (VQA) last week.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EAssistant Professor\u0026nbsp;\u003Cstrong\u003EDhruv Batra\u003C\/strong\u003E\u0026nbsp;earned a Young Investigator award from the Office of Naval Research and \u003Cstrong\u003EDevi Parikh\u003C\/strong\u003E earned a Google Research Faculty Award.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe grants will provide $510,000 over three years for Batra\u0026#39;s research,\u0026nbsp;\u003Cem\u003EExplainable and Trustworthy\u0026nbsp;Intelligent Systems\u003C\/em\u003E, and $85,681 for one year for Parikh\u0026rsquo;s, \u003Cem\u003EMaking the V in VQA Matter: Elevating the Role of Image Understanding in Visual Question Answering\u003C\/em\u003E.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn VQA, given an image and a free-form natural language question about the image, the machine\u0026#39;s task is to automatically produce a concise, accurate, free-form, natural language answer.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EBatra\u0026rsquo;s research aims to 1) develop theory, algorithms, and implementations for transparent deep neural networks that are able to provide explanations for their predictions, and 2) to study the effect of developed transparent neural networks and explanations on user trust and perceived trustworthiness with VQA as the AI testbed.\u003C\/p\u003E\r\n\r\n\u003Cp\u003ESimilarly, Parikh\u0026rsquo;s research aims to build a more balanced VQA dataset that reduces language biases and allows evaluation protocols that more accurately reflect progress in image understanding. Another goal is to train a VQA model that leverages that balanced dataset to promote more detailed image understanding, and develop a counter-example based explanation modality, where the VQA model justifies its answer by providing examples of images it believes are similar to the image at hand.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe result will be that users can better trust the VQA model and identify its oncoming failures, according to the proposal.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026quot;I think this line of research addresses a fundamental problem for the future of AI -- how do we make AI trustworthy?\u0026quot; Batra said. \u0026quot;How do we build intelligent systems that explain why they are making the predictions they are making?\u0026quot;\u003C\/p\u003E\r\n","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":"","field_summary_sentence":[{"value":"Assistant Professors Dhruv Batra and Devi Parikh earned the Young Investigator award from the Office of Naval Research and a Google Research Faculty Award, respectively."}],"uid":"33939","created_gmt":"2017-02-27 22:03:49","changed_gmt":"2017-02-28 19:19:25","author":"David Mitchell","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2017-02-27T00:00:00-05:00","iso_date":"2017-02-27T00:00:00-05:00","tz":"America\/New_York"},"extras":[],"hg_media":{"586461":{"id":"586461","type":"image","title":"Dhruv Batra","body":null,"created":"1485377710","gmt_created":"2017-01-25 20:55:10","changed":"1485377710","gmt_changed":"2017-01-25 20:55:10","alt":"","file":{"fid":"223509","name":"DhruvBatra.jpg","image_path":"\/sites\/default\/files\/images\/DhruvBatra.jpg","image_full_path":"http:\/\/tlwarc.hg.gatech.edu\/\/sites\/default\/files\/images\/DhruvBatra.jpg","mime":"image\/jpeg","size":82240,"path_740":"http:\/\/tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/DhruvBatra.jpg?itok=WX8pn09l"}},"586462":{"id":"586462","type":"image","title":"Devi Parikh","body":null,"created":"1485377735","gmt_created":"2017-01-25 20:55:35","changed":"1485377735","gmt_changed":"2017-01-25 20:55:35","alt":"","file":{"fid":"223510","name":"Devi Parikh.jpg","image_path":"\/sites\/default\/files\/images\/Devi%20Parikh.jpg","image_full_path":"http:\/\/tlwarc.hg.gatech.edu\/\/sites\/default\/files\/images\/Devi%20Parikh.jpg","mime":"image\/jpeg","size":62731,"path_740":"http:\/\/tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/Devi%20Parikh.jpg?itok=TJlsp21d"}}},"media_ids":["586461","586462"],"groups":[{"id":"47223","name":"College of Computing"},{"id":"50876","name":"School of Interactive Computing"}],"categories":[],"keywords":[{"id":"654","name":"College of Computing"},{"id":"166848","name":"School of Interactive Computing"},{"id":"173614","name":"visual question answering"},{"id":"173615","name":"dhruv batra"},{"id":"173616","name":"devi parikh"}],"core_research_areas":[{"id":"39521","name":"Robotics"}],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EDavid Mitchell\u003C\/p\u003E\r\n\r\n\u003Cp\u003ECommunications Officer\u003C\/p\u003E\r\n\r\n\u003Cp\u003Edavid.mitchell@cc.gatech.edu\u003C\/p\u003E\r\n","format":"limited_html"}],"email":["david.mitchell@cc.gatech.edu"],"slides":[],"orientation":[],"userdata":""}}}