{"671058":{"#nid":"671058","#data":{"type":"event","title":"PhD Defense by Patrick Grady","body":[{"value":"\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u003Cstrong\u003ETitle\u003C\/strong\u003E: Sensing Touch from Vision for Humans and Robots\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u003Cstrong\u003EDate\u003C\/strong\u003E: Tuesday, November 28, 2023\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u003Cstrong\u003ETime\u003C\/strong\u003E: 2:00pm-4:00pm EST\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u003Cstrong\u003ELocation\u003C\/strong\u003E: Klaus 1116\u003Cbr \/\u003E\r\n\u003Cstrong\u003EZoom\u003C\/strong\u003E: \u003Ca href=\u0022https:\/\/gatech.zoom.us\/j\/96467976963?pwd=MkxSUDFnaFJ6eFp0dXBkMHNQU3BtZz09\u0022\u003Ehttps:\/\/gatech.zoom.us\/j\/96467976963?pwd=MkxSUDFnaFJ6eFp0dXBkMHNQU3BtZz09\u003C\/a\u003E\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u003Cstrong\u003EPatrick Grady\u003C\/strong\u003E\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003ERobotics PhD Student\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003ESchool of Electrical and Computer Engineering\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EGeorgia Institute of Technology\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u003Cstrong\u003ECommittee\u003C\/strong\u003E:\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EDr. James Hays (Advisor) \u2013 School of Interactive Computing, Georgia Tech\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EDr. Charlie Kemp (Advisor) \u2013 CTO, Hello Robot\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EDr. Seth Hutchinson \u2013 School of Interactive Computing, Georgia Tech\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EDr. Animesh Garg \u2013 School of Interactive Computing, Georgia Tech\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003EDr. Chengcheng Tang \u2013 Meta Reality Labs\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003E\u003Cstrong\u003EAbstract\u003C\/strong\u003E: \u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003ETo affect their environment, humans and robots use their hands and grippers to push, pick up, and manipulate the world around them. At the core of this interaction is physical contact which determines the underlying mechanics of the grasp. While contact is useful in understanding manipulation, it is difficult to measure. In this thesis, we explore methods to estimate contact between humans, robots, and objects using easy-to-collect imagery. First, we demonstrate a method which leverages subtle visual changes to infer the pressure between a human hand and surface using RGB images. We initially explore this work in a constrained laboratory setting, but also develop a weakly-supervised data collection technique to estimate hand pressure in less constrained settings. A parallel approach allows us to estimate the pressure and force that soft robotic grippers apply to their environments, allowing for precise closed-loop control of a robot. Finally, we develop a joint pose and contact estimator which may generalize to internet-scale images. Our model leverages multiple heterogeneously labeled datasets and images with contact labeled by human annotators. Overall, this thesis makes progress towards understanding human and robot manipulation from only visual sensing.\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003E\u003Cspan\u003E\u003Cspan\u003ESensing Touch from Vision for Humans and Robots\u003C\/span\u003E\u003C\/span\u003E\u003C\/p\u003E\r\n","format":"limited_html"}],"field_summary_sentence":[{"value":"Sensing Touch from Vision for Humans and Robots"}],"uid":"27707","created_gmt":"2023-11-14 19:53:28","changed_gmt":"2023-11-14 19:53:28","author":"Tatianna Richardson","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2023-11-28T14:00:00-05:00","event_time_end":"2023-11-28T16:00:00-05:00","event_time_end_last":"2023-11-28T16:00:00-05:00","gmt_time_start":"2023-11-28 19:00:00","gmt_time_end":"2023-11-28 21:00:00","gmt_time_end_last":"2023-11-28 21:00:00","rrule":null,"timezone":"America\/New_York"},"location":"Klaus 1116","extras":[],"groups":[{"id":"221981","name":"Graduate Studies"}],"categories":[],"keywords":[{"id":"100811","name":"Phd Defense"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1788","name":"Other\/Miscellaneous"}],"invited_audience":[{"id":"78771","name":"Public"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}