Robotics and Artificial Intelligence Laboratory

Ethan Fahnestock

Undergraduate Research Assistant, Department of Electrical and Computer Engineering


Email: efahnest@u.rochester.edu

Websites: Google Scholar, Profile on Undergraduate Experience

Biosketch


Ethan FahnestockEthan Fahnestock is an undergraduate at the University of Rochester studying Physics and Astronomy and Interdepartmental Engineering. He is a research assistant in the lab currently working on mobile robot navigation. Ethan is interested in problems limiting the capabilities of intelligent agents operating in new environments, and is motivated by the use of robots for scientific data acquisition. Specifically, he is interested in exploring the limitations of planning, perception, control, and inference, and how the relationship between these components can be leveraged to address their limitations on fielded robotic systems.

Education


B.S. in Physics and Astronomy, B.S. in Interdepartmental Engineering, University of Rochester, Expected May 2021

Research


My undergraduate research has focused on the development of algorithms for mobile manipulation in a priori unknown environments and human-robot teaming. My first efforts focused on a language-guided adaptive perception pipeline that enabled efficient language grounding. I worked to help deploy and evaluate the performace of this framework for mobile manipulation tasks in unknown environments. Following this, I lead an effort to explore the limitation of this perception pipeline's ability to consider the needs of planning and control as well as language grounding when adapting. I have also worked on planning for assembly tasks with environmental uncertainty. Most recently my research has focused on mobile robot navigation and path following.

Efficient World Representations for Language-based Human-robot Interaction

Efficient World Representations for Language-based Human-robot InteractionMy first dive into robotics research focused on the problem of creating scalable world representations to support grounded language understanding. To act on natural language commands, robots must understand an instruction in the context of their surroundings. This representation of their surroundings, generated by perception, must be sufficiently rich to support all tasks a robot may be asked to approach. However, these rich perceptual representations are difficult to maintain and computationally limiting for real time language-based human-robot interaction. To approach this problem, graduate student Siddharth Patki proposed Adaptive Perception (AP). This approach infers a structure for the perception pipeline from a natural language command that produces worlds only representing required objects to ground and complete the specified task. My first project was working to evaluate the performance of AP for completing mobile manipulation tasks in a priori unknown environments. Following this, I led an effort to explore methods to enable AP to consider the needs of planning and control while adapting to the instruction. To motivate this, consider the instructions "go to to the door" and "open the door". For both tasks, a robot must have a representation of a door in its world model to ground the phrase. However, for the "open" task, planning requires a representation of a door handle and an understanding of its affordances that AP would previously select not to model. As a first step towards enabling AP to consider the needs of planning and selectively represent these relationships, I introduced hierarchical symbols to allow AP to represent hierarchies between detectors.

S. Patki, E. Fahnestock, T.M. Howard, and M. Walter, "Language-guided Semantic Mapping and Mobile Manipulation in Partially Observable Environments," In Conference on Robot Learning. PMLR, Oct. 2019, vol. 100, pp. 1201-1210

E. Fahnestock, S. Patki, and T.M. Howard, "Language-guided Adaptive Perception with Hierarchical Symbolic Representations for Mobile Manipulators," In 6th AAAI Fall Symposium Series on Artificial Intelligence for Human-Robot Interaction. Nov. 2019

Parts Assembly with Environmental Uncertainty

Parts Assembly with Environmental UncertaintyDuring CMU's Robotics Institute Summer Scholars (RISS) program I worked under the mentorship of Professor Maxim Likhachev on developing methods to complete parts assembly tasks with tight tolerances under environment uncertainty. Robots struggle with manipulation tasks that require high precision in unstructured environments, like inserting a key or plugging in a power cord, because their perception systems often cannot achieve the level of accuracy required to complete the task. Contact can be a powerful tool to locate features for both low and high-precision tasks, and is often employed by humans (e.g. locating a light switch in the dark). Our goal was to create a planner that could leverage interaction with the environment to reduce uncertainty. I developed and explored the properties of a planner that utilized parallelized physics-based simulation to evaluate task-space impedance control motion primitives, allowing contact to influence planning. This work resulted in a presentation, poster, and paper in the RISS Working Papers Journal. Future directions for this work include selective simulation, and belief-space planning to explicitly reason over environmental uncertainty.

Mobile Robot Navigation

Most recently my research has transitioned to algorithms for motion planning and path following for unmanned ground vehicles. My focus has been on model-predictive control approaches for path following.

Publications


Journal Articles

[J1] M.R. Walter, S. Patki, A.F. Daniele, E. Fahnestock, F. Duvallet, S. Hemachandra, J. Oh, A. Stentz, N. Roy, and T.M. Howard, "Language Understanding for Field and Service Robots in a Priori Unknown Environments," Field Robotics. 2021, forthcoming

Refereed Conference Papers

[C2] B. Hedegaard, E. Fahnestock, J. Arkin, A. Menon, and T.M. Howard, "Discrete optimization of adaptive state lattices for iterative motion planning on unmanned ground vehicles," In IEEE/RSJ International Conference on Intelligent Robots and Systems. Oct. 2021, forthcoming

[C1] S. Patki, E. Fahnestock, T.M. Howard, and M. Walter, "Language-guided Semantic Mapping and Mobile Manipulation in Partially Observable Environments," In Conference on Robot Learning. PMLR, Oct. 2019, vol. 100, pp. 1201-1210

Refereed Workshop and Symposium Papers

[WS1] E. Fahnestock, S. Patki, and T.M. Howard, "Language-guided Adaptive Perception with Hierarchical Symbolic Representations for Mobile Manipulators," In 6th AAAI Fall Symposium Series on Artificial Intelligence for Human-Robot Interaction. Nov. 2019

Conference Abstracts and Posters

[AP1] S. Patki, J. Arkin, E. Fahnestock, and T.M. Howard, "A Framework for Proactive and Adaptive Natural Language Interaction in Dynamic and Unstructured Environments," In 2nd Robot Teammates Operating in Dynamic, Unstructured Environments (RT-DUNE) at the 2019 International Conference on Robotics and Automation. May 2019, forthcoming

Awards


Goldwater Scholarship, 2020    Profile Article on AwardAnnouncement Article,