Nitzan Orr
I research how to improve teleoperation and understanding of robots
I research how to improve teleoperation and understanding of robots
My work is generally in the area of human-robot interaction, with some projects and interests in virtual reality and haptics. My current projects involve defining and solving challenges in people's ability to remotely manipulate objects (via robots and in VR), as well as increasing situational awareness for remote tele-operators.
During Summer 2022 I worked at the NASA Ames Research Center in Mountain View, CA. As part of the Astrobee Team within the Intelligent Robotics Group (IRG), I worked on improving the map visualization pipeline. Astrobee, a family of cube-shaped robots deployed in the International Space Station (ISS), mapped several ISS modules which we wanted to make accessible to certain stakeholders. However, the data was high resolution and couldn't be rendered in whole on the web. My job was to partition the 3D model into smaller chunks, at varying resolution levels, so that we could populate a Level-of-Detail Octree (3D Tiles). The work and environment were rewarding, and I enjoyed learning about the inner workings of 3D models and texture maps.
You can access some of the code I wrote here where I used the Blender Python API to automate cropping meshes and reducing texture sizes. The user interface is called ISAAC, not to be confused with my previous project ISAACS.
I co-led a team of undergraduate researchers developing ISAACS, the Immersive Semi-Autonomous Aerial Command System. We remotely operated drones for mapping, radiation detection, and more all through virtual reality. ISAACS integrates flight controls and real-time sensor data visualization into a single interface. Our paper was published at the IEEE Nuclear Science Conference (see "Publications"). I was advised by Dr. Allen Yang & Prof. Kai Vetter. ISAACS GitHub
I worked during the school year at an autonomous indoor vehicles startup on vision applications for robot movement calibration. Not only did I gain knowledge about mapping, autonomous navigation, and sensors but also saw what a culture of inclusiveness looks like. Friendly Robots Company
At DeepDrive I designed and wrote the ROS software that connected all components of our autonomous RC car -- from the camera input to the neural network inference, and all the way down to the PWM (Pulse Width Modulation) motor commands. The car is equipped with an Nvidia TX2 and interfaced with an Arduino and ZED 1 depth sensor. Once the car was up and running I collected training data and trained our Pytorch implementation of SqueezeNet, an efficient CNN. I was advised by Dr. Karl Zipser & Dr. Sascha Hornauer. Autonomous RC Car GitHub
Konstant A, N Orr, Hagenow M, Gundrum I, Hen Hu Y , Mutlu B, Zinn M, Gleicher M, Radwin R. (2023). Corrective Shared Autonomous Human-Robot Collaboration in a Sanding Task. To appear in The Journal of the Human Factors and Ergonomics Society
Hagenow M, Senft E, Orr N, Radwin R, Gleicher M, Mutlu B, Losey D, Zinn M. (2023). Coordinated Multi-Robot Shared Autonomy Based on Scheduling and Demonstrations. To appear in IEEE Robots and Automation Letters
Orr N*, Dayani P*, Thomopoulos A.*, Saran V, Krishnaswamy S, Zhang E, Hu N, McPherson D, Menke J, Yang A, Vetter K. Immersive Operation of a Semi-Autonomous Aerial Platform for Detecting and Mapping Radiation. (2020). IEEE Nuclear Science Symposium and Medical Imaging Conference
Mehr N, Sanselme M, Orr N, Horowitz R, Gomes G. Offset Selection for Bandwidth Maximization on Multiple Routes. In 2018 Annual American Control Conference, ACC 2018. p. 6366-6371. 8431660. (Proceedings of the American Control Conference). https://doi.org/10.23919/ACC.2018.8431660