Nitzan Orr

CS Research M.S. @ University of Wisconsin--Madison · SF Bay Area · nitzan (at) cs (dot) wisc (dot) edu

I research how to improve teleoperation and understanding of robots


Experience

M.S. Research Student

University of Wisconsin--Madison

My work is generally in the area of human-robot interaction, with some projects and interests in virtual reality and haptics. My current projects involve defining and solving challenges in people's ability to remotely manipulate objects (via robots and in VR), as well as increasing situational awareness for remote tele-operators.

Fall 2021 - Present

NASA Software Engineering Intern

University of Wisconsin--Madison

During Summer 2022 I worked at the NASA Ames Research Center in Mountain View, CA. As part of the Astrobee Team within the Intelligent Robotics Group (IRG), I worked on improving the map visualization pipeline. Astrobee, a family of cube-shaped robots deployed in the International Space Station (ISS), mapped several ISS modules which we wanted to make accessible to certain stakeholders. However, the data was high resolution and couldn't be rendered in whole on the web. My job was to partition the 3D model into smaller chunks, at varying resolution levels, so that we could populate a Level-of-Detail Octree (3D Tiles). The work and environment were rewarding, and I enjoyed learning about the inner workings of 3D models and texture maps.

You can access some of the code I wrote here where I used the Blender Python API to automate cropping meshes and reducing texture sizes. The user interface is called ISAAC, not to be confused with my previous project ISAACS.

Picture of me kneeling near one of the cube-shaped astrobee robots in the testing lab. The robot is mouted on a floatation platform.

Me with two of our cube-shaped friends hovering millimeters above the zero-friction granite top (them, not me)

Summer 2022

Undergrad Researcher

Vive Center for Enhanced Reality, UC Berkeley

I co-led a team of undergraduate researchers developing ISAACS, the Immersive Semi-Autonomous Aerial Command System. We remotely operated drones for mapping, radiation detection, and more all through virtual reality. ISAACS integrates flight controls and real-time sensor data visualization into a single interface. Our paper was published at the IEEE Nuclear Science Conference (see "Publications"). I was advised by Dr. Allen Yang & Prof. Kai Vetter. ISAACS GitHub

ISAACS Website

2019 - 2021

Robot Software Intern

Friendly Robots Company

I worked during the school year at an autonomous indoor vehicles startup on vision applications for robot movement calibration. Not only did I gain knowledge about mapping, autonomous navigation, and sensors but also saw what a culture of inclusiveness looks like. Friendly Robots Company

Spring 2020

Undergrad Researcher

Berkeley Deep Drive

At DeepDrive I designed and wrote the ROS software that connected all components of our autonomous RC car -- from the camera input to the neural network inference, and all the way down to the PWM (Pulse Width Modulation) motor commands. The car is equipped with an Nvidia TX2 and interfaced with an Arduino and ZED 1 depth sensor. Once the car was up and running I collected training data and trained our Pytorch implementation of SqueezeNet, an efficient CNN. I was advised by Dr. Karl Zipser & Dr. Sascha Hornauer. Autonomous RC Car GitHub

2018 - 2019

Projects

Aerial Real-Time 3D Mapping

AR / VR Course Project (CS 294)

Publications & Presentations

Publications
Presentations

Photography

Click on the images to enlarge them.