meet whitney crooks of tufts university school of engineering

in robots in education by Jeff Green

From time to time we spotlight some of the best and brightest researchers working with robots in academia. Today, we’re very pleased to publish a guest blog post from Whitney Crooks, Ph.D. Candidate in Mechanical Engineering at Tufts University. Take it away, Whitney.

When I first learned to program in college, I struggled for months. By the end of the semester, my professor and my TAs knew me very well, and the comp-sci major on my sophomore floor was probably actively avoiding me out of fear I would have more questions. And I wasn’t alone. Programming is hard. It’s essentially learning to speak and write in a different language, except we aren’t learning how to say, “Hello my name is…” and “I’m good. How are you?” We’re learning not only words like “loop” and “recursive” but also how to implement them as well as the theory behind them.

Whitney Crooks and the Baxter Research Robot at Tufts University
Many people taking their first computer science class, or teaching themselves on their own, struggle to master the basics of programming. This begs the question, how can we make it easier to teach people how to program without losing the learning that takes place or restricting students to overly simplistic problems?

At the Tufts University Center for Engineering Education and Outreach, we are very interested in answering questions like this. We have found that the graphical nature of LabVIEW helps students learn the basics of programming, which can then be applied to text-based languages and protocols, while also giving them the freedom to explore increasingly complex problems. We have seen that it is much easier to demonstrate the difference between a “while loop” and a “for loop” or what exactly recursion is, when the students can see the program move through the commands step by step on their computer screens.

We have had a lot of success teaching students how to program LEGO NXTs and EV3s in LabVIEW so when we got our hands on a Baxter Research Robot, we had two thoughts: a) how cool would it be to program Baxter in Lab VIEW? and b) what sort of outside-the-box tasks can we get Baxter to do?

From our first question, ROSforLabVIEW was born. ROSforLabVIEW contains a set of VIs (Virtual Instruments, or routines) for standard ROS functions, such as initializing a topic or publishing, as well as a set of VIs designed specifically for Baxter. Currently, most topics are represented, including enabling Baxter, reading joint states, moving joints, reading images from cameras, and reading sonar and IR values. With these VIs we’ve created programs that enable Baxter to shake hands and sort shapes (done by interfacing with the LabVIEW Vision Development module).

In answering our second question, we have thus far come up with two different primary projects. The first involved getting Baxter to sort NXT kits. At the CEEO, we use NXT kits to teach students K-university about robotics. Naturally, we end up with a lot of disorganized and incomplete kits after a trip to the classroom. We normally pay undergrads to sort the kits, but it’s a fairly mindless and boring task. Our first outside-the-box task was to teach Baxter how to sort LEGO kits, which was completely with a high success rate, using ROS and OpenCV.

Our current project with Baxter, and my Ph.D. project, involves using him as an in-home assistant for the mobility impaired. We are working to develop tools that will help Baxter assist the mobility impaired with their daily home-life. Tools enable Baxter to complete tasks such as picking up dropped objects, helping a fallen person get up, and preparing and microwaving a simple meal, or removing the packaging from a store-bought microwaveable meal, heating it up, and serving it to the in-care person. We would like to make Baxter teleoperatable so that the patient can easily control Baxter as it completes tasks. Eventually, we would like to expand teleoperation to a network of people, some mobility impaired themselves, who can control Baxter to assist the in-care people who need it in a model similar to that of Mechanical Turk, an Amazon program that pays users nominal amounts to complete micro-tasks.

In the year and a half since Baxter has arrived at our lab, our excitement for the research opportunities Baxter has provided us with have not diminished. We are constantly evolving our projects and coming up with new ones. Plus, being able to control a human-sized robot makes all that struggling to learn how to program totally worth it.

Whitney Crooks is a mechanical engineering doctoral student at Tufts University School of Engineering. She is also a fellow in Tufts’ Soft Material Robotics | IGERT program.

Want in on the robotics spotlight? Tell us what you’re working on in an email to and we’ll consider featuring you on the Rethink Robotics blog.


About the Author

Jeff Green

I'm Jeff Green, senior content and social media strategist at Rethink Robotics. When I'm not socializing Sawyer and Baxter, our smart, collaborative robots, I'm usually caught up in the home tornado, also known as my three kids. Love them, my wife, old-school Chinese food, movies, and of course game-changing technology.

Leave a Comment

Your email address will not be published. Required fields are marked *