Teaching human psychology using open-source robotics



My lab‘s research focuses on building precise computational models of how people think and learn. Outside our scientific field it can sometimes be hard to understand what a model is and what it adds to psychological theorizing. However, almost everybody can understand what a cognitive model is via a simple but powerful analogy: Cognitive models are like robots. The scientific goal is to build a model (i.e., robot), which behaves as similarly to humans as possible.

This semester I converted this analogy into a semester-long undergraduate course. I purchased a small army of low-cost robots based on the open-source Arduino platform.

Students in the class worked in groups and were assigned their own robot for the duration of the semester (they even got to name their robots!). I then put together a series of immersive labs that explore key cognitive science ideas through the lens of robotics.

The class was pretty successful even on its first pass, and so I thought I’d share some of what I learned with other educators and students.


“What I cannot create, I do not understand.” – Richard Feynman

A basic philosophy of the course is to approach an abstract, philosophical topic like “cognition” or “the mind” in an interactive, hands-on way.

When a student builds a software or hardware artifact that actually does something, a bunch of mental gears are set in motion: They are forced to make decisions, evaluate outcomes, and reflect on how to do things differently in the future. The end result is often they understand more about the lesson but also get to experience what it is like to be a scientist in a lab pursuing original research questions (and tackling challenging problems).

Not to mention, robots are cool.

Exactly how cool are robots? Well, in previous semesters I listed a very similar lecture-based course as “Computation and the mind” and had roughly 2-3(!!) undergraduate students voluntarily sign up for the course. This semester I changed the title to “Robots, Minds, and the Human Brain” and the enrollment requests exceeded the number of robots I could afford to purchase. I had to actually turn students away on my first round teaching it!

What on earth do robots have to do with human psychology?

This is a good question. Perhaps the more direct answer is to describe a few of the labs I developed.

The first lab (based on ideas I got from Tom Stafford) tried to emphasize how the goal of cognitive science is to essentially “reverse engineer” the inner workings of the human mind. To that end, I uploaded a program into the “minds” of each group’s robot but did not explain how it worked. Students were then tasked with performing “synthetic psychology” experiments to understand what the robot was doing (e.g., flashing lights or playing sounds to see how the robot reacted). Unlike the challenge we face in the natural science of psychology, in this exercise there is a “correct” answer.

Students were surprised when they learned that the robot worked according to simpler principles then they initially assumed and I used the exercise to discuss the difficulties of inferring the operation of the mind from behavior alone. (The actual robot was one of the many explored by Valentino Braitenburg in his book Vehicles: Experiments in Synthetic Psychology.)

In another lab, students performed behavioral experiments on each other to understand how variables like the inter-aural timing difference (ITD) and inter-aural intensity differences (IID) between their two ears influences the spatial perception of sound. Then we discussed the theory of sound localization from an engineering perspective (signal processing, cross correlation), and a neuroscience perspective (coincidence detection neurons in the medial superior olive, etc…). Ultimately, after a few weeks of guided work, students successfully built their own robot circuit (by wiring up microphones), which could turn its head to follow the sound of their mobile phone as it moved around the room.

Weekly readings drew from classics in cognitive science (e.g., the introductory chapter from David Marr’s Vision book, Rodney Brooks articles, selections from Braitenburg’s book, and recent articles on robotics and cognitive science in popular magazines).

Open source robotics to rule the world

Even a few years ago, robots were fairly expensive devices only accessible to well funded research labs (one exception is the ever lovable B.E.A.M. robotics idea). However, today there are many low-cost robotics platforms geared primarily at novice computer programmers. Many people know about the Lego Mindstorms platform, but there are also really advanced systems like Pleo which are incredibly lifelike and engaging.

I (along with Sachitch Cheruvatur) explored a number of these systems and ultimately settled on a open-source platform build around Arduino. Arduino is a microcontroller chipset created in Italy which is low cost (each chip costs around $30) and fairly easy to use. Another company, Parallax makes a simple robots platform where the Arduino chipset acts a the “brain” (i.e., the robotics kit includes a chassis, wheels, a Arduino compatible “shield“, and servos). The advantages of this Arduino approach are:

  1. It requires “real” programming (in C) rather than a stripped down graphical programming environment. Advanced undergraduates should be working at this level.
  2. Arduino makes it very easy to develop and test code. You write your code in a special text editor and press the “upload” and it syntax checks and compiles the code, the transfers it over USB to the chipset.
  3. The Arduino platform has a very large user community (there are many inspiring project ideas online at Instructables, Make Magazine, and Adafruit).
  4. The robotics platform is relatively simple/fixed. Since my class is about psychology, cognition, and behavior, I was less interested in teaching some of the “mechanical engineering” aspects of robotics (Lego Mindstorms is obviously really great for this).

What I learned

I was fairly shocked by the level of engagement the students had with the material in this class. There are two things I think contributed to this. The first is that there is something special about seeing an actual physical object respond to your programming commands. Even though you could do almost every element of these lab virtually in software, students were really excited when they got the robot to move around autonomously.

Second, I kept the robots for each group in boxes in my office. To work on their robot outside of class the students needed to drop by my office. This ended up being a great way to get to know the students in the class better. I was also surprised how often people came by to work on their robot even when there wasn’t an assignment due. For example, one student spent extra time setting up his Linux laptop to work with the Arduino chip. Others just wanted to really master programming language so they could make a cool final project and so wanted extra time to play with things.

The timing of the course was a little rushed (some labs took longer than I expected, something I hope to address in future versions). However, the actual final projects were pretty interesting. Students made use of ultrasonic sensors (like bats) to help the robot navigate around the room, to follow one another (similar to imprinting behavior in animals), a remote control which could be used to deliver reward to the robot, as well as a memory-based robot which could remember a series of interactions and replay them. Since so few of the students in the class had any programming experience at all, I thought it was pretty impressive and hopefully they will not be afraid to continue to develop these skills.

Anyway, I am looking forward to teaching this type of class again in the future and am working on putting my course materials together into a type of textbook that other instructors might be able to use. Stay tuned!







  1. Hi, it’s really an interesting point of view. Have you ever thought on a MOOC couse or to open it to non NYU students?

    I’m using arduino (and other open source HW and design platforms) to teach entrepreneurship and business modelling, which allows us this kind of reverse engineering with a firm or product instead of a mind :)

    Regards!

    Pere Losantos
  2. [...] mejor cómo funciona nuestra mente y la suya. Por ejemplo, y vinculado a mi querido arduino, esta asignatura de la Universidad de New York donde un robot es preprogramado por el instructor y los alumnos deben deducir qué rutinas están [...]

    2014005. Tres líneas de investigación en robótica – Hablar por hablar | Blog de Pere Losantos
  3. [...] [...]

    Eu, um robô? – A codificação da empatia