print header, cpp news


Robotic Dog SPOT Helps Computer Science Students Navigate Code

Students get up close and personal with an Agile Mobile Robot at the Chevron booth at the Hi Tech job fair at Cal Poly Pomona.

The computer science department’s dog SPOT has been trained to do more tricks in the last two years than your average pooch. Like most dogs, SPOT can walk and play catch. However, the four-legged creature is unlike other dogs in one important way. SPOT is a robot platform that enables students to learn how to code intuitively.

SPOT was acquired in 2022 during a visit to Boston Dynamics, a global leader in developing highly mobile robots. Computer Science Professor Daisy Tang oversaw this purchase funded by a US Air Force grant.

“The project is a good exploration, an interactive platform, and a big learning curve for them,” said Tang, “It is tangible, and they can see the real applications and the movements of the robot right away from their control code. Different people learn differently, and it is a chance to engage the students in further research.”

The Cal Poly Pomona Team used Boston Dynamics’ software development sample kit to understand SPOT’s movements and physics. Eventually, they brainstormed project proposals and gained approval from Tang, learning their own style of implementation of code throughout the process.

“Us computer science majors get a lot of homework like build your own calculator and build your own website,” said Laurence Tremblay (’24, computer science), “but seeing your own code actually working on SPOT gives you the impression of hands-on learning.”

In the first year, three projects were completed. Students implemented additional hardware for SPOT to be able to play catch, deliver groceries, and recycle Pepsi cans around campus.

“There have been a lot of problems we had to face; we needed to overcome it. Obstacles like bugs, figuring next steps, and organization,” said Youstina Gerges (’24, computer science). “This taught us how to work together, how to debug, how to manage our time, what research we needed to do, and more about machine learning and AI.”

Both Tremblay and Gerges were part of the team that coded SPOT to play Tic-Tac-Toe, using AI algorithms to determine the next move in every interaction. Another project completed in the second year was SPOTTY Guard, where coding enabled SPOT to register human faces and notice unrecognizable objects. The team also experimented with large language models (LLM), which are essentially machine learning models such as Chat GPT.

In the future, SPOT and the coding style students used could help and support people with medical or physical needs and push the boundaries of technology.

“The aim of SPOT understanding voice commands is because we eventually want to evolve the project into assisting people with disability and mobility issues,” said Gabriel Siguenza, a graduate student. “Because SPOT is a mobile robot with an articulate arm attachment that lets him manipulate real-world objects, we plan to have him, in one shape or form, help people to operate the robot without a controller (i.e. using voice commands), pick up objects they need, and be able to ask for help. This is the future goal for the project.”

Recruitment begins in the late summer or early fall of each year. Contact Chair and Professor Daisy Tang at to apply and learn more about SPOT.