Student startup Marco Polo gives athletes a fresh perspective on training

 

July 12, 2018

 

Marco Polo team member Mike Devlin addresses the crowd at the Sports Tech Collider Finale.
Marco Polo team member Mike Devlin addresses the crowd at the Sports Tech Collider Finale.
Marco Polo team member Mike Devlin addresses the crowd at the Sports Tech Collider Finale.

Marco Polo team member Mike Devlin addresses the crowd at the Sports Tech Collider Finale.

Born out of SCET’s Sports Tech Collider Sprint, Marco Polo is a company focused on bringing artificial intelligence, or AI, to the sports world.

Three UC Berkeley students came together to give sports teams an advantage when it comes to training. Second-year MBA student Mike Devlin, computer science Ph.D. student Panna Felsen, and recent alum Daniel Rozovsky collaborated on the project, specifically focusing on how they could help improve Cal sports.

“It was a lot of fun to go through and think back to what the pain points of sports are in general,” Devlin said. “A really cool part of it was moving into the building of it … and what really goes into teaching the computer to understand. It was a really exciting experience.”

Devlin said that because sports is dependent on intuition — a “gut-focused enterprise” — the team wanted to give both players and athletes more information to make split-second decisions.

Marco Polo’s technology is based on using computer vision, a subset of artificial intelligence, to recognize what is going on during practice, individual exercise, and games, the same way that people see players on the court. The algorithm teaches neural networks to do the same, such as understanding the difference between the home and away teams, and providing information on better techniques and preparation.

According to Devlin, companies currently work with similar technology to provide instant feedback for athletes, but those cameras cost thousands of dollars and are inaccessible to most athletes. Marco Polo works simply with a camera phone, and there are no sensors. The team used film captured using an iPhone to do their analysis and were able to reach the same level of accuracy.

Throughout the eight-week Collider Sprint, teams worked through several iterations of their products, incorporating feedback from user validation, testing, and building to produce a final prototype.

“Our team did a phenomenal job,” Devlin said. “It gave us the confidence that this might be more than just a fun school project — this could be a company, this could be something that we could build for many years.”