Breeders Cup Computer Vision Zone

Charles Durham

Charles Durham

Lancaster, Pennsylvania

6 0
  • 0 Collaborators

Develop machine learning routines to identify racehorses in order to integrate augmented data overlays on live stream 360 VR ...learn more

Project status: Published/In Market

HPC, Artificial Intelligence, Graphics and Media

Groups
DeepLearning, Artificial Intelligence East Coast

Intel Technologies
Intel Opt ML/DL Framework, Intel CPU

Links [3]

Overview / Usage

The project goal was to live stream a 360 VR video during the Breeder's Cup World Championship that provided live data within augmented graphics that was overlayed in video production software. This would provide a more advanced view for horse racing fans that they have never seen before. The paddock area is where many fans will come to examine the horses while they are looking in the racing booklet to check out important statistics. The current booklets are printed from an official database called Equibase, but these books are 3 days old and information may change. The Breeder's Cup organization was also looking to improve engagement with the younger more tech-savvy viewers.

Methodology / Approach

Our team combined live video feed from Insta360 Pro camera that was being accessed through video production software PC workstation and a computer vision PC that was running the machine learning. The machine learning was done with OpenCV ingesting the RTMP stream from the camera and running python program with Tensorflow using YOLOv2 on a custom trained data model to correctly identify racehorse saddle towels (based on the foreground and background colors combined with the number). The YOLOv2 was used on training over 12 thousand images that were annotated and run through many training cycles on AWS GPU instances. There was another machine running Node.js that was set up to ingest data from the Equibase database and also provide a web service API that provided posting the identified horse from the computer vision machine and served the Equibase data to front-end Javascript that was running on the video production software to render the augmented overlays. The video production software was configured to publish to the Wowza video solution that syndicated the live feed to YouTube and Facebook for live viewing. Technically our broadcast was delayed by 60 seconds due to moderation requirements.

Technologies Used

Intel i7 machine with NVIDIA 1070 was running CUDA and Tensorflow GPU mode. Python 3.6, Node.js / Express. OpenCV, Voysys VR Producer for 360 Video production. Wowza streaming service. Jockey Club Equibase Restful API

Comments (0)