Skip to content

svngoku/sagemaker-high-speed-ball-tracking-application

 
 

Repository files navigation

High-speed Ball Tracking Application in Sports Broadcast Videos

Introduction

In ball Sports, ball tracking data is considered one of the fundamental and useful information in evaluating players’ performance and game strategies. Until recently, it was a challenge to come up with a reliable technique that would accurately recognize and position balls in sports that involve tiny balls moving at high-speed. For instance, tennis, badminton, baseball or golf.

In this project, we are going to build an end to end machine learning workflow that prepares video files, performs model training and standing up a realtime endpoint in SageMaker. A sample application is also included that takes a sports broadcast video and produce a separate video file that contains ball trajectory thats overlays the original video.

The following diagram dipicts a high level solution architecture that supports the development work.

Architecture Diagram

Solution Architecture

Solution Overview

Specifically, the end to end machine learning workflow encapsulates the following main steps:

  1. Upload the relevant sport broadcast video files to S3 bucket. These video files are to be annotated with labels.
  2. Label video frames by using Amazon Sagemaker Ground Truth labeling job.
  3. Process the ground truth labels to create features for model training using a SageMaker Processing job.
  4. Train a deep learning model (Tracknet) using a Sagemaker training job with GPU instance.
  5. Deploy a realtime HTTPS endpoint for serving the model for predicting ball positions.
  6. Produce a ball tracking video for a given broadcast video file through invoking the deployed HTTPS endpoint.

Quick Start Guide

Prerequisites

The code and the notebooks used in this repository were tested using Amazon SageMaker. For best experience, we highly recommend using SageMaker Studio or SageMaker Notebook instances.

If this is your first time using Amazon SageMaker, here's a good starting point on setting up the SageMaker environment.

This project integrates with other AWS services, including S3, SageMaker Ground Truth and Amazon Cognito to provide an end to end solution. When configuring access control through IAM policies, we recommend getting started with the AWS managed policies, and move towards the least privilege permissions as a best practice.

Here're a list of Managed IAM Policies to associate with the SageMaker Execution Role:

Please refer to the following IAM policy examples for additional customizations that provide finer grained permissions that meet your security requirements.

Starting Point

To get started, you need to clone the project into your SageMaker Studio environment:

> git clone https://github.com/wei-m-teh/sagemaker-tracknet-v2
> cd sagemaker-tracknet-v2

Ground Truth Labeling Job

Assuming the original broadcast video files are uploaded to S3 bucket, we are going to apply appropriate labels to the video frames. SageMaker Ground Truth is a data labeling service that makes it easy to label data in various formats and gives you the option to use human annotators through Amazon Mechanical Turk, third-party vendors, or your own private workforce. In our example, we are going to create a ground truth labeling job with private workforce to annotate the uploaded videos.

The sample videos files we tested for this project can be found here

Instructions on how to create a SageMaker Ground Truth labeling job can be found here.

Feature Engineering

Once the video files are labeled, we will create the features to train a model. Given large volume of labeled data (labels are applied to every video frame in video files), we will leverage a SageMaker Processing job to help us featurize the dataset required for training a model. Instructions on how to create a SageMaker Processing Job can be found here.

Model Training

Once feature engineering step is complete, all the input data required for training a model should be available in the speicfied S3 bucket location. We will trigger a SageMaker Training job to train a model, as followed here.

Deploy Model Endpoint and Generate Ball Tracking Trajectory Video Files

After the model is trained successfully, we can now deploy a SageMaker endpoint to serve inference for the video files. With a realtime endpoint deployed in SageMaker, you can then create ball tracking videos by integrating the deployed endpoint with your own videos. A sample application is included in this repository to demonstrate the capability in action here. To run the application, use any badminton videos that you would like to track the ball movements, or you could use one of the videos referenced here.

Running this repository in SageMaker Local Mode

If you would like to run this notebook in local mode, please follow the intructions here.

NOTE:

If you would like to run these notebooks in Sagemaker Notebook instance environment, you will need to make sure the current backend for keras.json is pointed to tensorflow. You can do so by running the following command in your terminal.

cp -f ~/.keras/keras_tensorflow.json ~/.keras/keras.json

Prediction Results

Inference application will save the results and prediction labels (visibility) and the location of the ball in a csv file. The sample file below represents an example of the prediction results above. Take a look at the ball closely and observe the red dot following the movement of the ball.

1_01_00_predict

License

This library is licensed under the MIT-0 License. See the LICENSE file.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 72.0%
  • Python 27.9%
  • Shell 0.1%