Senior Design Projects

ECS193 A/B Winter & Spring 2019

RowBot - an Integrated logbook and AI coach for rowers

Email **********
Paul Crawford
Hegemony Technologies

Project's details

RowBot - an Integrated logbook and AI coach for rowers
The technologies driving the Internet of Things ( IOT ) and motivations behind the “quantified self” movement have found synergies in health, fitness and sports applications as demonstrated with the surging popularity of consumer devices like Fitbit, Apple Watch, and Garmin and numerous mobile apps. The more niche sport of rowing is benefiting from these general advances in fitness tracking, however, most of the rowing data sources of interest are isolated and unconsolidated. To generate more quantified insights into rowing, all of the following information proves relevant in improving rowing performance: calendars, practice plans, weather, tides, water flow, video, heart rate, speed, stroke rate, boat information, crew information, physiologic capacity, a collection of specific rowing data from boat instruments and ergometers, and interpretations of the rowing performance from a coach. These data have been accumulated in a broad collection of applications, files, databases, emails, texts, pics, videos, and websites. All of this data is ripe for harvesting and building an intelligent agent that could supplement or serve as a replacement to a coach.
The goal of this project is to build an extensible and highly scalable platform and methods for creating an artificially intelligent agent to coach rowers. This so-called “RowBot” needs to 1 ) identify, access, query, compile, interpret, and derive rowing-relevant data from a wide variety of sources, 2 ) create models to classify the data using supervised and unsupervised machine learning methodologies, 3 ) derive coaching recommendations from optimization and regression analyses of the aforementioned data, and 4 ) deliver the rowing recommendations back to the coach and rower.
A Cloud-based platform that includes an ingestion engine for importing the identified data sources, a robust database for storing and accessing the data, a data processing and analytics pipeline, a machine learning infrastructure to train and build the classification, regression, and optimization models, and a recommendation engine to deliver the coaching instructions. The code should be developed using industry standard techniques that include clean and well-documented code, the use of code repositories, appropriate unit tests, and a highly automated deployment framework.
SQL and NoSQL database design, ETL, and administration; website development and integration, Python and associated libraries for dataframes, advanced mathematics, web scraping, and machine learning ( pandas, scikit-learn, scipy, beautifulsoup, etc. ), code repositories ( GitLab ), Agile software development methodologies. This work is expected to be built in the AWS ecosystem for ease of administration and scaling and will definitely involve significant use of image and video analysis libraries ( JPG, YouTube, and H264/mp4 formats; metadata ), CSV, XML and JSON file formats, time synchronization techniques, and data alignment, filtering, and optimization methods.
**********
30-60 min weekly or more
Client wishes to keep IP of the project
Attachment N/A
No
Team members N/A
N/A
N/A