Jenkins is the way to build and deploy ML models fast and free

Final University Project

Submitted By Jenkins User Manuel Jesús Núñez Ruiz
During an internship, this university student sought to  develop  ML models with DevOps techniques (MLOps) for gamma ray classification.
Organization: University of Granada
Team Members: Alberto Guillén Perales, Professor, Software Engineer Intern
Industries: Education
Programming Languages: Python
Version Control Systems: GitHub
Community Support: Jenkins.io websites & blogs

Intern tackles ML-Ops with the support of Jenkins.

Background:  My challenge during my internship and final university project was to test and deploy ML Models automatically using DevOps practices. 

*Goal:*  Ensure code quality and ML models unit testing.
"Jenkins is great for fast builds and testing, and works nicely with Docker on a Raspberry Pi."
image— Manuel Jesús Núñez Ruiz, Software Engineer Intern

Solution & Results:  The solution I came up with was to deploy a Jenkins instance on my Raspberry Pi. It turns out it works fine, and it is a very cheap solution for students since I don’t have enough credits in Travis, and Jenkins is FREE!

I learned how to use Jenkins a few days ago, and I have to say this is a very useful tool. I first installed Open Blue Ocean, a very cool and modern web interface for Jenkins. After the installation, I created a Jenkinsfile with the defined pipeline I wanted to execute. It included the execution of a dependencies installation on a Python container and the execution of linters and code formatters (Pylint and Black) to ensure the best code quality. Finally, it executes Pytest to ensure the correct operation of the implemented ML Models.

The key capabilities I relied on for this project were: Open Blue Ocean, Docker builds, and Works on Raspi. Take a look at my project here.

I was pleased with Jenkins because:

  • Build times are very fast

  • It’s free

  • It can be executed on a Raspberry Pi 

  • It has very useful pipeline definition 

  • It integrates well with GitHub