Mon Dec 13 2021

Xomnia’s machine learning engineers participate in FruitPunch’s AI for Wildlife Challenge

Anna Dekker, Samantha Biegel, and Vincent Roest, who are part of Xomnia’s Machine Learning Development Program, have volunteered in the second edition of the AI for Wildlife Challenge. The three machine learning engineers, alongside three data scientists based in South Africa and the UAE, formed a team supervised by our Lead Data Engineer Andrey Kudryavets to deliver a model that detects poachers in real-time in the Pilanesberg National Reserve in South Africa.

The 8-week challenge started in September 2021, with data professionals volunteering to overcome with AI different challenges related to detecting poachers at the Reserve. The volunteers were divided into teams, and the team with our machine learning engineers was assigned the task of developing an object detection model that uses RGB data, which can detect poaching in real-time using colored data collected by drones.

Other teams were concerned with creating a data preparation pipeline, improving the existing model for thermal data, preparing hardware to enable deploying the model working on the drone, and developing the autonomous landing of the drone.

The initiative was organized by FruitPunch, a community of data specialists who use AI to come up with solutions for challenges related to the Sustainable Development Goal 15 about "Life on land", which is one of 17 goals set in 2015 by the United Nations. To deliver this challenge, FruitPunch collaborated with the South African conservation company Strategic Protection of Threatened Species (SPOTS). SPOTS’ staff at the Pilanesberg National Reserve need support in quickly detecting poachers, who hunt rhinos living within the reserve to sell their horns.

Delivering classification and object detection models

Due to the lack of drone imagery of poachers at the possession of SPOTS, our team started by collaborating with the park rangers, who walked around the park pretending to be poachers, getting themselves filmed by the drones.

Our MLE’s and their team members deployed an AWS Ground Truth pipeline to outsource the labor-intensive task of labeling the imagery for poachers, and then used data augmentation to make the data set big enough.

Following some performance evaluation, the team’s strategy was to create a classification model to label images as containing poachers or not, and if yes, they applied an object detection model that could specify the location of the poacher.

“Because continuously running the object detection models puts too much pressure on the battery of the drone, we decided to first apply a cheaper classification algorithm,” explains Xomnia’s machine learning engineer Vincent, who volunteered in the challenge.

The model, which is in the proof of concept stage, is ready to be deployed on drones to send data to SPOTS’ control center. It can help save rhinos by monitoring a larger area more accurately and quickly.

“It’s nice that we could learn from each other within this initiative’s setup, as different volunteers brought different experiences. It was an excellent opportunity to contribute to this great cause while broadening our skills,” said Anna.

“At the Machine Learning Development Program, we trained to work with tools like AWS. The challenge was a nice opportunity to apply what I learned,” added Samantha.

“An added bonus was that we also had to manage the project. With the help of our lead data engineer Andrey, we decided where to go and how we wanted to solve problems, going from nothing to having a finalized POC in 8 weeks!” said Vincent.