Introducing the iNaturalist 2018 Challenge!

March 15, 2018, 4:51 p.m. By: Kirti Bakshi

iNaturalist 2018

It's all because of the recent advances in deep learning, that the abilities of machines in visual recognition have improved dramatically, hence allowing the practical application of computer vision to tasks that now range from pedestrian detection for self-driving cars to expression recognition in virtual reality. But, still, one area that remains challenging for computers, is fine-grained and instance-level recognition.

For computers, it is challenging to discriminate fine-grained categories because many categories relatively have few training examples (i.e., the long tail problem), and the ones that do exist often lack training labels that are authoritative, there is variability in illumination, viewing angle and also the object occlusion.

So, keeping this in mind and in order to help confront these hurdles, you are presented the 2018 iNaturalist Challenge (iNat-2018), in partnership with iNaturalist and Visipedia (short for Visual Encyclopedia), a species classification competition that has been offered a project for which Caltech and Cornell Tech received a Google Focused Research Award. This for the 5th International Workshop is a flagship challenge on Fine Grained Visual Categorization (FGVC5) at CVPR 2018.

iNaturalist has emerged as a world leader for citizen scientists to share observations of species and connect with nature since its founding in 2008. It hosts photos that are research-grade and annotations that are submitted by an engaged community of users.

Building ahead upon the first iNaturalist challenge, iNat-2017, the competition this year spans over 8000 categories of plants, animals, and fungi, with a total of training images that are more than 450,000.

The participants are invited to enter the competition on Kaggle, with the final submissions that are due in early June. The details are mentioned in the end. Training data, annotations, and links to pre-trained models can be found on the GitHub repo.

How is it Different from iNat2017:

The dataset this year in 2018 contains more species as compared to the last year's competition. It also contains a full taxonomy, a longer tail, as well as a new evaluation metric.

Main Goal of the competition:

The goal of this competition in automatic image classification is to push the state of the art for real-world data that features a large number of fine-grained categories with high-class imbalance.

Along with iNat-2018, FGVC5 will also be hosting the iMaterialist challenge that this year will include:

  • A furniture categorization challenge

  • A fashion attributes challenge

This will be done for product images and a set of “FGVCx” challenges that will represent smaller-scale challenges that are still significant, featuring content such as food as well as modern art.

Also, in contrast to all other image classification datasets such as ImageNet, the dataset in the iNaturalist challenge puts forward the exhibition of a long-tailed distribution, including many species that are having relatively few images. It is also important to enable machine learning models in order to handle categories in the long-tail, as the natural world is imbalanced very heavily. As some species are more abundant and also easier to photograph as compared to others. Therefore, this iNaturalist challenge will encourage progress because the training distribution of iNat-2018 has an even longer tail than the iNat-that was held the previous year.

FGVC5 (Fine-Grained Visual Categorization) will be showcased on the main stage at CVPR 2018, that will ensure a broad exposure for the top performing teams. This project will in automatic image classification advance the state-of-the-art for real world, fine-grained categories, with heavy class imbalances, as well as large numbers of classes.

Competition Details:

Competition Begins February 2018

Submission Deadline 4th June 2018

You are therefore invited to participate in these competitions and help move the field forward!

For More Information: GitHub