Nutri-Sight: Part 2: Seamless Nutrition Data Entry – On the Open Food Facts and Website, and maybe yours!

Nutri-Sight: Part 2: Seamless Nutrition Data Entry – On the Open Food Facts and Website, and maybe yours!

In the first part of this Nutri-Sight blogpost series, we introduced Nutri-Sight, our AI-powered solution developed thanks to the DRG4Food programme, designed to tackle the monumental task of extracting nutrition facts from food packaging images, at Open Food Facts scale. This technology is a game-changer for populating the Open Food Facts database, but its true power lies in making this process faster and easier for everyone involved. Now, let’s dive into how we’re bringing this capability directly to your fingertips through user-friendly interfaces.

A Smoother Experience on the Open Food Facts Mobile App

We know many of our contributions come directly from users scanning products in supermarkets or their kitchen cupboards using the Open Food Facts mobile app (available on iOS and Android). Manually typing in the entire nutrition table can be time-consuming and prone to errors.

This is where Nutri-Sight steps in to radically simplify the contribution flow:

  1. “Scan” the Nutrition Table: When you add or update a product, simply take a clear picture of the nutrition facts panel.
  2. AI-Powered Suggestions: Our app sends the image to Robotoff and the Nutri-Sight AI. Within seconds, the model analyzes the image and extracts the nutritional values (calories, fat, carbohydrates, protein, etc.) and their corresponding units per 100g/ml.
  3. Intuitive Validation UI: Instead of a blank form, the app will present you with suggestions below the nutrition fields. Each suggested value will be clearly marked, and you can fill the field, or overwrite the existing value.
  4. Quick Confirmation: Your role shifts from tedious data entry to quick verification. Simply review the AI’s suggestions against the packaging image and tap to confirm or easily correct any minor inaccuracies.

This streamlined interface significantly reduces the effort required to add vital nutritional information. Under the hood, this user interface for validating AI suggestions is built using our open-source Open Food Facts Dart package. This is crucial because it means this exact same ML capacity isn’t limited to just our app. Any third-party developer building a Flutter application can integrate this package to provide their users with the same efficient nutrition data validation experience, powered by Open Food Facts data and AI.

Bringing AI Assistance to the Open Food Facts website

The collaborative effort to grow the Open Food Facts database also happens extensively on our website. Contributors often upload photos taken earlier or work through batches of products needing data completion from the mobile app. To support these users, we’ve developed a similar AI-assisted workflow for the web platform.

When viewing a product page on the Open Food Facts website (while logged in), if a nutrition table image is available, contributors see an option to use Nutri-Sight. The results are presented via a dedicated web component.

This web component functions much like the interface in the mobile app:

  • It displays the nutrition table image alongside the data fields.
  • It highlights the values suggested by the Nutri-Sight AI.
  • It allows the contributor to easily confirm or correct each value before saving.

The beauty of encapsulating this functionality in a web component is its reusability. Any third-party website or web application can embed this component with just a few lines of code. This allows other platforms focused on food, health, or sustainability to integrate Open Food Facts’ nutrition data validation directly into their own user flows, leveraging our AI and ensuring instant user satisfaction while contributing back to Open Food Facts.

Growing Ecosystem: Third-Party Adoption

The core philosophy of Open Food Facts is openness and collaboration. We don’t just want to build great tools for ourselves; we want to empower the entire ecosystem working towards food transparency.

We are therefore thrilled that applications like El Coco, which already uses Open Food Facts data and was a partner in the NutriSight project, are in the process of integrating these new AI-powered validation UIs into their own applications.

By adopting these tools, third-party apps can:

  • Improve the quality and completeness of nutrition data within their own apps.
  • Reduce the friction for their users who want to contribute missing information to get the scores.
  • Benefit directly from the advancements driven by the Open Food Facts community and projects like DRG4Food.
  • Ultimately, contribute to making comprehensive nutritional information more accessible to everyone, everywhere.

Conclusion: Better Data, Easier Access

With Nutri-Sight, nutrition data collection becomes faster, more accurate, and more collaborative. By integrating these tools into our mobile app and website, and making them available as open-source packages and components for others, we are significantly accelerating our collective mission towards complete food transparency. Whether you’re a user, a contributor, or a developer, these advancements make it easier than ever to be part of the food revolution.

How to get involved 

  • If you are a user of the Open Food Facts application, you can benefit right now from these prediction tools in the mobile application and on the website.
  • If you are a contributor to the database, you can use the Hunger Games platform to quickly validate the nutritional information on a whole host of food products.
  • If you’re a data scientist, you might decide to help us improve the model, extend it to more data or make it available on mobile. You’ll certainly be interested in the model, the data or the training code. 

Many thanks again to the DRG4Food project, funded by the European Commission, which has given us the opportunity to speed up the race towards greater transparency in food.