Ingredient-guided RGB-D fusion network for nutritional assessment

Zhihui Feng, Hao Xiong, Weiqing Min, Sujuan Hou*, Huichuan Duan, Zhonghua Liu, Shuqiang Jiang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The nutritional value of agricultural products is an important indicator for evaluating their quality, which directly affects people's dietary choices and overall well-being. Nutritional assessment studies provide a scientific basis for the production, processing, and marketing of food by analyzing the nutrients they contain. Traditional methods often struggle with suboptimal accuracy and can be time consuming, as well as a shortage of professionals. The progress in artificial intelligence has revolutionized dietary health by offering more accessible methods for food nutritional assessment using vision-based approaches. However, existing vision-based methods using RGB images often face challenges due to varying lighting conditions, impacting the accuracy of nutritional assessment. An alternative is the RGB-D fusion method, which combines RGB images and depth maps. Yet, these methods typically rely on simple fusion techniques that do not ensure precise assessment. Additionally, current vision-based methods struggle to detect small components like oils and sugars on food surfaces, crucial for determining ingredient information and ensuring accurate nutritional assessment. In this pursuit, we propose a novel ingredient-guided RGB-D fusion network that integrates RGB images with depth maps and enables more reliable nutritional assessment guided by ingredient information. Specifically, the multifrequency bimodality fusion module is designed to leverage the correlation between the RGB image and the depth map within the frequency domain. Furthermore, the progressive-fusion module and ingredient-guided module leverage ingredient information to explore the potential correlation between ingredients and nutrients, thereby enhancing the guidance for nutritional assessment learning. We evaluate our approach on a variety of ablation settings on Nutrition5k, where it consistently outperforms state-of-the-art methods.
Original languageEnglish
Pages (from-to)156-166
Number of pages11
JournalIEEE Transactions on AgriFood Electronics
Volume3
Issue number1
Early online date3 Dec 2024
DOIs
Publication statusPublished - Apr 2025

Keywords

  • deep learning
  • food nutrients estimation
  • ingredient information
  • multimodality fusion

Cite this