Researchers at the University of Illinois Urbana-Champaign have developed a new method to convert standard RGB images into hyperspectral images using deep machine learning. This innovative approach could dramatically lower the cost and complexity of chemical analysis in agriculture, enabling more accessible product quality assessment using regular cameras or smartphones.
Md Toukir Ahmed, doctoral student at the University of Illinois Urbana-Champaign, takes a photo of a sweet potato with a hyperspectral camera.
(Source: College of Aces)
Hyperspectral imaging is a useful technique for analyzing the chemical composition of food and agricultural products. However, it is a costly and complicated procedure, which limits its practical application. A team of University of Illinois Urbana-Champaign researchers has developed a method to reconstruct hyperspectral images from standard RGB images using deep machine learning. This technique can greatly simplify the analytical process and potentially revolutionize product assessment in the agricultural industry.
“Hyperspectral imaging uses expensive equipment. If we can use RGB images captured with a regular camera or smartphone, we can use a low-cost, handheld device to predict product quality,” said lead author Md Toukir Ahmed, a doctoral student in the Department of Agricultural and Biological Engineering (ABE), part of the College of Agricultural, Consumer and Environmental Sciences and The Grainger College of Engineering at Illinois.
The researchers tested their method by analyzing the chemical composition of sweet potatoes. They focused on soluble solid content in one study and dry matter in a second study — important features that influence the taste, nutritional value, marketability, and processing suitability of sweet potatoes. Using deep learning models, they converted the information from RGB images into hyperspectral images.
“With RGB images, you can only detect visible attributes like color, shape, size, and external defects; you can’t detect any chemical parameters. In RGB images you have wavelengths from 400 to 700 nanometers, and three channels — red, green, and blue. But with hyperspectral images you have many channels and wavelengths from 700 to 1,000 nm. With deep learning methods, we can map and reconstruct that range so we now can detect the chemical attributes from RGB images,” said Mohammed Kamruzzaman, assistant professor in ABE and corresponding author on both papers.
Hyperspectral imaging captures a detailed spectral signature at spatial locations across hundreds of narrow bands, combining to form hypercubes. Applying cutting-edge deep learning-based algorithms, Kamruzzaman and Ahmed were able to create a model to reconstruct the hypercubes from RGB images to provide the relevant information for product analysis.
They calibrated the spectral model with reconstructed hyperspectral images of sweet potatoes, achieving over 70 % accuracy in predicting soluble solid content and 88 % accuracy in dry matter content, marking a significant improvement over previous studies.
In a third paper, the research team applied deep learning methods to reconstruct hyperspectral images for predicting chick embryo mortality, which has applications for the egg and hatchery industry. They explored different techniques and made recommendations for the most accurate approach.
“Our results show great promise for revolutionizing agricultural product quality assessment. By reconstructing detailed chemical information from simple RGB images, we're opening new possibilities for affordable, accessible analysis. While challenges remain in scaling this technology for industrial use, the potential to transform quality control across the agricultural sector makes this a truly exciting endeavor,” Kamruzzaman concluded.
References:
The first paper, “Deep learning-based hyperspectral image reconstruction for quality assessment of agro-product,” is published in the Journal of Food Engineering [DOI: 10.1016j.jfoodeng.2024.112223]. Authors are Md. Toukir Ahmed, Ocean Monjur, and Mohammed Kamruzzaman.
The second paper, “Comparative analysis of hyperspectral Image reconstruction using deep learning for agricultural and biological applications,” is published in Results in Engineering [DOI: 10.1016/j.rineng.2024.102623]. Authors are Md. Toukir Ahmed, Arthur Villordon, and Mohammed Kamruzzaman.
Both studies were funded by the USDA Agricultural Marketing Service through the Specialty Crop Multistate Program grant AM21SCMPMS1010.
The third paper, “Hyperspectral image reconstruction for predicting chick embryo mortality towards advancing egg and hatchery industry,” is published in Smart Agricultural Technology [DOI: 10.1016/j.atech.2024.100533]. Authors are Md. Toukir Ahmed, Md. Wadud Ahmed, Ocean Monjur, Jason Emmert, Grirish Chowdhary, and Mohammed Kamruzzaman. Funding was provided by the USDA National Institute of Food and Agriculture, Award # 2023–67015–39154.
Date: 08.12.2025
Naturally, we always handle your personal data responsibly. Any personal data we receive from you is processed in accordance with applicable data protection legislation. For detailed information please see our privacy policy.
Consent to the use of data for promotional purposes
I hereby consent to Vogel Communications Group GmbH & Co. KG, Max-Planck-Str. 7-9, 97082 Würzburg including any affiliated companies according to §§ 15 et seq. AktG (hereafter: Vogel Communications Group) using my e-mail address to send editorial newsletters. A list of all affiliated companies can be found here
Newsletter content may include all products and services of any companies mentioned above, including for example specialist journals and books, events and fairs as well as event-related products and services, print and digital media offers and services such as additional (editorial) newsletters, raffles, lead campaigns, market research both online and offline, specialist webportals and e-learning offers. In case my personal telephone number has also been collected, it may be used for offers of aforementioned products, for services of the companies mentioned above, and market research purposes.
Additionally, my consent also includes the processing of my email address and telephone number for data matching for marketing purposes with select advertising partners such as LinkedIn, Google, and Meta. For this, Vogel Communications Group may transmit said data in hashed form to the advertising partners who then use said data to determine whether I am also a member of the mentioned advertising partner portals. Vogel Communications Group uses this feature for the purposes of re-targeting (up-selling, cross-selling, and customer loyalty), generating so-called look-alike audiences for acquisition of new customers, and as basis for exclusion for on-going advertising campaigns. Further information can be found in section “data matching for marketing purposes”.
In case I access protected data on Internet portals of Vogel Communications Group including any affiliated companies according to §§ 15 et seq. AktG, I need to provide further data in order to register for the access to such content. In return for this free access to editorial content, my data may be used in accordance with this consent for the purposes stated here. This does not apply to data matching for marketing purposes.
Right of revocation
I understand that I can revoke my consent at will. My revocation does not change the lawfulness of data processing that was conducted based on my consent leading up to my revocation. One option to declare my revocation is to use the contact form found at https://contact.vogel.de. In case I no longer wish to receive certain newsletters, I have subscribed to, I can also click on the unsubscribe link included at the end of a newsletter. Further information regarding my right of revocation and the implementation of it as well as the consequences of my revocation can be found in the data protection declaration, section editorial newsletter.