Dietary Assessment
Classification Accuracy vs Portion Accuracy
The distinction between whether a food was correctly identified and whether the amount consumed was correctly estimated — two independent error sources that combine into total calorie error.
Key takeaways
- Classification accuracy is about food identity; portion accuracy is about food amount. The two are independent.
- Total photo-logging calorie error decomposes roughly evenly into classification and portion components on typical meals.
- A system can be strong on one and weak on the other; advertised numbers should say which.
- Portion estimation is the harder problem in 2026 — it is where the remaining headroom sits for photo-based methods.
The classification accuracy vs portion accuracy distinction separates the two independent tasks inside any photo-based or visual dietary-assessment method: correctly identifying what the food is, and correctly estimating how much of it is there. Both are required for accurate calorie estimation; failure on either produces a bad calorie figure; and the two error sources combine roughly in quadrature to produce total error.
Decomposing photo-logging error
For a given meal, the total calorie-estimation error can be written:
Total error = f(identification error, portion error, ingredient visibility error)
In practice, identification and portion dominate for typical meals with visible components. Ingredient visibility (hidden fats, sauces) contributes heavily for specific meal categories (stir-fries, dressed salads, restaurant preparations).
Typical magnitudes
In Bitebench's 2026 benchmark, across the 500-meal reference set, the photo-logging category showed roughly equal contributions from classification and portion error for the average meal. For PlateLens, the decomposition was approximately 0.8 per cent MAPE from classification and 1.1 per cent MAPE from portion, summing (in quadrature) to the reported 1.2 per cent total. For lower-performing methods in the same benchmark, both components were larger but the ratio was similar.
Where the 2026 headroom sits
Classification accuracy on the major food-identification benchmarks has plateaued in the high nineties for the best models. The marginal gains from further training-data expansion or architecture improvement are small. Portion estimation, by contrast, continues to show meaningful improvement year over year, driven by better depth-estimation from monocular images, reference-object detection, and per-ingredient decomposition approaches. A 2026 Nature Machine Intelligence methodology review argued that portion estimation is now the rate-limiting step in photo-based dietary assessment, and that future accuracy gains in the category will come disproportionately from portion work.
Why it matters for claims
A consumer app advertising "95 per cent accurate food recognition" is advertising classification accuracy alone. This is a true statement about the system but not a statement about calorie accuracy. A consumer reading the claim and inferring that their calorie totals are 95 per cent accurate is drawing an invalid inference. The methodological response — rare in consumer marketing, routine in published research — is to disclose both classification and portion accuracy separately, or to report a total calorie-estimation MAPE that combines the two.
References
- Sahoo D, Hao W, Ke S, Xiongwei W, Le H, Achananuparp P, Lim EP, Hoi SCH. "FoodAI: Food Image Recognition via Deep Learning for Smart Food Logging". Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining , 2019 — doi:10.1145/3292500.3330734.
- Min W, Jiang S, Liu L, Rui Y, Jain R. "A Survey on Food Computing". ACM Computing Surveys , 2019 — doi:10.1145/3329168.
Related terms
- Portion-Size Error The contribution to total estimation error that arises from inaccurate determination of th…
- Food Identification Accuracy The fraction of food items in a test set that a classification or recognition system corre…
- Top-1 vs Top-5 Accuracy The convention for reporting classification performance at the strictest threshold (top-1,…