This is Food Recognition App I built with a TensorFlow model in a Swift iOS app. Basic tutorial included but happy to add more if there are enough requested!
*Notice the image accuracy at the bottom left.
- TensorFlow
- AlamoFire
- Node.js Backend (Not Included)
Donations will be put back in tutorials (but please don't feel like it's necessary).
This app imports a Tensorflow image recognition model into a Swift app and runs the model every third of a second. The training dataset used to create the model can be found with the Food 101 Keras Dataset or the Food Cam Japanese Dataset.
Since robust training sets are essential in creating accurate models, I also built a script that pulls images from Flickr and adds to the dataset (Please feel free to reach out if you would like it). This project is build in conjunction with Morten-Just Trainer Mac.
This view contains the bulk of the code that links the video stream with the caloric intake of the identified food.
if confidence > 0.10 {
label = label1
machineGuess2.text = "\(outPut[0].key): \(String (Int(outPut[0].value)))%"
machineGuess3.text = "\(outPut[1].key): \(String (Int(outPut[1].value)))%"
label = outPut[0].key
secondLabel = outPut[1].key
}
// change the trigger confidence in the Config file
if confidence > Config.confidence {
presentSeenObject(label: label)
}
}
Allows you to set the confidence variable, which determines when the food is confirmed( static var confidence = 0.9
)
This is a really great collection view embedded in a table view to create a very pretty interface.
###[Last Points] This is a fairly large project with a backend written in Node.js (not included) so it may not work as expected with out some dev time. Please let me know your thoughts!