A computer learns about ingredients and recipes What else would you put on a peanut butter sandwich? by Jeremy Cherfas from Eat This Podcast

Recommendation engines are everywhere. They let Netflix suggest shows you might want to watch. They let Spotify build you a personalised playlist of music you will probably like. They turn your smartphone into a source of endless hilarity and mirth. And, of course, there’s IBM’s Watson, recommending all sorts of “interesting” new recipes. As part of his PhD project on machine learning, Jaan Altosaar decided to use a new mathematical technique to build his own recipe recommendation engine.

The technique is similar to the kind of natural language processing that powers predictive text on a phone, and one of the attractions of using food instead of English is that there are only 2000–3000 ingredients to worry about, instead of more than 150,000 words.

The results so far are fun and intriguing, and can only get better.Recommendation engines are everywhere. They let Netflix suggest shows you might want to watch. They let Spotify…

If possible, click to play, otherwise your browser may be unable to play this audio file.
Back in February I had retweeted something interesting from physicist and information theorist Michael Nielsen:

“Augmented cooking with machine intelligence”, with interesting remarks on generating food analogies… https://t.co/UluYk6p8TV
— michael_nielsen (@michael_nielsen) February 2, 2017

http://platform.twitter.com/widgets.js
I found the article in it so interesting, there was some brief conversation around it and I thought to recommend it to my then new friend Jeremy Cherfas, whose Eat This Podcast I had just recently started to enjoy. Mostly I thought he would find it as interesting as I, though I hardly expected he’d turn it into a podcast episode. Though I’ve been plowing through back episodes in his catalog, fortunately this morning I ran out of downloaded episodes in the car so I started streaming the most recent one to find a lovely surprise: a podcast produced on a tip I made.
While he surely must have been producing the episode for some time before I started supporting the podcast on Patreon last week, I must say that having an episode made from one of my tips is the best backer thank you I’ve ever received from a crowd funded project.
Needless to say, I obviously found the subject fascinating. In part it did remind me of a section of Herve This’ book The Science of the Oven (eventually I’ll get around to posting a review with more thoughts) and some of his prior research which I was apparently reading on Christmas Day this past year. On page 118 of the text This discusses the classic French sauces of Escoffier’s students Louis Saulnier and Theodore Gringoire [1] and that a physical chemical analysis of them shows there to be only twenty-three kinds. He continues on:

A system that I introduced during the European Conference on Colloids and Interfaces in 2002 [2] offers a new classification, based on the physical chemical structure of the sauce. In it, G indicates a gas, E an aqueous solution, H a fat in the liquid state, and S a solid. These “phases” can be dispersed (symbol /), mixed (symbol +), superimposed (symbol θ), included (symbol @). Thus, veal stock is a solution, which is designated E. Bound veal stock, composed of starch granules swelled by the water they have absorbed, dispersed in an aqueous solution, is thus described by the formula (E/S)/E.

This goes on to describe in a bit more detail how the scientist-cook could then create a vector space of all combinations of foods from a physical state perspective. A classification system like this could be expanded and bolted on top of the database created by Jaan Altosaar and improved to provide even more actual realistic recipes of the type discussed in the podcast. The combinatorics of the problem are incredibly large, but my guess is that the constraints on the space of possible solutions is brought down incredibly in actual practice. It’s somewhat like the huge numbers of combinations the A, C, T, and Gs in our DNA that could be imagined, yet only an incredibly much smaller subset of that larger set could be found in a living human being.
Small World
The additional byproduct of catching this episode was that it finally reminded me why I had thought the name Jaan Altosaar was so familiar to me when I read his article. It turns out I know Jaan and some of his previous work. Sometime back in 2014 I had corresponded with him regarding his fantastic science news site Useful Science which was just then starting. While I was digging up the connection I realized that my old friend Sol Golomb had also referenced Jaan to me via Mark Wilde for some papers he suggested I read.

References

[1]
T. Gringoire and L. Saulnier, Le répertoire de la cuisine. Dupont et Malgat, 1914.

[2]
H. This, “La gastronomie moléculaire,” Sci Aliments, vol. 23, no. 2, pp. 187–198, 2003 [Online]. Available: http://sda.revuesonline.com/article.jsp?articleId=2577 [Source]

Syndicated copies to:




Author: Chris Aldrich

I’m a biomedical and electrical engineer with interests in information theory, complexity, evolution, genetics, signal processing, theoretical mathematics, and big history.

I’m also a talent manager-producer-publisher in the entertainment industry with expertise in representation, distribution, finance, production, content delivery, and new media.
View all posts by Chris Aldrich