RecipeGM: A Hierarchical Recipe Generation Model

This paper demonstrates the application of hierarchical convolutional neural networks using self-attention mechanisms for the task of generating recipes given a set of ingredients the recipe should contain. We compare this model, RECIPEGM, to an LSTM baseline and RecipeGPT using several metrics and show that our model is able to outperform even RecipeGPT in some cases. Furthermore, this work discusses suitable evaluation techniques for recipe generation and highlights weak points of some current in use metrics.

[1]  Alden J. Moe Cohesion, Coherence, and the Comprehension of Text. , 1979 .

[2]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[3]  Antonio Torralba,et al.  Recipe1M+: A Dataset for Learning Cross-Modal Embeddings for Cooking Recipes and Food Images , 2021, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Kilian Q. Weinberger,et al.  BERTScore: Evaluating Text Generation with BERT , 2019, ICLR.

[5]  Yejin Choi,et al.  Globally Coherent Text Generation with Neural Checklist Models , 2016, EMNLP.

[6]  Verena Rieser,et al.  Why We Need New Evaluation Metrics for NLG , 2017, EMNLP.

[7]  Salim Roukos,et al.  Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.

[8]  Y. Nesterov A method for solving the convex programming problem with convergence rate O(1/k^2) , 1983 .

[9]  Rico Sennrich,et al.  Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.

[10]  Yejin Choi,et al.  The Curious Case of Neural Text Degeneration , 2019, ICLR.

[11]  Myle Ott,et al.  fairseq: A Fast, Extensible Toolkit for Sequence Modeling , 2019, NAACL.

[12]  Wang Ling,et al.  Reference-Aware Language Models , 2016, EMNLP.

[13]  Thibault Sellam,et al.  BLEURT: Learning Robust Metrics for Text Generation , 2020, ACL.

[14]  Philips Kokoh Prasetyo,et al.  RecipeGPT: Generative Pre-training Based Cooking Recipe Generation and Evaluation System , 2020, WWW.

[15]  Ramesh C. Jain,et al.  A Survey on Food Computing , 2018, ACM Comput. Surv..

[16]  Deborah L. McGuinness,et al.  FoodKG: A Semantics-Driven Knowledge Graph for Food Recommendation , 2019, SEMWEB.

[17]  H. Chatley Cohesion , 1921, Nature.

[18]  Kush R. Varshney,et al.  A big data approach to computational creativity: The curious case of Chef Watson , 2019, IBM J. Res. Dev..

[19]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[20]  Vikash Singh,et al.  Replace or Retrieve Keywords In Documents at Scale , 2017, ArXiv.

[21]  Yann Dauphin,et al.  Hierarchical Neural Story Generation , 2018, ACL.

[22]  Christopher D. Manning,et al.  Do Massively Pretrained Language Models Make Better Storytellers? , 2019, CoNLL.

[23]  Ilya Sutskever,et al.  Language Models are Unsupervised Multitask Learners , 2019 .

[24]  Tajinder Singh,et al.  Food Image to Cooking Instructions Conversion Through Compressed Embeddings Using Deep Learning , 2019, 2019 IEEE 35th International Conference on Data Engineering Workshops (ICDEW).

[25]  Yanjun Li,et al.  Food Recipe Alternation and Generation with Natural Language Processing Techniques , 2020, 2020 IEEE 36th International Conference on Data Engineering Workshops (ICDEW).

[26]  Jianfeng Gao,et al.  A Diversity-Promoting Objective Function for Neural Conversation Models , 2015, NAACL.

[27]  Chin-Yew Lin,et al.  ROUGE: A Package for Automatic Evaluation of Summaries , 2004, ACL 2004.

[28]  Shuyang Li,et al.  Generating Personalized Recipes from Historical User Preferences , 2019, EMNLP.

[29]  Percy Liang,et al.  Unifying Human and Statistical Evaluation for Natural Language Generation , 2019, NAACL.