Named Tensor Notation

We propose a notation for tensors with named axes, which relieves the author, reader, and future implementers of machine learning models from the burden of keeping track of the order of axes and the purpose of each. The notation makes it easy to lift operations on low-order tensors to higher order ones, for example, from images to minibatches of images, or from an attention mechanism to multiple attention heads. After a brief overview and formal definition of the notation, we illustrate it through several examples from modern machine learning, from building blocks like attention and convolution to full models like Transformers and LeNet. We then discuss differential calculus in our notation and compare with some alternative notations. Our proposals build on ideas from many previous papers and software libraries. We hope that our notation will encourage more authors to use named tensors, resulting in clearer papers and more precise implementations.

[1]  A. Rogozhnikov Einops: Clear and Reliable Tensor Manipulations with Einstein-like Notation , 2022, ICLR.

[2]  Daniel D. Johnson,et al.  Getting to the point: index sets and parallelism-preserving autodiff for pointful array programming , 2021, Proc. ACM Program. Lang..

[3]  Jaime Fern'andez del R'io,et al.  Array programming with NumPy , 2020, Nature.

[4]  Natalia Gimelshein,et al.  PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.

[5]  Matthew J. Johnson,et al.  Dex: array programming with typed indices , 2019 .

[6]  Kaiming He,et al.  Group Normalization , 2018, International Journal of Computer Vision.

[7]  Sören Laue,et al.  Computing Higher Order Derivatives of Matrix and Tensor Expressions , 2018, NeurIPS.

[8]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[9]  Stephan Hoyer,et al.  xarray: N-D labeled arrays and datasets in Python , 2017 .

[10]  Tongfei Chen,et al.  Typesafe Abstractions for Tensor Operations , 2017, ArXiv.

[11]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[12]  Nir Friedman,et al.  Probabilistic Graphical Models - Principles and Techniques , 2009 .

[13]  Richard A. Harshman,et al.  An index formalism that generalizes the capabilities of matrix notation and algebra to n‐way arrays , 2001 .

[14]  F. E.,et al.  A Relational Model of Data Large Shared Data Banks , 2000 .

[15]  Bjørn K. Alsberg,et al.  A diagram notation for N‐mode array equations , 1997 .

[16]  Norbert Fuhr,et al.  A probabilistic relational algebra for the integration of information retrieval and database systems , 1997, TOIS.

[17]  Sumit Sarkar,et al.  A probabilistic relational model and algebra , 1996, TODS.

[18]  J. Magnus,et al.  Matrix differential calculus with applications to simple, Hadamard, and Kronecker products. , 1985 .

[19]  R. Penrose,et al.  Spinors and Space–Time: Subject and author index , 1984 .

[20]  Alain Pirotte,et al.  A precise definition of basic relational notions and of the relational algebra , 1982, SGMD.

[21]  A. Einstein,et al.  Die Grundlage der allgemeinen Relativitätstheorie , 1916 .

[22]  M. M. G. Ricci,et al.  Méthodes de calcul différentiel absolu et leurs applications , 1900 .