It's All Just Vectorization: einx, a Universal Notation for Tensor Operations
Florian Fervers, Sebastian Bullinger, Christoph Bodensteiner, Michael Arens
We introduce einx, a universal notation for tensor operations, and provide a Python implementation.
Abstract
Tensor operations represent a cornerstone of modern scientific computing. However, the Numpy-like notation adopted by predominant tensor frameworks is often difficult to read and write and prone to so-called shape errors, i.a., due to following inconsistent rules across a large, complex collection of operations. Alternatives like einsum and einops have gained popularity, but are inherently restricted to few operations and lack the generality required for a universal model of tensor programming.
To derive a better paradigm, we revisit vectorization as a function for transforming tensor operations, and use it to both lift lower-order operations to higher-order operations, and conceptually decompose higher-order operations to lower-order operations and their vectorization.
Building on the universal nature of vectorization, we introduce einx, a universal notation for tensor operations. It uses declarative, pointful expressions that are defined by analogy with loop notation and represent the vectorization of tensor operations. The notation reduces the large APIs of existing frameworks to a small set of elementary operations, applies consistent rules across all operations, and enables a clean, readable and writable representation in code. We provide an implementation of einx that is embedded in Python and integrates seamlessly with existing tensor frameworks: https://github.com/fferflo/einx
einx is universal notation for tensor operations using vectorization, reducing large APIs to small consistent operation sets.
- Universal tensor operation notation based on vectorization as function for transforming tensor operations
- Consistent rules applying across all operations replacing large Numpy-like APIs with small elementary operation set
- Clean readable code representation with interpretation by analogy to loop notation
- Vectorization
- Declarative notation
- Graph representations
Authors did not state explicit limitations.
Authors did not state explicit future directions.
Author keywords
- Tensor notation
- tensor programming
- einx
- einsum
- einops
Related orals
Information Shapes Koopman Representation
Proposes information-theoretic Lagrangian formulation to balance simplicity and expressiveness in Koopman representation learning for dynamical systems.
High-dimensional Analysis of Synthetic Data Selection
Demonstrates that covariance matching procedure improves synthetic data quality for training neural networks better than mean shift or other approaches.
MrRoPE: Mixed-radix Rotary Position Embedding
MrRoPE generalizes RoPE-extension via radix system conversion, achieving train-short-test-long with doubled effective context window.