by N. Menet, M. Hersche, G. Karunaratne, L. Benini, A. Sebastian, A. Rahimi
Abstract:
With the advent of deep learning, progressively larger neural networks have been designed to solve complex tasks. We take advantage of these capacity-rich models to lower the cost of inference by exploiting computation in superposition. To reduce the computational burden per input, we propose Multiple-Input-Multiple-Output Neural Networks (MIMONets) capable of handling many inputs at once. MIMONets augment various deep neural network architectures with variable binding mechanisms to represent an arbitrary number of inputs in a compositional data structure via fixed-width distributed representations. Accordingly, MIMONets adapt nonlinear neural transformations to process the data structure holistically, leading to a speedup nearly proportional to the number of superposed input items in the data structure. After processing in superposition, an unbinding mechanism recovers each transformed input of interest. MIMONets also provide a dynamic trade-off between accuracy and throughput by an instantaneous on-demand switching between a set of accuracy-throughput operating points, yet within a single set of fixed parameters. We apply the concept of MIMONets to both CNN and Transformer architectures resulting in MIMOConv and MIMOFormer, respectively. Empirical evaluations show that MIMOConv achieves ≈2– 4×speedup at an accuracy delta within [+0.68,−3.18]% compared to WideResNet CNNs on CIFAR10 and CIFAR100. Similarly, MIMOFormer can handle 2–4 inputs at once while maintaining a high average accuracy within a [−1.07,−3.43]% delta on the long range arena benchmark. Finally, we provide mathematical bounds on the interference between superposition channels in MIMOFormer. Our code is available at https://github.com/IBM/multiple-input-multiple-output-nets.
Reference:
MIMONets: Multiple-Input-Multiple-Output Neural Networks Exploiting Computation in Superposition N. Menet, M. Hersche, G. Karunaratne, L. Benini, A. Sebastian, A. RahimiIn Proc. Neural Information Processing Systems (NeurIPS), volume 36, 2023
Bibtex Entry:
@inproceedings{menet2023mimo,
title = {MIMONets: Multiple-Input-Multiple-Output Neural Networks Exploiting Computation in Superposition},
author = {Menet, Nicolas and Hersche, Michael and Karunaratne, Geethan and Benini, Luca and Sebastian, Abu and Rahimi, Abbas},
booktitle = {Proc. Neural Information Processing Systems (NeurIPS)},
volume = {36},
year = {2023},
month = {December},
pdf = {https://arxiv.org/pdf/2312.02829}
}