Draft for a paper.
Previous research into expressiveness of Neural architectures to perform arbitrary computation predominantly comes from the expectation these Neural architectures will learn which computations to perform in the training process (out of all arbitrary computations). We question the practicality of this expectation given recent results on the limitations of Deep Learning. Instead we propose a simple Neural architecture, the Synthesizer, that simply outfits other Neural architectures with the ability to memorize and perform programs, thus extending any implicit Neural computational abilities with explicit computational abilities. Our theoretical and empirical analysis finds the Synthesizer is capable of learning and performing arbitrary programs in a practical manner while still maintaining standard Deep Learning abilities both theoretically and empirically. Preliminary experiments show how the Synthesizer has the novel ability to learn and perform “soft-programs” that merge computational and Deep Learning abilities and display a kind of “thinking fast and slow”.