[1912.01412v1] Deep Learning for Symbolic Mathematics
We show that a simple transformer model trained on these datasets can perform extremely well both at computing function integrals, and solving differential equations, outperforming state-of-the-art mathematical frameworks like Matlab or Mathematica that rely on a large number of algorithms and heuristics, and a complex implementation (Risch, 1970)

Neural networks have a reputation for being better at solving statistical or approximate problems than at performing calculations or working with symbolic data. In this paper, we show that they can be surprisingly good at more elaborated tasks in mathematics, such as symbolic integration and solving differential equations. We propose a syntax for representing mathematical problems, and methods for generating large datasets that can be used to train sequence-to-sequence models. We achieve results that outperform commercial Computer Algebra Systems such as Matlab or Mathematica.
‹Figure 1: Number of trees and expressions for different numbers of operators and leaves. p1 and p2 correspond to the number of unary and binary operators respectively, and L to the number of possible leaves. The bottom two curves correspond to the number of binary and unary-binary trees (enumerated by Catalan and Schroeder numbers respectively). The top two curves represent the associated number of expressions. We observe that adding leaves and binary operators significantly increases the size of the problem space. (Counting expressions)Figure 2: Distribution of input and output lengths for different integration datasets. The FWD generator produces short problems with long solutions. Conversely, the BWD generator creates long problems, with short solutions. The IBP approach stands in the middle, and generates short problems with short solutions. (Generalization across generators)