Two Facebook researchers from Paris create a new one for FB neural networkcapable of solving complex mathematical equations, even those dealing with mathematical analysis. Their work is described in article dated December 2, published in the arXiv archive (a repository of scientific research administered by Cornell University). This is another big step forward for neural networks.
If today’s students could get their hands on a copy of the latest neural network Facebook, they could cheat throughout the third part of the calculus course. They could even solve the differential equation shown above in less than 30 seconds.
Well, okay, maybe this network won’t be a replacement anytime soon. Wolfram alpha, but Facebook has really created a model that can solve complex math problems, rather than do the simple old arithmetic that AI models used to deal with.
This work is a huge leap forward in the ability of computers to understand mathematical logic. The research is outlined in a new article Deep Learning for Symbolic Mathematics (“Deep Learning in Symbolic Mathematics”) published in arXiv. Two Parisian Facebook scientists, Guillaume Lamplais and François Charton, led the way.
In the introduction to their work, the scientists point out that neural networks do not have a reputation for being geeks when it comes to computing from mathematical analysis or symbolic data analysis – mostly numbers that you cannot add, multiply, or perform any other operation on them.
“All data used by computers are numbers,” Lamplé and Charton tell the magazine. Popular mechanics… – In most cases, they represent such quantities as the intensity of the color in the image or the sales volume of the product. But sometimes numbers are used as symbols to denote objects or classes. For example, a particular age group can be represented by a number. “
“This is what makes it difficult for neural networks to solve problems with symbolic data,” they say. “They need to learn both data and symbolic rules.”
With their unique approach to parsing the mathematical logic of computers, Lamplé and Charton eliminated this problem by allowing their neural network to process and solve problems from mathematical analysis in about one second. In their paper, they argue that their neural network can outperform other commercial algebraic computing software packages such as Matlab or Wolfram Mathematicawhich industry professionals commonly use for integral calculations.
Why do we need differential equations?
Just in case, if you have studied mathematics for a long time, recall: a differential equation is an equation with one or more derivatives of some function. Such equations can be used, for example, for calculation the rate of change in a rapidly increasing rabbit population, or perhaps the rate of decline in demand for a particular brand of sneaker.
The integral equation deals with some unknown function under the sign integral… If you’ve seen the movie Mean Girls and remember how during the math competition the heroine Lindsay Lohan shouted: “There is no limit!”, you are on the right track.
Lamplé and Charton say they decided to focus on differential equations and integrals for three main reasons:
These are complex tasks that are often taught in universities and which are difficult for people to solve (let’s leave the machines aside).
Problems of this kind are associated with manipulation of characters, similar to those performed in the language. “So we thought it was a good target for the natural language processing models that we use,” they say.
Finally, these problems made it easier for two researchers to create a large set of problems and solutions to train their model and validate the solutions.
The problem with neural networks
IN neural networks a biological approach to computing is used, that is, the ability to solve problems is based on the principles of the human brain. When we learn new associations or patterns, our brain builds connections between neurons. For example, when you see a cat, you notice that it has fur, two eyes, and four legs. Upon further examination, you notice that it is a small animal. Your brain is making a connection: it’s a cat, not a dog. All this time, neurons exchange electrical signals, building connections with each other. This is how we learn to recognize patterns.
Likewise, neural networks rely on layers of artificial “neurons” that mirror neurons in our brain, only these so-called neurons do the basic computation. When enough of these elements work together, the entire network can solve more complex problems, even if its individual layers are configured to handle only one kind of equations.
With this in mind, neural networks are great for recognizing images (for example, a square on Facebook that outlines the faces of friends so you can tag them); they are suitable to beat people in strategy games such as chess or go, or even help autonomous vehicles identify potential road hazards and predict the behavior of nearby barriers.
However, neural networks do not differ in their ability to solve complex mathematical equations, such as problems of mathematical analysis. This is because how we recognize and write expressions goes beyond the theoretical minds of machines.
It is difficult for neural networks to solve derivative equations because expressions are based on a convention that makes sense to humans but becomes burdensome in the sense of computers. For example, we write the expression x³, but it really means “x times x times x”. Thus, despite their simplicity, expressions with exponents are broken down into even smaller and simpler mathematical expressions. For neural networks, this logic is difficult. It is obvious to humans, but machines need to learn this logic.
The same is true in problems of differential and integral calculus, where abbreviations are also used for simpler equations contained within an expression. These problems have patterns that can be detected by computer systems, but it just so happens that there is still no reliable solution.
Lamplé and Charton’s new method involves breaking complex expressions into critical pieces. The scientists then train the neural network to find patterns of mathematical logic that are equivalent to integration and differentiation. This allows the software to execute the program in a unique machine way. The scientists then allowed the neural network to find new expressions that were not used in its training, and compared the results with results from other software such as Wolfram Mathematica and Matlab.
To do this, the duo decomposed the equations into smaller parts, representing the equation in the form of a tree. Each leaf is a number, constant, or variable, and each node is a symbol for an operator such as addition, multiplication, differentiation with respect to a variable, and so on.
For example, the expression 2 + 3 x (5 + 2) can be represented as follows:
And the expression 3×2 + cos (2x) – 1 is broken down into the following parts:
Neural network training
Lampla and Charton then needed to find a way to train their neural network, which must consume a huge amount of data, in order to establish wide enough connections between their “neurons.” These connections, if built correctly, allow the neural network to analyze the differential equation.
Scientists compiled a random dataset that included a number of different differential and integral problems and their solutions. They focused on first and second order equations and limited the growth of expressions. After processing this data, the neural network learned to calculate derivatives and integrals for given mathematical expressions, similar to the one shown at the beginning of this article.
To complete the whole process, Lamplé and Charton put the neural network to the test, feeding it 5000 new expressions that she had never seen before. The results were impressive.
“On all tasks, we see that our model is significantly superior to the Mathematica package,” they wrote in their article. “When integrating functions, our model achieves an accuracy of almost 100%, while Mathematica barely reaches 85%.”
In the allotted 30 seconds, Matlab and Mathematica did not find solutions to many problems. However, Facebook’s neural network only takes a second to find such solutions. Our example above is one such challenge.
Ok, now what?
Unfortunately, Lamplé or Charton’s article doesn’t give any hints about what Facebook plans to do with this neural network. However, judging by the fact that in their interview with the magazine Popular mechanics Natural language processing is constantly mentioned, it is possible that Facebook is working to improve methods of processing linguistic information that can be used for a variety of things. Or maybe this giant social network just wants to help you with your homework? Who knows.
Lamplé and Charton told the magazine Popular mechanics that while their model serves only as a proof of concept, it shows that neural networks can handle symbolic mathematics and practical applications will emerge in due course.
“Differential equations are very common in science, especially in physics, chemistry, biology and engineering, so many applications are possible,” they say.
And finally, if you wanted to test your knowledge of mathematics, here is the final answer you should have gotten by solving the equation in y in the very first example:
Don’t be discouraged if you made a mistake, or if the solution took you much longer than the 30 seconds that Facebook neural network took. After all, you are only human.
To be a cool specialist, it is important not only to understand the principles of operation of classical models, but also to understand how it works, and here you need mathematics. You can get the necessary knowledge and skills on the course Mathematics for Data Science or on its extended version Mathematics and Machine Learning for Data Science…
Learn how to upgrade in other specialties or master demanded professions from scratch, can be here or follow the links below. Discount only for habravchan residents 50% with a promo code HABR…
Other professions and courses