How was 2019 in the field of mathematics and Computer Science?

Translation of the article was prepared especially for students. base and advanced of courses “Mathematics for Data Science”.


Over the past year, mathematicians and computer scientists have made great strides in number theory, graph theory, machine learning, and quantum computing, and even revised our fundamental concepts of mathematics and neural networks.

For mathematicians and computer science specialists, 2019 was a year of repetition and close study. Some revised the fundamental principles, while others found strikingly simple evidence, new methods for solving problems, or comprehended unexpected solutions to long-standing problems. Some of these achievements have already found wide application in physics and other scientific disciplines. Others exist solely as a theory (or just for fun), and from a practical point of view today do not bear any benefit.

Quanta decided to highlight a decade-long attempt to rid mathematics of the hard equal sign and replace it with a more flexible concept of “equivalence.” We also talked about new ideas in the general theory of neural networks that could give computer science specialists the coveted theoretical foundation for the success of deep learning algorithms.

Meanwhile, ordinary mathematical objects, such as matrices and networks, revealed unexpected new ideas in short elegant proofs, and ten-year problems of number theory finally got solutions. Mathematicians learned more about how regularity and order arise in chaotic systems, random numbers, and other seemingly random areas. One way or another, machine learning was becoming more powerful, changing approaches and areas of scientific research, while quantum computers were likely to reach a critical point.

Laying the foundation of understanding

What if the equal sign – the foundation of all mathematics – was just a mistake? An increasing number of mathematicians, led in part by Jacob Lurie from the Institute for Advanced Study, want to rewrite science, replacing “equality” with a freer language “Equivalence”. “Today, the foundations of mathematics are built on sets of objects called sets, but ten years ago several mathematicians began working with more universal groups called categories, which convey more information than sets and reflect more possible relationships than“ equality ”. Since 2006, Lurie has published more than a thousand pages of mathematical theory on how to translate modern mathematics into the language of category theory.

Not so long ago, other mathematicians began to establish the fundamental principles of a field in which there is no prevailing dogma that can be discarded: the field of machine learning. The technology underlying today’s most successful machine learning algorithms is becoming increasingly necessary in science and society, however, few people really understand how it works. In January, we wrote about attempts to create neural network theory, which explains the effect of structure on network capabilities.

A new look at old problems

Just because the path seems familiar, one cannot say that there are no secrets left in it. For centuries, mathematicians, physicists, and engineers have worked with mathematical terms, such as “eigenvalues” and “eigenvectors,” using them for matrices that reflect transformations of objects in different ways. In August, three physicists and one mathematician brought out a new simple the formula, which in a new way connects these two sets of quantities, it greatly simplified the work of physicists in the study of neutrinos and gave rise to new mathematical discoveries. After the publication of their study, scientists learned that this relationship was discovered long ago, but was ignored all the time.

The routine routine of computer science one day was lit up by the discovery of a mathematician who suddenly solved one of the biggest open problems in this area, proving the hypothesis of “sensitivity” (https://www.quantamagazine.org/mathematician-solves-computer-science-conjecture -in-two-pages-20190725 /), which describes how likely it is that you can affect the output of the chip by changing the data on one input. The proof turned out to be disarmingly simple and compact enough to be generalized in one tweet. At the same time, in the world of graph theory, another Spartan article (this time weighing only three pages) denied hypothesis ten years ago about how it is better to choose colors for network nodes, a hypothesis that affects maps, seat layout and sudoku.

Signal in noise

Mathematics often involves finding some kind of order inside the mess, extracting hidden patterns from apparent randomness. In May, one team used the so-called magic functionsto show that the best ways of arranging points in eight-dimensional and 24-dimensional spaces are also universally optimal, that is, they solve an infinite number of problems that go beyond the tight packing of equal spheres. It is still unclear why magic functions are so universal. “There are some problems in mathematics that are solved with the help of perseverance and brute force,” said mathematician Henry Cohn. “In addition, there are times when it seems like mathematics wants something to happen.”

Other scientists, however, found patterns in the unpredictable. Sarah Paluse proved that numerical sequences called “Polynomial progressions”are inevitable in sufficiently large sets of numbers, even if the numbers in them are randomly selected. More mathematicians have proved that with certain conditions of pattern arise in a twice-random process of analyzing forms randomly in the event that the forms themselves are also random. Further, reinforcing the connection between chaos and order, in March Tim Austin proved that all mathematical descriptions of changes ultimately constitute a mixture ordered and random systems – and even ordered ones need an element of randomness. Finally, in the real world, physicists are working on understanding when and how chaotic systems, from flickering fireflies to firing neurons, can synchronize and move as whole.

Game with numbers

In elementary school, we all learned to multiply in the good old way, but in March 2019, two mathematicians described faster multiplication method. Instead of multiplying each digit of the first number by each digit of the second, which is rather useless for large numbers, the calculator can combine a number of methods, which includes addition, multiplication and permutation of the numbers, in order to finally get the result in a much smaller number of steps. This is actually the most effective way to multiply numbers by far.

Other interesting discoveries in the world of numbers this year talk about how to express the number 33 in the form sum of three cubes, thereby proving the long-standing hypothesis of how approximate irrational numbersfor example, the number pi, and deepen the relationship between the amounts and number set products.

The growing challenges of machine learning

Scientists are increasingly turning to machines for help, not only in obtaining data, but also in understanding them. In March, we talked about machine learning changing science growth rate. A process called generative modeling, for example, may be the “third way” to formulate and test hypotheses after traditional methods of observation and modeling, although many still see it simply as a simplified method of processing information. In any case, as Dan Falk wrote, machine learning “changes the taste of scientific discovery and, of course, simplifies the path to it.”

If we talk about what machine learning has helped us over the past year, it is worth noting that researchers have found algorithms that are potentially can predict earthquakes in the north-west of the Pacific Ocean, while the multidisciplinary team, meanwhile, was figuring out how vision works, creating a mathematical model based on brain anatomy. But this is still a long way off: the German team announced that machines often cannot recognize images, because they focus on textures, not forms, and the neural network, nicknamed BERT, has learned to defeat people in reading comprehension tests, just so that the researchers wondered if the machine really understands something or just does a better job of testing.

Next Steps in the Development of Quantum Computers

After many years of ignorance last year, researchers finally reached an important stage in understanding the issue of quantum computing – although, as in the whole of quantum, this understanding is riddled with uncertainty. Conventional classical computers are built on a binary number system and operate on bits, while quantum computers operate on qubits, which use quantum rules to increase computing power. In 2012 John Preskill coined the term “quantum superiority” to describe the point at which a quantum computer surpasses a classical one. Messages about faster quantum systems made many insiders suspect that we could reach this point in 2019, and in October Google announced that this moment has finally come. However, IBM, as Google rival in this matter, did not agree with this statement, saying that it causes “a lot of skepticism.” However, the obvious progress in creating viable quantum computers over the years has also prompted researchers, such as Stephanie Venus, to start building next generation quantum internet.

On this translation of the article came to an end. And we invite you to open days on basic and advanced courses “Mathematics for Data Science”, for a more detailed acquaintance with the course program.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *