This Guy Found a Faster Way to Multiply
Because the method you learned in middle school is ridiculously slow.
Popular Mechanics- David Grossman
Read when you’ve got time to spare.
Photo by YouTube/UNSW
From grade school onward, complex multiplication has been a headache. But an assistant professor from the University of New South Wales Sydney in Australia has developed a new method for multiplying giant numbers together that's more efficient than the "long multiplication" so many are taught at an early age.
“More technically, we have proved a 1971 conjecture of Schönhage and Strassen about the complexity of integer multiplication,” associate professor David Harvey says in the video below.
The Schönhage–Strassen algorithm, developed by two German mathematicians, was actually the fastest method of multiplication from 1971 through 2007. Although a faster method was developed in 2007, it's rarely used today.
Schönhage and Strassen predicted that an algorithm multiplying n-digit numbers using n * log( n) basic operations should exist, Harvey says. His paper is the first known proof that it does.
Harvey picks the example of 314 multiplied by 159—a large equation, sure, but much larger ones are used every day in real-life scenarios. To solve the problem, most people are taught to multiply each individual number together, and then add up the sums:
9 is multiplied by 4, 1, and 3; then 5 is multiplied by 4, 1, and 3, and so on. The result ends up with 9 digit-by-digit products.
For some, seeing a word in the middle of a math problem might be enough to make their eyes glaze over like they did in algebra class. As a refresher: log is short for logarithm, which helps people decipher exponents that make numbers squared or cubed or even something higher. So 2⁵ is 32, but expressed logarithmically, it would look like log₂(32)=5. While it may seem like a mouthful, logarithms are crucial when numbers get much larger.
The Schönhage-Strassen method is very fast, Harvey says. If a computer were to use the squared method taught in school on a problem where two numbers had a billion digits each, it would take months. A computer using the Schönhage-Strassen method could do so in 30 seconds.
But if the numbers keep rising into the trillions and beyond, the algorithm developed by Harvey and collaborator Joris van der Hoeven at École Polytechnique in France could find solutions faster than the 1971 Schönhage-Strassen algorithm.
“It means you can do all sorts of arithmetic more efficiently, for example division and square roots," he says. "You could also calculate digits of pi more efficiently than before. It even has applications to problems involving huge prime numbers.
“People have been hunting for such an algorithm for almost 50 years," he continues. It was not a forgone conclusion that someone would eventually be successful. It might have turned out that Schönhage and Strassen were wrong, and that no such algorithm is possible.
“But now we know better,” he says.
No comments:
Post a Comment