Fujitsu has optimized FPGA architecture that is 1000 to 10,000 times faster than conventional computers for optimization problems and expects commercial prototype in 2018

Fujitsu has collaborated with the University of Toronto to develop a new computing architecture to tackle a range of real-world issues by solving combinatorial optimization problems, which involve finding the best combination of elements out of an enormous set of element combinations.

This architecture employs conventional semiconductor technology with flexible circuit configurations to allow it to handle a broader range of problems than current quantum computing can manage. In addition, multiple computation circuits can be run in parallel to perform the optimization computations, enabling scalability in terms of problem size and processing speed. Fujitsu Laboratories implemented a prototype of the architecture using FPGAs for the basic optimization circuit, which is the minimum constituent element of the architecture, and found the architecture capable of performing computations some 10,000 times faster than a conventional computer.

Through this architecture, Fujitsu Laboratories is enabling faster solutions to computationally intensive combinatorial optimization problems, such as how to streamline distribution, improve post-disaster recovery plans, formulate economic policy, and optimize investment portfolios. It will also make possible the development of new ICT services that support swift and optimal decision-making in such areas as social policy and business, which involve complex intertwined elements.

Fujitsu says it has implemented basic optimisation circuits using an FPGA to handle combinations which can be expressed as 1024 bits, which when using a ‘simulated annealing’ process ran 10,000 times faster than conventional processors in terms of handling the aforementioned thorny combinatorial optimisation problems.

The company says it will work on improving the architecture going forward, and by the fiscal year 2018, it expects “to have prototype computational systems able to handle real-world problems of 100,000 bits to one million bits that it will validate on the path toward practical implementation”.

Background

In society, people need to make difficult decisions under such constraints as limited time and manpower. Examples of such decisions include determining procedures for disaster recovery, optimizing an investment portfolio, and formulating economic policy. These kinds of decision-making problems–in which many elements are considered and evaluated, and the best combination of them needs to be chosen–are called combinatorial optimization problems. With combinatorial optimization problems, as the number of elements involved increases, the number of possible combinations increases exponentially, so in order to solve these problems quickly enough to be of any practical utility for society, there needs to be a dramatic increase in computing performance. The miniaturization that has supported the improvements in computing performance over the last 50 years is nearing its limits (Figure 1), and it is hoped that devices will emerge based on completely different physical principles, such as quantum computers.

SOURCES – Fujitsu, Tech Radar, University of Toronto