# Solving Difficult Problems with Probabilistic Computing

According to the concept of computational complexity, mathematical problems have varying degrees of difficulty depending on how easily they can be solved. While a conventional computer can solve some problems (P) in polynomial time—that is, the time required to solve P is a polynomial function of the size of the input—it often fails to solve NP problems that scale exponentially with the problem size and therefore cannot be solved in polynomial time. Therefore, large enough NP problems cannot be solved using conventional computers built on semiconductor devices.

Due to their ability to perform a significant number of operations simultaneously, quantum computers are seen as promising in this regard. As a result, the NP problem solving procedure is accelerated. However, many physical applications are very sensitive to temperature changes.

As a result, the use of quantum computers often requires harsh experimental conditions such as extremely low temperatures, making their manufacture difficult and increasing their cost.

Fortunately, a less well-known and as yet undiscovered alternative to quantum computing is probabilistic computing. To effectively solve NP problems, stochastic computing makes use of so-called "stochastic nanodevices" whose operation depends on thermal fluctuations. Thermal fluctuations, unlike quantum computers, help solve probabilistic computation problems. As a result, probabilistic computing is more practical to use in everyday situations.

Researchers have proven the capacity of probabilistic computation by simulating stochastic nano-device networks to solve specific NP problems, providing much-needed information about this viable alternative. The research, led by Professor Peter Bermel at Purdue University, has been published in the Journal of Photonics for Energy (JPE).

The "Ising model", a standard model, was used by researchers to simulate a wide variety of physical and mathematical topics. The energy operator known as the "Hamiltonian" can also describe NP problems. The Hamiltonian was originally developed to model the interactions of magnetic dipole moments of atomic spins. In essence, solving an NP problem requires solving the associated Ising Hamiltonian.

These problems are solved using probabilistic computing devices consisting of optical parametric oscillators (OPOs) and stochastic circular nanomagnet networks with low thermal barriers.

Researchers have activated such a nanomagnet network using existing fabrication techniques. They then applied this to solve the Ising Hamiltonians of four NP-complete problems from number theory associated with combinatorial optimization. NP-complete problems are problems that do not have an efficient solution algorithm. These include integer linear programming, binary integer linear programming, full coverage and number partitioning.

The theoretical solution of the Ising model (Boltzmann's law) and the simulation results of the first three problems containing 3, 3 and 6 probabilistic bits (p-bits) were in perfect agreement. Simulations of five different full-coverage problems with 3, 6, 9, 12, and 15 p-bits revealed a similar agreement between modeling and theory. This showed that frameworks for probabilistic computing could be scaled.

According to Bermel, “the key to making probabilistic computing a powerful and viable alternative to traditional computing techniques is effective scaling with task size. Both models and experiments will need to be used to determine which strategies are most effective.

The researchers suggest that even if the simulation results given show solid findings for all p-bits (from 3 to 15), parallel algorithms can help further increase the simulation capacity. The transition from nanomagnet to OPO networks can enable effective problem solving where parallelism is not possible. The system can be easily implemented and mapped over an OPO network using existing manufacturing processes such as CMOS technology. As a result, stochastic nanomagnets with low energy barriers for probabilistic computation can eventually be constructed.

According to Sean Shaheen, University of Colorado Boulder professor and JPE Editor-in-Chief, “As artificial intelligence and scientific/enterprise computing increase in scale at a rate that raises significant, if not urgent, concerns about energy consumption and carbon footprint, non-traditional forms of computing hardware development becomes more and more important.”

This work by Zhu, Xi and Bermel offers a realistic path to technology that can handle a significant class of NP-complete problems. The work demonstrates a scalable, energy-efficient solution that has the potential to significantly outperform conventional hardware in handling computationally demanding tasks by ingeniously using nonlinear networks of optical devices to drive Ising computation.

Source: techxplore.com/news

Günceleme: 03/05/2023 14:19