Quantum algorithm

1, Deutsch-Jozas algorithm

In many quantum algorithms, Deutsch - Jozas algorithm was the first proposed, the simplest and effective algorithm about probability, and is also the quantum algorithm currently has been verified by NMR lab. Suppose there is a function , in order to judge it is a constant function or balance function, in the classical algorithm to accurately solve the problem, it need the N=pow(2,n) times of calculation, but if you use calculation of quantum algorithm issued by Deutsch, it can solve correctly only through one time of calculation. Thus, quantum algorithm can judge function properties by a single calculation. Compared with the classical algorithm, it has exponent increase in its speed, due to the existing of different arrangement problems, N bit Deutsch problem has kinds of different functions, and each function requires a corresponding unitary transformation. In 2002, for the first time in an ion trap quantum computer (one single bounded 40 ca + ion processor driven by laser pulse), it illustrated the Deutsch - Jozsa algorithm, now, Deutsch - Jozsa algorithm is used as an important decision-making process.

Suppose there is a function , to judge whether he is often function and balance function, in the classical algorithm to to accurately solve the need N=pow (2, n) magnitude calculation, but if use Deutsch given quantum algorithm simply by a computation to be able to accurately resolve. Thus, quantum algorithm just a judge a to determine function properties. Compared with the classical algorithm and its speed is the index increased, due to the presence of different permutation problem, n bit Deutsch total different function, and each function are need a corresponding unitary transformation. In 2002, for the first time in an ion Quantum computer (a driven by the laser pulse is separate a trapped 40Ca + ion processors based on, the Deutsch Jozsa algorithm was demonstrated. Now, Deutsch Jozsa algorithm as an important decision procedure is used.

2, Shor's Algorithm

Shor’s algorithm, named with the mathematician, Peter Shor. it is a kind of quantum algorithm for integer factorization discovered in 1994. Shor algorithm is divided into traditional section and quantum section, and the former is a simplify operation with the operation of the traditional computer to simplify the factorization into searching purpose, the later solves the problem of searching purpose. Take the Period - finding subroutine as an example, this algorithm use the quantum circuits that under a given fixed constant N and an arbitrary constant a, find out the setting value of f (x) = a ^ mod N. Given N, find out Q = 2 ^ Q and accord with N^2≤Q≤2N^2?? (this also indicate Q/r > N simultaneously). Input and output qubits transient memory need to store the superposition state of all values from 0 to Q-1, so that it needs Q qubits respectively. Here used qubits looks twice as more than the required number, which guarantees that even close to the size of the N cycle r / 2, also at least N different x will produce the same f (x).

Shor algorithm can be used to crack the public key encryption method which has been widely used, namely the RSA encryption algorithm. RSA algorithm is on the basis of assumptions we can't be very efficient to decompose a known integer. As far as we know now, this hypothesis is true in the traditional computers; No known traditional algorithm can solve the problem in polynomial time. However, Shor algorithm shows the factorization this problem on the quantum computer can solve very efficient, so a large enough quantum computer can crack the RSA.

3, Simulate Anneal Arithmetic(SAA)

Simulated annealing algorithm is derived from the principle of solid annealing, the earliest application in combinatorial optimization field by Kirkpatrick. SAA based on physical annealing process of the solid matter and general combinatorial optimization problem, similarity between starting from a higher initial temperature, accompanied by falling the temperature parameters, combined with probabilistic kick features randomly in the solution space to find the global optimal solution of the objective function, by giving the search process is a kind of time-varying and ultimately towards zero probability of sudden jump, which can effectively avoid falling into local minimum and finally to the global optimal. Broke the other algorithm can only find the limitations of local optimal solution. Simulated annealing algorithm is based on the Monte - Carlo iterative solution strategy of a kind of random optimization algorithm, in theory algorithm has the global optimization performance of probability, and has been widely applied in engineering, such as VLSI, production scheduling, control engineering, signal processing, machine learning, neural network, etc.

4, Grover

Grover quantum search algorithm ignores the search element properties in the search, and to focus on those elements of the index, so it has a strong versatility. At the same time, it can efficiently decode des cipher system, with accelerate the search for the potential use of public key cryptosystem. The initial state of the Grover algorithm is

Among them, |r> is the edit state that we want to find, Grover search algorithm, the first labeled state to take anti, do ||admard-Waosch operation. Then the |0> state to take anti, and finally do ||admard-Waosch transform.

Evolutionary algorithms

In the field of artificial intelligence, a genetic algorithm (GA) is a search heuristic that mimics the process of natural selection. This heuristic (also sometimes called a metaheuristic) is routinely used to generate useful solutions to optimization and search problems. Genetic algorithms belong to the larger class of evolutionary algorithms (EA), which generate solutions to optimization problems using techniques inspired by natural evolution, such as inheritance, mutation, selection and crossover.

Evolutionary algorithms is a sub-field of evolutionary computing.

Evolution strategies (ES, see Rechenberg, 1994) evolve individuals by means of mutation and intermediate or discrete recombination. ES algorithms are designed particularly to solve problems in the real-value domain. They use self-adaptation to adjust control parameters of the search. De-randomization of self-adaptation has led to the contemporary Covariance Matrix Adaptation Evolution Strategy (CMA-ES). Evolutionary programming (EP) involves populations of solutions with primarily mutation and selection and arbitrary representations. They use self-adaptation to adjust parameters, and can include other variation operations such as combining information from multiple parents.

Gene expression programming (GEP) also uses populations of computer programs. These complex computer programs are encoded in simpler linear chromosomes of fixed length, which are afterwards expressed as expression trees. Expression trees or computer programs evolve because the chromosomes undergo mutation and recombination in a manner similar to the canonical GA. But thanks to the special organization of GEP chromosomes, these genetic modifications always result in valid computer programs.[40]

Genetic programming (GP) is a related technique popularized by John Koza in which computer programs, rather than function parameters, are optimized. Genetic programming often uses tree-based internal data structures to represent the computer programs for adaptation instead of the list structures typical of genetic algorithms. Grouping genetic algorithm (GGA) is an evolution of the GA where the focus is shifted from individual items, like in classical GAs, to groups or subset of items. The idea behind this GA evolution proposed by Emanuel Falkenauer is that solving some complex problems, a.k.a. clustering or partitioning problems where a set of items must be split into disjoint group of items in an optimal way, would better be achieved by making characteristics of the groups of items equivalent to genes. These kind of problems include bin packing, line balancing, clustering with respect to a distance measure, equal piles, etc., on which classic GAs proved to perform poorly. Making genes equivalent to groups implies chromosomes that are in general of variable length, and special genetic operators that manipulate whole groups of items. For bin packing in particular, a GGA hybridized with the Dominance Criterion of Martello and Toth, is arguably the best technique to date.

Interactive evolutionary algorithms are evolutionary algorithms that use human evaluation. They are usually applied to domains where it is hard to design a computational fitness function, for example, evolving images, music, artistic designs and forms to fit users' aesthetic preference.