• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Variational quantum semi-supervised classifier based on label propagation

    2023-09-05 08:47:32YanYanHou侯艷艷JianLi李劍XiuBoChen陳秀波andChongQiangYe葉崇強(qiáng)
    Chinese Physics B 2023年7期
    關(guān)鍵詞:李劍

    Yan-Yan Hou(侯艷艷), Jian Li(李劍), Xiu-Bo Chen(陳秀波), and Chong-Qiang Ye(葉崇強(qiáng))

    1College of Information Science and Engineering,ZaoZhuang University,Zaozhuang 277160,China

    2School of Artificial Intelligence,Beijing University of Posts and Telecommunications,Beijing 100876,China

    3School of Cyberspace Security,Beijing University of Posts and Telecommunications,Beijing 100876,China

    4Information Security Center,State Key Laboratory Networking and Switching Technology,Beijing University of Posts and Telecommunications,Beijing 100876,China

    Keywords: semi-supervised learning,variational quantum algorithm,parameterized quantum circuit

    1.Introduction

    Classification,one of the most common problems in machine learning, is devoted to predicting the labels of new input data based on labeled data.Classification algorithms have been widely applied in image processing,speech recognition,and other fields.With the development of artificial intelligence and cloud computing, data in classification tasks are expanding rapidly.Still,most of them are unlabeled and need to be labeled by experienced annotators in advance.As labeling data is expensive and time-consuming work,[1]researchers began to study how to add unlabeled data into the training set and utilize labeled and unlabeled data to build semi-supervised classifiers.Label propagation,a crucial semi-supervised learning method,predicts the labels of unlabeled data based on graphs.Semi-supervised classifiers based on label propagation pay more attention to the internal relevance of data and have good effects on the classification of multiple correlation data.However, as the scale of data grows, creating a graph requires a higher computational cost.Implementing the semi-supervised classifier based on label propagation becomes a challenging task for classical computers.

    Quantum machine learning, the intersection of quantum physics and machine learning,offers potential speed-ups over classical machine learning algorithms.Variational quantum algorithms (VQAs) are the dominant strategy in the noisy intermediate-scale quantum(NISQ)era.It is devoted to building hybrid quantum-classical models, where parameterized quantum circuits construct the cost function of issues, and classical computers train the parameters of quantum circuits by minimizing cost functions.In VQAs,quantum devices focus on classically intractable problems and transfer the problems difficult to implement on quantum devices to classical computers.Therefore, VQAs have lower requirements for quantum resources and have become important methods for implementing quantum machine learning tasks.At present,VQAs have been applied in classification,[2–5]clustering,[6,7]generative models,[8–10]dimensionality reduction,[11–13]etc.

    Quantum systems represent data in the Hilbert space of exponential dimension.Inspired by the advantages of quantum systems in processing high-dimensional data,researchers proposed a series of quantum classification algorithms.[14–17]Considering the high computational complexity of kernel computation,Rebentrostt[18]offered a quantum support vector machine (QSVM) algorithm.This algorithm adopted quantum matrix inversion[19]and a density matrix exponentiation method[20]to implement binary classifier tasks,and achieved exponential speed-ups over corresponding classical classifiers under certain conditions.Schuld[21]proposed a quantum distance-based classifier,which only used Hadamard gates and two single-qubit measurements to implement binary classification tasks.Blank[22]designed a quantum kernel classifier based on quantum swap test operation,which is called a swap test classifier.This classifier achieves good classification accuracies in non-linear classification tasks.

    The label propagation method predicts labels of unlabeled data by minimizing energy function, which is similar to cost function optimization of VQAs.Inspired by the similarity between VQAs and label propagation,we adopt VQAs to design a quantum label propagation method, and further implement a quantum semi-supervised classifier based on predicted labels.Our work has two main contributions.(i)A variable label propagation method based on locally parameterized quantum circuit is designed for the first time.The locally parameterized quantum circuit can be used to implement VQAs with only some unknown parameters.(ii)A classifier based on hybrid Bell andZbases measurement is designed,and this measurement method reduces circuit depth and is more suitable implemented on NISQ devices.We organize the paper as follows.Section 2 gives a review of classical label propagation.Section 3 outlines the variational quantum label propagation method.Section 4 designs a quantum semi-supervised classifier based on hybrid Bell andZbases measurement.Section 5 verifies the accuracies of label propagation and the semisupervised classifier.Finally,we get a conclusion and discuss future research directions.

    2.Review of label propagation

    Label propagation begins with mapping a data set into an undirected weight graph, where nodes represent data and edges reflect the similarities between data.If two data have a great similarity, the edge between them has a higher weight;otherwise, the edge has a lower weight.Labeled data propagates their labels to the neighboring unlabeled data based on undirected weight graphs.Since a graph corresponds to a matrix, matrix operations can be used to implement semisupervised learning.LetD={Dl,Du}denote a training data set,including labeled dataDl={(x1,y1),(x2,y2),...,(xl,yl)}and unlabeled dataDu={xl+1,xl+2,...,xl+u},wherexi ∈Rmis theithdata described bymreal-valued attributes andyi ∈{+1,?1}represents the corresponding label.knearest neighbors method is a common method for constructing undirected weight graphsG=(V,E),whereV={xi}i∈nrepresents nodes,E={ei j}i,j∈ndenotes the edges between the nodesxiandxj,andn=l+u.Ifxjis one ofknearest neighbors ofxi,there is an edge between the nodesxiandxj; otherwise, there are no edges.

    LetW={wij}i,j∈ndenote the weight matrix of edges,

    represents the weight of the edgeei j,whereNk(xi)means the set ofknearest neighbors ofxi.Letf(xi) represent the predicted label ofxi.Similar samples should have similar labels,the more similar the samples, the smaller the difference between the labels,then the energy function of all training data[1]is

    wheredi=∑1wijrepresents the sum of theithrow ofWandD=diag(d1,d2,...,dn)denotes the degree matrix.Equation(2)can be rewritten in the matrix form as

    wheref= () represents the predicted label vector of all training data.fl= (f(x1),...,f(xl)) andfu=(f(xl+1),...,f(xl+u)) correspond to the label vectors of labeled and unlabeled data,respectively.OnceE(f)obtains the minimum value,f(xi)for labeled data will be equal to the correct labelyi,andf(xi)for unlabeled data will be closest to the correct label.Thus,predicting the labels of unlabeled data can be implemented by solving the optimization problem

    whereL=D ?Wis the Laplacian matrix.Figure 1 shows a simple example of label propagation.

    Fig.1.Label propagation.Red nodes represent labeled data belonging to the +1 class.Blue nodes mean labeled data belonging to the ?1 class.White nodes indicate unlabeled data.Panel(a)shows the graph before label propagation.Panel (b) gives the graph after label propagation.After label propagation,labeled data propagate their labels to their neighboring unlabeled data according to the k nearest-neighbors principal.

    3.Variational quantum label propagation method

    In this section, we reformulate the original label propagation and design a variational quantum label propagation(VQLP) method according to the similarity between the energy functionE(f)and cost functions of VQAs.

    3.1.Reformulation of label propagation

    3.2.The overall structure of the VQLP algorithm

    The VQLP algorithm adopts an iterative optimization method to predict the optimal label vector.In each iteration,the work includes evaluation cost function, learning parameters, and predicting label vectors.Figure 2 shows the overall structure of the VQLP algorithm.In the first stage, the state|ψ〉for the incidence matrixBis prepared by conditionally accessing the state of weight matrixW.The label vector|?f(θ)〉 represents labels of labeled data and unlabeled data.As only the labels of unlabeled data are unknown,we design a locally parameterized quantum circuitV(θ)to build the label vector|?f(θ)〉.After preparing|ψ〉and|?f(θ)〉,a hybrid Bell andZbases measurement,called asU2,is applied on|ψ〉and|?f(θ)〉 to construct the cost functionC(θ).The label vector|?f(θ)〉based on initial parameters may not correspond to the correct labels of training data.Thus, the second work is to search for the optimal label vector.The cost function valueC(θ)is transmitted to a classical optimizer and minimized by tuning the parametersθ.OnceC(θ)reaches the maximum iteration numberτor is less than the specified error thresholdετ,the optimal parametersθ?are gotten.V(θ)consists of parameterized quantum circuit for building labels of unlabeled data and unparameterized quantum circuit for building labels of labeled data.In the third stage,parameterized quantum circuitV′(θ?) (partial circuit ofV(θ)) acts on the initial state|0〉 to construct the label vector|?u(θ?)〉 for unlabeled data.Algorithm 1 shows the outline of the VQLP algorithm.

    Fig.2.The overall structure of the VQLP algorithm. U1 acts on the initial state|ψ0〉to construct the state|ψ〉of the incidence matrixB.The parameterized quantum circuit V(θ)acts on the initial state|?0f〉to produce the state|?f(θ)〉of the label vector f.Cost function C(θ)is gotten by performing the unitary operation U2 followed by classical post-processing,where U2 is responsible for computing the Hibert–Schmidt inner product between |ψ〉 and |?f(θ)〉.In each iteration, the classical computer minimizes C(θ) to get the optimized parameters θ.Once the optimal parameter θ?is obtained,the ansatz V′(θ?)acts on the state|0···0〉r to build the label vector|?u(θ?)〉=|···〉for unlabeled data,where|?iu〉represents the ith qubit of|?u(θ?)〉.

    Algorithm 1 Variational quantum label propagation(VQLP)algorithm

    3.3.Construct normalized incidence matrix

    In this subsection, our primary work is to construct the state|ψ〉 for the normalized incidence matrixB.To compensate for scaling effects, we firstly standardize all training data to zero mean and unit variance, then normalize them into unit vectors.represent the amplitude encoding of the training dataxp, where|·| representsl2-norm andxp jmeans thejthelement ofxp.Reference [28] proposed a quantum algorithm for estimating Euclidean distances between quantum states, and this algorithm corresponds to a mapping|p〉|q〉|0〉|p〉|q〉|d2(φxp,φxq)〉,whered2(φxp,φxq) =|φxp ?φxq|2represents the square of the Euclidean distance between|φxp〉 and|φxq〉.The first work of constructing incidence matrix is to prepare the weight matrixWby theknearest-neighbor method.To implement this work, we prepare the superposition stateof Euclidean distances between|φxp〉 and|φxq〉, and searchkminimum of|d2(xp,xq)〉 by the minimum search method.After searchingkminimum values of all training data,the state

    for weight matrixWis constructed, wherewpqdescribes the neighbor relationship betweenxpandxq.Ifxqis one ofknearest neighbors ofxqorxpis one ofknearest neighbors ofxq,wpq=1;otherwise,wpq=0.

    According to Eq.(15), the elements of incidence matrixBcome fromW.The second work is to build the state|ψ〉for the incidence matrixBbased on|ψW〉.We prepare the state

    as input,where register 1 stores the indexes of nodes,and registers 2 and 3 store the indexes ofwpq.Registers 4, 5, and 6 with initial value|0〉|0〉|0〉will store comparison results.The specific steps of building the incidence matrixBare as follows.

    (1) Do comparison operation (UC)[28]on registers 2 and 3, where the comparison result is stored in registers 4 and 6,and yield the state

    Measure register 6 withZbasis, if the measurement result is 0,then get the state

    (2)Extract the elements of the incidence matrixB.Equality comparison is firstly applied on registers 1 and 2.If registers 1 and 2 have the same value, register 5 is set to|1〉.If register 5 is still|0〉after performing the equality comparison,another equality comparison is performed on registers 1 and 3.If registers 1 and 3 have the same value,register 4 is set to|1〉.Through two equality comparisons,we get the state

    (3)Perform CNOT operation on registers 5 and 4,where register 4 serves as control register, then measure register 5 withZbasis.If the measurement result is 1,the state

    (4) Act Hadamard operation on register 4, then measure it into|1〉.The system yields the state

    corresponding to the normalized incidence matrixB.Figure 3 shows the circuit implementation,and this circuit corresponds to the moduleU1in Fig.1.

    Fig.3.Circuit of constructing the incidence matrix.Reg.1–Reg.6 represent registers 1–6.The circuit in the dotted box (a) represents the comparison operation.The circuit in the dotted box (b) represents the equality comparison between Reg.1 and Reg.2, and the circuit in the dotted box(c)denotes the equality comparison between Reg.1 and Reg.3.The output state|ψ〉corresponds to the incidence matrixB.

    3.4.Build label vector

    In this subsection, our primary work is to build the state|?f(θ)〉of the predicted label vectorf=(fl,fu).As the correct label vector(y1,...,yl)of labeled data is known,the predicted label vectorfldoes not need to be updated in the optimization process.be the amplitude encoding offl, whereyi ∈{+1,?1}.As the predicted label vectorfuis unknown,we adopt parameterized quantum circuit(ansatz)to prepare the state

    is gotten.In general, the number of unlabeled data is not less than the number of labeled data in semi-supervised learning.To make two terms of|(θ)〉meet a particular proportion, the controlled rotation operationRy(2α) is applied on added register 9 conditional on register 8 being|1〉, where.This operation yields the state

    Subsequently,measure register 9 with theZbasis,if the measurement result is 0,get the state

    whereyi ∈{+1,?1}andAs register 8 is redundant, apply Hadamard operation on register 8 followed by measuring it withZbasis, if the measurement result is 0,finally,get the state

    where the amplitude of|?f(θ)〉is proportional to the predicted label vectorf.Figure 4 shows the circuit of constructing|?f(θ)〉, whereUlandV′(θ) are used to build the labels for labeled and unlabeled data,respectively.As part of the circuit is parameterized,the circuit is called as locally parameterized quantum circuit,corresponding to the moduleV(θ)of Fig.1.

    Fig.4.Circuit of constructing the label vector|?f(θ)〉.Reg.7, Reg.8,and Reg.9 represent registers 7,8,and 9,respectively. Ul means quantum access to labeled vector|?l〉,and V′(θ)is the parameterized quantum circuit for building|?u(θ)〉.

    Many ansatzes can be used to implementV′(θ).Hardware-efficient ansatz uses lower circuit depth and fewer parameters to represent the solution space of problems,[29,30]and we adopt this ansatz to buildV′(θ).Hardware-efficient ansatz adopts a layered layout.Each layer consists of multiple 2-qubit unitary modules,where e?iHμrepresents 1-qubit parameterized gate,Hμis Hermitian operator, andWμrepresents unparameterized gate.Usually,the unitary moduleV(θik)includes multiple 1-qubit parameterized gates, thenθik={,,...,,...}.In the parameters optimization process,the error of|?u(θ?)〉decreases exponentially with the layers of the ansatzV′(θ?) increasing.To get the exact number of layers, we first prepare the ansatz with fewer layers and gradually increase the layers until the ansatz satisfies the specified error tolerance.[31]With increased data scale, Hardware-efficient ansatz shows an exponentially vanishing gradient(barren plateau).The VQLP algorithm adopts the alternating layered layout to solve the barren plateau problem.In this layout, the entangled gates in each layer only act on local qubits,[32]and the cost function is the combination of local functions, so the ansatz uses shallower circuit depth to solve the vanishing gradient problem.Figure 5 shows the circuit implementation,whereandWμare implemented by by rotationRyand CNOT,respectively.

    Fig.5.Parameterized quantum circuit V′(θ)(6-qubit input).This circuit includes l layers,where{q1,....,q6}represents the qubit sequence of |?u(θ)〉.The ith dashed box indicates the unitary operation in the ith layer.Each layer is composed of multiple unitary modules (θ j i ),consisting of single qubit rotations Ry(θ j i )and CNOT gates acting on neighboring qubit pairs.The circuit uses alternating layered layout,and unitary modules of adjacent layers act on alternating qubit pairs.

    3.5.Compute cost function

    After getting the incidence matrix|ψ〉and the label vector|?f(θ)〉,the sequent work is to compute the cost functionC(θ).According toρ1=tr2,3(ρ),the cost function in Eq.(7)can be rewritten as

    which is the Hilbert–Schmidt inner product between|?f(θ)〉and|ψ〉.We adopt the Bell basis measurement method[33]to compute the cost functionC(θ).As|?f(θ)〉is stored in register 7 and|ψ〉is stored in registers 1,2,and 3,Eq.(21)is equal to?→c=(1,1,1,?1)?gdenote the post-processing vector of Bell basis measurement, and the cost function value can be computed byC(θ)=·.

    Fig.6.The circuit of computing the cost function C(θ).Reg.1, Reg.2,and Reg.3 store the incidence matrix |ψ〉, and Reg.6 stores the label vector |?f(θ)〉.Perform CNOT operation on registers 1 and 7, followed by Hadamard on register 7.After measuring the 2-qubit operator CZ on registers 1 and 7,the expectation value〈CZ〉1,7,corresponding to the cost function value C(θ),can be obtained by further classical computation.

    4.Semi-supervised binary classifier

    Quantum label propagation gets the labels of unlabeled data in training data set.In this section, we design a quantum semi-supervised binary classifier based on all training data.Let|φxi〉 represent training data, wherei ∈{1,...,n}andn=l+u.Ifi ∈{1,...,l},|φxi〉 represents labeled data;otherwise,|φxi〉 denotes unlabeled data.f′={f′1,f′2,...,f′n}denotes the predicted label vector got by label propagation,wheref′i=0 meansxibelonging to the +1 class andf′i=1 representsxibelonging to the?1 class.Let|φx?〉be test data,andk?i=|〈φx?|φxi〉|2represents the overlap between|φx?〉and|φxi〉.We adopt the weighted sum of overlaps between test and training data to predict the label

    for the test data|φx?〉, wherec1andc2are the weight coefficients determined by the importance of labeled and unlabeled data.In semi-supervised learning,labeled data is more important than unlabeled data,thenc1>c2.

    Quantum swap test operation can be used to compute|〈φx?|φxi〉|2, so it is also applied to predict the labelf?.However,this method needs multiple Toffoli gates,and the circuit has a higher depth as input qubits increase.Thus,implementing the classifier based on quantum swap test operation is not easy for current quantum devices.The Bell basis measurement method[33]is a novel method for computing the overlap of quantum states, which has a shallower circuit depth and is easy to implement on NISQ devices.Inspired by this method,we design a hybrid Bell andZbases measurement method to buildf?.

    Given the superposition state

    (1)Perform CNOT operation on registersAandB,where registerAserves as control register,then get the state

    (2)Apply Hadamard operation on registerAand get

    Through the first two steps, quantum state overlaps have been stored in the amplitudes of|ω2〉〈ω2|.

    (3)Measure the expectation value of controlled-Zoperator for registersAandBandσZoperator for registerD,and this operation can be written as〈ω2|CZABσDZ|ω2〉, whereCZABdenotes controlled-Zoperator on registersAandB, andσDZdenotesσZoperator on registerD.AsCZAB=(|00〉〈00|+|01〉〈01|+|10〉〈10|?|11〉〈11|)ABandσDZ=(|0〉〈0|?|1〉〈1|)D,the expectation value is

    where the superscriptsABDof operators are omitted for simplicity.When=0,registerDis|0〉;when=1,registerDis|1〉.According to the two values of registerD,Eq.(27)can be rewritten as

    Quantum swap test can be used to compute the overlap between two quantum states withnqubits, where the qubits that form states can be entangled.According to the circuit equivalence of the Bell basis measurement and quantum swap test,[22]the Bell basis measurement in the quantum semisupervised classifier can be generalized to compute the overlap between|φx?〉A(chǔ)and|φxi〉Bwithnqubits.Figure 7 shows the circuit implementation.

    Fig.7.The circuit of the semi-supervised binary classifier.Reg.A stores|φx?〉.Reg.B, Reg.C, and Reg.D store |φxi〉, the index |i〉, and the corresponding label |fi〉, respectively.CNOT is performed on registers A and B, followed by Hadamard operation acts on register A.The label vector f?is gotten by measuring the expectation value of controlled-Z operator for registers A,B,and the expectation value of σZ operator for register D.

    The quantum semi-supervised classifier based on the hybrid Bell andZbases measurement, shown in Fig.7, contains two layers.CNOT gates implement the first layer, and Hadamard gates implement the second layer.As CNOT gates act on different qubits, all CNOT gates can be executed in parallel.Similarly, Hadamard gates act on different qubits and can also be performed in parallel.No matter how many qubits of training or test data contains, the quantum semisupervised classifier based on hybrid Bell andZbases measurement needs only two layers,independent of the size of the classification problem.According to the method proposed in Ref.[24], the quantum semi-supervised classifier can also be implemented by swap test operation,shown in Fig.8.This circuit requires multiple CNOT and Toffoli gates, which cannot be implemented in parallel.One Toffoli gate requires multiple one-qubit and two-qubit gates to implement.Figure 9 shows Toffoli gate decomposition, where the Toffoli gate is implemented with CNOT, Hadamard,T, andT?gates.The quantum semi-supervised classifier based on the swap test operation needs 14 layers when training data or test data contains one qubit and 14mlayers when training data or test data hasmqubits.The circuit depth is linear with the size of the classification problem.Compared with the quantum semi-supervised classifier based on swap test operation,our proposed quantum semi-supervised classifier is more suitable for implementation on near-term quantum devices.

    The quantum semi-supervised classifier based on hybrid Bell andZbases measurement requires more complex classical post-processing operations, which scale linearly with the system sizel.From the realization difficulty, the classical post-processing with higher complexity is more easily implemented than the quantum circuit whose depth scales linearly withl.The speed-up of the quantum semi-supervised classifier does not come from transmitting the exponential complexity work to the classical computer but from parallel quantum operations.To reduce the complexity of classical post-processing,we can convert the complex classical post-processing into quantum operations.Figure 10 shows the circuit implementation.This circuit adds an ancilla register and Toffoli gates,and the expectation value ofσZoperator for the ancilla register replaces the expectation value of the controlled-Zoperator for registersAandB.Compared with the circuit in Fig.7.This circuit has a higher depth but simpler classical post-processing.

    Fig.10.The quantum semi-supervised binary classifier with simplified classical post processing.Compared with circuit Fig.7, this circuit converts the complex classical post-processing of the semi-supervised binary classifier into the quantum operation, where f?is obtained by measuring the expectation value of the σZ operator for the ancilla register(|0〉)and register D.The circuit in the dashed box has the same function as that in Fig.7.

    5.Numerical simulations and performance analysis

    In this section, we adopt the Iris dataset to test the performances of the quantum semi-supervised binary classifier based on hybrid Bell andZbases.The Iris dataset contains 150 samples, where samples 0–49 belong to class 1, samples 50–99 belong to class 2, and samples 100–149 belong to class 3.Classifying samples of classes 2 and 3 is the most difficult task for the Iris dataset, and we mainly analyze this task.We first choose 8 samples{x0,x1,x2,x3,x4,x5,x6,x7}to demonstrate label propagation,where samples{x0,x2,x4,x6}belong to class 2 and samples{x1,x3,x5,x7}belong to class 3.Let samples{x0,x1,x2,x3}represent labeled data.Assume the labels of samples{x4,x5,x6,x7}are unlabeled data, label propagation is to predict the labels of samples{x4,x5,x6,x7}.If the samplexibelongs to class 2, the labelyiis 1, and if the samplexibelongs to class 3, the labelyiis?1.Figure 11 shows the predicted labels of samples{x4,x5,x6,x7}.Simulation results show that the predicted labelf(xi)is close to the correct labelyi.represent the correct normalized label vector for samples{x4,x5,x6,x7},L=[f(x4),f(x5),f(x6),f(x7)] represents the predicted label vector,andS=L·LTdenotes the accuracy of the predicted labels.Figure 12 exhibits the accuracySunder different initial parameters.We can find that the accuracySreaches 99.5%after about 90 iterations regardless of the initial parameters.

    Fig.11.Predicted labels versus the number of iterations.The classical optimizer is the COBYLA method.Curves represent the estimated labels of unlabeled data.

    Fig.12.Accuracy versus the number of iterations.Curves represent the accuracies under four different random initialized parameters.

    Our second work is to demonstrate the accuracy of the semi-supervised binary classifier.Figure 13 presents the result for classifying samples of classes 1 and 2,where the predicted label greater than 0 represents the sample belonging to class 1;otherwise,the sample belongs to class 2.The result shows all samples of classes 1 and 2 can be correctly classified.Figure 14 shows the result for classifying samples of classes 1 and 3,where all samples can be correctly classified.Figure 15 shows the classification result of classifying samples of classes 2 and 3.As samples of classes 2 and 3 are not linearly separable,a few classification errors occurred in this task.

    Fig.13.Classification results of semi-supervised binary classifier(classes 1 and 2).The horizontal axis represents the index i of samples,and the vertical axis shows the predicted label f(xi).Stars represent samples from class 1,and dots represent samples from class 2.

    Fig.14.Classification results of semi-supervised binary classifier(classes 1 and 3).Stars represent samples from class 1, and crosses represent samples from class 3.

    Fig.15.Classification results of semi-supervised binary classifier(classes 2 and 3).Dots represent samples from class 2, and crosses represent samples from class 3.

    Table 1 shows mean accuracies and standard deviations for the quantum semi-supervised binary classifier based on Bell andZbases(semi-supervised classifier)and the quantum classifier based on swap test operation(swap test classifier)[22]under 10 random initial parameters, where each sample only contains two features.The cells of the format±m(xù)ean accuracy (standard deviation).For the semi-supervised classifier,the mean accuracy of classifying classes 1 and 2 is 100%,and the mean accuracy is also 100%for classifying classes 1 and 3.The mean accuracy of classifying classes 2 and 3 is 90.90%.Still, this accuracy of the semi-supervised classifier is higher than that of the swap test classifier.

    Table 1.Classification accuracy and standard deviation(two features).

    To improve the separability of classes 2 and 3,we extract four features from the Iris dataset.Table 2 shows mean accuracies and standard deviations for samples containing four features.Compared with Tables 1 and 2,we can find that classifying the Iris dataset containing four features has higher accuracy than classifying the Iris dataset containing two features.

    6.Conclusions and future work

    In this paper, we adopt a quantum method to implement a quantum semi-supervised binary classifier.By converting the incidence matrix and label vector into quantum states,we design a variational quantum label propagation (VQLP)method.This method utilizes locally parameterized quantum circuits to reduce parameters required in the optimization and is more suitable for implementation on quantum devices.Based on the predicted labels, we further design a quantum semi-supervised classifier based on hybrid Bell andZbases measurement, which has a shallower circuit depth compared with the swap test classifier.Simulation results show that the VQLP method can predict the labels of unlabeled data with 99.5% accuracy, and the quantum semi-supervised classifier has higher classification accuracy than the swap test classifier.This algorithm assumes quantum operations under noiseless environments.However,hardware noise exists when the semisupervised learning algorithms are implemented on near-term quantum devices.The quantum semi-supervised classifier under noise environments needs to be researched in future work.Besides,we can further investigate how to adopt multiple data copies to build quantum semi-supervised classifiers based on kernel functions.The design method of theknearest neighbor graph in VQLP provides a novel idea for creating quantum machine learning models based on graphs.Simultaneously,this research promotes the development of VQAs in quantum semi-supervised learning fields.

    Acknowledgements

    Project supported by the Open Fund of Advanced Cryptography and System Security Key Laboratory of Sichuan Province (Grant No.SKLACSS-202108), the National Natural Science Foundation of China (Grant No.U162271070),Scientific Research Fund of Zaozhuang University (Grant No.102061901).

    猜你喜歡
    李劍
    Efficient semi-quantum secret sharing protocol using single particles
    The coupled deep neural networks for coupling of the Stokes and Darcy–Forchheimer problems
    兩塊寶石
    Probabilistic quantum teleportation of shared quantum secret
    Efficient quantum private comparison protocol utilizing single photons and rotational encryption
    Constructing the three-qudit unextendible product bases with strong nonlocality
    Efficient quantum private comparison protocol based on one direction discrete quantum walks on the circle
    Quantum partial least squares regression algorithm for multiple correlation problem
    Improving the purity of heralded single-photon sources through spontaneous parametric down-conversion process*
    Phase-sensitive Landau–Zener–St¨uckelberg interference in superconducting quantum circuit?
    日本av免费视频播放| 久久国内精品自在自线图片| 亚洲天堂av无毛| 一区在线观看完整版| 黄片无遮挡物在线观看| 黄色配什么色好看| 校园人妻丝袜中文字幕| 国产成人精品久久久久久| 边亲边吃奶的免费视频| 久久久亚洲精品成人影院| 2022亚洲国产成人精品| 色94色欧美一区二区| 日本爱情动作片www.在线观看| 国产黄色免费在线视频| 亚洲精品自拍成人| 日韩伦理黄色片| 一级片'在线观看视频| 国产成人a∨麻豆精品| 欧美xxxx性猛交bbbb| 韩国av在线不卡| 久热久热在线精品观看| 一级毛片aaaaaa免费看小| 成人免费观看视频高清| 人妻夜夜爽99麻豆av| 国产精品成人在线| 国产亚洲5aaaaa淫片| 免费人妻精品一区二区三区视频| 一个人看视频在线观看www免费| 人妻少妇偷人精品九色| 天堂俺去俺来也www色官网| 美女视频免费永久观看网站| 精品亚洲乱码少妇综合久久| 人妻 亚洲 视频| 国产精品一区二区在线观看99| 亚洲av国产av综合av卡| 中文精品一卡2卡3卡4更新| 久久综合国产亚洲精品| 婷婷色综合www| 性色av一级| 高清在线视频一区二区三区| 日韩一区二区视频免费看| a级片在线免费高清观看视频| 成年人免费黄色播放视频 | 日本av手机在线免费观看| 高清av免费在线| 国产欧美日韩精品一区二区| 另类亚洲欧美激情| 大陆偷拍与自拍| 国产精品一区二区在线不卡| 日日爽夜夜爽网站| 国产免费一区二区三区四区乱码| 女性被躁到高潮视频| 一二三四中文在线观看免费高清| videos熟女内射| 赤兔流量卡办理| 九草在线视频观看| 汤姆久久久久久久影院中文字幕| 久久99蜜桃精品久久| 26uuu在线亚洲综合色| 高清av免费在线| 久久6这里有精品| 人妻系列 视频| 精品亚洲成国产av| 欧美激情国产日韩精品一区| 亚洲欧美精品自产自拍| 亚洲av国产av综合av卡| 国产黄频视频在线观看| 色吧在线观看| av网站免费在线观看视频| 国产深夜福利视频在线观看| 久久av网站| 亚洲电影在线观看av| 在线观看av片永久免费下载| 成人国产av品久久久| 成年人午夜在线观看视频| 国产亚洲一区二区精品| 超碰97精品在线观看| 中文字幕免费在线视频6| 在线观看一区二区三区激情| 新久久久久国产一级毛片| 欧美精品高潮呻吟av久久| 大香蕉久久网| 午夜影院在线不卡| av专区在线播放| 国产伦在线观看视频一区| 成人亚洲欧美一区二区av| 22中文网久久字幕| 97超视频在线观看视频| 91久久精品国产一区二区三区| 免费观看性生交大片5| 亚洲av免费高清在线观看| 国产视频内射| 亚洲精品一区蜜桃| 曰老女人黄片| 国产免费福利视频在线观看| 天天躁夜夜躁狠狠久久av| www.色视频.com| 久久久精品94久久精品| 男男h啪啪无遮挡| 99热这里只有是精品在线观看| 久久久久久久久久久丰满| 国产欧美日韩精品一区二区| 一级毛片黄色毛片免费观看视频| 美女主播在线视频| 一本—道久久a久久精品蜜桃钙片| 亚洲伊人久久精品综合| 久久人人爽人人爽人人片va| 成人特级av手机在线观看| 日日摸夜夜添夜夜爱| 一本大道久久a久久精品| 久久精品久久久久久久性| 热re99久久精品国产66热6| 亚洲av在线观看美女高潮| 十八禁高潮呻吟视频 | 在线观看免费日韩欧美大片 | 插逼视频在线观看| 国产成人精品久久久久久| 午夜av观看不卡| 日韩精品有码人妻一区| 国产精品麻豆人妻色哟哟久久| 国产成人精品一,二区| 国产精品99久久久久久久久| 在线播放无遮挡| 天天操日日干夜夜撸| 亚洲精品色激情综合| 丰满少妇做爰视频| av在线老鸭窝| 天堂中文最新版在线下载| 成人国产av品久久久| 两个人免费观看高清视频 | 久久久国产一区二区| 日本爱情动作片www.在线观看| 亚洲综合精品二区| 国产老妇伦熟女老妇高清| 国产精品免费大片| 春色校园在线视频观看| 日日啪夜夜爽| 日本黄色日本黄色录像| 久久久国产精品麻豆| 黄色怎么调成土黄色| 你懂的网址亚洲精品在线观看| 免费观看av网站的网址| 观看美女的网站| 午夜福利影视在线免费观看| 女的被弄到高潮叫床怎么办| 欧美一级a爱片免费观看看| 国产精品99久久久久久久久| 男女边摸边吃奶| 一本色道久久久久久精品综合| 男男h啪啪无遮挡| 国产精品国产三级国产专区5o| 在线观看免费视频网站a站| 欧美日韩一区二区视频在线观看视频在线| 日本wwww免费看| 久久99热这里只频精品6学生| 夜夜骑夜夜射夜夜干| 免费在线观看成人毛片| 自线自在国产av| 欧美老熟妇乱子伦牲交| 蜜臀久久99精品久久宅男| 国产精品欧美亚洲77777| 亚洲av免费高清在线观看| 国产精品久久久久久久久免| 成人无遮挡网站| 自拍偷自拍亚洲精品老妇| 2018国产大陆天天弄谢| 日韩免费高清中文字幕av| 女的被弄到高潮叫床怎么办| 国产毛片在线视频| 久久久午夜欧美精品| 国产男女内射视频| 好男人视频免费观看在线| 99热这里只有精品一区| 99国产精品免费福利视频| 亚洲精品自拍成人| 视频区图区小说| 日本与韩国留学比较| 黑人巨大精品欧美一区二区蜜桃 | 91aial.com中文字幕在线观看| 18禁裸乳无遮挡动漫免费视频| 久久人人爽av亚洲精品天堂| 老司机影院成人| 老熟女久久久| 欧美日韩一区二区视频在线观看视频在线| 国产成人91sexporn| 国产综合精华液| 91精品伊人久久大香线蕉| 青春草国产在线视频| 亚洲,欧美,日韩| 久久99一区二区三区| 久久精品久久久久久噜噜老黄| 啦啦啦中文免费视频观看日本| 久久久久久久国产电影| 99国产精品免费福利视频| 蜜臀久久99精品久久宅男| 边亲边吃奶的免费视频| 亚洲图色成人| kizo精华| 婷婷色麻豆天堂久久| 色哟哟·www| 亚洲综合精品二区| 免费黄网站久久成人精品| 人妻一区二区av| 三级国产精品欧美在线观看| 国产精品一区二区三区四区免费观看| 欧美xxxx性猛交bbbb| 性色av一级| h视频一区二区三区| 99视频精品全部免费 在线| 少妇人妻 视频| 啦啦啦视频在线资源免费观看| 青春草国产在线视频| 两个人免费观看高清视频 | 日本与韩国留学比较| 日韩av免费高清视频| 91久久精品国产一区二区三区| av.在线天堂| 99久久中文字幕三级久久日本| 婷婷色综合www| 乱码一卡2卡4卡精品| av天堂中文字幕网| 欧美精品一区二区免费开放| 在线观看国产h片| 国产精品久久久久久精品电影小说| 自线自在国产av| 成人午夜精彩视频在线观看| av福利片在线观看| av播播在线观看一区| 2018国产大陆天天弄谢| 啦啦啦中文免费视频观看日本| 女性被躁到高潮视频| 精品亚洲成国产av| 亚洲欧美一区二区三区黑人 | 极品人妻少妇av视频| 一级毛片我不卡| 久久国产精品男人的天堂亚洲 | videossex国产| 久久精品国产亚洲网站| 黑人猛操日本美女一级片| 亚洲av福利一区| 少妇猛男粗大的猛烈进出视频| 国产在视频线精品| 另类亚洲欧美激情| 国产一区二区在线观看日韩| 午夜福利影视在线免费观看| 午夜免费男女啪啪视频观看| 一区在线观看完整版| 大片免费播放器 马上看| 中文字幕人妻熟人妻熟丝袜美| 九九在线视频观看精品| 亚洲欧洲日产国产| 免费在线观看成人毛片| 熟女av电影| 免费观看性生交大片5| 一级黄片播放器| 久久婷婷青草| 久久免费观看电影| 免费观看无遮挡的男女| 国产一区有黄有色的免费视频| av网站免费在线观看视频| 成人漫画全彩无遮挡| 三上悠亚av全集在线观看 | 精品久久久久久久久亚洲| 亚洲第一区二区三区不卡| 亚洲国产日韩一区二区| 街头女战士在线观看网站| 青青草视频在线视频观看| 最近2019中文字幕mv第一页| 日本黄色日本黄色录像| 亚洲国产精品专区欧美| 亚洲av免费高清在线观看| 少妇精品久久久久久久| 波野结衣二区三区在线| 18禁裸乳无遮挡动漫免费视频| 欧美精品亚洲一区二区| 日韩在线高清观看一区二区三区| 午夜av观看不卡| 午夜福利视频精品| 国产伦精品一区二区三区四那| 欧美精品一区二区免费开放| 免费观看性生交大片5| 伊人久久国产一区二区| 人妻夜夜爽99麻豆av| 成人影院久久| 99热6这里只有精品| 熟女av电影| 看非洲黑人一级黄片| 日韩中文字幕视频在线看片| 九草在线视频观看| 欧美日韩视频精品一区| 少妇裸体淫交视频免费看高清| 狂野欧美激情性bbbbbb| 欧美高清成人免费视频www| 男女边摸边吃奶| 激情五月婷婷亚洲| 国产老妇伦熟女老妇高清| 丝袜喷水一区| 九九在线视频观看精品| 色哟哟·www| 国产精品一区www在线观看| 新久久久久国产一级毛片| 夫妻午夜视频| 午夜福利影视在线免费观看| 成人午夜精彩视频在线观看| 亚洲精品日韩av片在线观看| 久久6这里有精品| 国产乱人偷精品视频| 久久人妻熟女aⅴ| 激情五月婷婷亚洲| 男人狂女人下面高潮的视频| 三级经典国产精品| 极品人妻少妇av视频| 男男h啪啪无遮挡| 免费黄频网站在线观看国产| 制服丝袜香蕉在线| 国产日韩一区二区三区精品不卡 | 一本久久精品| 国产色爽女视频免费观看| 精品一区二区三卡| 人妻夜夜爽99麻豆av| 一级片'在线观看视频| 久久精品国产亚洲网站| 国产精品一区二区三区四区免费观看| 久久久久久久久大av| 男人和女人高潮做爰伦理| 在线观看www视频免费| 日韩欧美精品免费久久| 老女人水多毛片| 91午夜精品亚洲一区二区三区| 国产视频首页在线观看| 日本黄色片子视频| 99国产精品免费福利视频| 中文欧美无线码| 亚洲国产最新在线播放| 日韩欧美一区视频在线观看 | 男女边吃奶边做爰视频| 亚洲精品成人av观看孕妇| 在线精品无人区一区二区三| 亚洲精品aⅴ在线观看| 熟妇人妻不卡中文字幕| 久久国产精品大桥未久av | 久久精品久久久久久久性| 久久女婷五月综合色啪小说| 最近最新中文字幕免费大全7| 99热这里只有是精品在线观看| 欧美三级亚洲精品| 成人漫画全彩无遮挡| 如何舔出高潮| 制服丝袜香蕉在线| 亚洲欧洲日产国产| 青春草国产在线视频| 国产一区二区三区综合在线观看 | 午夜91福利影院| av在线app专区| 免费观看在线日韩| 亚洲内射少妇av| 超碰97精品在线观看| 久久久久久久久久久免费av| 亚洲av福利一区| 亚洲情色 制服丝袜| 午夜激情久久久久久久| 久久久久国产精品人妻一区二区| 日韩,欧美,国产一区二区三区| 欧美精品国产亚洲| 久久99热6这里只有精品| 99视频精品全部免费 在线| 中国国产av一级| 国产乱人偷精品视频| 久久久久久久久久久久大奶| 黄色怎么调成土黄色| 成人综合一区亚洲| 中文欧美无线码| 精品少妇内射三级| 精品久久久久久久久av| 久久午夜福利片| 国产淫语在线视频| av免费观看日本| 国产成人a∨麻豆精品| a级一级毛片免费在线观看| 国产深夜福利视频在线观看| 五月开心婷婷网| 国产一区亚洲一区在线观看| 国产精品一二三区在线看| 免费av不卡在线播放| 视频中文字幕在线观看| 中文字幕久久专区| 成人黄色视频免费在线看| 简卡轻食公司| 大陆偷拍与自拍| 啦啦啦视频在线资源免费观看| 国产av一区二区精品久久| 亚洲美女视频黄频| 日韩一区二区三区影片| 午夜福利在线观看免费完整高清在| 91精品国产国语对白视频| 少妇被粗大的猛进出69影院 | 丰满迷人的少妇在线观看| 欧美 日韩 精品 国产| 欧美日韩综合久久久久久| 午夜免费观看性视频| 不卡视频在线观看欧美| 亚州av有码| 亚洲成人一二三区av| 色网站视频免费| 国产男女内射视频| 亚洲精品一区蜜桃| 欧美日韩视频精品一区| 中文乱码字字幕精品一区二区三区| 大香蕉久久网| 国产精品女同一区二区软件| 男女无遮挡免费网站观看| 一级毛片 在线播放| 女的被弄到高潮叫床怎么办| freevideosex欧美| 大香蕉97超碰在线| 国产精品一区二区三区四区免费观看| 久久午夜综合久久蜜桃| 在线观看免费视频网站a站| 26uuu在线亚洲综合色| 精品国产国语对白av| 午夜视频国产福利| 一区二区av电影网| 国产日韩欧美视频二区| 免费观看av网站的网址| 嫩草影院入口| 日本黄色日本黄色录像| 午夜免费鲁丝| a级片在线免费高清观看视频| 多毛熟女@视频| 精品久久久噜噜| 亚洲精品乱久久久久久| 久久精品国产亚洲av天美| 欧美日韩视频精品一区| 啦啦啦啦在线视频资源| 国产极品天堂在线| 男人爽女人下面视频在线观看| 国产高清不卡午夜福利| 免费观看无遮挡的男女| 日本黄色日本黄色录像| 在线观看美女被高潮喷水网站| 欧美高清成人免费视频www| 在线观看三级黄色| 亚洲精品一区蜜桃| 亚洲国产精品专区欧美| 亚洲精品一区蜜桃| 成人国产麻豆网| 熟妇人妻不卡中文字幕| 日韩精品有码人妻一区| 久久久久久久国产电影| 精品久久久久久久久亚洲| 久热久热在线精品观看| 只有这里有精品99| 免费在线观看成人毛片| 国产黄色免费在线视频| 久久精品夜色国产| 国产成人精品无人区| 精品久久久精品久久久| 欧美+日韩+精品| 9色porny在线观看| 91精品一卡2卡3卡4卡| 免费av中文字幕在线| 精品亚洲乱码少妇综合久久| 亚洲综合精品二区| 一级二级三级毛片免费看| 国产午夜精品一二区理论片| 亚洲av福利一区| 观看免费一级毛片| 精品一区二区免费观看| 亚洲av不卡在线观看| av播播在线观看一区| 中国三级夫妇交换| 久久热精品热| 国产亚洲精品久久久com| 美女脱内裤让男人舔精品视频| 91精品国产国语对白视频| a级片在线免费高清观看视频| 熟女电影av网| 国产熟女午夜一区二区三区 | 老司机影院成人| 亚洲精品aⅴ在线观看| 久久久久网色| 三级国产精品欧美在线观看| 好男人视频免费观看在线| 精品少妇黑人巨大在线播放| 亚洲精品久久午夜乱码| 伊人久久精品亚洲午夜| 啦啦啦视频在线资源免费观看| 久久久久国产网址| 国产av国产精品国产| 中国美白少妇内射xxxbb| 自拍欧美九色日韩亚洲蝌蚪91 | 国产黄频视频在线观看| 99热6这里只有精品| 亚洲精品自拍成人| 丝袜脚勾引网站| 日本免费在线观看一区| 精品国产国语对白av| 精品酒店卫生间| 9色porny在线观看| 成人午夜精彩视频在线观看| 久久久久久伊人网av| 如何舔出高潮| 男女边摸边吃奶| 美女cb高潮喷水在线观看| 国产精品.久久久| 狂野欧美激情性xxxx在线观看| 在线观看免费视频网站a站| 精品午夜福利在线看| 在线观看人妻少妇| 日韩大片免费观看网站| 新久久久久国产一级毛片| 国产一区二区在线观看av| 午夜精品国产一区二区电影| 国产黄频视频在线观看| 建设人人有责人人尽责人人享有的| 国产永久视频网站| 久久久久精品性色| 中文天堂在线官网| 亚洲国产精品成人久久小说| 永久网站在线| 日本欧美国产在线视频| av天堂中文字幕网| 最近中文字幕2019免费版| av免费观看日本| 乱系列少妇在线播放| 成人毛片60女人毛片免费| 午夜福利影视在线免费观看| 久久久国产欧美日韩av| 亚洲av国产av综合av卡| 建设人人有责人人尽责人人享有的| 街头女战士在线观看网站| 日本黄色片子视频| 国产精品偷伦视频观看了| 成人二区视频| 亚洲av欧美aⅴ国产| 亚洲人与动物交配视频| 久久久午夜欧美精品| 亚洲av综合色区一区| 中文欧美无线码| 国产无遮挡羞羞视频在线观看| 日韩一本色道免费dvd| a 毛片基地| 大陆偷拍与自拍| 亚洲婷婷狠狠爱综合网| 亚洲成色77777| 男男h啪啪无遮挡| 成人免费观看视频高清| 亚洲国产欧美在线一区| 国产成人精品一,二区| 日韩中文字幕视频在线看片| 国产色爽女视频免费观看| 成人国产av品久久久| 亚洲图色成人| 欧美日韩精品成人综合77777| 老司机影院成人| 建设人人有责人人尽责人人享有的| 久久久久国产网址| 最近中文字幕高清免费大全6| 亚洲在久久综合| 精品少妇久久久久久888优播| 欧美老熟妇乱子伦牲交| 国产高清国产精品国产三级| 久久精品国产亚洲av涩爱| 国产免费一区二区三区四区乱码| 亚洲第一区二区三区不卡| 下体分泌物呈黄色| 一区二区三区精品91| 亚洲国产欧美日韩在线播放 | 欧美xxxx性猛交bbbb| 婷婷色麻豆天堂久久| 天堂俺去俺来也www色官网| 国产精品国产三级专区第一集| 妹子高潮喷水视频| 免费播放大片免费观看视频在线观看| 乱码一卡2卡4卡精品| 免费久久久久久久精品成人欧美视频 | 亚洲欧美日韩另类电影网站| 国产av精品麻豆| 国产精品久久久久成人av| 国产免费福利视频在线观看| 国产 精品1| 一区二区三区精品91| 国产欧美日韩一区二区三区在线 | 夫妻午夜视频| 欧美人与善性xxx| 六月丁香七月| 夜夜看夜夜爽夜夜摸| 欧美xxxx性猛交bbbb| 久久国产乱子免费精品| 国产乱人偷精品视频| 自拍欧美九色日韩亚洲蝌蚪91 | 高清不卡的av网站| 久久热精品热| 国产亚洲av片在线观看秒播厂| 日本av手机在线免费观看| 天天操日日干夜夜撸| 丝瓜视频免费看黄片| 国产精品一区二区在线不卡| 国产免费福利视频在线观看| 九九爱精品视频在线观看| 丰满迷人的少妇在线观看| 亚洲av日韩在线播放| 亚洲人成网站在线播| 成人18禁高潮啪啪吃奶动态图 | 亚洲欧美精品自产自拍| 99热6这里只有精品| 日韩伦理黄色片| 久久狼人影院| 18禁在线播放成人免费| 免费av不卡在线播放| 亚洲精品日韩av片在线观看| 欧美日韩亚洲高清精品| 日韩三级伦理在线观看| 蜜桃在线观看..| 高清不卡的av网站|