• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Hyperparameter on-line learning of stochastic resonance based threshold networks

    2022-08-31 09:56:14WeijinLi李偉進(jìn)YuhaoRen任昱昊andFabingDuan段法兵
    Chinese Physics B 2022年8期
    關(guān)鍵詞:李偉

    Weijin Li(李偉進(jìn)), Yuhao Ren(任昱昊), and Fabing Duan(段法兵)

    College of Automation,Qingdao University,Qingdao 266071,China

    Keywords: noise injection,adaptive stochastic resonance,threshold neural network,hyperparameter learning

    1. Introduction

    Gradient-based optimizers are commonly used for training neural networks,but a necessary condition for successfully implementing the network performance optimization is the continuously differentiable activation functions therein. However,the introduction of the piecewise-linear(e.g.,ReLU)[1,2]or the hard-limited (e.g., binarization) activations[3,4]has gained increasing attention in recent years due to the fact of training much deeper networks or largely saving memory storage and computation.[1–8]Therefore, training the neural network with nondifferentiable or zero-gradient activation functions becomes a tricky problem for applications of available gradient-based optimizers.

    A natural way of training threshold neural networks by the gradient-descent based back-propagation algorithm is approximating the nondifferentiable threshold neuron to a smoothed activation function,[6,9–13]but performing the network testing with threshold activation functions and the well trained network weights. For instance,a conventional method is to substitute the sigmoid function 1/(1+e?λu)with a large parameterλ>0(e.g.,λ=10)for the threshold neuron.[9–11]Unfortunately, the generalization performance of the threshold neural network is unsatisfactory in testing phase,because network weights are inadequately trained for the saturated regimes of the sigmoid function.[6,9–13]

    Recently, the approach of noise injection becomes a useful alternative option to optimize the artificial neural network.[4,6,8,12–22]It is interesting to notice that the benefits of injecting noise in threshold neural networks can be viewed as a type of stochastic resonance effect,[22]because there is also a nonzero noise level for improving the performance of nonlinear systems.[6,12–14,23–33]By injecting artificial noise into the saturated regime of activation function[4]or smoothing the input-output characteristic of hard-limiting neurons with an ensemble of mutually independent noise components,[6,12,13]the non-differentiable threshold networks own a proper definition of the gradients. Then, the gradientbased back-propagation algorithm can successfully perform in finely transformed threshold neural networks. Meanwhile, a stochastic resonance based threshold neural network on the distribution of injected noise into the hidden layer is proposed for effectively recycling the back-propagation training algorithm.[6,12,13]Furthermore,using a stochastic gradient descent (SGD) optimizer, we actually realize a noise-boosted back-propagation training algorithm in this kind of stochastic resonance based threshold networks by adaptively learning of both weights and noise levels.[13,29]

    The introduction of injected noise into the threshold neural network extends the order of dimension of the parameter space, and also poses a challenge to the SGD optimizer for finding a good optima in the non-convex performance landscape of the loss function.[34–38]This is because the SGD optimizer is sometimes trapped in a domain of the flat landscape of the loss function,and the gradients with respect to the noise level and network weights approach zero.[36–38]In this paper,the noise-boosted Adam optimizer is further demonstrated to train the stochastic resonance based threshold neural network more effectively in comparison with the SGD optimizer. It is shown that,with the powerful hyperparameter on-line learning capacity of the Adam optimizer,[34–38]the designed threshold network can attain the much lower mean square error (MSE)for function approximation or the higher accuracy for image classification. Moreover, in the testing phase of the stochastic resonance based threshold neural network,the practical realization of the threshold network trained by the Adam optimizer requires less computational cost than that of the optimized threshold network by the SGD optimizer. The benefits of injected noise manifested by the Adam optimizer in each hidden neuron with different noise levels also conduce to the research of exploiting adaptive stochastic resonance effects in machine learning.

    2. Main results

    2.1. Stochastic resonance based threshold network

    Consider anN×K×Mfeed-forward threshold neural network with three layers.[6,12,13,29,39,40]The input layer receives the datax ∈RN×1,and the weight matrixW ∈RK×Nconnects the hidden layer and the input one. The hidden layer consists ofKthreshold neurons[41]described as

    with an adjustable threshold parameterθkfork=1,2,...,K.The weight matrixU ∈RM×KconnectsKthreshold neurons in the hidden layer withMlinear activation functions in the output layer, andy ∈RM×1denotes the network output vector.

    It is seen from Fig. 1(a) that the threshold activation functionψ(u) of Eq. (1) is nondifferentiable atu=θkand with zero-gradients foru ?=θk. Based on the noise injection method,[6,12,13,29,39,40]we can replace the activation functionψ(u) in Eq. (1) with a differentiable one. This process is formed by injecting a sufficiently large numberTof mutually independent noise variablesηtwith the common probability density function (PDF)fη(η), as shown in Fig. 1(b). Then,a continuously differentiable activation function is deduced as the substitution,i.e.,the expectation[6,12,13,29]

    Fig.1. (a)Threshold activation function ψ(u)in Eq.(1)with the threshold parameter θk,(b)an ensemble of T noise samples with a common level σ of the standard deviation and(c)the noise-smoothed activation function h(x)with learnable parameters θk and σ.

    It is interesting to note that, for the designed threshold neural network with the hidden neuron given by Eq. (2),the aforementioned difficulty of training threshold neural networks can be overcame,because the activation functionh(x)in Eq.(2)has its finely defined gradient?h(x)/?x=fη(θk ?x).Furthermore,it is emphasized that the effective learning ability of the designed threshold neural network is also extended by the introduction of the learnable parameters of noise levelsσkand thresholdθkin Eq.(2).During the training process,the parametersσkandθk,as well as the network weight matricesWandU, are all optimized by the gradient-based learning rule. Specially, we here represent the corresponding gradients as?h(x)/?σk=∞θk?x ?fη(η)/?σkdηand?h(x)/?θk=

    ?fη(θk ?x)with respect toσkandθk,respectively. Thus,the back-propagation learning algorithm can be successfully implemented for training the designed threshold neural network.

    2.2. Motivated example

    Let{x(?),s(?)}L?=1denoteLexamples of the training set to train the desinged threshold network in a supervised learning manner, ands ∈?M×1represents the desired response vector. The loss function of the empirical mean square error(MSE)can be calculated as

    where‖ · ‖denotes the Euclidean norm. LetΘ ∈{W,U,θk,σk}denote the learnable parameter of the designed threshold network,the plain stochastic gradient descent(SGD)optimizer updates the parameterΘby the learning rule

    Then, the 1×K×1 (K= 10) stochastic resonance based threshold neural network is trained to fit the training set{x(?),s(?)}L?=1sampled from the target function of Eq. (5).Here,the learning rate takesα=0.01,the initial noise levelsσk(0)=1, the initial threshold parametersθk(0) and the initial weight vectorsW(0)andU(0)are uniformly distributed in the interval [?1,1]. The learning process ofJmseof the designed threshold network is shown in Fig. 2 via the SGD optimizer for different initial values of learnable parameters.

    It is important to note in Fig. 2 that the SGD optimizer is sensitive to the initial values of parameters. For example, setting the initial weightW1,1=?5 and the noise levelσ8=1, it is seen in Fig. 2 that the MSEJmse(?) of the designed threshold network converges to a local minimum ofJmse=1.149×10?3after 5000 epochs of training. However,when the initial values ofW1,1=?2.3 andσ8=0.2 are not properly set, it is shown in Fig. 2 that the learning trajectory of the MSEJmse(▽) is stuck in a flat landscape of the network performance surface, because the gradient herein is almost zero in every direction. Here,besidesW1,1and the noise levelσ8,other parametersΘ ∈{W,U,θk,σk}are fixed.

    Next, we employ the Adam optimizer to update the parameterΘof the stochastic resonance based threshold neural network by the learning rule[34]

    whereβ1=0.9 andβ2=0.999 are attenuation coefficients,0<ε ?1 is the regularization parameter,m(?) denotes the first-order moment with its corrected value ?m(?)andv(?)is the second-order moment estimation with its corrected value ?v(?).The operator⊙denotes the Hadamard product. Likewise,starting from the initial values ofW1,1=?2.3 andσ8=0.2,it is seen in Fig. 2 that the MSEJmse(+) of the designed network trained by the Adam optimizer can escape from the flat landscape of the network performance surface,and finally converge to the minimum ofJmse=1.109×10?3.Here,other parametersΘ ∈{W,U,θk,σk}(not includingW1,1andσ8)are fixed as the converged values searched by the SGD optimizer. This result clearly shows that the Adam optimizer,compared with the SGD one, is more efficient for optimizing the designed stochastic resonance based threshold network.

    Fig.2.(a)Performance surface and(b)the corresponding contour of the MSE Jmse versus the weight W1,1 and the noise level σ8 of the eighth hidden neuron. The learning curves of Jmse of the designed threshold network are also illustrated for different optimizers.

    Furthermore, it is shown in Fig. 3 that the Adam optimizer can optimize the designed threshold network with a smaller MSEJmse=1.137×10?5after 1600 epochs of training.Here,the performance surface is still illustrated versus the weightW1,1and the injected noise levelσ8, but the optimum value ofJmse=1.137×10?5is searched over the the whole space of the learnable network parameters by the Adam optimizer.The parametersΘ ∈{W,U,θk,σk}includingW1,1and the noise levelσ8are simultaneously updated by the learning rule of Eq.(6).

    It is emphasized that each hidden neuron in Eq.(2)is associated with one noise levelσkfork=1,2,...,K. Thus, in the training process of the designed threshold network,the injected noise will manifest its beneficial role as the convergence of the noise levelσkreaches a non-zero(local)optimum. It is seen in Figs.2 and 3 that,for a given weightW1,1nearby 1.2,the conventional stochastic resonance phenomenon can be observed as the noise levelσ8increases. When the noise levelσ8reaches the corresponding non-zero(local)optimum,the MSEJmseattains its minimum. It is noted that the resonant phenomenon of the designed threshold networks characterized by the MSEJmseis induced by a bundle ofKnoise levelσkin the hidden neurons.The stochastic resonance phenomenon shown in Fig.2 or Fig.3 only describes a slice of the MSEJmseversus the noise levelσ8for visualization. In view of the learning curves of theKnoise levelsσk,Fig.4 also presents the distinct characteristic of the adaptive stochastic resonance effect in the training phase. The learning curves of the injected noise levelsσkin Fig.4 show that the noise levelsσkstart from the same initial value of unity, and converge to different but non-zero optima. This fact also validates the practicability of the proposed noise-boosted Adam optimizer in Eq.(6)by adaptively optimizing the noise level in the training phase.

    Fig. 3. (a) Performance surface and (b) the corresponding contour of the MSE Jmse versus the weight coefficient W1,1 and the noise level σ8 of the eighth hidden neuron. The learning curve(?)of Jmse is also illustrated for the Adam optimizer.

    However,the hidden neuronh(x)in Eq.(2)is a limit expression,which ensures the success of training,but cannot be realized in practical testing of the threshold neural network. In testing experiments, the hidden neuronh(x)needs to be realized as

    where a finite numberTof threshold activation functions in Eq. (1) are activated byTmutually independent and identically distributed noise componentsηt, respectively. For 103testing datax(?) equally spaced in the interval [?2,2], the trained threshold neural network with the hidden neuronsh(x)indicated by Eq. (7) is simulated for 102times. In each trial,for thek-th hidden neuronh(x) (k=1,2,...,K),Tindependent noise componentsηtare randomly generated at the converged noise levelσkafter the training phase. Then, the outputs of the designed threshold network are averaged as an approximation to the target function of Eq.(5).

    Fig. 4. Learning curves of K =10 noise levels σk in the hidden layer of the designed stochastic resonance based threshold network. Other parameters are the same as those in Fig.3.

    For different numbers ofT,the experimental results of the MSEJmseare shown in Fig.5(a)for testing the trained threshold networks by both optimizers. It is seen from Fig.5(a)that,in order to achieve the same MSEJmse, the designed threshold network trained by the Adam optimizer needs a smaller numberTof threshold activation functions in Eq.(1)as well as the noise componentsηt. For instance,when the statistical mean value of the MSEJmse=4×10?4,T=3×103threshold functions in Eq. (1) are required for the designed threshold network trained by the Adam optimizer,whereasT=104threshold functions are required for the SGD optimizer. In Fig. 5(b), usingT=104threshold functions assisted by the injected noise with the same size,the output(blue dashed line)of the well trained threshold network by the Adam optimizer is plotted. For comparison,the target function(red solid line)of Eq. (5) is also illustrated. It is seen in Fig. 5(b) that the trained threshold neural network performs well on approximating the target unidimensional function in the testing phase.It is emphasized that, in the practical realization of the designed threshold network,the beneficial role of injected noise is accomplished by averaging an ensemble of threshold functions driven by the same size of injected noise samples in each hidden neuron,whereby the application of adaptive stochastic resonance in threshold neural networks is evidently confirmed.

    Fig.5. (a)Experimental results of the MSE Jmse for 103 testing points of the target unidimensional function in Eq.(5)versus the number T. Here,102 trials are realized for each point of experimental results,and the ensemble of outputs of T threshold functions activated by T noise samples is regarded as the realization of the hidden neuron of Eq.(7). (b)The output of the trained threshold network by the Adam optimizer as the approximation(blue dash line)of the target function of Eq.(5). For comparison,the target function(red solid line)of Eq.(5)is also plotted.

    2.3. Two-dimensional function approximation

    Furthermore, we consider the test of a two-dimensional benchmark function

    The stochastic resonance based threshold neural network is designed with the size 2×50×1. The 16×16 training set of{x(?),s(?)}L?=1sampled from the target function of Eq. (8) is equally spaced in the range [?3,3]×[?3,3]. After 2×104training epochs, the MSEsJmseare achieved as 0.1003 and 7×10?4for the SGD optimizer and the Adam optimizer, respectively. The Adam optimizer is still superior to the SGD optimizer in training the designed threshold network to approximate the two-dimensional function in Eq.(8).For 32×32 testing set sampled from the target function of Eq.(8)over the range[?3,3]×[?3,3],Fig.6(a)illustrates the outputs of the trained threshold network as the approximation(patched surface)of the two-dimensional functionf(x1,x2)in Eq. (8). The experimental results of the MSEJmseare obtained as 0.1012 and 7.5×10?4for the SGD optimizer and the Adam optimizer,respectively. Especially,Fig.6(b)illustrates the relative errors|y ?f(x1,x2)|/?between the outputyof the trained threshold network by the Adam optimizer and the testing data. Here, the maximum relative error is obtained as max|y ?f(x1,x2)|/?=5.188×10?3and the maximum difference?=maxf(x1,x2)?minf(x1,x2)=13.4628 for the target function in Eq.(8)in the range[?3,3]×[?3,3].

    Fig. 6. (a) Network outputs y of the trained threshold neural network by the Adam optimizer as the approximation (patched surface) to the 32×32 testing data(?)of the two-dimensional function f(x1,x2)in Eq.(8). (b)The corresponding relative error|y?f(x1,x2)|/?between the threshold network output and the testing data. For reference,the maximum difference ?=max f(x1,x2)?min f(x1,x2)=13.4628 is taken for the target function in Eq.(8)in the range[?3,3]×[?3,3].

    2.4. Real world data set of the function regression

    We also validate the trainedN×20×1 threshold network on nine real-world data sets.[42–50]The dimensionNand the lengthLof data sets are listed in Table 1,and the computer is equipped with CPU of Intel Core i7-6700@3.40 GHz and 16G RAM DDR4@2133 MHz. Using the two-eight rule, 80% of data are used for training,while 20%of data are employed to test the trained threshold neural network. Table 2 reports the training and testing results of the MSEJmseof the stochastic resonance based threshold neural network for two considered optimizers. It is seen in Table 2 that the Adam optimizer can optimize the designed threshold network with a lower MSEJmsethan the SGD optimizer does in both training and testing data sets. The hyperparameter on-line learning of stochastic resonance based threshold networks via the Adam optimizer enables more precise control of injected noise levels in the hidden layer, leading to significantly improved performance of the designed threshold network.

    Table 1. Feature dimension and length of data sets.

    Table 2. Experimental results of the MSE Jmse of designed threshold threshold networks.

    2.5. Recognition of handwritten digits

    We further incorporate the expectation expression of Eq. (2) into the deep convolution neural network for image classification. The architecture of the deep convolution neural network contains 20 convolutional filters with the size 9×9 and the same padding, the hidden layer with 20 neurons indicated by Eq. (2), the factor-2 pooling layer, and the fully connected layers with 10 neurons. The benchmark data set of MNIST consists of a training set of 6×104and a testing set of 104gray-scale images(28×28)representing the handwritten digits from 0 to 9. Here, we employ 104images with the training set and the test set in a ratio 4:1. Using the SGD and Adam optimizers,the training accuracies of the designed convolution neural network versus the training epoch number are shown in Fig. 7. It is seen in Fig. 7 that the designed convolution neural network optimized by the Adam optimizer generally achieves a higher accuracy for recognizing handwritten digits. For the well trained convolution neural network after 50 training epochs,usingT=10 threshold functions assisted by the injected noise with the same size,the experimental accuracy rate in the test set is 98.41% for the Adam optimizer and 97.63% for the SGD optimizer, respectively. For comparison,it is noted that the full-connected backward propagation network[29]and the support vector machine[51]achieve the accuracy rates of 97%and 94.1%,respectively. The accuracy rate 98.41%obtained by the trained convolution threshold neural network by the Adam optimizer indicates a very satisfactory efficiency of the deep learning method. These results demonstrate that the injection of noise into threshold neurons facilitates the optimization of the designed deep convolution neural network, and the hyperparameter on-line learning of the Adam optimizer can also train the deeper stochastic resonance based threshold networks for image classification with a competitive performance.

    Fig.7.Learning curves of accuracies of the designed convolution neural network on the MNIST data set.

    3. Conclusion

    In summary,in order to optimize the threshold neural network in the training phase, we replace the zero-gradient activation function with the continuously differentiable function that is based on the PDF of the injected noise. However, this substitution strategy introduces a group of noise parameters related to the noise PDF,and poses challenges for the training algorithm. For function approximation and image classification,it is shown that,due to the hyperparameter on-line learning capacity,the Adam optimizer can speed up the training of the designed threshold neural network and overcome the local convergence problem existing in the SGD optimizer. More interestingly, the injected noise not only extends the dimension of the parameter space wherein the designed threshold network is optimized,but also converges to a nonzero noise level in each hidden neuron. This distinguishing feature is closely related to the adaptive stochastic resonance effect,and also indicates a meaningful application of the stochastic resonance phenomenon in artificial neural networks.

    However,it is noted that the Adam optimizer can train the artificial neural network to fit the target function well or recognize labels in classification more correctly. Then,for noisy observations of a target function or the uncorrected labels,the overfitting problem may occur for the training process of the designed threshold network by the Adam optimizer. Noting the the regularization in the loss function contributed by the injected noise,[16,17]it is much more meaningful to further investigate the generalization performance of the stochastic resonance based threshold neural network,especially for the low signal-to-noise ratio of the acquired signals by sensors. In Eq. (2), only the Gaussian noise injected into the designed feed-forward threshold neural network is considered. It will be interesting to find the optimal injected noise type to achieve the improved performance of the designed threshold network with respect to the noise PDF.In addition,we test the convolution threshold neural network with the injection of noise on image classification of the MNIST data set, and the potential applications of this kind of deep convolution threshold neural networks on more challenging data set, e.g., CIFAR-10 and ImageNet,also deserve to be explored.

    Acknowledgement

    Project supported by the Natural Science Foundation of Shandong Province,China(Grant No.ZR2021MF051).

    猜你喜歡
    李偉
    年夜飯里的漢字
    “田”野里的樂趣
    “制造”年獸
    孟母三遷
    李偉作品選登
    變魔術(shù)
    錯(cuò)在哪兒
    這樣玩多好
    拼拼 讀讀 寫寫
    電話里傳來的“暖氣”
    日韩高清综合在线| 欧美成人午夜精品| 一a级毛片在线观看| 女同久久另类99精品国产91| 国产成人系列免费观看| 极品教师在线免费播放| 免费在线观看亚洲国产| 亚洲中文日韩欧美视频| 欧美日韩瑟瑟在线播放| 18禁黄网站禁片免费观看直播| 99在线人妻在线中文字幕| 麻豆av在线久日| 成年女人毛片免费观看观看9| 窝窝影院91人妻| 在线观看免费视频日本深夜| 国产不卡一卡二| 一个人免费在线观看的高清视频| 最近在线观看免费完整版| 成人亚洲精品av一区二区| 欧美高清成人免费视频www| 精品一区二区三区视频在线观看免费| 黄色女人牲交| 日韩av在线大香蕉| 可以免费在线观看a视频的电影网站| 国产在线精品亚洲第一网站| 精品少妇一区二区三区视频日本电影| 日韩 欧美 亚洲 中文字幕| 亚洲狠狠婷婷综合久久图片| 久99久视频精品免费| 欧美大码av| 亚洲av五月六月丁香网| 国产高清激情床上av| 国产亚洲精品久久久久5区| 国产麻豆成人av免费视频| 两个人看的免费小视频| 精品一区二区三区视频在线观看免费| 久久精品影院6| 99国产精品一区二区蜜桃av| 国产久久久一区二区三区| 韩国av一区二区三区四区| 一进一出抽搐gif免费好疼| 亚洲成人国产一区在线观看| 日韩大码丰满熟妇| 丰满人妻熟妇乱又伦精品不卡| 国内精品一区二区在线观看| 欧美在线黄色| 中国美女看黄片| 亚洲国产精品久久男人天堂| 亚洲精品一卡2卡三卡4卡5卡| 午夜福利欧美成人| 麻豆av在线久日| 中文字幕最新亚洲高清| 999久久久精品免费观看国产| 国产91精品成人一区二区三区| 母亲3免费完整高清在线观看| 午夜福利成人在线免费观看| 精品不卡国产一区二区三区| 久久天堂一区二区三区四区| 制服诱惑二区| 在线国产一区二区在线| 亚洲国产精品sss在线观看| 成在线人永久免费视频| 婷婷丁香在线五月| 亚洲专区国产一区二区| 亚洲真实伦在线观看| 十八禁网站免费在线| 毛片女人毛片| 国产成人av激情在线播放| 亚洲人成网站在线播放欧美日韩| 国产精品一及| 国产精华一区二区三区| 国产1区2区3区精品| ponron亚洲| av有码第一页| 老司机午夜十八禁免费视频| 99国产精品一区二区三区| 日韩欧美精品v在线| 国内毛片毛片毛片毛片毛片| 久久久国产成人精品二区| 国产成+人综合+亚洲专区| 国产三级在线视频| 国内毛片毛片毛片毛片毛片| 国产v大片淫在线免费观看| 不卡av一区二区三区| 亚洲成人精品中文字幕电影| 母亲3免费完整高清在线观看| 国内久久婷婷六月综合欲色啪| 亚洲精品一卡2卡三卡4卡5卡| 久久天堂一区二区三区四区| 午夜老司机福利片| 亚洲中文日韩欧美视频| 一a级毛片在线观看| 国产精品久久视频播放| 中文在线观看免费www的网站 | 女生性感内裤真人,穿戴方法视频| 给我免费播放毛片高清在线观看| 免费一级毛片在线播放高清视频| 国产野战对白在线观看| 久久亚洲真实| 日韩大尺度精品在线看网址| 一区二区三区高清视频在线| 中文字幕熟女人妻在线| 老司机午夜福利在线观看视频| 在线十欧美十亚洲十日本专区| 国产高清激情床上av| 一本久久中文字幕| 精品久久久久久久人妻蜜臀av| 国产爱豆传媒在线观看 | 亚洲色图av天堂| 久久99热这里只有精品18| 又大又爽又粗| 精品久久久久久久人妻蜜臀av| 男女午夜视频在线观看| 这个男人来自地球电影免费观看| 午夜成年电影在线免费观看| 99国产精品99久久久久| 麻豆成人av在线观看| 人人妻人人澡欧美一区二区| 日韩欧美三级三区| 最近视频中文字幕2019在线8| 久久精品成人免费网站| a在线观看视频网站| 18禁黄网站禁片午夜丰满| 亚洲欧美精品综合久久99| 久久热在线av| 看黄色毛片网站| 亚洲最大成人中文| 日韩免费av在线播放| 2021天堂中文幕一二区在线观| 男女下面进入的视频免费午夜| 久久中文字幕一级| 岛国视频午夜一区免费看| 18美女黄网站色大片免费观看| 正在播放国产对白刺激| 亚洲全国av大片| 99国产综合亚洲精品| 久久精品国产99精品国产亚洲性色| 99国产精品一区二区三区| 国产主播在线观看一区二区| 国产成人影院久久av| 九色国产91popny在线| 少妇粗大呻吟视频| 18禁美女被吸乳视频| 国产成人啪精品午夜网站| 国产精品1区2区在线观看.| 99热只有精品国产| 国产真实乱freesex| 黑人操中国人逼视频| 日韩成人在线观看一区二区三区| 久久久久国内视频| 一区二区三区国产精品乱码| 美女免费视频网站| 1024手机看黄色片| 两性午夜刺激爽爽歪歪视频在线观看 | 精品久久久久久久末码| 亚洲av中文字字幕乱码综合| 国产在线观看jvid| 成年人黄色毛片网站| 99国产精品99久久久久| www日本黄色视频网| 亚洲av熟女| 国产av一区在线观看免费| 亚洲中文字幕一区二区三区有码在线看 | 黄色视频,在线免费观看| 99热这里只有精品一区 | 精品人妻1区二区| 欧美成人性av电影在线观看| 最新美女视频免费是黄的| 日韩免费av在线播放| а√天堂www在线а√下载| 精品久久久久久久久久久久久| 色哟哟哟哟哟哟| 亚洲性夜色夜夜综合| 久久伊人香网站| 三级国产精品欧美在线观看 | 国产97色在线日韩免费| 日本黄色视频三级网站网址| 日本免费a在线| 国产av一区在线观看免费| 天天躁夜夜躁狠狠躁躁| 亚洲黑人精品在线| 一区二区三区国产精品乱码| 免费电影在线观看免费观看| 免费在线观看成人毛片| 日韩中文字幕欧美一区二区| 午夜亚洲福利在线播放| 国产片内射在线| 三级国产精品欧美在线观看 | 国产单亲对白刺激| 久久精品91蜜桃| 视频区欧美日本亚洲| 特大巨黑吊av在线直播| 91字幕亚洲| 两个人视频免费观看高清| 欧美大码av| 免费看a级黄色片| 色播亚洲综合网| 免费在线观看日本一区| av在线播放免费不卡| 高清毛片免费观看视频网站| 国产午夜精品论理片| 免费电影在线观看免费观看| 亚洲av电影不卡..在线观看| 午夜福利高清视频| 久久久久国内视频| 亚洲国产精品合色在线| 最近视频中文字幕2019在线8| 伦理电影免费视频| 久久人人精品亚洲av| а√天堂www在线а√下载| 在线播放国产精品三级| 亚洲精品在线观看二区| 久久久国产成人免费| 又黄又爽又免费观看的视频| 最近视频中文字幕2019在线8| 亚洲欧洲精品一区二区精品久久久| 亚洲熟女毛片儿| 极品教师在线免费播放| 日日爽夜夜爽网站| 国产欧美日韩一区二区三| 狂野欧美白嫩少妇大欣赏| 成年版毛片免费区| 我要搜黄色片| 久久 成人 亚洲| 国产主播在线观看一区二区| 性欧美人与动物交配| 色哟哟哟哟哟哟| 最近最新中文字幕大全免费视频| 欧美成人免费av一区二区三区| av福利片在线观看| 日韩免费av在线播放| 成人国产综合亚洲| 88av欧美| 成人国语在线视频| 麻豆一二三区av精品| 国产精品1区2区在线观看.| 久久这里只有精品中国| 亚洲av熟女| 50天的宝宝边吃奶边哭怎么回事| 老熟妇仑乱视频hdxx| 露出奶头的视频| 免费看日本二区| 国产精品乱码一区二三区的特点| 此物有八面人人有两片| 黄色视频,在线免费观看| x7x7x7水蜜桃| 黑人操中国人逼视频| 国产男靠女视频免费网站| 久久亚洲真实| 亚洲在线自拍视频| 精品人妻1区二区| 麻豆av在线久日| 国产成人精品久久二区二区免费| 五月伊人婷婷丁香| 午夜福利在线观看吧| 国产精品 欧美亚洲| 老汉色∧v一级毛片| 国产一区二区三区在线臀色熟女| 蜜桃久久精品国产亚洲av| 国产一级毛片七仙女欲春2| 老汉色av国产亚洲站长工具| 成年人黄色毛片网站| 韩国av一区二区三区四区| 久久这里只有精品中国| 在线观看免费午夜福利视频| 欧美乱色亚洲激情| 国产一区二区在线观看日韩 | 欧洲精品卡2卡3卡4卡5卡区| 超碰成人久久| 国产又色又爽无遮挡免费看| 中国美女看黄片| 亚洲 欧美一区二区三区| svipshipincom国产片| 超碰成人久久| 日本 av在线| 欧美黑人巨大hd| 最新美女视频免费是黄的| 精品不卡国产一区二区三区| 婷婷六月久久综合丁香| 日本 欧美在线| 亚洲天堂国产精品一区在线| 国产精品久久久久久久电影 | 国产成+人综合+亚洲专区| 亚洲欧美日韩无卡精品| av超薄肉色丝袜交足视频| 久久久久久亚洲精品国产蜜桃av| 国产亚洲精品一区二区www| 亚洲美女黄片视频| 69av精品久久久久久| 国产单亲对白刺激| 国产精品一区二区精品视频观看| 最好的美女福利视频网| 久久中文字幕一级| 国产久久久一区二区三区| 一区二区三区激情视频| 91麻豆精品激情在线观看国产| 嫁个100分男人电影在线观看| 一级毛片高清免费大全| 免费电影在线观看免费观看| 中文在线观看免费www的网站 | 夜夜躁狠狠躁天天躁| 久久这里只有精品中国| √禁漫天堂资源中文www| 色在线成人网| 国产成人欧美在线观看| 亚洲全国av大片| 亚洲色图 男人天堂 中文字幕| 欧美精品亚洲一区二区| 大型av网站在线播放| 亚洲国产日韩欧美精品在线观看 | 午夜免费观看网址| 色尼玛亚洲综合影院| av国产免费在线观看| 国产97色在线日韩免费| 亚洲成人精品中文字幕电影| 少妇的丰满在线观看| 日本五十路高清| 99久久国产精品久久久| www.自偷自拍.com| 后天国语完整版免费观看| 中文亚洲av片在线观看爽| 国产成人精品无人区| 亚洲专区字幕在线| 久久人妻福利社区极品人妻图片| 日韩欧美在线乱码| 99国产综合亚洲精品| 亚洲中文日韩欧美视频| 嫩草影院精品99| 一级毛片高清免费大全| 九九热线精品视视频播放| 窝窝影院91人妻| 老汉色av国产亚洲站长工具| 成人欧美大片| 国产精品日韩av在线免费观看| 亚洲在线自拍视频| 亚洲精品美女久久av网站| 精品一区二区三区av网在线观看| 亚洲精品色激情综合| 91大片在线观看| 一进一出好大好爽视频| 精品第一国产精品| 国产精品免费一区二区三区在线| 制服人妻中文乱码| 免费电影在线观看免费观看| 91老司机精品| 一二三四社区在线视频社区8| 亚洲av成人精品一区久久| 免费av毛片视频| 国产av一区在线观看免费| 国产高清激情床上av| 校园春色视频在线观看| 久久99热这里只有精品18| 99热这里只有精品一区 | 老汉色av国产亚洲站长工具| 国产av麻豆久久久久久久| 十八禁网站免费在线| 高潮久久久久久久久久久不卡| 99精品欧美一区二区三区四区| 亚洲天堂国产精品一区在线| 国产成年人精品一区二区| 制服人妻中文乱码| 亚洲国产欧洲综合997久久,| 精品一区二区三区av网在线观看| 日韩三级视频一区二区三区| 国产精品野战在线观看| 嫩草影院精品99| 久久精品国产综合久久久| 视频区欧美日本亚洲| 一区二区三区国产精品乱码| 久久中文字幕人妻熟女| 亚洲欧美日韩东京热| 欧美成人性av电影在线观看| 亚洲精品一卡2卡三卡4卡5卡| av超薄肉色丝袜交足视频| 国产精品免费视频内射| 一边摸一边抽搐一进一小说| 亚洲人成网站在线播放欧美日韩| tocl精华| www.自偷自拍.com| 亚洲一区高清亚洲精品| 97人妻精品一区二区三区麻豆| 亚洲欧美精品综合久久99| 午夜福利欧美成人| 久久久久精品国产欧美久久久| 午夜免费观看网址| 怎么达到女性高潮| 亚洲精品美女久久久久99蜜臀| 亚洲人成网站高清观看| 女生性感内裤真人,穿戴方法视频| 黄色毛片三级朝国网站| 国产精品av久久久久免费| 一本大道久久a久久精品| 我要搜黄色片| 亚洲色图av天堂| 1024视频免费在线观看| 国产亚洲av高清不卡| 国产亚洲av嫩草精品影院| 亚洲午夜精品一区,二区,三区| 中文字幕熟女人妻在线| 51午夜福利影视在线观看| 香蕉国产在线看| 9191精品国产免费久久| 国产精品久久久久久精品电影| 99久久精品国产亚洲精品| 亚洲中文字幕日韩| 一个人免费在线观看电影 | 中文字幕最新亚洲高清| 美女免费视频网站| 99精品久久久久人妻精品| 白带黄色成豆腐渣| 亚洲无线在线观看| 成人亚洲精品av一区二区| 国产精品影院久久| 亚洲精品久久国产高清桃花| 免费在线观看视频国产中文字幕亚洲| 88av欧美| 男男h啪啪无遮挡| 天天添夜夜摸| 激情在线观看视频在线高清| 桃红色精品国产亚洲av| 国内少妇人妻偷人精品xxx网站 | 女人被狂操c到高潮| 国内少妇人妻偷人精品xxx网站 | 国产成人欧美在线观看| 婷婷六月久久综合丁香| 一级毛片女人18水好多| 99国产综合亚洲精品| 亚洲国产精品合色在线| 嫁个100分男人电影在线观看| 麻豆久久精品国产亚洲av| 国产成人av教育| 国产男靠女视频免费网站| 91麻豆av在线| 国产精品久久久人人做人人爽| 99热这里只有是精品50| 欧美日韩中文字幕国产精品一区二区三区| 欧美乱码精品一区二区三区| 夜夜爽天天搞| 人妻夜夜爽99麻豆av| 精品一区二区三区四区五区乱码| 国产一区二区在线观看日韩 | 成年人黄色毛片网站| 免费搜索国产男女视频| 国产亚洲精品久久久久久毛片| 99在线视频只有这里精品首页| 天天躁夜夜躁狠狠躁躁| 欧美日本亚洲视频在线播放| 特大巨黑吊av在线直播| 在线免费观看的www视频| 久久国产精品人妻蜜桃| 国产欧美日韩一区二区三| 久久久久精品国产欧美久久久| 欧美日韩国产亚洲二区| 少妇人妻一区二区三区视频| 两人在一起打扑克的视频| 欧美+亚洲+日韩+国产| 亚洲专区中文字幕在线| 亚洲人成电影免费在线| 亚洲中文字幕一区二区三区有码在线看 | 日韩欧美 国产精品| 欧美日韩一级在线毛片| 两性午夜刺激爽爽歪歪视频在线观看 | 女人被狂操c到高潮| 精品人妻1区二区| 99久久国产精品久久久| 国内少妇人妻偷人精品xxx网站 | 免费看日本二区| 亚洲精品一区av在线观看| 成人av在线播放网站| 精品一区二区三区视频在线观看免费| 一边摸一边抽搐一进一小说| 国产精品1区2区在线观看.| 啪啪无遮挡十八禁网站| 国内精品一区二区在线观看| 在线观看美女被高潮喷水网站 | 欧美精品啪啪一区二区三区| 日韩欧美免费精品| 亚洲一区二区三区不卡视频| 俄罗斯特黄特色一大片| 亚洲成av人片在线播放无| videosex国产| 日日摸夜夜添夜夜添小说| 亚洲精品色激情综合| 观看免费一级毛片| 两个人免费观看高清视频| 精品午夜福利视频在线观看一区| 亚洲成人精品中文字幕电影| 最新在线观看一区二区三区| 日本五十路高清| 99热6这里只有精品| 国产爱豆传媒在线观看 | 成人av一区二区三区在线看| 91国产中文字幕| 亚洲电影在线观看av| 禁无遮挡网站| 一二三四在线观看免费中文在| 亚洲一码二码三码区别大吗| 亚洲自偷自拍图片 自拍| a级毛片a级免费在线| 黄色a级毛片大全视频| 亚洲美女视频黄频| 老司机福利观看| ponron亚洲| 十八禁人妻一区二区| 色综合婷婷激情| 日韩精品免费视频一区二区三区| 免费在线观看成人毛片| 国产乱人伦免费视频| 天堂√8在线中文| 欧美久久黑人一区二区| 欧美午夜高清在线| 国产久久久一区二区三区| 国产蜜桃级精品一区二区三区| 夜夜夜夜夜久久久久| 18禁观看日本| 久久性视频一级片| 国产亚洲av高清不卡| 国产精品久久久久久人妻精品电影| netflix在线观看网站| 国产av不卡久久| 最近最新中文字幕大全电影3| 久久这里只有精品中国| 欧美一级毛片孕妇| 啦啦啦免费观看视频1| xxxwww97欧美| 又粗又爽又猛毛片免费看| 中亚洲国语对白在线视频| 亚洲va日本ⅴa欧美va伊人久久| 亚洲精品美女久久久久99蜜臀| 久久热在线av| 视频区欧美日本亚洲| 国产免费男女视频| 51午夜福利影视在线观看| 在线十欧美十亚洲十日本专区| 中文字幕熟女人妻在线| 国产成人精品久久二区二区免费| 久久香蕉国产精品| 国产精品一区二区免费欧美| 日韩欧美一区二区三区在线观看| 亚洲第一欧美日韩一区二区三区| 亚洲精品一区av在线观看| 亚洲人成伊人成综合网2020| 日韩免费av在线播放| 在线观看免费午夜福利视频| 日韩欧美 国产精品| 真人一进一出gif抽搐免费| 日本黄大片高清| 精品欧美国产一区二区三| 日本黄大片高清| 欧美乱妇无乱码| 熟女电影av网| 久久性视频一级片| 妹子高潮喷水视频| 两性夫妻黄色片| 午夜两性在线视频| 男插女下体视频免费在线播放| 精品少妇一区二区三区视频日本电影| 国产成人影院久久av| 日本黄色视频三级网站网址| 给我免费播放毛片高清在线观看| 欧美三级亚洲精品| 毛片女人毛片| 欧美日韩黄片免| 久久九九热精品免费| 国产高清视频在线播放一区| 亚洲成人久久爱视频| 亚洲专区中文字幕在线| aaaaa片日本免费| 欧美+亚洲+日韩+国产| 99久久国产精品久久久| 男人舔奶头视频| 亚洲成a人片在线一区二区| 午夜影院日韩av| 日本撒尿小便嘘嘘汇集6| 欧美av亚洲av综合av国产av| 可以免费在线观看a视频的电影网站| 色在线成人网| 成年女人毛片免费观看观看9| 18禁黄网站禁片免费观看直播| www国产在线视频色| 精华霜和精华液先用哪个| 美女扒开内裤让男人捅视频| 国产欧美日韩一区二区三| 91大片在线观看| 亚洲精品美女久久av网站| 成人精品一区二区免费| 99国产综合亚洲精品| 日本熟妇午夜| 日本 欧美在线| 亚洲电影在线观看av| 欧美成狂野欧美在线观看| 欧美日韩精品网址| 国产成人精品无人区| 小说图片视频综合网站| 香蕉国产在线看| 国内少妇人妻偷人精品xxx网站 | 国产成人一区二区三区免费视频网站| 国产野战对白在线观看| 色综合欧美亚洲国产小说| 久久精品夜夜夜夜夜久久蜜豆 | 全区人妻精品视频| 午夜精品在线福利| ponron亚洲| 国产69精品久久久久777片 | 一区福利在线观看| xxx96com| 亚洲av日韩精品久久久久久密| 一本综合久久免费| 夜夜爽天天搞| 老汉色∧v一级毛片| 久久久精品国产亚洲av高清涩受| 亚洲精品一区av在线观看| 一卡2卡三卡四卡精品乱码亚洲| 成人三级黄色视频| 91麻豆精品激情在线观看国产|