• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Neural network hyperparameter optimization based on improved particle swarm optimization①

    2023-12-15 10:43:52XIEXiaoyan謝曉燕HEWanqiZHUYunYUJinhao
    High Technology Letters 2023年4期

    XIE Xiaoyan(謝曉燕), HE Wanqi②, ZHU Yun, YU Jinhao

    (?School of Computer, Xi’an University of Posts and Telecommunications, Xi’an 710121, P.R.China)

    (??School of Electronic Engineering, Xi’an University of Posts and Telecommunications, Xi’an 710121, P.R.China)

    Abstract

    Key words: hyperparameter optimization, particle swarm optimization (PSO) algorithm, neural network

    0 Introduction

    With the rapid growth of the artificial intelligence(AI) technologies, deep learning has achieved significant results in complex regression and classification problems, involving computer vision (CV), natural language processing (NLP), object detection[1], and so on.a neural network model needs to be trained to drive a satisfied accuracy.But there are a large number of hyperparameters which need to be configured during the training processing, such as learning rate,batch size, and optimizer.How to select appropriate hyperparameters to help training and explore the best neural network model has become a focus and a difficulty.

    In the early studies, grid search was used to exhaustive search of parameter space.Later, an improved algorithm was implemented based on this called random search.Experiments have shown that, with the same number of search iterations, random search tries more parameter values compared with grid search, and reduces search time while ensuring model accuracy[2].However, these search methods cannot solve the parameter optimization problem of neural network very well.Ref.[3] mentioned that hyperparameter optimization is an NP-hard optimization problem, and most current approaches solve it by adopting metaheuristic algorithms.

    Ref.[4] successfully used genetic algorithm(GA) for hyperparameter tuning.It is found that more and more metaheuristic and modified algorithms could be used to optimize neural networks, and some researchers choose to extend different exploration[3].The development of hybrid models can improve performance and the ability of complex optimization.Ref.[5] proposed an improved hybrid algorithm based on bat echo location behavior, combining local search to optimize weights, architectures, and active neurons.Ref.[6]introduced a combination of genetic and gradient descent to train networks.The proposed HGADD-NN method achieved good results on several benchmark problems.

    Among most common algorithms, the particle swarm optimization (PSO)[7-8]is a more popular choice, compared with GA[9], grey wolf optimization algorithm (AWO)[10]and ant colony optimization(ACO), because of its less nodal parameters, efficient global search and easily concurrent processing.So, it is introduced into parameters selection of convolutional neural network (CNN).In Ref.[7], PSO is used for parameters decision of CNN and verified successfully in classification of CIFAR.This indicates that PSO is a feasible scheme for parameters optimization of CNN.To handle the different of value scopes and datatypes of hyperparameters, independent candidate particles were defined for different parameter in Ref.[7].But the initial position selected randomly would result in local optima trapped and the weakly local searching ability would entail a long convergent stage.Such a weakness may lead to a biased model even with the cost of additional calculations.Therefore, the improved PSO or integrating other heuristic algorithms become a striking field.Ref.[11] proposed a distributed PSO(DPSO)mechanism.The particles were updated and allocated in the master,and the fitness of different particles was calculated using multiple slaves.This strategy automatically and globally searches for optimal hyperparameters and dramatically reduces training time compared with traditional PSO, thanks to the distributed training method.Unfortunately, this idea only uses a distribution way to tackle long training time in the parameter configuration process.The cost of calculation is ignored.

    The key problem to this issue lies in balance of the diversity of particle swarms and cost of convergence.In this work, an improved solution of PSO is proposed to aid in locating the best hyperparameter combination of specific network architecture easier.The following specific contributions are made.

    (1) To response to the troubles caused by inconsistent data types and widely different value scopes of neural networks, the interval mapping method is adopted to data coding.The motivation of such design is to ensure the effectiveness of particles through a normalized strategy and to avoid the local evolution of swarm stem from randomly position selecting of original particles.

    (2) By introducing mutation and crossover operations to increase the diversity of particles, the proposed algorithm solves the problems of the raw PSO, such as being easily trapped in local optima and low convergence accuracy during hyperparameter searching.

    The rest of this paper is organized as follows.Section 1 discusses related work.Section 2 analyzes the motivation and describes the implementation of proposed algorithms in detail.Section 3 discusses the experimental environment and implementation scheme on different network structures and datasets.Section 4 summarizes the content of this paper.

    1 Related work

    In PSO[12], the optima seeking is converted to a process of traversing an-dimensional function with particles of a swarm.Each a potential solution to a given problem is viewed as a particle.PSO obtains the best solution from interaction of particles.When mapping the different value scope of hyperparameters into andimensional function, selecting of the best parameter combination can be figured out by PSO.

    If there areNhyperparameters in the specific CNN, the value of each parameter ranges fromlowtoup.A particle can be denoted as{x1,x2,…,xN-1,xN}, and the performance is evaluated by a fitness function (often the loss function in CNN).The swarm is constructed originally with particles generated randomly by predefined value.The fitness value of each particle is calculated iteratively until reaching its best position or meeting the pre-set termination condition.The prior personal best position (gbest) and the global best position (pbest) is characterized as the intermediate results.Updating of particles can be given by Eq.(1) and Eq.(2).

    where,vrepresents the velocity vector;wis the inertia weight utilized to balance the local exploitation and global exploration;r1andr2are random vectors uniformly distributed within the range of[low,up];c1andc2are acceleration coefficients, which are positive constants.

    Above scheme, mentioned in Ref.[7] had been demonstrated be helpful to parameters tuning.However, there are many kinds of hyperparameter included in CNN, their datatype is also inconsistent (the number of convolutional kernels is recorded as integer, the learning rate is float, etc.).Moreover, the value scope of parameters in different layers show an enormous difference.The uniform format in coding will merge the effective attribute of particles and results a calculation error.However, differentiated coding format will increase the complexity of the swarm.Additional, when solving the problem with a large number of dimensions or complex and large datasets, PSO shows poor-quality results,usually falling into local optima trap.

    To cope with this tricky situation, a series of solutions are proposed in this paper.The composite variabletype particles are mapped into an interval to ensure the effectiveness of particles update and to prevent the poor results due to significant differences in data sampling intervals.Then,the selection,mutation,and crossover operations of GA are introduced to add diversity of particles and avoid getting stuck in local optima during the parameter search process.Finally, an improved particle swarm optimization (IPSO) algorithm is proposed based on the above two improvement measures.

    2 Methods

    2.1 Hybrid particle coding

    The parameters of CNN to be optimized selected in this paper include the number of convolution kernels, learning rate and batch size, etc., as shown in Table 1.

    Table 1 Parameters to be optimized

    It can be seen from Table 1 that the scope of parameters of int-type ranges from 6 ×(Parm1) to 120 ×( Parm4 ).In addition, a float-type parameter(Parm6) is also involved.Such a variety of value maybe leads to bias direction in updating procedure and unable to converge in severe cases.As a result, there needs an effective manner to remove this difference.To this end, the PSO advocate[7-8]can only independently characterizes particles for each parameter.As a result,initial value of each particle only occurs randomly in self-defined scope.Although the difference of parameter scope needs not to be considered, such a treatment may probably generate an unreachable position during searching.For example, in a two-dimensional space composed of Parm2 and Parm3, the value of Parm2 could not reach [1,3] and [8,11], marked as ‘?’in Fig.1, due to the limiting of random function.Subsequently, particles generated will not overall coverage to initial swarm, while it would trap in local optima in iteration.

    To avoid such incidents, a random sampling function conformed to uniform distribution is introduced in our scheme, which uniforms the coding of overall parameters.Meanwhile, Eq.(3) is used to mapping scopes of parameters to a normalized uniform.

    where the lower bound and upper bound of a parameter are denoted aslowAreaandupArearespectively.The distribution of initial particles is presented as ‘?’ in Fig.1.It is obvious that particles scattered steadily in overall interval.The global coverage of search space and completeness of swarm are guaranteed very well.

    Fig.1 Schematic diagram of particle distribution before and after optimization

    2.2 Genetic manipulation

    In this paper, in order to increase the diversity of particles during the updating, the selection, crossover,and mutation operations of GA are added to the IPSO.The cross process of particles is shown in Fig.2.Suppose there areNparticles in the swarm, which are denoted as{X1,X2,X3,…,XN-1,XN}.After the calculation, the fitness values of each particle are marked as{F(X1),F(X2),F(X3),…,F(XN-1),F(XN)}.Firstly, the first two particles with higher fitness values are selected and recorded asXiandXj.The inter-section pointPis calculated using Eq.(4).TakingN=6 as an example, the selection result is shown in Fig.2(a).Then, using the inter-section pointPas the boundary, the particlesXiandXjare divided into four parts:Xi-front,Xi-after,Xj-front, andXj-after, as shown in Fig.2(b).Finally,Xi-frontis spliced withXj-after,andXj-frontis spliced withXi-after, then two new particlesXi-newandXj-neware formed, as shown in Fig.2(c).

    Fig.2 An example illustration of the crossover process

    The schematic diagram of the particle mutation is presented in Fig.3,takingN=6 as an example.Firstly,the complete information of the initial particle would be obtained.Then, an integer value in [1,N] will be randomly generated,Ndenotes the total number of optimized parameters.Finally, a randomly generated number within the corresponding parameter scope of that position will replace the original value, and the mutation operation will be done.

    Fig.3 Schematic diagram of mutation process

    For the raw PSO method, particle position participating in the next iteration is determined by the optimal solution obtained from this searching round and the historical optimal solution.The update domain of raw PSO can be described by the zone, shown as dashed lines in Fig.4.The updating domain of raw PSO is limited to such a domain.It is only stopped in the optima of such local domain.If the global optima are escaped out of this interval,the final result will be deviated to the correct one.

    The GA operations will help to jump out of this limitation by involving new particle.It can enhance the diversity of particles in iteration of swarm, and make the global optima be located easier.

    Fig.4 Comparison of particle positions before and after optimization

    2.3 Overall scheme

    The flow chart of hyperparameter configuration based on the proposed IPSO is shown in Fig.5, there are 3 steps as follows.

    Step 1 Initialization of swarm and its particles.The learning factorsc1,c2, weight coefficientw,number of particles, and maximum particle speed need to be initialized at beginning.At the same time, it opens up a space to store local and global optimal values and randomly generates a specified numberNof particles to form a swarm according to the number and scope of parameters to be optimized.

    Step 2 Calculate the fitness value of the particles.The values of each particle are sequentially sent to the designated neural network for training and testing, and the accuracy rate obtained will be selected as the fitness value of the particle and recorded.

    Step 3 Particles updating.According to Algorithm 1, the particles in the current swarm are updated to complete the subsequent iteration.

    Algorithm 1 Particle updating Input: Primitive swarm D(X), Fitness function F(X)Output: The updated swarm D(X)1: function update (D(X))2: Initialization max1 and max2 3 for:i in length ( D(X) )4: Evaluate F(Xi)5: set max1 and max2 6: end for 7: update gbest and pbest 8:new1 ,new2 = crossover ( max1 ,max2 )9: D(X) = mutate ( new1 ,new2 )10: return D(X)11: end function

    Fig.5 Flow chart of the IPSO

    3 Experiments and results analysis

    3.1 Implementation details

    The initialization of the algorithm is completed according to the parameters shown in Table 2.The proposed algorithm is verified under different network structures and data sets.The experimental environment of this paper is shown in Table 3.

    Table 2 IPSO algorithm related parameters

    Table 3 Experimental environment setting

    3.2 Experimental results and analysis

    3.2.1 Under different network structures

    In this work,LeNet-5,ResNet-18,VGG-16,MobileViT, and Long short-term memory (LSTM) are selected as the test objects.The MNIST was used on LeNet-5 and LSTM, the Flowers-Recognition was used on MobileViT, and the remaining networks tested with the CIFAR10.This presearch uses the parameters searched by the IPSO and PSO to complete the training task.The accuracy rate changes with epoch during the training are shown in Fig.6.It can be seen that for the same training epoch,no matter which neural network is used, the parameter configuration searched by the IPSO can make the neural network converge faster than the PSO.In addition, Fig.6 (c) and Fig.6(d) show more apparent difference in accuracy.

    Fig.6 Accuracy rate change with Epoch

    Using the IPSO and the parameters searched by the PSO to complete the training, the final accuracy rate is compared in Table 4.It is clear that, after structure tuned by IPSO, the CNN model can finally obtain higher accuracy, compared with tuned by PSO,among which the accuracy rates are increased by about 0.4%,1.9%, and 4.5%, respectively.On the ViT model, the network built by the IPSO can finally complete the image classification tasks with an accuracy rate of 66.21%, about 13% higher than that by the raw PSO.IPSO and PSO have achieved similar accuracy on the recurrent neural network.On the ViT model experiment, IPSO and PSO only have a difference of 1 in parameter 1, but the final network structure gains a significant difference due to the layer-by-layer accumulation.Meanwhile, although the data number of Flowers-Recognition used in this paper is small, different parameter configurations will still bring obvious difference of performance.LSTM is more inclined to the application of NLP.At the same time, this paper only considers two network-related parameters, the hidden state and the number of recurrent network layers.Although the difference between them will still change the number of parameters, the extraction effect of changing data feature is not obvious, which can be to negligible.

    Table 4 Comparison of the accuracy of different algorithms on different network structures

    3.2.2 Under different datasets

    To get the effectiveness of the IPSO proposed in this paper on the hyperparameter seeking, three datasets of MNIST, Fashion-MNIST and CIFAR10 are selected for verification.

    (1) The MNIST dataset includes 70 000 handwritten digital images.The training set contains 60 000 samples, while the test set contains 10 000 samples.The dataset is categorized into 10 classes related to 10 numerals.

    (2) The Fashion-MNIST dataset contains 10 categories of images: T-shirt, trouser, puller, dress, coat,sandy, shirt, sneaker, bag, and ankle boot.The training dataset has a total of 60 000 samples, and the test dataset has a total of 10 000 samples.

    (3) The CIFAR-10 dataset includes 60 000 color images from 10 categories, with 6000 images per category.These 10 different categories represent aircraft, cars, birds, cats, deer, dogs, frogs, horses, boats, and trucks.The training set contains 50 000 samples (83% of the original dataset),while the test set contains 10 000 samples (17% of the original dataset).

    The parameters searched by the IPSO on the above three data sets can have 99.58%, 93.39%,and 78.96% accuracy when training the neural network.Compared with PSO, the accuracy has increased by about 0.1%,0.5%,2%, respectively.Compared with the SSO[13], the same accuracy has been achieved on the MNIST dataset meanwhile.On Fashion-MNIST and CIFAR10, the accuracy has increased by 0.36%and 5.83% respectively.Compared with the DPSO[11], the model accuracy obtained using IPSO on MNIST and Fashion MNIST is slightly higher, and the final results are shown in Table 5.In addition, in order to compare with the WOA[10], the network’s optimization parameters are adjusted to be consistent with those in GWO,including batch size,epochs,and optimizer.The results are shown in Table 6.From the table, it can be seen that the IPSO algorithm’s search for parameter combinations completed training on the Fashion-MNIST dataset has a higher accuracy on the test set than GWO algorithm.

    Table 5 Comparison of the accuracy of different algorithms on different network structures

    Table 6 Accuracy comparison on the Fashion-MNIST dataset

    4 Conclusion

    This paper proposes an IPSO algorithm that integrates GA to address the issues of traditional PSO,such as easily falling into local optima and low convergence accuracy in the process of seeking.Finally, the performance is verified by using different types of neural network models and datasets.Experimental results show that the proposed IPSO achieves higher accuracy than traditional PSO on CNNs and ViT models, tested with Fashion-MNIST and CIFAR10 datasets.Moreover, with optimized parameter configurations, the model is more stable and converges faster during training.However, this paper only considers a limited number of parameters to be optimized, while other parameters affecting the neural network structure, such as the depth and number of convolutional layers, can also be searched for the optimal solution.Therefore, in the future, it is worth considering the fusion of parameters at different levels to find the optimal network structure and model parameters for better model performance.

    bbb黄色大片| 男人的好看免费观看在线视频| 一个人免费在线观看电影| 精品久久久久久久久av| 我的女老师完整版在线观看| 他把我摸到了高潮在线观看| 国产一区二区在线观看日韩| 亚洲国产精品sss在线观看| 久久人人精品亚洲av| 国产三级在线视频| 国产乱人视频| 国产爱豆传媒在线观看| 欧美日韩黄片免| 不卡一级毛片| 两个人的视频大全免费| 国产免费一级a男人的天堂| ponron亚洲| 亚洲专区中文字幕在线| 嫩草影视91久久| 国产高潮美女av| 欧美一区二区精品小视频在线| 久久欧美精品欧美久久欧美| 日韩欧美精品v在线| www日本黄色视频网| 日本 av在线| 国产蜜桃级精品一区二区三区| 久久久久久久久久黄片| 我的女老师完整版在线观看| 亚州av有码| 少妇的逼好多水| 国产一区二区亚洲精品在线观看| 亚洲内射少妇av| 国产精品日韩av在线免费观看| 我要看日韩黄色一级片| 精品免费久久久久久久清纯| 99久久无色码亚洲精品果冻| 18美女黄网站色大片免费观看| 99久久无色码亚洲精品果冻| 国产欧美日韩一区二区精品| 男人和女人高潮做爰伦理| 嫁个100分男人电影在线观看| 青草久久国产| a在线观看视频网站| 男人的好看免费观看在线视频| 高清毛片免费观看视频网站| 久久午夜亚洲精品久久| 中亚洲国语对白在线视频| 午夜福利在线观看吧| 69av精品久久久久久| 一区二区三区免费毛片| 亚洲欧美清纯卡通| 成年人黄色毛片网站| 国产精品嫩草影院av在线观看 | 久久久成人免费电影| 宅男免费午夜| 亚洲,欧美精品.| 国产欧美日韩精品亚洲av| 琪琪午夜伦伦电影理论片6080| 国产精品爽爽va在线观看网站| 欧美一区二区亚洲| 两个人视频免费观看高清| 日本 欧美在线| 成年人黄色毛片网站| 国产精品99久久久久久久久| 国产精品人妻久久久久久| 特级一级黄色大片| 欧美激情久久久久久爽电影| 国产白丝娇喘喷水9色精品| 九九在线视频观看精品| 国产高清视频在线播放一区| 人人妻人人澡欧美一区二区| 中文字幕人成人乱码亚洲影| 免费电影在线观看免费观看| 五月伊人婷婷丁香| 在线观看一区二区三区| 在线观看一区二区三区| 91麻豆精品激情在线观看国产| 亚洲综合色惰| 日本 av在线| 免费av毛片视频| 久久99热这里只有精品18| 久久久成人免费电影| 黄片小视频在线播放| 99在线人妻在线中文字幕| 99热只有精品国产| 欧美日韩中文字幕国产精品一区二区三区| 国产精品自产拍在线观看55亚洲| 国产精品一区二区三区四区久久| 亚洲熟妇熟女久久| 偷拍熟女少妇极品色| 91久久精品国产一区二区成人| 亚洲五月婷婷丁香| 简卡轻食公司| 又爽又黄无遮挡网站| 久久精品国产亚洲av天美| 99热6这里只有精品| 久99久视频精品免费| 99热这里只有是精品在线观看 | 久久久国产成人精品二区| 俺也久久电影网| 日韩欧美精品免费久久 | 天天一区二区日本电影三级| 国产黄片美女视频| 免费在线观看影片大全网站| 亚洲自拍偷在线| 亚洲欧美日韩无卡精品| 色综合亚洲欧美另类图片| 麻豆国产97在线/欧美| 亚洲av一区综合| 欧美成人a在线观看| 好男人在线观看高清免费视频| 国产欧美日韩一区二区精品| 欧美极品一区二区三区四区| 免费观看精品视频网站| 国产伦一二天堂av在线观看| 欧美精品啪啪一区二区三区| 一区二区三区激情视频| 国产三级中文精品| 欧美日本亚洲视频在线播放| 久久久成人免费电影| 国产高清视频在线观看网站| 一夜夜www| 男插女下体视频免费在线播放| 国产日本99.免费观看| 久久精品国产亚洲av天美| 中文字幕高清在线视频| 少妇熟女aⅴ在线视频| 免费av不卡在线播放| 大型黄色视频在线免费观看| 亚洲精品成人久久久久久| 亚洲av美国av| 深爱激情五月婷婷| 中国美女看黄片| 少妇人妻精品综合一区二区 | 最后的刺客免费高清国语| 非洲黑人性xxxx精品又粗又长| 日韩欧美在线二视频| 国产在线男女| 欧美激情国产日韩精品一区| 午夜福利在线观看免费完整高清在 | 欧美日韩国产亚洲二区| 亚洲国产日韩欧美精品在线观看| 在线观看66精品国产| 神马国产精品三级电影在线观看| 高清日韩中文字幕在线| 国产精品女同一区二区软件 | 国产高清有码在线观看视频| 国模一区二区三区四区视频| 51国产日韩欧美| 国产精品永久免费网站| 丁香六月欧美| 国内精品一区二区在线观看| 国内精品美女久久久久久| 午夜福利欧美成人| 国产三级黄色录像| 国产黄色小视频在线观看| 黄片小视频在线播放| av天堂中文字幕网| 国产男靠女视频免费网站| 深夜精品福利| 精品免费久久久久久久清纯| 欧美色视频一区免费| 欧美3d第一页| 久久久精品大字幕| 人妻制服诱惑在线中文字幕| 精品一区二区三区视频在线观看免费| 18禁黄网站禁片免费观看直播| 日韩欧美三级三区| 免费观看人在逋| 精品一区二区三区视频在线| 久久精品国产亚洲av天美| 欧美日本视频| 成年女人看的毛片在线观看| 少妇人妻一区二区三区视频| 亚洲精品粉嫩美女一区| 天天躁日日操中文字幕| 中文字幕精品亚洲无线码一区| 中文字幕人妻熟人妻熟丝袜美| 两性午夜刺激爽爽歪歪视频在线观看| 亚洲中文字幕一区二区三区有码在线看| 久久国产乱子伦精品免费另类| 长腿黑丝高跟| 色播亚洲综合网| 免费在线观看亚洲国产| 久久久久久久亚洲中文字幕 | 国产成人福利小说| 最近中文字幕高清免费大全6 | 国产三级黄色录像| 一边摸一边抽搐一进一小说| 热99re8久久精品国产| 一卡2卡三卡四卡精品乱码亚洲| 精品久久久久久久人妻蜜臀av| 精品不卡国产一区二区三区| 97人妻精品一区二区三区麻豆| 国产一区二区亚洲精品在线观看| 欧美成人一区二区免费高清观看| 国产精品亚洲av一区麻豆| a级一级毛片免费在线观看| 精品不卡国产一区二区三区| 色在线成人网| 久久精品综合一区二区三区| 日本一二三区视频观看| 亚洲av电影不卡..在线观看| 日日摸夜夜添夜夜添av毛片 | 国内精品美女久久久久久| 午夜a级毛片| 免费看美女性在线毛片视频| 欧美日韩福利视频一区二区| 欧洲精品卡2卡3卡4卡5卡区| 亚洲无线观看免费| 午夜福利高清视频| 亚州av有码| 一区二区三区免费毛片| 亚洲av不卡在线观看| 国产午夜精品论理片| 亚洲精品在线美女| 黄色女人牲交| 日本免费a在线| 亚洲人与动物交配视频| 久久人妻av系列| 黄色日韩在线| 久久国产乱子伦精品免费另类| 国产一区二区激情短视频| 青草久久国产| av黄色大香蕉| 婷婷精品国产亚洲av| 亚洲国产精品999在线| 90打野战视频偷拍视频| 男人的好看免费观看在线视频| 蜜桃久久精品国产亚洲av| 欧美日韩瑟瑟在线播放| 最好的美女福利视频网| 琪琪午夜伦伦电影理论片6080| 国产精华一区二区三区| 久久伊人香网站| 波野结衣二区三区在线| 国产主播在线观看一区二区| 免费搜索国产男女视频| 欧美成人免费av一区二区三区| 内射极品少妇av片p| 最好的美女福利视频网| 90打野战视频偷拍视频| 精品无人区乱码1区二区| 丁香六月欧美| 好男人电影高清在线观看| 成人三级黄色视频| 久久精品综合一区二区三区| 丝袜美腿在线中文| 国产真实乱freesex| 成人美女网站在线观看视频| 亚洲 欧美 日韩 在线 免费| 国产精品免费一区二区三区在线| 日本 欧美在线| 在线播放国产精品三级| 欧美区成人在线视频| 久久99热这里只有精品18| 夜夜爽天天搞| 99久久成人亚洲精品观看| 激情在线观看视频在线高清| av国产免费在线观看| 99国产精品一区二区蜜桃av| 亚洲欧美精品综合久久99| 日本熟妇午夜| 日韩成人在线观看一区二区三区| 极品教师在线免费播放| 国产69精品久久久久777片| 亚洲成av人片在线播放无| 亚洲精品一区av在线观看| 日本在线视频免费播放| 日本成人三级电影网站| 亚洲综合色惰| 男人的好看免费观看在线视频| 精品人妻熟女av久视频| 赤兔流量卡办理| 精品久久久久久,| 欧美一区二区亚洲| 欧美3d第一页| 99国产精品一区二区三区| 久久精品久久久久久噜噜老黄 | 日本黄大片高清| 美女高潮喷水抽搐中文字幕| www.www免费av| 真实男女啪啪啪动态图| 1000部很黄的大片| 亚洲内射少妇av| 午夜激情欧美在线| 韩国av一区二区三区四区| 午夜影院日韩av| 国产高清有码在线观看视频| 欧美成人免费av一区二区三区| 亚洲国产精品合色在线| 国产高清激情床上av| 十八禁人妻一区二区| 日本熟妇午夜| 欧美日韩瑟瑟在线播放| 色综合婷婷激情| www.熟女人妻精品国产| 亚洲第一欧美日韩一区二区三区| 国产成人av教育| 一区二区三区四区激情视频 | avwww免费| 琪琪午夜伦伦电影理论片6080| 国产成人aa在线观看| 人人妻人人看人人澡| 久久久久久九九精品二区国产| 久久亚洲精品不卡| 一个人免费在线观看电影| 亚洲精品乱码久久久v下载方式| 欧美不卡视频在线免费观看| 少妇高潮的动态图| 亚洲综合色惰| 国产精品女同一区二区软件 | 麻豆国产av国片精品| 免费在线观看日本一区| 精品人妻偷拍中文字幕| 国产三级黄色录像| 国产视频内射| 黄色配什么色好看| 国产三级在线视频| 亚洲国产精品成人综合色| 国产成人啪精品午夜网站| 国产极品精品免费视频能看的| 国产精品日韩av在线免费观看| 桃色一区二区三区在线观看| 国内精品一区二区在线观看| 男女下面进入的视频免费午夜| 国产成人影院久久av| 亚洲欧美日韩高清在线视频| 国产精品综合久久久久久久免费| 欧美日韩中文字幕国产精品一区二区三区| 在线看三级毛片| 老司机深夜福利视频在线观看| 亚洲国产精品999在线| 成人无遮挡网站| 欧美潮喷喷水| 国产三级中文精品| 久久精品综合一区二区三区| 最新在线观看一区二区三区| 老熟妇乱子伦视频在线观看| 亚州av有码| 免费在线观看亚洲国产| 亚洲,欧美,日韩| 久久99热这里只有精品18| 麻豆国产av国片精品| 午夜免费男女啪啪视频观看 | 内地一区二区视频在线| 亚洲国产精品合色在线| 五月伊人婷婷丁香| 高清毛片免费观看视频网站| 国产激情偷乱视频一区二区| 丁香欧美五月| 99riav亚洲国产免费| 亚洲av中文字字幕乱码综合| 精品久久久久久,| 色精品久久人妻99蜜桃| 免费在线观看日本一区| 2021天堂中文幕一二区在线观| 岛国在线免费视频观看| 乱码一卡2卡4卡精品| 日韩人妻高清精品专区| 久久6这里有精品| 中文字幕人妻熟人妻熟丝袜美| 99久久九九国产精品国产免费| 国产精品,欧美在线| 可以在线观看的亚洲视频| 亚洲一区二区三区不卡视频| 少妇被粗大猛烈的视频| 在线观看午夜福利视频| 日日摸夜夜添夜夜添小说| avwww免费| 在现免费观看毛片| 亚洲一区二区三区不卡视频| 男人舔女人下体高潮全视频| 最近在线观看免费完整版| 搞女人的毛片| 草草在线视频免费看| 久久久国产成人免费| 熟妇人妻久久中文字幕3abv| 香蕉av资源在线| 非洲黑人性xxxx精品又粗又长| 黄色配什么色好看| 天堂动漫精品| 国产精品亚洲av一区麻豆| 国产老妇女一区| 床上黄色一级片| 老司机福利观看| 午夜福利高清视频| 女人十人毛片免费观看3o分钟| 最近最新中文字幕大全电影3| 人人妻,人人澡人人爽秒播| 99久久无色码亚洲精品果冻| 日韩大尺度精品在线看网址| 深夜a级毛片| 久久久久九九精品影院| netflix在线观看网站| avwww免费| 18禁黄网站禁片午夜丰满| 国产毛片a区久久久久| 88av欧美| 色综合站精品国产| 亚洲av不卡在线观看| 成年女人看的毛片在线观看| av在线天堂中文字幕| 高清在线国产一区| 97碰自拍视频| 久久午夜福利片| 18+在线观看网站| 欧美午夜高清在线| 精品一区二区三区视频在线| 熟妇人妻久久中文字幕3abv| 精品乱码久久久久久99久播| xxxwww97欧美| 精品福利观看| 中文字幕熟女人妻在线| 久久精品国产99精品国产亚洲性色| 女同久久另类99精品国产91| 精品久久久久久久久av| 老司机福利观看| 国产一区二区三区视频了| 国产精品女同一区二区软件 | 国产成人啪精品午夜网站| 国产免费一级a男人的天堂| 亚洲aⅴ乱码一区二区在线播放| 日日夜夜操网爽| 久久精品久久久久久噜噜老黄 | 一级a爱片免费观看的视频| 青草久久国产| 成人高潮视频无遮挡免费网站| 波多野结衣高清作品| 午夜福利成人在线免费观看| 国内少妇人妻偷人精品xxx网站| 一区二区三区四区激情视频 | 国产精品亚洲美女久久久| 97超视频在线观看视频| 在线看三级毛片| 在线观看免费视频日本深夜| 偷拍熟女少妇极品色| 九色国产91popny在线| 无遮挡黄片免费观看| 久久九九热精品免费| 在线十欧美十亚洲十日本专区| 黄片小视频在线播放| 男人狂女人下面高潮的视频| 国产极品精品免费视频能看的| 听说在线观看完整版免费高清| 久久人妻av系列| 中文字幕人成人乱码亚洲影| 简卡轻食公司| 亚洲经典国产精华液单 | 日日夜夜操网爽| 国产主播在线观看一区二区| 狂野欧美白嫩少妇大欣赏| 免费av不卡在线播放| 在线观看午夜福利视频| 日韩欧美在线二视频| 嫩草影视91久久| 高清毛片免费观看视频网站| 日本熟妇午夜| 国产三级黄色录像| 国产 一区 欧美 日韩| 91av网一区二区| 热99re8久久精品国产| 久久精品91蜜桃| 久久久久久久精品吃奶| 久久久久久久久中文| 国产91精品成人一区二区三区| 国产视频内射| 亚洲成人久久性| 午夜老司机福利剧场| 成人国产综合亚洲| 久久久精品欧美日韩精品| 淫妇啪啪啪对白视频| 99久久精品热视频| 蜜桃亚洲精品一区二区三区| 日韩人妻高清精品专区| a在线观看视频网站| 757午夜福利合集在线观看| 成人特级av手机在线观看| 真实男女啪啪啪动态图| 黄片小视频在线播放| 国产又黄又爽又无遮挡在线| 最新中文字幕久久久久| 成人亚洲精品av一区二区| 久久国产乱子免费精品| 久久精品国产99精品国产亚洲性色| 午夜影院日韩av| 久99久视频精品免费| 国产乱人伦免费视频| 男女之事视频高清在线观看| 少妇的逼水好多| 亚洲国产欧洲综合997久久,| 国产av一区在线观看免费| 18禁黄网站禁片免费观看直播| 在线免费观看不下载黄p国产 | 日本成人三级电影网站| 亚洲aⅴ乱码一区二区在线播放| 亚洲无线观看免费| 亚洲内射少妇av| 国产精品一区二区免费欧美| 色噜噜av男人的天堂激情| 成年免费大片在线观看| 日韩人妻高清精品专区| 真人一进一出gif抽搐免费| 99视频精品全部免费 在线| 亚洲欧美日韩卡通动漫| 中文资源天堂在线| 亚洲最大成人av| 日韩大尺度精品在线看网址| 亚洲av.av天堂| 亚洲不卡免费看| 午夜福利欧美成人| 国产精品亚洲av一区麻豆| 亚洲精品在线观看二区| 精品熟女少妇八av免费久了| 99视频精品全部免费 在线| 丝袜美腿在线中文| 欧美绝顶高潮抽搐喷水| АⅤ资源中文在线天堂| 最近最新中文字幕大全电影3| 精品国产三级普通话版| 97超级碰碰碰精品色视频在线观看| 欧美不卡视频在线免费观看| 国产精品爽爽va在线观看网站| 中文字幕高清在线视频| 精品久久久久久久久av| 99精品在免费线老司机午夜| 日本精品一区二区三区蜜桃| 天堂影院成人在线观看| 免费av毛片视频| av黄色大香蕉| 亚洲av.av天堂| 乱码一卡2卡4卡精品| 免费电影在线观看免费观看| 中文字幕人妻熟人妻熟丝袜美| 香蕉av资源在线| 国产av一区在线观看免费| 黄色女人牲交| 丰满人妻熟妇乱又伦精品不卡| 久久精品国产亚洲av香蕉五月| 日本三级黄在线观看| 久久精品国产自在天天线| 校园春色视频在线观看| 3wmmmm亚洲av在线观看| 欧美另类亚洲清纯唯美| 久久亚洲精品不卡| 自拍偷自拍亚洲精品老妇| 久久久国产成人免费| 久久久久久九九精品二区国产| 国产真实乱freesex| 1000部很黄的大片| 熟女人妻精品中文字幕| 午夜福利高清视频| 18禁黄网站禁片午夜丰满| 亚洲精品亚洲一区二区| 三级毛片av免费| 国产男靠女视频免费网站| 国产单亲对白刺激| 亚洲精品成人久久久久久| 亚洲av电影不卡..在线观看| 精品国产三级普通话版| 日本三级黄在线观看| 色综合亚洲欧美另类图片| 国产在视频线在精品| 久久久久久久久久黄片| 啦啦啦观看免费观看视频高清| 99在线人妻在线中文字幕| 丁香六月欧美| 久久久精品欧美日韩精品| 国产午夜精品论理片| 一个人免费在线观看电影| 伊人久久精品亚洲午夜| 欧美激情在线99| 亚洲经典国产精华液单 | 国产精品乱码一区二三区的特点| 亚洲av熟女| 一个人看的www免费观看视频| 中国美女看黄片| 精品国产亚洲在线| 禁无遮挡网站| 91久久精品电影网| 99久久精品国产亚洲精品| 日本五十路高清| 成年女人毛片免费观看观看9| av欧美777| 午夜久久久久精精品| 国产高清视频在线观看网站| 真人做人爱边吃奶动态| 国产精品女同一区二区软件 | .国产精品久久| 一本久久中文字幕| 国产高潮美女av| 两人在一起打扑克的视频| 91av网一区二区| 色尼玛亚洲综合影院| 亚洲av免费在线观看| 国产伦精品一区二区三区四那| 黄色视频,在线免费观看| 人妻制服诱惑在线中文字幕| 亚洲五月天丁香| 国产视频一区二区在线看| 久久久国产成人免费| 波多野结衣高清无吗| 91麻豆av在线| 亚洲综合色惰| 啦啦啦韩国在线观看视频| 亚洲狠狠婷婷综合久久图片| 成人永久免费在线观看视频| 亚洲av熟女| 夜夜看夜夜爽夜夜摸| 欧美日韩福利视频一区二区| 国产三级在线视频| 内地一区二区视频在线| 亚洲精品色激情综合| 少妇的逼水好多| 欧美成人性av电影在线观看|