• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Neural network hyperparameter optimization based on improved particle swarm optimization①

    2023-12-15 10:43:52XIEXiaoyan謝曉燕HEWanqiZHUYunYUJinhao
    High Technology Letters 2023年4期

    XIE Xiaoyan(謝曉燕), HE Wanqi②, ZHU Yun, YU Jinhao

    (?School of Computer, Xi’an University of Posts and Telecommunications, Xi’an 710121, P.R.China)

    (??School of Electronic Engineering, Xi’an University of Posts and Telecommunications, Xi’an 710121, P.R.China)

    Abstract

    Key words: hyperparameter optimization, particle swarm optimization (PSO) algorithm, neural network

    0 Introduction

    With the rapid growth of the artificial intelligence(AI) technologies, deep learning has achieved significant results in complex regression and classification problems, involving computer vision (CV), natural language processing (NLP), object detection[1], and so on.a neural network model needs to be trained to drive a satisfied accuracy.But there are a large number of hyperparameters which need to be configured during the training processing, such as learning rate,batch size, and optimizer.How to select appropriate hyperparameters to help training and explore the best neural network model has become a focus and a difficulty.

    In the early studies, grid search was used to exhaustive search of parameter space.Later, an improved algorithm was implemented based on this called random search.Experiments have shown that, with the same number of search iterations, random search tries more parameter values compared with grid search, and reduces search time while ensuring model accuracy[2].However, these search methods cannot solve the parameter optimization problem of neural network very well.Ref.[3] mentioned that hyperparameter optimization is an NP-hard optimization problem, and most current approaches solve it by adopting metaheuristic algorithms.

    Ref.[4] successfully used genetic algorithm(GA) for hyperparameter tuning.It is found that more and more metaheuristic and modified algorithms could be used to optimize neural networks, and some researchers choose to extend different exploration[3].The development of hybrid models can improve performance and the ability of complex optimization.Ref.[5] proposed an improved hybrid algorithm based on bat echo location behavior, combining local search to optimize weights, architectures, and active neurons.Ref.[6]introduced a combination of genetic and gradient descent to train networks.The proposed HGADD-NN method achieved good results on several benchmark problems.

    Among most common algorithms, the particle swarm optimization (PSO)[7-8]is a more popular choice, compared with GA[9], grey wolf optimization algorithm (AWO)[10]and ant colony optimization(ACO), because of its less nodal parameters, efficient global search and easily concurrent processing.So, it is introduced into parameters selection of convolutional neural network (CNN).In Ref.[7], PSO is used for parameters decision of CNN and verified successfully in classification of CIFAR.This indicates that PSO is a feasible scheme for parameters optimization of CNN.To handle the different of value scopes and datatypes of hyperparameters, independent candidate particles were defined for different parameter in Ref.[7].But the initial position selected randomly would result in local optima trapped and the weakly local searching ability would entail a long convergent stage.Such a weakness may lead to a biased model even with the cost of additional calculations.Therefore, the improved PSO or integrating other heuristic algorithms become a striking field.Ref.[11] proposed a distributed PSO(DPSO)mechanism.The particles were updated and allocated in the master,and the fitness of different particles was calculated using multiple slaves.This strategy automatically and globally searches for optimal hyperparameters and dramatically reduces training time compared with traditional PSO, thanks to the distributed training method.Unfortunately, this idea only uses a distribution way to tackle long training time in the parameter configuration process.The cost of calculation is ignored.

    The key problem to this issue lies in balance of the diversity of particle swarms and cost of convergence.In this work, an improved solution of PSO is proposed to aid in locating the best hyperparameter combination of specific network architecture easier.The following specific contributions are made.

    (1) To response to the troubles caused by inconsistent data types and widely different value scopes of neural networks, the interval mapping method is adopted to data coding.The motivation of such design is to ensure the effectiveness of particles through a normalized strategy and to avoid the local evolution of swarm stem from randomly position selecting of original particles.

    (2) By introducing mutation and crossover operations to increase the diversity of particles, the proposed algorithm solves the problems of the raw PSO, such as being easily trapped in local optima and low convergence accuracy during hyperparameter searching.

    The rest of this paper is organized as follows.Section 1 discusses related work.Section 2 analyzes the motivation and describes the implementation of proposed algorithms in detail.Section 3 discusses the experimental environment and implementation scheme on different network structures and datasets.Section 4 summarizes the content of this paper.

    1 Related work

    In PSO[12], the optima seeking is converted to a process of traversing an-dimensional function with particles of a swarm.Each a potential solution to a given problem is viewed as a particle.PSO obtains the best solution from interaction of particles.When mapping the different value scope of hyperparameters into andimensional function, selecting of the best parameter combination can be figured out by PSO.

    If there areNhyperparameters in the specific CNN, the value of each parameter ranges fromlowtoup.A particle can be denoted as{x1,x2,…,xN-1,xN}, and the performance is evaluated by a fitness function (often the loss function in CNN).The swarm is constructed originally with particles generated randomly by predefined value.The fitness value of each particle is calculated iteratively until reaching its best position or meeting the pre-set termination condition.The prior personal best position (gbest) and the global best position (pbest) is characterized as the intermediate results.Updating of particles can be given by Eq.(1) and Eq.(2).

    where,vrepresents the velocity vector;wis the inertia weight utilized to balance the local exploitation and global exploration;r1andr2are random vectors uniformly distributed within the range of[low,up];c1andc2are acceleration coefficients, which are positive constants.

    Above scheme, mentioned in Ref.[7] had been demonstrated be helpful to parameters tuning.However, there are many kinds of hyperparameter included in CNN, their datatype is also inconsistent (the number of convolutional kernels is recorded as integer, the learning rate is float, etc.).Moreover, the value scope of parameters in different layers show an enormous difference.The uniform format in coding will merge the effective attribute of particles and results a calculation error.However, differentiated coding format will increase the complexity of the swarm.Additional, when solving the problem with a large number of dimensions or complex and large datasets, PSO shows poor-quality results,usually falling into local optima trap.

    To cope with this tricky situation, a series of solutions are proposed in this paper.The composite variabletype particles are mapped into an interval to ensure the effectiveness of particles update and to prevent the poor results due to significant differences in data sampling intervals.Then,the selection,mutation,and crossover operations of GA are introduced to add diversity of particles and avoid getting stuck in local optima during the parameter search process.Finally, an improved particle swarm optimization (IPSO) algorithm is proposed based on the above two improvement measures.

    2 Methods

    2.1 Hybrid particle coding

    The parameters of CNN to be optimized selected in this paper include the number of convolution kernels, learning rate and batch size, etc., as shown in Table 1.

    Table 1 Parameters to be optimized

    It can be seen from Table 1 that the scope of parameters of int-type ranges from 6 ×(Parm1) to 120 ×( Parm4 ).In addition, a float-type parameter(Parm6) is also involved.Such a variety of value maybe leads to bias direction in updating procedure and unable to converge in severe cases.As a result, there needs an effective manner to remove this difference.To this end, the PSO advocate[7-8]can only independently characterizes particles for each parameter.As a result,initial value of each particle only occurs randomly in self-defined scope.Although the difference of parameter scope needs not to be considered, such a treatment may probably generate an unreachable position during searching.For example, in a two-dimensional space composed of Parm2 and Parm3, the value of Parm2 could not reach [1,3] and [8,11], marked as ‘?’in Fig.1, due to the limiting of random function.Subsequently, particles generated will not overall coverage to initial swarm, while it would trap in local optima in iteration.

    To avoid such incidents, a random sampling function conformed to uniform distribution is introduced in our scheme, which uniforms the coding of overall parameters.Meanwhile, Eq.(3) is used to mapping scopes of parameters to a normalized uniform.

    where the lower bound and upper bound of a parameter are denoted aslowAreaandupArearespectively.The distribution of initial particles is presented as ‘?’ in Fig.1.It is obvious that particles scattered steadily in overall interval.The global coverage of search space and completeness of swarm are guaranteed very well.

    Fig.1 Schematic diagram of particle distribution before and after optimization

    2.2 Genetic manipulation

    In this paper, in order to increase the diversity of particles during the updating, the selection, crossover,and mutation operations of GA are added to the IPSO.The cross process of particles is shown in Fig.2.Suppose there areNparticles in the swarm, which are denoted as{X1,X2,X3,…,XN-1,XN}.After the calculation, the fitness values of each particle are marked as{F(X1),F(X2),F(X3),…,F(XN-1),F(XN)}.Firstly, the first two particles with higher fitness values are selected and recorded asXiandXj.The inter-section pointPis calculated using Eq.(4).TakingN=6 as an example, the selection result is shown in Fig.2(a).Then, using the inter-section pointPas the boundary, the particlesXiandXjare divided into four parts:Xi-front,Xi-after,Xj-front, andXj-after, as shown in Fig.2(b).Finally,Xi-frontis spliced withXj-after,andXj-frontis spliced withXi-after, then two new particlesXi-newandXj-neware formed, as shown in Fig.2(c).

    Fig.2 An example illustration of the crossover process

    The schematic diagram of the particle mutation is presented in Fig.3,takingN=6 as an example.Firstly,the complete information of the initial particle would be obtained.Then, an integer value in [1,N] will be randomly generated,Ndenotes the total number of optimized parameters.Finally, a randomly generated number within the corresponding parameter scope of that position will replace the original value, and the mutation operation will be done.

    Fig.3 Schematic diagram of mutation process

    For the raw PSO method, particle position participating in the next iteration is determined by the optimal solution obtained from this searching round and the historical optimal solution.The update domain of raw PSO can be described by the zone, shown as dashed lines in Fig.4.The updating domain of raw PSO is limited to such a domain.It is only stopped in the optima of such local domain.If the global optima are escaped out of this interval,the final result will be deviated to the correct one.

    The GA operations will help to jump out of this limitation by involving new particle.It can enhance the diversity of particles in iteration of swarm, and make the global optima be located easier.

    Fig.4 Comparison of particle positions before and after optimization

    2.3 Overall scheme

    The flow chart of hyperparameter configuration based on the proposed IPSO is shown in Fig.5, there are 3 steps as follows.

    Step 1 Initialization of swarm and its particles.The learning factorsc1,c2, weight coefficientw,number of particles, and maximum particle speed need to be initialized at beginning.At the same time, it opens up a space to store local and global optimal values and randomly generates a specified numberNof particles to form a swarm according to the number and scope of parameters to be optimized.

    Step 2 Calculate the fitness value of the particles.The values of each particle are sequentially sent to the designated neural network for training and testing, and the accuracy rate obtained will be selected as the fitness value of the particle and recorded.

    Step 3 Particles updating.According to Algorithm 1, the particles in the current swarm are updated to complete the subsequent iteration.

    Algorithm 1 Particle updating Input: Primitive swarm D(X), Fitness function F(X)Output: The updated swarm D(X)1: function update (D(X))2: Initialization max1 and max2 3 for:i in length ( D(X) )4: Evaluate F(Xi)5: set max1 and max2 6: end for 7: update gbest and pbest 8:new1 ,new2 = crossover ( max1 ,max2 )9: D(X) = mutate ( new1 ,new2 )10: return D(X)11: end function

    Fig.5 Flow chart of the IPSO

    3 Experiments and results analysis

    3.1 Implementation details

    The initialization of the algorithm is completed according to the parameters shown in Table 2.The proposed algorithm is verified under different network structures and data sets.The experimental environment of this paper is shown in Table 3.

    Table 2 IPSO algorithm related parameters

    Table 3 Experimental environment setting

    3.2 Experimental results and analysis

    3.2.1 Under different network structures

    In this work,LeNet-5,ResNet-18,VGG-16,MobileViT, and Long short-term memory (LSTM) are selected as the test objects.The MNIST was used on LeNet-5 and LSTM, the Flowers-Recognition was used on MobileViT, and the remaining networks tested with the CIFAR10.This presearch uses the parameters searched by the IPSO and PSO to complete the training task.The accuracy rate changes with epoch during the training are shown in Fig.6.It can be seen that for the same training epoch,no matter which neural network is used, the parameter configuration searched by the IPSO can make the neural network converge faster than the PSO.In addition, Fig.6 (c) and Fig.6(d) show more apparent difference in accuracy.

    Fig.6 Accuracy rate change with Epoch

    Using the IPSO and the parameters searched by the PSO to complete the training, the final accuracy rate is compared in Table 4.It is clear that, after structure tuned by IPSO, the CNN model can finally obtain higher accuracy, compared with tuned by PSO,among which the accuracy rates are increased by about 0.4%,1.9%, and 4.5%, respectively.On the ViT model, the network built by the IPSO can finally complete the image classification tasks with an accuracy rate of 66.21%, about 13% higher than that by the raw PSO.IPSO and PSO have achieved similar accuracy on the recurrent neural network.On the ViT model experiment, IPSO and PSO only have a difference of 1 in parameter 1, but the final network structure gains a significant difference due to the layer-by-layer accumulation.Meanwhile, although the data number of Flowers-Recognition used in this paper is small, different parameter configurations will still bring obvious difference of performance.LSTM is more inclined to the application of NLP.At the same time, this paper only considers two network-related parameters, the hidden state and the number of recurrent network layers.Although the difference between them will still change the number of parameters, the extraction effect of changing data feature is not obvious, which can be to negligible.

    Table 4 Comparison of the accuracy of different algorithms on different network structures

    3.2.2 Under different datasets

    To get the effectiveness of the IPSO proposed in this paper on the hyperparameter seeking, three datasets of MNIST, Fashion-MNIST and CIFAR10 are selected for verification.

    (1) The MNIST dataset includes 70 000 handwritten digital images.The training set contains 60 000 samples, while the test set contains 10 000 samples.The dataset is categorized into 10 classes related to 10 numerals.

    (2) The Fashion-MNIST dataset contains 10 categories of images: T-shirt, trouser, puller, dress, coat,sandy, shirt, sneaker, bag, and ankle boot.The training dataset has a total of 60 000 samples, and the test dataset has a total of 10 000 samples.

    (3) The CIFAR-10 dataset includes 60 000 color images from 10 categories, with 6000 images per category.These 10 different categories represent aircraft, cars, birds, cats, deer, dogs, frogs, horses, boats, and trucks.The training set contains 50 000 samples (83% of the original dataset),while the test set contains 10 000 samples (17% of the original dataset).

    The parameters searched by the IPSO on the above three data sets can have 99.58%, 93.39%,and 78.96% accuracy when training the neural network.Compared with PSO, the accuracy has increased by about 0.1%,0.5%,2%, respectively.Compared with the SSO[13], the same accuracy has been achieved on the MNIST dataset meanwhile.On Fashion-MNIST and CIFAR10, the accuracy has increased by 0.36%and 5.83% respectively.Compared with the DPSO[11], the model accuracy obtained using IPSO on MNIST and Fashion MNIST is slightly higher, and the final results are shown in Table 5.In addition, in order to compare with the WOA[10], the network’s optimization parameters are adjusted to be consistent with those in GWO,including batch size,epochs,and optimizer.The results are shown in Table 6.From the table, it can be seen that the IPSO algorithm’s search for parameter combinations completed training on the Fashion-MNIST dataset has a higher accuracy on the test set than GWO algorithm.

    Table 5 Comparison of the accuracy of different algorithms on different network structures

    Table 6 Accuracy comparison on the Fashion-MNIST dataset

    4 Conclusion

    This paper proposes an IPSO algorithm that integrates GA to address the issues of traditional PSO,such as easily falling into local optima and low convergence accuracy in the process of seeking.Finally, the performance is verified by using different types of neural network models and datasets.Experimental results show that the proposed IPSO achieves higher accuracy than traditional PSO on CNNs and ViT models, tested with Fashion-MNIST and CIFAR10 datasets.Moreover, with optimized parameter configurations, the model is more stable and converges faster during training.However, this paper only considers a limited number of parameters to be optimized, while other parameters affecting the neural network structure, such as the depth and number of convolutional layers, can also be searched for the optimal solution.Therefore, in the future, it is worth considering the fusion of parameters at different levels to find the optimal network structure and model parameters for better model performance.

    国产成人精品久久久久久| 国产精品国产三级国产av玫瑰| 男女国产视频网站| 久久久国产成人免费| 女的被弄到高潮叫床怎么办| 亚洲丝袜综合中文字幕| 人妻制服诱惑在线中文字幕| 国产三级中文精品| 国产欧美另类精品又又久久亚洲欧美| av国产久精品久网站免费入址| 男人狂女人下面高潮的视频| 国产精品电影一区二区三区| 在线天堂最新版资源| 久久久久精品久久久久真实原创| 偷拍熟女少妇极品色| 久久久久九九精品影院| 亚洲国产精品国产精品| 国产精品.久久久| 2022亚洲国产成人精品| 男女边吃奶边做爰视频| 国产精品嫩草影院av在线观看| 成年免费大片在线观看| 国产成人午夜福利电影在线观看| 夜夜看夜夜爽夜夜摸| 久久精品国产自在天天线| 偷拍熟女少妇极品色| 人妻系列 视频| 免费一级毛片在线播放高清视频| 中文亚洲av片在线观看爽| 国产单亲对白刺激| 欧美变态另类bdsm刘玥| 亚洲电影在线观看av| 日产精品乱码卡一卡2卡三| av天堂中文字幕网| 久久久精品大字幕| 亚洲三级黄色毛片| 尾随美女入室| av天堂中文字幕网| eeuss影院久久| 最近视频中文字幕2019在线8| 午夜福利在线在线| 日韩中字成人| 亚洲精品色激情综合| 国产欧美另类精品又又久久亚洲欧美| 欧美一区二区国产精品久久精品| 成年女人永久免费观看视频| 校园人妻丝袜中文字幕| 欧美xxxx黑人xx丫x性爽| 成人性生交大片免费视频hd| 女人被狂操c到高潮| 精品久久久久久久久久久久久| 天堂av国产一区二区熟女人妻| 国产探花极品一区二区| 精品国产露脸久久av麻豆 | 日本欧美国产在线视频| 欧美丝袜亚洲另类| 日韩亚洲欧美综合| 女人十人毛片免费观看3o分钟| 男女边吃奶边做爰视频| 亚洲av男天堂| 国产精品久久久久久精品电影| 国产视频首页在线观看| 久久韩国三级中文字幕| 2021少妇久久久久久久久久久| 热99re8久久精品国产| 中文字幕免费在线视频6| 天堂网av新在线| 伦精品一区二区三区| 欧美区成人在线视频| 国产三级中文精品| 99久久精品热视频| 国产精品爽爽va在线观看网站| 能在线免费看毛片的网站| 91精品国产九色| 夜夜爽夜夜爽视频| 真实男女啪啪啪动态图| 99久久精品国产国产毛片| av女优亚洲男人天堂| 最近手机中文字幕大全| 建设人人有责人人尽责人人享有的 | 日本av手机在线免费观看| 国产一级毛片七仙女欲春2| 精品99又大又爽又粗少妇毛片| 国产单亲对白刺激| 中文字幕亚洲精品专区| av线在线观看网站| 观看免费一级毛片| 国产精品三级大全| 我的女老师完整版在线观看| 我要看日韩黄色一级片| 热99re8久久精品国产| 老司机福利观看| 亚洲在线自拍视频| 人人妻人人澡人人爽人人夜夜 | 九草在线视频观看| 中文天堂在线官网| 欧美bdsm另类| 国产精品女同一区二区软件| 国产精品日韩av在线免费观看| 免费av毛片视频| 在线观看美女被高潮喷水网站| 日韩高清综合在线| 一边摸一边抽搐一进一小说| 亚洲一区高清亚洲精品| 午夜福利成人在线免费观看| 国产精品野战在线观看| 国产精品一区二区性色av| 久久精品久久久久久噜噜老黄 | 干丝袜人妻中文字幕| 国产 一区 欧美 日韩| 国产伦一二天堂av在线观看| 寂寞人妻少妇视频99o| 男人舔奶头视频| 国产av不卡久久| 丰满乱子伦码专区| 少妇人妻一区二区三区视频| 大又大粗又爽又黄少妇毛片口| 国产综合懂色| 直男gayav资源| 秋霞伦理黄片| 国产高清不卡午夜福利| 国产亚洲5aaaaa淫片| 亚洲国产欧洲综合997久久,| av视频在线观看入口| 狂野欧美白嫩少妇大欣赏| 尤物成人国产欧美一区二区三区| 亚洲,欧美,日韩| 中文天堂在线官网| 日韩欧美三级三区| 欧美zozozo另类| 久久人人爽人人片av| 在线免费观看不下载黄p国产| 国产真实乱freesex| 麻豆精品久久久久久蜜桃| 亚洲av电影不卡..在线观看| 秋霞伦理黄片| 老司机影院毛片| 黄片无遮挡物在线观看| 久久久久久大精品| 美女cb高潮喷水在线观看| 99久国产av精品| 午夜免费男女啪啪视频观看| 欧美日韩综合久久久久久| 男女啪啪激烈高潮av片| 亚洲怡红院男人天堂| 视频中文字幕在线观看| 国产中年淑女户外野战色| 国产欧美日韩精品一区二区| 国产午夜福利久久久久久| 国产私拍福利视频在线观看| 免费观看人在逋| 国产v大片淫在线免费观看| 精品国产一区二区三区久久久樱花 | 99久久精品热视频| 国产精品美女特级片免费视频播放器| 搞女人的毛片| 亚洲av男天堂| 麻豆乱淫一区二区| 国产白丝娇喘喷水9色精品| 国产精品久久久久久精品电影| av卡一久久| 最近最新中文字幕大全电影3| 色综合色国产| 国产男人的电影天堂91| 色视频www国产| 亚洲色图av天堂| 黑人高潮一二区| 久久精品国产亚洲网站| 看免费成人av毛片| 爱豆传媒免费全集在线观看| 韩国av在线不卡| 国产白丝娇喘喷水9色精品| 亚洲国产欧洲综合997久久,| 久久草成人影院| 全区人妻精品视频| 国产黄a三级三级三级人| 99热全是精品| 白带黄色成豆腐渣| 99在线人妻在线中文字幕| 亚洲av免费高清在线观看| 亚洲图色成人| 久久久久久久久大av| 欧美zozozo另类| 久久国内精品自在自线图片| 国产综合懂色| 精品人妻一区二区三区麻豆| 亚洲经典国产精华液单| 一区二区三区高清视频在线| 亚洲内射少妇av| 最近手机中文字幕大全| 免费黄网站久久成人精品| 亚洲av中文字字幕乱码综合| 欧美精品一区二区大全| 爱豆传媒免费全集在线观看| 午夜福利视频1000在线观看| 久久精品国产99精品国产亚洲性色| 国产精品电影一区二区三区| 一区二区三区高清视频在线| 九色成人免费人妻av| 亚洲欧美中文字幕日韩二区| 国产探花在线观看一区二区| 两性午夜刺激爽爽歪歪视频在线观看| 欧美又色又爽又黄视频| 国产精品一及| 九色成人免费人妻av| 国产精品久久视频播放| 老司机影院成人| 久久久精品大字幕| av在线蜜桃| 欧美精品国产亚洲| 在现免费观看毛片| 成人美女网站在线观看视频| 日本黄色视频三级网站网址| 精品国产三级普通话版| 亚洲五月天丁香| 成年av动漫网址| 伊人久久精品亚洲午夜| 村上凉子中文字幕在线| av在线天堂中文字幕| 久久精品国产亚洲网站| 国产午夜精品论理片| 91av网一区二区| 最近中文字幕高清免费大全6| 老司机福利观看| 国产白丝娇喘喷水9色精品| 久久99热这里只有精品18| 一区二区三区乱码不卡18| 高清在线视频一区二区三区 | 免费看a级黄色片| 偷拍熟女少妇极品色| 干丝袜人妻中文字幕| 久久鲁丝午夜福利片| 成人国产麻豆网| 美女内射精品一级片tv| 又爽又黄a免费视频| 亚洲第一区二区三区不卡| www日本黄色视频网| 男女视频在线观看网站免费| 成人亚洲精品av一区二区| 两性午夜刺激爽爽歪歪视频在线观看| 男人舔奶头视频| 淫秽高清视频在线观看| 91精品国产九色| 两个人视频免费观看高清| 亚洲国产最新在线播放| 亚洲美女搞黄在线观看| 国产午夜精品论理片| 亚洲国产精品sss在线观看| 日本av手机在线免费观看| 国产精品蜜桃在线观看| 国产毛片a区久久久久| 看非洲黑人一级黄片| 欧美激情在线99| 欧美成人午夜免费资源| 人人妻人人澡人人爽人人夜夜 | 日本熟妇午夜| 看非洲黑人一级黄片| 精品久久久久久久末码| 欧美成人精品欧美一级黄| av在线蜜桃| 国产精品久久久久久av不卡| 中文资源天堂在线| 精品久久久久久久久av| 亚洲国产精品国产精品| eeuss影院久久| 午夜激情福利司机影院| 国产极品精品免费视频能看的| 99国产精品一区二区蜜桃av| 国产精品不卡视频一区二区| 国产亚洲午夜精品一区二区久久 | 亚洲一区高清亚洲精品| 亚洲激情五月婷婷啪啪| 亚洲最大成人手机在线| 三级经典国产精品| 免费观看性生交大片5| 一级毛片aaaaaa免费看小| 热99在线观看视频| 成人鲁丝片一二三区免费| 久久久久久九九精品二区国产| 亚洲国产最新在线播放| 国产精品一区二区三区四区免费观看| 偷拍熟女少妇极品色| 久久精品熟女亚洲av麻豆精品 | 国产男人的电影天堂91| 老师上课跳d突然被开到最大视频| 中文欧美无线码| 精品人妻熟女av久视频| 成年女人看的毛片在线观看| 一边摸一边抽搐一进一小说| 一个人看的www免费观看视频| 99九九线精品视频在线观看视频| 小蜜桃在线观看免费完整版高清| 变态另类丝袜制服| 观看免费一级毛片| 亚洲av男天堂| 国产精品永久免费网站| 婷婷色综合大香蕉| 青春草国产在线视频| 国产午夜福利久久久久久| 亚洲五月天丁香| 免费看光身美女| 久久久久久久午夜电影| 欧美潮喷喷水| 亚洲18禁久久av| 亚洲不卡免费看| 国产精品福利在线免费观看| 在现免费观看毛片| 国产午夜福利久久久久久| 一区二区三区免费毛片| 搞女人的毛片| 少妇裸体淫交视频免费看高清| 久久久a久久爽久久v久久| 国产真实伦视频高清在线观看| 能在线免费观看的黄片| 亚洲第一区二区三区不卡| videossex国产| 天天一区二区日本电影三级| 日韩强制内射视频| 久久久欧美国产精品| 亚洲熟妇中文字幕五十中出| 亚洲在线观看片| 久久婷婷人人爽人人干人人爱| 欧美成人免费av一区二区三区| 国产三级在线视频| 日韩一区二区三区影片| 一区二区三区高清视频在线| av女优亚洲男人天堂| 天堂中文最新版在线下载 | 亚洲av男天堂| 日韩欧美国产在线观看| 九九久久精品国产亚洲av麻豆| 日本黄色片子视频| 亚洲乱码一区二区免费版| 最近2019中文字幕mv第一页| 麻豆久久精品国产亚洲av| 男人的好看免费观看在线视频| .国产精品久久| 亚洲av熟女| 伊人久久精品亚洲午夜| av在线播放精品| 色5月婷婷丁香| 99热网站在线观看| 美女cb高潮喷水在线观看| 色哟哟·www| 免费观看的影片在线观看| 国产激情偷乱视频一区二区| 26uuu在线亚洲综合色| 成人性生交大片免费视频hd| 人妻制服诱惑在线中文字幕| 久久久成人免费电影| 校园人妻丝袜中文字幕| 国产大屁股一区二区在线视频| 男女啪啪激烈高潮av片| 国产精品久久久久久av不卡| 国产激情偷乱视频一区二区| 日韩精品有码人妻一区| 中文乱码字字幕精品一区二区三区 | 日日摸夜夜添夜夜爱| 国产一区有黄有色的免费视频 | 亚洲成人av在线免费| 中国国产av一级| 国产av一区在线观看免费| 久久久亚洲精品成人影院| 午夜福利在线观看吧| 可以在线观看毛片的网站| 国产伦精品一区二区三区视频9| 99热6这里只有精品| av.在线天堂| 日本免费a在线| 国产精品野战在线观看| 婷婷六月久久综合丁香| 午夜久久久久精精品| 亚洲av.av天堂| 高清毛片免费看| 国产精品av视频在线免费观看| 欧美高清性xxxxhd video| 看非洲黑人一级黄片| 午夜福利成人在线免费观看| 在线观看一区二区三区| 日本三级黄在线观看| 男人舔女人下体高潮全视频| 如何舔出高潮| 99久久精品热视频| 国产高清三级在线| 国产一区二区在线av高清观看| 青春草视频在线免费观看| 一边亲一边摸免费视频| 99久久精品一区二区三区| 国产一区二区在线av高清观看| 免费一级毛片在线播放高清视频| 久久久久久国产a免费观看| 小蜜桃在线观看免费完整版高清| 18禁在线播放成人免费| 国产探花极品一区二区| 欧美成人a在线观看| 国产精品1区2区在线观看.| 丰满少妇做爰视频| 午夜福利在线观看免费完整高清在| 麻豆久久精品国产亚洲av| 春色校园在线视频观看| 大话2 男鬼变身卡| 午夜激情欧美在线| 亚洲色图av天堂| av在线观看视频网站免费| 国产免费福利视频在线观看| 午夜亚洲福利在线播放| 国产精品麻豆人妻色哟哟久久 | 精品熟女少妇av免费看| 中国国产av一级| 天堂影院成人在线观看| 人人妻人人澡欧美一区二区| 午夜福利视频1000在线观看| 黄色配什么色好看| 国产精品爽爽va在线观看网站| 99热这里只有是精品50| 亚洲精品成人久久久久久| 国产精品美女特级片免费视频播放器| 99视频精品全部免费 在线| 国产精品精品国产色婷婷| 国产成人精品一,二区| 久久亚洲精品不卡| 超碰97精品在线观看| 中文字幕av成人在线电影| 亚洲最大成人手机在线| av天堂中文字幕网| 99国产精品一区二区蜜桃av| 中国美白少妇内射xxxbb| 高清av免费在线| 日本三级黄在线观看| 男女下面进入的视频免费午夜| 免费黄网站久久成人精品| av免费在线看不卡| 免费一级毛片在线播放高清视频| 听说在线观看完整版免费高清| 不卡视频在线观看欧美| 欧美成人a在线观看| av.在线天堂| 久99久视频精品免费| 男人舔奶头视频| 一级av片app| 在线观看一区二区三区| 国产一区有黄有色的免费视频 | 日本免费a在线| 免费无遮挡裸体视频| 最近手机中文字幕大全| 久久精品人妻少妇| 婷婷色av中文字幕| 婷婷六月久久综合丁香| 亚洲三级黄色毛片| 免费看美女性在线毛片视频| 麻豆成人av视频| av视频在线观看入口| 最近中文字幕2019免费版| 亚洲欧美日韩高清专用| 波多野结衣高清无吗| 欧美成人精品欧美一级黄| 国产精品久久久久久精品电影| 午夜精品一区二区三区免费看| 九九热线精品视视频播放| 伊人久久精品亚洲午夜| 成人毛片a级毛片在线播放| 亚洲真实伦在线观看| 国产在视频线精品| 国产毛片a区久久久久| 淫秽高清视频在线观看| 欧美一级a爱片免费观看看| 亚洲国产精品专区欧美| 晚上一个人看的免费电影| 亚洲自拍偷在线| 国产视频内射| 大话2 男鬼变身卡| 国产成人a∨麻豆精品| 岛国毛片在线播放| 女人十人毛片免费观看3o分钟| 国产伦理片在线播放av一区| 九九热线精品视视频播放| 精品久久久久久久人妻蜜臀av| 亚洲欧洲日产国产| 亚洲丝袜综合中文字幕| 欧美一区二区国产精品久久精品| a级一级毛片免费在线观看| 亚洲国产精品sss在线观看| 国产成人精品婷婷| 18禁动态无遮挡网站| 久久久久精品久久久久真实原创| 国产精品伦人一区二区| 纵有疾风起免费观看全集完整版 | 色5月婷婷丁香| 91狼人影院| 国产视频首页在线观看| 伦精品一区二区三区| 一个人免费在线观看电影| 欧美+日韩+精品| 精品久久久久久久久av| 99热这里只有是精品50| 国产国拍精品亚洲av在线观看| 夜夜爽夜夜爽视频| 色哟哟·www| 中文字幕av在线有码专区| 成人特级av手机在线观看| 97超碰精品成人国产| 成人特级av手机在线观看| 联通29元200g的流量卡| 七月丁香在线播放| 欧美高清性xxxxhd video| 亚洲自拍偷在线| 成年免费大片在线观看| 2021天堂中文幕一二区在线观| 国产精品久久久久久精品电影小说 | 国产淫语在线视频| 国内揄拍国产精品人妻在线| 国产精品久久久久久久久免| 久久久午夜欧美精品| 水蜜桃什么品种好| 精品熟女少妇av免费看| 一区二区三区免费毛片| 中文资源天堂在线| 九色成人免费人妻av| 精品国内亚洲2022精品成人| 久久婷婷人人爽人人干人人爱| 精品久久久久久成人av| 91午夜精品亚洲一区二区三区| 欧美xxxx黑人xx丫x性爽| 三级经典国产精品| 毛片一级片免费看久久久久| 国产毛片a区久久久久| 91在线精品国自产拍蜜月| 亚洲精华国产精华液的使用体验| 久久精品综合一区二区三区| a级一级毛片免费在线观看| 综合色av麻豆| 日本熟妇午夜| 中文乱码字字幕精品一区二区三区 | 3wmmmm亚洲av在线观看| 成人一区二区视频在线观看| 日韩av在线大香蕉| 天天躁夜夜躁狠狠久久av| 免费黄网站久久成人精品| 男女边吃奶边做爰视频| 黄色日韩在线| 麻豆乱淫一区二区| 色吧在线观看| 熟妇人妻久久中文字幕3abv| 一本一本综合久久| eeuss影院久久| 成人欧美大片| 亚洲aⅴ乱码一区二区在线播放| 欧美精品一区二区大全| 亚洲美女搞黄在线观看| 九九热线精品视视频播放| av天堂中文字幕网| 三级国产精品片| 国产精品国产高清国产av| 精品久久久久久久末码| 欧美潮喷喷水| 亚洲第一区二区三区不卡| 国产老妇伦熟女老妇高清| 两个人的视频大全免费| 少妇人妻精品综合一区二区| 亚洲精品日韩在线中文字幕| 亚洲一区高清亚洲精品| 在线观看一区二区三区| 男女那种视频在线观看| 国产精品人妻久久久影院| 观看免费一级毛片| 日日撸夜夜添| av线在线观看网站| 免费播放大片免费观看视频在线观看 | 老司机影院成人| 天天躁夜夜躁狠狠久久av| 99热6这里只有精品| 91aial.com中文字幕在线观看| 女人被狂操c到高潮| 久久亚洲精品不卡| 观看免费一级毛片| 九色成人免费人妻av| 亚洲欧美成人精品一区二区| 日本爱情动作片www.在线观看| 看十八女毛片水多多多| 精品久久久久久久末码| 亚洲欧美精品自产自拍| 国产精品久久久久久精品电影小说 | 国产免费一级a男人的天堂| 99热6这里只有精品| 精品久久久久久久久亚洲| 欧美成人精品欧美一级黄| 亚洲综合色惰| 亚洲丝袜综合中文字幕| 免费看美女性在线毛片视频| 三级经典国产精品| 国产老妇伦熟女老妇高清| 国产av在哪里看| 欧美极品一区二区三区四区| 变态另类丝袜制服| 一级黄色大片毛片| 三级国产精品片| 日韩欧美三级三区| 少妇高潮的动态图| 人妻系列 视频| 亚洲国产欧洲综合997久久,| 久久久久久久久久久免费av| 青春草国产在线视频| 国产亚洲91精品色在线| 波野结衣二区三区在线| 啦啦啦啦在线视频资源| 亚洲内射少妇av| 欧美区成人在线视频| 如何舔出高潮| 国产精品爽爽va在线观看网站| 亚洲第一区二区三区不卡| 99热这里只有精品一区| 美女xxoo啪啪120秒动态图| 纵有疾风起免费观看全集完整版 | 国产精品电影一区二区三区| 一区二区三区免费毛片|