• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Physical informed memory networks for solving PDEs: implementation and applications

    2024-03-07 12:56:36JiuyunSunHuanheDongandYongFang
    Communications in Theoretical Physics 2024年2期

    Jiuyun Sun,Huanhe Dong and Yong Fang

    College of Mathematics and Systems Science,Shandong University of Science and Technology,Qingdao 266590,China

    Abstract With the advent of physics informed neural networks(PINNs),deep learning has gained interest for solving nonlinear partial differential equations(PDEs)in recent years.In this paper,physics informed memory networks(PIMNs)are proposed as a new approach to solving PDEs by using physical laws and dynamic behavior of PDEs.Unlike the fully connected structure of the PINNs,the PIMNs construct the long-term dependence of the dynamics behavior with the help of the long short-term memory network.Meanwhile,the PDEs residuals are approximated using difference schemes in the form of convolution filter,which avoids information loss at the neighborhood of the sampling points.Finally,the performance of the PIMNs is assessed by solving the KdV equation and the nonlinear Schr?dinger equation,and the effects of difference schemes,boundary conditions,network structure and mesh size on the solutions are discussed.Experiments show that the PIMNs are insensitive to boundary conditions and have excellent solution accuracy even with only the initial conditions.

    Keywords: nonlinear partial differential equations,physics informed memory networks,physics informed neural networks,numerical solution

    1.Introduction

    Partial differential equations (PDEs) are widely used to describe nonlinear phenomena in nature [1–3].And,solving PDEs is helpful in understanding the physical laws behind these nonlinear phenomena [4–7].However,analytical solutions of PDEs are often very difficult to obtain [8].Accordingly,numerical methods have been proposed and promoted the study of PDEs [9,10].Due to the large computational demands of these methods,the accuracy and efficiency of solving PDEs are difficult to acquire simultaneously.

    Figure 1.The general structure of LSTM.

    In recent years,deep learning methods have been extended from natural language recognition and machine translation to scientific computing,and have provided new ideas for solving PDEs [11–13].According to the universal approximation theorem,a multilayered feed forward network containing a sufficient number of hidden neurons can approximate any continuous function with arbitrary accuracy[14,15],which provides the theoretical support for deep learning to solve PDEs.At this stage,there are two types of deep learning methods for solving PDEs [16].The first type keep the same learning approach as the original deep learning methods.The basic theory is constructing neural operators by learning mappings from function parameter dependence to solutions,such as Deeponet,Fourier neural operator,so on[17,18].This type needs to be trained only once to handle different initial value problems,but requires a large amount of data with high fidelity.The second type combines deep learning with physical laws.In second type,the physical laws and a small amount the initial and boundary data of PDEs are used to constrain the network training instead of lots of labeled data.The representative is physics informed neural networks (PINNs) that can solve both nonlinear PDEs and corresponding inverse problems [19].Based on the original PINNs,many improved versions were proposed [20–26].Ameya et al set a scalable hyper-parameter in the activation function and proposed an adaptive activation function with better learning capabilities and convergence speed [22].Lin and Chen devise a two-stage PINNs method based on conserved quantities,which better exploits the properties of PDEs[23].In addition to the physical laws,the variational residual of PDEs is also considered as a loss term,such as deep Ritz method,deep Galerkin method and so on [27–29].As the intensive study of the PINNs and its variants,these algorithms are applied to many fields,such as biomedical problems[30],continuum micromechanics [31],stiff chemical kinetics [32].Inspired by the PINNs,some deep learning solver for nonfully connected structures are proposed.Zhu et al constructed a physics-constrained convolutional encoding-decoding structure for stochastic PDEs[33].Based on the work of Zhu et al,physics-informed convolutional-recurrent networks combined with long short-term memory network (LSTM)were proposed,while the initial and boundary conditions were hard-encoded into the network [34].Mohan et al used an extended convolutional LSTM to model turbulence [35].Based on this,Stevens et al exploited the temporal structure[36].In general,existing deep learning solvers with LSTM structures can approximate the dynamic behavior of the solution without any labeled data,but the implementations rely on feature extraction of convolutional structures and are complex.

    The aim of this paper is to build a flexible deep learning solver based on physical laws and temporal structures.Therefore,physics informed memory networks (PIMNs)based on LSTM framework is proposed as a new method for solving PDEs.In the PIMNs,the differential operator is approximated using difference schemes rather than automatic differentiation (AD).AD is flexible and ingenious,but loses information about the neighbors of the sampling points [37].The differential schemes are implemented as convolution filter,and the convolution filter is only used to calculate the physical residuals and does not change with network training.This is different from existing solvers with convolutional structures.Numerical experiments on the KdV equation and the nonlinear Schr?dinger equation show that the PIMNs can achieve excellent accuracy and are insensitive to the boundary conditions.

    The rest of the paper is organized as follows.In section 2,the general principle and network architectures of the PIMNs are elaborated.In section 3,two sets of numerical experiments are given and the effects of various influencing factors on the learned solution are discussed.Conclusion is given in last section.

    2.Physics informed memory networks

    2.1.Problem setup

    In general,the form of PDEs that can be solved by physical informed deep learning is as follows:

    where u(x,t) is the solution of PDEs and N [u]is a nonlinear differential operator.u0(x) and u1(t),u2(t) are initial and boundary functions,respectively.And,the basic theory of physical informed deep learning is approximating the solution u(x,t) through the constraints of the physical laws [15].Therefore,the PDEs residual f(x,t) is defined as

    The keys of the PIMNs are the calculation of physical residuals f(x,t) using the difference schemes and the establishment of the corresponding long-term dependence.

    2.2.Physics informed memory networks

    In this part,the framework and principles of the PIMNs are given.As shown in figure 2,the basic unit of the PIMNs is LSTM.LSTM inherits the capability of the recurrent neural network for long sequence data and can avoid the problem of vanishing gradient [38].

    The structure of the LSTM unit is shown in figure 1.Xt,ht,ct,,ft,it,otare the input,the hidden state,the cell state,the internal cell state,the forget gate,the input gate,the output gate,respectively.ftand itcontrol the information forgotten and added to ct.is the information added to ct.The output is determined jointly by otand ct.The mathematical expression of LSTM is shown as follows:

    Here,W,b are the network parameters,and ?represents the Hadamard product.

    Figure 2.The general structure of the PIMNs.

    In the PIMNs,subscripts and superscripts ofandrefer to the number of time series and layers,respectively.LSTM unit imports the outputof the current moment t to the next moment t+1,which links all moments of the same spatial point.In addition,is also used as input to moment t for the next LSTM layer,which strengthens the connection between different moments.It should be noted that the hidden nodes of the last LSTM are fixed.In fact,we use the last LSTM to control the dimensionality of the output.

    As shown in figure 2,the inputs to the PIMNs are the coordinates of the grid points in the region [x0,x1]×[0,T].The region [x0,x1]×[0,T] is meshed into (m+1)×(n+1)grid points.Each set of inputs to the PIMNs are coordinate values (xi,t0),(xi,t1),…,(xi,tn) at the same location.The outputs are the corresponding(corresponding to each row of the left panel of figure 3).Based on the outputof the PIMNs,their loss functions can be constructed.The loss function of includes three components:

    Here,δx and δt represent spatial and temporal intervals,respectively.To accelerate the training,the difference schemes are implemented by convolution operations.Taking second-order central difference schemes as an example,the convolution filters are:

    Figure 3 illustrates the computation of uxand ut.Higher order derivatives can be obtained by performing a difference operation on uxand ut,such as

    Figure 3.The convolution process of difference schemes.

    As shown in figure 3,the padding is not done to avoid unnecessary errors.This leads to the fact that utat t=t0,t=tnand uxat x=x0,x=xmcannot be computed (in terms of the second-order central difference).In other words,fjicannot be computed at t=t0(i.e.the initial condition)due to the lack ofA similar problem arises with the boundary conditions.Therefore,fjiat t=t0,x=x0and x=xmwill not be used to compute the loss.When higher order derivative terms are included in fji,the region in which losses are not computed is correspondingly enlarged.In fact,since the spatial and temporal intervals we obtain are usually small and the boundaries of the domain can be bounded by the initial and boundary conditions,the numerical solutions of the PDEs can be approximated even without computing fjinear the boundaries of the domain.And,the boundary conditions are not necessary,which will be tested in the numerical experiments.Correspondingly,the total loss excludes the boundary loss when the boundary condition is missing.In addition,the spatial derivatives at t0are constructed by known initial conditions.Therefore,except for being the target for the initial loss,the initial conditions are also used indirectly for the PDEs residuals.

    3.Numerical experiments

    In this section,the PIMNs are applied to solve the KdV equation and the nonlinear Schr?dinger equation.Specifically,the effects of different difference schemes on the solving ability of the PIMNs are discussed,and the network structure and mesh size that minimize the error are investigated.Moreover,the performance of the PIMNs and the PINNs are compared.

    In the numerical experiments,the implementation of the PIMNs is based on Python 3.7 and Tensorflow 1.15.The loss function is chosen as the mean squared loss function,and L-BFGS is used to optimize the loss function.All numerical examples reported here are run on a Lenovo Y7000P 2020H computer with 2.60 GHz 6-core Intel(R) Core(TM) i7-10750H CPU and 16 GB memory.In addition,the relative L2error is used to measure the difference between the predicted and true values and is calculated as follows:

    3.1.Case 1: The KdV equation

    The KdV equation is a classical governing model for the propagation of shallow water waves and has important applications in many areas of physics,such as fluid dynamics,plasma [39–41].In general,the KdV equation is given by

    where q0(x) is an initial function,and q1(x) and q2(x) are boundary functions.x0and x1are arbitrary real constants.The PDEs residual f(x,t) corresponding to the KdV equation is:

    And,the existence theory of equation (14) can be referred to in the [42].Here,the traveling wave solution is simulated:

    Taking [x0,x1] and [t0,t1] as [?10,10] and [0,1],the corresponding initial and boundary functions are obtained.

    3.1.1.Comparison of different difference schemes for solving the KdV equation.In section 2.2,we constructed the PDEs residuals by the second-order central difference.However,it is important to discuss which difference scheme is suitable for constructing time derivatives for establishing the long-term dependence of PDEs.Here,forward difference,backward difference and central difference are used to computer the temporal derivatives,respectively.The space [?10,10] is divided into 1000 points and the time [0,1] is divided into 100 points.Two LSTM layers are used with the number of nodes 30 and 1,respectively.Since the initialization of the network parameters is based on random number seeds,4 sets of experiments based on different random number seeds (i.e.1,2,3,4 in table 1)were set up in order to avoid the influence of chance.The relative errors for four sets of numerical experiments are given in table 1.

    From the data in table 1,the PIMNs can solve the KdV equation with very high accuracy for all three difference methods.The bolded data are the lowest relative L2errors produced by the same random number.It can be clearly seen that the relative L2errors produced by the central difference is significantly lower than that of the forward and backward difference when using the same network architecture and training data.This indicates that the temporal structure constructed by the central difference is most suitable for solving the KdV equation.

    3.1.2.The effect of boundary conditions on solving the KdV equation.In the PIMNs,the boundary conditions are not necessary due to the establishment of long-term dependencies.Next,we analyze the influence of boundary conditions on the training and results.The experimental setup are consistent with the previous subsection.And,four sets of experiments with different seeds of random numbers were also set up.

    Figure 4 shows the loss curve,and subplots a–d and e–h correspond to the cases without boundary conditions and with boundary conditions,respectively.From figure 4,it can be seen that the total loss with and without boundary conditions can converge to the same level.Although the boundary loss without boundary conditions is more curved than the boundary loss with boundary conditions,the difference is not significant.Meanwhile,the red line shows that the influence of boundary loss is limited.In terms of the number of iterations,the boundary conditions do not accelerate the convergence of the network,but rather lead to a certain increase in iterations.

    Figure 4.The loss curve:a–d are the loss curves without boundary conditions and e–h are the loss curves with boundary conditions.The blue solid line and the blue dashed line are MSE and MSEB(the left y-axis),respectively,and the red line is the ratio of MSEB to MSE(the right y-axis).

    Table 2 gives the relative errors for the four sets of numerical experiments with and without boundary condition losses corresponding to figure 4.These two cases have very similar errors.the accuracy of the solution is not affected by boundary conditions.In general,since the influence of boundary conditions on both the training process and the relative L2errors is limited,the PIMNs is insensitive to the boundary conditions.But this does not mean that the boundary conditions are not important for solving PDEs,it only shows that the PIMNs can solve initial value problems for PDEs.

    3.1.3.The effect of network structure for solving the KdV equation.In this part,based on the same training data,we investigated the effect of network structure on solving the KdV equation by setting different numbers of LSTM layers and hidden nodes.Complex networks tend to be more expressive,but also more difficult to train.The relative L2errors for different network structures are given in table 3.Among them,the number of hidden nodes does not include the last LSTM layer (the last LSTM hidden node is 1).

    Table 3 shows experimental results of different network structures.When increasing the number of LSTM layers and hidden nodes,the error shows a tendency to decrease.Although not all data fit this trend,it can still be argued that a complex network structure is beneficial to increase accuracy.

    3.1.4.The effect of mesh size on solving the KdV equation.In this part,the impact of mesh size on errors is studied when the region is fixed as [?10,10]×[0,1].More temporal and spatial points represent smaller temporal and spatial steps and finer grids.In general,a finer grid produces a smaller truncation error in the difference schemes.This means numerical solutions with smaller errors.But,a fine grid also represents a large amount of training data,which is more demanding to train the model.The number of spatial points is set to 500,1000,1500 and 2000.The number of time points is set to 50,100,150 and 200.The network structure is chosen with 3 LSTM layers and the first 2 layers have 50 hidden nodes.

    The errors at different mesh sizes are given in table 4.It can be observed that in the case of all time nodes,the error is minimum when the spatial node is 500.However,the change of time node does not have a regular effect on the error.To sum up,excessively increasing the grid number and decreasing grid size does not improve the accuracy of the solution for a fixed region.

    Figure 5 shows the dynamic behavior of learned solution and the error density diagram when the time points are 150 and the spatial points are 500.The number of iterations is 622 and the training time is 91 s.The error density diagram shows that the error has remained very low and has not changed significantly over time.Therefore,PIMNs can still solve the KdV equation with high accuracy when only the initial conditions are available.

    Figure 5.The traveling wave solution of the KdV equation: the dynamic behavior of learned solution and the error density diagram.

    3.1.5.Comparison of the PINNs and the PIMNs for the KdV equation.In this part,the KdV equation is solved by the PINNs and the PIMNs with and without boundary conditions,respectively.To effectively compare these two methods,three sets of comparison experiments were set up and the number of parameters of the three cases is set close to each other.The number of hidden layers of the PINNs are 5,7,9 and the single layer neuron is 50.The numbers of parameters are 10 401,15 501,20 601.The number of initial and boundary points and collocations points are 100,10 000,respectively.The structure of the PIMNs is two LSTM layers,and the first layer is 50,60,70 nodes,respectively.The numbers of parameters are 10 808,15 368,20 728.Table 5 shows the relative errors and number of parameters for the PINNs with boundary condition losses and the PIMNs with and without boundary condition losses.

    In terms of the relative errors,all three cases are able to solve the KdV equation with high accuracy.Both the errors of the PIMNs with and without boundary conditions is lower than that of the PINNs when the number of parameters is close.This indicates that the structure of the PIMNs is more advantageous when reconstructing the solutions of the KdV equation.Also,consistent with the 3.1.2 subsection,the PIMNs with boundary conditions does not show a significant advantage over the PIMNs without boundary conditions.In summary,the PIMNs can simulate the solution of the KdV equation with only initial conditions,and even have higher accuracy than the PINNs.

    3.2.Case 2: Nonlinear Schr?dinger equation

    To test the ability of the PIMNs to handle complex PDEs,the nonlinear Schr?dinger equation is solved.The nonlinear Schr?dinger equation is often used to describe quantum behavior in quantum mechanics and plays an important role in the physical fields such as plasma,fluid mechanics,and Bose–Einstein condensates [43,44].The nonlinear Schr?dinger equation is given by

    where q are complex-valued solutions,q0(x) is an initial function,and q1(x) and q2(x) are boundary functions.The existence theory of equation (18) can be referred to in the[45].The complex value solution q is formulated as q=u+iv,and u(x,t) and v(x,t) are real-valued functions of x,t.The equation (18) can be converted into

    The residuals of equation (18) can be defined as

    Here,the traveling wave solution is simulated by the PIMNs and formed as

    where c=?ω ?v2/4>0.Taking c=0.8,v=1.5,the traveling wave solution equation (21) is reduced to

    Taking [x0,x1],[t0,t1] as [–10,10],[0,1],the corresponding initial and boundary functions are obtained.

    3.2.1.Comparison of different difference schemes for solving the nonlinear Schr?dinger equation.Similar to the KdV equation,it is first discussed that which difference schemes should be used to calculate the temporal derivatives of the solution q.The space [?10,10] is divided into 1000 points and the time [0,1] is divided into 100 points.Two LSTM layers are used,and the number of nodes is 30 and 2,respectively.Table 6 shows the results generated by the four sets of random numbers.

    In table 6,the bolded data are the lowest relative L2errors produced by the same random number.It can be seen that the temporal derivatives constructed by all three difference schemes can successfully solve the nonlinear Schr?dinger equation with very small relative L2errors.And,result generated by the central difference performs better compared to the other two ways.Therefore,the central difference is used to calculate the time derivative in the subsequent subsections.

    3.2.2.The effect of boundary conditions on solving the nonlinear Schr?dinger equation.To investigate the effect of boundary conditions on the solution in the nonlinear Schr?dinger equation,the training process and experimental results with and without boundary conditions are compared.The network structure and training data continue the previous setup.Four sets of experiments were also set up.

    Figure 6 shows the loss curve of the training process.Subplots a–d show the loss curves without boundary conditions,and subplots e–h show the loss curves with boundary conditions.It is clear that the total loss convergence levels are close for the top and bottom.Although e–h have more iterations than a–d under the influence of the boundary conditions,all ratio are very low,less than 0.01.Therefore,the boundary conditions do not positively influence the training process.

    Figure 6.The loss curve:a–d are the loss curves without boundary conditions and e–h are the loss curves with boundary conditions.The blue solid line and the blue dashed line are MSE and MSEB(the left y-axis),respectively,and the red line is the ratio of MSEB to MSE(the right y-axis).

    Table 7 demonstrates the relative L2errors with and without boundary conditions corresponding to figure 6.After adding the boundary loss to the total loss,the error is kept at the original level.The influence of the boundary conditions on the errors remains limited.Since the boundary conditions do not positively affect either the training process or the results,they are not necessary for the PIMNs.

    3.2.3.The effect of network structure for solving the nonlinear Schr?dinger equation.In this part,we set different numbers of network layers and neurons to study that how the network structure affects the errors.The data are the same as those used before.Table 8 shows results of numerical experiments with different network structures.Note that the structures in the table 8 do not include the final LSTM layer.

    From table 8,the error usually decreases as the number of neuron nodes and LSTM layers increases.Due to some chance factors,not all errors satisfy this law.In summary,complex structure of the PIMNs is more advantageous for solving differential equations.

    3.2.4.Effect of mesh size on solving the nonlinear Schr?dinger equation.In this part,we set different spatial and temporal points to study that how the mesh size affects the errors.The region remains [?10,10]×[0,1].The number of spatial points is set to 500,1000,1500 and 2000,and the number of time points is set to 50,100,150 and 200.The network structure is 3 LSTM layers,and the nodes of the first two layers are 50.The relative L2errors for different mesh sizes are given in table 9.

    As can be seen from table 9,the relative error reduces as the grid size decreases.However,the error is not minimal at a grid number of 2000×200.This indicates that the grid determines the relative L2error to some extent,and moderate adjustment of the mesh size can improve the accuracy of the solution.

    Figure 7 shows the dynamic behavior of learned solution and the error density diagram when the time points are 100 and the spatial points are 2000.The iterations are 2013 and the training time is 225 s.From the error density,although the overall level of error is low,it also demonstrates an increasing trend over time.In general,the PIMNs can solve the nonlinear Schr?dinger equation with high speed and quality.

    Figure 7.The traveling wave solution of the nonlinear Schr?dinger equation:the dynamic behavior of learned solution and the error density diagram.

    3.2.5.Comparison of the PINNs and the PIMNs for nonlinear Schr?dinger equation.In this part,the nonliner Schr?dinger equation is solved using the PINNs and the PIMNs with and without boundary conditions,respectively.And the differences between the two models are discussed by comparing the relative errors.Similarly to 3.1.5,the parameters of the two models are set specifically.The PINNs has 5,7,9 layers with 50 nodes in each layer.The initial boundary points and the collocations points are 100 and 10 000,respectively.The PIMNs have two LSTM layers and the first layer has 50,60,70 nodes.Table 10 gives all the relative errors and the number of parameters.

    Table 1.The KdV equation: relative L2 errors for different difference schemes.

    Table 2.The KdV equation: relative L2 errors with and without boundary conditions.

    Table 4.The KdV equation: relative L2 errors for different mesh size.

    Table 5.The KdV equation: relative L2 errors for the PINNs and the PIMNs.

    Table 6.The nonlinear Schr?dinger equation: relative L2 errors for different difference schemes.

    Table 7.The nonlinear Schr?dinger equation:relative L2 errors with and without boundary conditions.

    Table 8.The nonlinear Schr?dinger equation: relative L2 errors for different network structures.

    Table 9.The nonlinear Schr?dinger equation: relative L2 errors for different mesh sizes.

    Table 10.The nonlinear Schr?dinger equation: relative L2 errors for the PINNs and the PIMNs.

    From the data of table 10,both the PINNs and PIMNs can solve the nonlinear Schr?dinger equation with very low error.All errors are very close except for individual experiments.The PINNs are more advantageous at the number of parameters around 10 000 and 20 000,and the PIMNs without boundary conditions have lower errors at the number of parameters around 15 000.That is,the PIMNs without boundary conditions can obtain similar results to the PIMNs with boundary conditions.This demonstrates the powerful generalization ability of the PIMNs when there is no boundary condition.

    4.Conclusion

    In this paper,the PIMNs are proposed to solve PDEs by physical laws and temporal structures.Differently from the PINNs,the framework of the PIMNs is based on LSTM,which can establish the long-term dependence of the PDEs’dynamic behavior.Moreover,the physical residuals are constructed using difference schemes,which are similar to finite difference method and bring better physical interpretation.To accelerate the network training,the difference schemes are implemented using the convolutional filter.The convolution filter is not involved in the model training and is only used to calculate the physical residuals.The performance and effectiveness of the PIMNs are demonstrated by two sets of numerical experiments.Numerical experiments show that the PIMNs have excellent prediction accuracy even when only the initial conditions are available.

    However,the PIMNs use only second-order central differences and do not use higher-order difference schemes.And,solving higher-order PDEs is worth investigating.In addition,most physical information deep learning methods construct numerical solutions of PDEs.In the [46],a neural network model based on generalized bilinear differential operators is proposed to solve PDEs[46].The method obtains a new exact network model solutions of PDEs by setting the network neurons as different functions.This proves that it is feasible to construct new exact analytical solutions of PDEs using neural networks.And how to construct new analytical solutions based on the PIMNs is a very worthwhile research problem.Compared with the fully connected structure,it is difficult to set the LSTM units in the same layer as different functions.These are the main research directions for the future.

    黄色片一级片一级黄色片| 香蕉国产在线看| 99re在线观看精品视频| 少妇 在线观看| 国产高清videossex| 亚洲五月婷婷丁香| 精品国产乱码久久久久久男人| 亚洲,欧美精品.| 91字幕亚洲| 女同久久另类99精品国产91| 亚洲一区二区三区不卡视频| 欧美日韩亚洲国产一区二区在线观看 | 女同久久另类99精品国产91| 中文字幕高清在线视频| 久久ye,这里只有精品| 中出人妻视频一区二区| 在线观看日韩欧美| 1024视频免费在线观看| 免费高清在线观看日韩| 精品国产美女av久久久久小说| 亚洲,欧美精品.| 久久精品国产a三级三级三级| 9热在线视频观看99| xxxhd国产人妻xxx| 9色porny在线观看| 美女扒开内裤让男人捅视频| 精品久久久久久电影网| 最新的欧美精品一区二区| 久久人妻熟女aⅴ| 国产精品自产拍在线观看55亚洲 | 久久久久国内视频| 俄罗斯特黄特色一大片| www.999成人在线观看| 日韩 欧美 亚洲 中文字幕| ponron亚洲| 下体分泌物呈黄色| 国精品久久久久久国模美| 欧美一级毛片孕妇| 国产麻豆69| 两性夫妻黄色片| 18禁美女被吸乳视频| 亚洲成人国产一区在线观看| tube8黄色片| 18禁观看日本| 日韩制服丝袜自拍偷拍| www日本在线高清视频| 国产成人免费无遮挡视频| 黑人猛操日本美女一级片| 两人在一起打扑克的视频| 男人的好看免费观看在线视频 | 国产精品 欧美亚洲| 美女 人体艺术 gogo| 午夜福利在线观看吧| 视频区欧美日本亚洲| 欧美黑人精品巨大| 99久久人妻综合| 国产一区二区三区视频了| 老司机在亚洲福利影院| 好看av亚洲va欧美ⅴa在| 变态另类成人亚洲欧美熟女 | 久久精品91无色码中文字幕| 欧美色视频一区免费| 色尼玛亚洲综合影院| 国产99久久九九免费精品| 国产日韩一区二区三区精品不卡| 欧美最黄视频在线播放免费 | 首页视频小说图片口味搜索| 国产精品欧美亚洲77777| 伦理电影免费视频| av网站在线播放免费| 老司机午夜十八禁免费视频| 99riav亚洲国产免费| 免费女性裸体啪啪无遮挡网站| 99在线人妻在线中文字幕 | 亚洲人成伊人成综合网2020| 国产av又大| 母亲3免费完整高清在线观看| 久久人妻福利社区极品人妻图片| 国产一区二区三区综合在线观看| 精品一品国产午夜福利视频| 婷婷精品国产亚洲av在线 | 欧美日韩乱码在线| 美女高潮到喷水免费观看| 午夜福利乱码中文字幕| 日韩欧美国产一区二区入口| 亚洲精品国产区一区二| 午夜激情av网站| 免费日韩欧美在线观看| 欧美乱色亚洲激情| 亚洲 欧美一区二区三区| 亚洲精品国产一区二区精华液| 欧美黄色淫秽网站| 中文欧美无线码| 夜夜爽天天搞| 欧美激情高清一区二区三区| 啦啦啦 在线观看视频| 美女高潮喷水抽搐中文字幕| 久热这里只有精品99| 亚洲第一青青草原| 99精品欧美一区二区三区四区| 人妻 亚洲 视频| videos熟女内射| 亚洲成人免费av在线播放| 人人妻,人人澡人人爽秒播| 亚洲三区欧美一区| 亚洲欧美激情在线| 欧美日韩av久久| av不卡在线播放| 99国产精品一区二区三区| 中文字幕av电影在线播放| 99久久99久久久精品蜜桃| 黄色成人免费大全| 成人国产一区最新在线观看| 国产xxxxx性猛交| 日本精品一区二区三区蜜桃| 欧美激情高清一区二区三区| 免费在线观看黄色视频的| 日韩欧美在线二视频 | 高清毛片免费观看视频网站 | 欧美 亚洲 国产 日韩一| 色在线成人网| 又大又爽又粗| 精品久久久精品久久久| 亚洲,欧美精品.| 亚洲人成电影免费在线| 欧美不卡视频在线免费观看 | 一区在线观看完整版| 免费观看精品视频网站| 免费少妇av软件| 中国美女看黄片| 免费观看精品视频网站| 99re在线观看精品视频| 欧美日韩福利视频一区二区| 曰老女人黄片| 美女高潮喷水抽搐中文字幕| 精品久久蜜臀av无| 身体一侧抽搐| 欧美在线一区亚洲| 男男h啪啪无遮挡| 好男人电影高清在线观看| 国产精品二区激情视频| 黑丝袜美女国产一区| 999精品在线视频| 国产欧美日韩精品亚洲av| 国产成人影院久久av| svipshipincom国产片| 欧美国产精品va在线观看不卡| 91成人精品电影| 十八禁网站免费在线| 亚洲av日韩在线播放| 久久久久国产一级毛片高清牌| 亚洲 欧美一区二区三区| 悠悠久久av| www.999成人在线观看| 国产国语露脸激情在线看| 亚洲在线自拍视频| 女人被躁到高潮嗷嗷叫费观| 高清视频免费观看一区二区| 久久人妻熟女aⅴ| x7x7x7水蜜桃| 亚洲在线自拍视频| 美国免费a级毛片| 亚洲熟女毛片儿| 三上悠亚av全集在线观看| 午夜福利欧美成人| 日韩欧美三级三区| 一边摸一边抽搐一进一小说 | 午夜免费观看网址| aaaaa片日本免费| 下体分泌物呈黄色| 免费高清在线观看日韩| 久久人人爽av亚洲精品天堂| 在线观看一区二区三区激情| 69精品国产乱码久久久| 亚洲色图av天堂| 欧美日本中文国产一区发布| 欧美国产精品一级二级三级| 国产野战对白在线观看| 亚洲色图 男人天堂 中文字幕| 亚洲午夜理论影院| 日韩精品免费视频一区二区三区| 国产精品秋霞免费鲁丝片| 欧美国产精品va在线观看不卡| 一本一本久久a久久精品综合妖精| 国产成人av教育| 亚洲午夜精品一区,二区,三区| 老汉色av国产亚洲站长工具| 免费av中文字幕在线| 国产高清视频在线播放一区| 一边摸一边抽搐一进一小说 | 久久久久久久久久久久大奶| 亚洲欧美一区二区三区久久| 人人妻人人爽人人添夜夜欢视频| 精品欧美一区二区三区在线| 精品电影一区二区在线| 免费看十八禁软件| 国产免费男女视频| 美女高潮到喷水免费观看| 一级片'在线观看视频| 免费在线观看影片大全网站| 久久天躁狠狠躁夜夜2o2o| 亚洲 欧美一区二区三区| 久久国产精品男人的天堂亚洲| tocl精华| 又黄又爽又免费观看的视频| 老司机影院毛片| 久久ye,这里只有精品| 人人妻,人人澡人人爽秒播| 久久精品91无色码中文字幕| 欧美 亚洲 国产 日韩一| 亚洲久久久国产精品| 久久精品国产99精品国产亚洲性色 | 欧美激情高清一区二区三区| 欧美久久黑人一区二区| 两个人免费观看高清视频| 男人操女人黄网站| 精品少妇一区二区三区视频日本电影| 欧美日韩乱码在线| 午夜日韩欧美国产| 十八禁网站免费在线| 欧美日韩亚洲国产一区二区在线观看 | 深夜精品福利| 熟女少妇亚洲综合色aaa.| 欧美 日韩 精品 国产| 最近最新中文字幕大全免费视频| 一区二区三区国产精品乱码| 十八禁网站免费在线| 久久久国产一区二区| 91字幕亚洲| 日韩大码丰满熟妇| 亚洲午夜精品一区,二区,三区| 女人久久www免费人成看片| 欧美国产精品一级二级三级| 天天影视国产精品| 精品无人区乱码1区二区| 黄色成人免费大全| 99国产精品一区二区蜜桃av | 大型av网站在线播放| 丁香六月欧美| 丰满的人妻完整版| 亚洲综合色网址| 一进一出抽搐gif免费好疼 | 人人妻人人添人人爽欧美一区卜| 国产又色又爽无遮挡免费看| 欧美丝袜亚洲另类 | 亚洲成a人片在线一区二区| 亚洲成人免费电影在线观看| 久久人妻熟女aⅴ| 日日夜夜操网爽| 一a级毛片在线观看| bbb黄色大片| 麻豆成人av在线观看| 国产亚洲精品久久久久5区| 50天的宝宝边吃奶边哭怎么回事| 亚洲精品一二三| 久久草成人影院| 成人特级黄色片久久久久久久| 亚洲av第一区精品v没综合| 久久 成人 亚洲| 热99久久久久精品小说推荐| 精品欧美一区二区三区在线| 久久久国产欧美日韩av| 99久久精品国产亚洲精品| 身体一侧抽搐| 成人手机av| 亚洲一区二区三区不卡视频| 免费高清在线观看日韩| 少妇猛男粗大的猛烈进出视频| 少妇被粗大的猛进出69影院| 成熟少妇高潮喷水视频| 人人妻,人人澡人人爽秒播| 后天国语完整版免费观看| 国产精品99久久99久久久不卡| 黑丝袜美女国产一区| 亚洲精品久久午夜乱码| 日本精品一区二区三区蜜桃| 9色porny在线观看| 18禁国产床啪视频网站| 国内久久婷婷六月综合欲色啪| 国产野战对白在线观看| 很黄的视频免费| 天堂俺去俺来也www色官网| 亚洲欧美一区二区三区黑人| 国产在视频线精品| 91成年电影在线观看| 高潮久久久久久久久久久不卡| 精品人妻在线不人妻| 久久久久视频综合| 国产一区有黄有色的免费视频| 美女 人体艺术 gogo| 天堂俺去俺来也www色官网| 免费人成视频x8x8入口观看| 亚洲午夜精品一区,二区,三区| 香蕉丝袜av| 在线观看免费高清a一片| 宅男免费午夜| 伦理电影免费视频| 国产精品国产av在线观看| 婷婷丁香在线五月| 一级,二级,三级黄色视频| 国产精品电影一区二区三区 | 一边摸一边抽搐一进一出视频| 国产精品九九99| 首页视频小说图片口味搜索| 中文亚洲av片在线观看爽 | 欧美日韩一级在线毛片| 国产成+人综合+亚洲专区| 美女高潮喷水抽搐中文字幕| 亚洲免费av在线视频| 一区二区三区激情视频| 一个人免费在线观看的高清视频| 少妇的丰满在线观看| 亚洲熟女毛片儿| 99精国产麻豆久久婷婷| 夫妻午夜视频| 老熟女久久久| 波多野结衣一区麻豆| 在线十欧美十亚洲十日本专区| 久久精品国产亚洲av香蕉五月 | 国产精品国产高清国产av | 亚洲精品久久午夜乱码| 国产欧美日韩精品亚洲av| 久久热在线av| 黄片小视频在线播放| 国内久久婷婷六月综合欲色啪| 国产一区在线观看成人免费| 熟女少妇亚洲综合色aaa.| 国产在线观看jvid| 夜夜夜夜夜久久久久| 国产深夜福利视频在线观看| 日本撒尿小便嘘嘘汇集6| 日韩制服丝袜自拍偷拍| 99久久精品国产亚洲精品| 自线自在国产av| 麻豆国产av国片精品| 999久久久精品免费观看国产| 嫁个100分男人电影在线观看| 欧美日韩精品网址| 色精品久久人妻99蜜桃| av免费在线观看网站| 狠狠婷婷综合久久久久久88av| 亚洲av片天天在线观看| 黄色视频,在线免费观看| 精品卡一卡二卡四卡免费| 欧美日韩中文字幕国产精品一区二区三区 | 人人妻人人澡人人爽人人夜夜| 亚洲精品久久午夜乱码| 99国产精品免费福利视频| 91大片在线观看| 国产91精品成人一区二区三区| www.熟女人妻精品国产| 黄片小视频在线播放| 伦理电影免费视频| 一本一本久久a久久精品综合妖精| 久久人妻av系列| 少妇被粗大的猛进出69影院| 久久人妻av系列| 免费在线观看完整版高清| 搡老乐熟女国产| 18禁国产床啪视频网站| 我的亚洲天堂| 亚洲中文av在线| 看免费av毛片| 国产三级黄色录像| 黑人操中国人逼视频| 国产乱人伦免费视频| 国产av精品麻豆| 亚洲一区二区三区不卡视频| 亚洲国产精品一区二区三区在线| 水蜜桃什么品种好| 国产精品 国内视频| 亚洲欧美精品综合一区二区三区| 嫁个100分男人电影在线观看| 黄网站色视频无遮挡免费观看| 男人的好看免费观看在线视频 | 黄色视频不卡| 精品国产国语对白av| 黄色视频,在线免费观看| 搡老熟女国产l中国老女人| 国产av精品麻豆| 午夜福利影视在线免费观看| 国产伦人伦偷精品视频| 757午夜福利合集在线观看| 国产成人精品无人区| 老司机在亚洲福利影院| 男人的好看免费观看在线视频 | 五月开心婷婷网| 看黄色毛片网站| 国产精品一区二区在线观看99| 午夜91福利影院| 中文字幕另类日韩欧美亚洲嫩草| 女性被躁到高潮视频| 男女下面插进去视频免费观看| 亚洲色图av天堂| 久久精品成人免费网站| 老汉色∧v一级毛片| 99香蕉大伊视频| 黄色 视频免费看| 日本一区二区免费在线视频| 村上凉子中文字幕在线| 99久久综合精品五月天人人| 80岁老熟妇乱子伦牲交| 我的亚洲天堂| 亚洲人成77777在线视频| 咕卡用的链子| 69av精品久久久久久| 水蜜桃什么品种好| 成年人黄色毛片网站| 纯流量卡能插随身wifi吗| 久99久视频精品免费| 日韩免费高清中文字幕av| 丁香欧美五月| 国产欧美日韩综合在线一区二区| 美女国产高潮福利片在线看| 国产一区二区三区在线臀色熟女 | 久久国产亚洲av麻豆专区| 日韩欧美三级三区| 真人做人爱边吃奶动态| 男女之事视频高清在线观看| 黄色成人免费大全| 国产熟女午夜一区二区三区| 免费看a级黄色片| 91麻豆av在线| 1024香蕉在线观看| 国产成人免费无遮挡视频| 大陆偷拍与自拍| www日本在线高清视频| 色精品久久人妻99蜜桃| 老汉色∧v一级毛片| 三级毛片av免费| 久久国产乱子伦精品免费另类| 成年女人毛片免费观看观看9 | 日本撒尿小便嘘嘘汇集6| 天天影视国产精品| 国产高清国产精品国产三级| 国产欧美亚洲国产| 一本综合久久免费| 一级,二级,三级黄色视频| 国产无遮挡羞羞视频在线观看| 欧美日韩福利视频一区二区| 亚洲美女黄片视频| 丰满人妻熟妇乱又伦精品不卡| 黄片小视频在线播放| 日日摸夜夜添夜夜添小说| 叶爱在线成人免费视频播放| 国产精品1区2区在线观看. | 又紧又爽又黄一区二区| 亚洲人成电影免费在线| 国产精品久久久av美女十八| 黄色丝袜av网址大全| 亚洲男人天堂网一区| 99国产精品99久久久久| cao死你这个sao货| 9191精品国产免费久久| 在线看a的网站| 在线观看免费视频日本深夜| 一级作爱视频免费观看| 天天躁狠狠躁夜夜躁狠狠躁| 搡老乐熟女国产| 国产三级黄色录像| www日本在线高清视频| 18在线观看网站| 午夜视频精品福利| 国产97色在线日韩免费| 久久天堂一区二区三区四区| 真人做人爱边吃奶动态| 男女下面插进去视频免费观看| 亚洲中文字幕日韩| 日本wwww免费看| 久久狼人影院| 人人妻人人添人人爽欧美一区卜| 丝袜美腿诱惑在线| 国产精品99久久99久久久不卡| 欧美日韩视频精品一区| 国产在线一区二区三区精| 亚洲精品国产一区二区精华液| 日韩欧美三级三区| 欧美乱码精品一区二区三区| 视频在线观看一区二区三区| 亚洲专区中文字幕在线| 老司机福利观看| 深夜精品福利| 亚洲专区字幕在线| 嫁个100分男人电影在线观看| 成人国产一区最新在线观看| 国产精品久久电影中文字幕 | 男女高潮啪啪啪动态图| 久久香蕉国产精品| a级毛片黄视频| 在线观看免费日韩欧美大片| 精品人妻1区二区| 美女扒开内裤让男人捅视频| 国产欧美亚洲国产| 亚洲国产精品合色在线| 国产欧美日韩一区二区精品| 天天影视国产精品| 亚洲欧美日韩高清在线视频| 欧美日韩乱码在线| 国产99白浆流出| 十分钟在线观看高清视频www| 国产蜜桃级精品一区二区三区 | 亚洲aⅴ乱码一区二区在线播放 | 露出奶头的视频| 国产乱人伦免费视频| 别揉我奶头~嗯~啊~动态视频| 老熟女久久久| 久久久久久亚洲精品国产蜜桃av| 精品久久久久久,| 久久人妻熟女aⅴ| 久久午夜综合久久蜜桃| 亚洲精品久久成人aⅴ小说| 久久人妻av系列| av有码第一页| 我的亚洲天堂| 国产精品免费一区二区三区在线 | 精品国产国语对白av| 欧美日韩亚洲综合一区二区三区_| 国产免费男女视频| av网站免费在线观看视频| 亚洲午夜精品一区,二区,三区| 18在线观看网站| 丝袜在线中文字幕| 99热网站在线观看| 日韩 欧美 亚洲 中文字幕| 国产高清videossex| 在线免费观看的www视频| 免费观看精品视频网站| 他把我摸到了高潮在线观看| 久久人人爽av亚洲精品天堂| 国产免费现黄频在线看| 一本综合久久免费| av超薄肉色丝袜交足视频| 超碰97精品在线观看| 亚洲人成电影免费在线| 日韩欧美国产一区二区入口| 首页视频小说图片口味搜索| 国产淫语在线视频| 国产成人啪精品午夜网站| 美女午夜性视频免费| av一本久久久久| 99精品欧美一区二区三区四区| 99精品在免费线老司机午夜| 叶爱在线成人免费视频播放| 在线十欧美十亚洲十日本专区| 大型av网站在线播放| 亚洲专区国产一区二区| 久久天躁狠狠躁夜夜2o2o| 水蜜桃什么品种好| 高清毛片免费观看视频网站 | 欧美性长视频在线观看| 免费人成视频x8x8入口观看| 久热爱精品视频在线9| 一区二区三区精品91| 国产精品国产高清国产av | 麻豆乱淫一区二区| 国产色视频综合| 精品电影一区二区在线| 国产亚洲欧美98| 在线视频色国产色| 久久久久视频综合| 人妻 亚洲 视频| 精品久久久久久久久久免费视频 | 国产av一区二区精品久久| 99久久99久久久精品蜜桃| 欧美亚洲 丝袜 人妻 在线| 极品少妇高潮喷水抽搐| 黄片播放在线免费| 男女下面插进去视频免费观看| 亚洲 国产 在线| 黄色丝袜av网址大全| 久久人妻福利社区极品人妻图片| 美女视频免费永久观看网站| 精品午夜福利视频在线观看一区| 女人久久www免费人成看片| 一级a爱视频在线免费观看| 精品一区二区三区四区五区乱码| 精品乱码久久久久久99久播| 欧美性长视频在线观看| 国产欧美日韩综合在线一区二区| 精品国产美女av久久久久小说| av电影中文网址| 久9热在线精品视频| 精品视频人人做人人爽| 9191精品国产免费久久| 两性夫妻黄色片| avwww免费| 国产成人精品久久二区二区免费| 在线观看免费日韩欧美大片| 精品国产一区二区三区四区第35| 丝袜美足系列| 午夜精品在线福利| 国产高清激情床上av| 亚洲欧美一区二区三区久久| 欧美色视频一区免费| 日韩欧美一区视频在线观看| 美国免费a级毛片| 国产熟女午夜一区二区三区| 少妇被粗大的猛进出69影院| 国产精品久久久久成人av| 国产极品粉嫩免费观看在线| 人妻 亚洲 视频| avwww免费| 在线观看舔阴道视频| 欧美色视频一区免费| 国产精品久久久人人做人人爽| 香蕉久久夜色| 亚洲一卡2卡3卡4卡5卡精品中文| 国产欧美日韩一区二区三| 国产精品乱码一区二三区的特点 | 在线观看免费日韩欧美大片| 一a级毛片在线观看| 欧美午夜高清在线| 黄片大片在线免费观看| 精品日产1卡2卡| 亚洲国产精品合色在线|