?

Application of support vector machine in drag reduction effect p*rediction of nanoparticles adsorption method on oil reservoir’s micro-channels

2015-02-16 06:43DIQinfeng狄勤豐HUAShuai華帥DINGWeipeng丁偉朋GONGWei龔偉CHENGYichong程毅翀YeFeng葉峰
水動力學研究與進展 B輯 2015年1期

DI Qin-feng (狄勤豐), HUA Shuai (華帥), DING Wei-peng (丁偉朋), GONG Wei (龔偉), CHENG Yi-chong (程毅翀), Ye Feng (葉峰)

1. Shanghai Institute of Applied Mathematics and Mechanics, Shanghai University, Shanghai 200072, China

2. Shanghai Key Laboratory of Mechanics in Energy and Environment Engineering, Shanghai 200072, China,E-mail: qinfengd@sina.com

Application of support vector machine in drag reduction effect p*rediction of nanoparticles adsorption method on oil reservoir’s micro-channels

DI Qin-feng (狄勤豐)1,2, HUA Shuai (華帥)1,2, DING Wei-peng (丁偉朋)1, GONG Wei (龔偉)2, CHENG Yi-chong (程毅翀)1,2, Ye Feng (葉峰)1,2

1. Shanghai Institute of Applied Mathematics and Mechanics, Shanghai University, Shanghai 200072, China

2. Shanghai Key Laboratory of Mechanics in Energy and Environment Engineering, Shanghai 200072, China,E-mail: qinfengd@sina.com

(Received August 29, 2013, Revised November 27, 2013)

Due to the complexity of influence factors in the nanoparticles adsorption method and the limitation of data samples, the support vector machine (SVM) was used in the prediction method for the drag reduction effect. The basic concept of SVM was introduced, and the -εSVR programming for the kernel function on the radial basis was established firstly with the help of the MATLAB software. Then, an analysis was made for the influencing factors of the drag reduction effect in nanoparticles adsorption. Finally, a prediction model for the drag reduction effect of nanoparticles was established, and the accuracy of training sample and prediction sample was analyzed. The result shows that the SVM has good availability and can be used as a rapid evaluation method of the drag reduction effect prediction of nanoparticles adsorption method.

nanoparticles adsorption method, support vector machine (SVM), prediction model, rapid evaluation, enhanced oil recovery

Introduction

Drag reduction by the hydrophobic nanoparticles adsorption in reservoir microchannels is a new technology to decrease water injection pressure and enhance water injection rate, which has been achieved significant drag reduction effect in oilfield tests. It is of great significance for solving the problem of high injection pressure but less-injection and improving oil recovery during the water flooding development of low permeability oilfields[1-4]. However, the high cost and long period of assessing drag reduction effect during development of the matched nanoparticles for special oil block have severely constrained the application of the technology.

Support vector machine (SVM) is a novel data mining method based on statistical learning theory(SLT)[5]. In machine learning, SVMs, also support vector networks are supervised learning models with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis[6].

With the development of SLT, SVM develops in theory and application rapidly. It has advantages such as high classification accuracies, few parameters,good generalization ability, and global optimal solution. It demonstrates a number of unique advantages in solving the small size problems, non-linear problems, and high-dimension data problems. SVM has been successfully used in prediction, data fitting, comprehensive evaluation, pattern recognition, and many other things[7,8].

Due to the complexity of influence factors in the nanoparticles adsorption method on drag reduction effect and the limitation of data samples, we chooseSVM method to study the drag reduction effect. After a brief introduction of basic ideas and theory of SVM,an -εSVR programming with the radial basis function (RBF) was established with the help of the MATLAB software. After the correctness of programmming being verified by one-dimensional function,the programming was used to predict the drag reduction effect of the nanoparticles adsorption method[9].

1. Svm basic models

1.1 SVM basic idea

SVM is developed from the theory of the optimal separating hyperplane in condition of linearly separable. The basic idea of SVM can be explained in the two-dimension case in Fig.1[10-12].

Fig.1 Optimal separating line in condition of linearly separable

In Fig.1, the solid and hollow points represent two types of data samples respectively. The so-called optimal separating line is the line which not only can separate the two samples properly, but also make the margin maximal. Former guarantees empirical risk minimization, later makes the actual risk minimum. As can be seen from Fig.1, the optimal separating line only depends on a few samples (support vectors samples, in red circle). This property makes the SVM very suitable in solving small size sample problem.

The SVM also has strong abilities to deal with nonlinear problems. To nonlinear problems, firstly,SVM project the original training data x to a higher dimensional feature space which can be separated linearly via a nonlinear function ()φx, and then obtain the optimal separating hyperplane in the higher dimensional feature space. In the higher dimensional feature space, the inner product φ(xi)·φ(xj) can be replaced by the kernel function in the original space, i.e., K(xi, xj)=φ(xi)·φ(xj). According to the functional theory, the kernel function corresponds to an inner product in some mapping space when it satisfies Mercer’s condition. To SVM, kernel function plays an important role in solving nonlinear problems, and the radial basis kernel function is commonly used, which can be expressed as Eq.(1)[13,14]

1.2 Support vector regression

Support vector regression (SVR) is one type of the SVMs. The optimal hyperplane of SVR is to make the error between the estimating value and the true value minimum. A loss function is needed to introduce for the regression prediction problem, and the insensitive loss function is commonly used[10]. To a sample set ε, in which T={(xi,yi),i=1,...,n},x∈Rd, y∈R, the SVR with ε insensitive loss function can be transformed into the dual problem of a convex quadratic programming optimization. Which can be expressed as Eq.(2).

where C is constant, and 0C>, indicating the penalty factor.iα andiα*are Lagrange multipliers. Usingioα,ioα*denote the solution of Eq.(2), then the corresponding regression estimation can be described,

2. -εSVR program implementaion

2.1 -εSVR programming with MATLAB

The SVM problems can be solved as a quadratic programming problems, and this make a convenienceto use MATLAB software to solving this problem because there is a specialized functions for solving quadratic programming problems[15]. By using the function “quadprog” in MATLAB software, theioα,ioα*in Eq.(1) to Eq.(4) can be obtained, then the value of bocan be got by Eq.(3) and Eq.(4). As a result, the model based on radial basis kernel function can be determined.

For model based on radial basis kernel function,the regression result is affected by the kernel parameter σ, insensitive factor ε and penalty factor C, so an optimization method should be used to determine them[16,17]. The web search method for parameters optimization was used in this paper, in which the range of each parameter and their step length must be determined in advance, then the step by step search method was used[18]. Selection of the bound for σ will become crucially important. Smaller σ may lead to over-fitting SVR, excessive σ may causes the kernel function tends to constant values. Based on the experience and fuzzy -Kmeans clustering method selected σ ranges. Finally an optimization criteria, the minimum training sample mean square error (MSE),was used to select the optimal parameters,

where f(xi) is the predicted value, yiis the original value.

Fig.2 -εSVR model of the MATLAB programming

Through the above analysis, a MATLAB programming based on the radial basis kernel function -ε SVR model can be designed and the block diagram was shown in Fig.2.

2.2 Verification of one-dimensional function

In order to verify the correctness of the -εSVA program, an one-dimensional test function =y(x-1)2e-x2/10+ζ, in which (x-1)2e-x2/10is the internal relations between data samples, was constructed. A normal distribution noise item ζ~N (0,0.01), in which the mean value is 0 and the variance is 0.01,was introduced to structure this data sample function. In x∈[-10,10], 60 points are randomly selected as data sample, in which the 54 points are randomly picked as the training sample, and the remaining are as the test sample, as shown in Fig.3.

Fig.3 One-dimensional function and division of sample collection

In order to optimize the σ, ε and C, an initial parameters range was selected. σ is between 0.2 and 6, and the step length is 0.2. ε is between 0.1 and 3, step 0.1. C is between 10 and 300, step 10. The training samples were imported into operational procedures. After the parameter optimization, optimal σ, ε , C are respectively 0.8, 0.1, 20. At this time,the training sample mean square error is 0.0854.

Fig.4 One-dimensional function comparing predicted and actual values

Select the optimal parameters for model training,respectively, to predict the training samples and testing samples. VR ratio (the number of support vectors divided by the number of training samples) is 0.87. The result is shown in Fig.4.

Table1 One-dimensional function test sample training results

As illustrated in Fig.4, there is only a marginal difference between the training sample predicted values (TRSPV) and the training sample true values(TRSTV), the testing sample predicted values(TESPV)and the testing sample true values (TESTV). The detailed values were given in Table 1. The maximum and the minimum absolute error were about 0.433,0.03 respectively. The test sample mean square error is 0.26332. The method achieves the average accuracy of 0.95 in leave-one-out cross-validations (LOOCV).

The above results show that the MATLAB program based on the radial basis kernel function can get good prediction result and can be used as a prediction tool.

3. The drag reduction effect of adsorption nanoparticles factors

The drag reduction effect of nanoparticles adsorption method are mainly determined by properties of cores, the characteristics of nanomaterials, as well as the matching between the two factors and experimental environment factors. The main parameters which reflect the core physical properties are mineral composition of the cores, permeability, porosity and oil saturation. The main parameters which reflect the properties of nanomaterials are nanomaterials particle size,modifier and activity. The matching of core and nanomaterials can be characterized by the core contact angle with adsorbed nanoparticles. The adsorption concentration, adsorption temperature and adsorption time can be used to described the environmental characteristics.

Core micro-channel effective aperture will be changed after adsorption of nanoparticles. The average pore diameter of core micro-channel is related to the permeability and porosity of core. Therefore, the core permeability and core porosity are the two effective factors for reference in selecting the suitable nanoparticles. Cores used in the experiment are all through washing oil, which will be regardless of the impact of oil saturation and are all from the same block,with the mineral composition roughly the same, which will be regardless of the influence on the results. Three kinds of nanoparticle in size were used to study the drag reduction performance of nano-material, so the size of nanoparticles is regarded as the influence factors. The modified material of the nanoparticles used in the experiments are all the same, and the activation is over 99%, therefore these two factors are not considered as the influence factors. Drag reduction effect is related to adsorption effect of nanoparticles on the core and the water contact angle of the nanoparticles adsorption layer, which is the characterization of surface wettability. The greater the contact angles the better drag reduction effect, so the contact angle is taken as the influence factor. As the experimental environment parameters, the concentration of nanopaticles in the solution, the temperature and the adsorption time will be fixed, so the experimental environment parameters will not be considered.

In the drag reduction effect evaluation, the core flow experiment was a direct and final method, and the ratio of core water phase permeability before and after nanoparticles adsorption, kwa/kwb, are used to described the drag reduction effect, so kwa/kwbis selected as the characterization parameter of drag reduction effect, namely the output of the model.

In conclusion, the core porosity, the core water phase permeability before nanoparticles adsorption,the average particle size of nanoparticles and the contact angel of the nanoparticles adsorption layer on the core are selected as the influence factors, namely the input of the model. The ratio of core water phase permeability before and after nanoparticles adsorption in the core flow experiment, kwa/kwb, is considered as the characterization parameter of drag reduction effect,namely the output of the model.

Fig.5 The -εSVA prediction model for drag reduction effect of nanoparticles adsorption method

Table2 Comparison of SVM true and predicted values

4. Prediction model and result analysis

Based on the analysis of input and output parameters, the drag reduction effect predication model of ε-SVR programming of the nanoparticles adsorption method was established, as shown in Fig.5. There are four input nodes and one output node.

Fig.6 Comparison of -εSVR sample true and predicted values

Due to less training data samples, through properly encrypted search, kernel parameter σ is selected between 0.1 and 6, step 0.1. Insensitive factor ε is selected between 0.01 and 3, step 0.01. Penalty factor C is selected between 10 and 300, step 10. By the parameters optimization based on the minimum mean square error method, the -εSVA prediction model of drag reduction effect of nanoparticles adsorption was established. The optimal σ, ε and C are 1.10, 0.01, 20 respectively. As the calculation results,VR ratio is 0.88. The predicted value and the true value were shown in Table 2, and the contrast diagram was shown in Fig.6.

As shown in Table 2, the -εSVR programming with radial basis kernel function for training samples have reached a very good prediction accuracy, with the maximum relative error only 0.97%. For the two verification samples, the absolute value of the relative error is between 3.22% and 3.79%, and the average value is only 3.5%. The method achieves the averageaccuracy of 0.942 in LOOCV.

As shown in Fig.6, with parameter optimization process, the -εSVR programming can reach a fairly good prediction precision for training samples. The results show that SVM has good availability in the drag reduction effect prediction of nanoparticle adsorption method.

5. Conclusions

Based on SVM and the analysis on the drag reduction effect of the nanoparticles adsorption method,the following conclusions can be reached:

(1) The SVM has unique advantages in solving small sample, nonlinearity, convergence rate, noise resistance, and high dimensional pattern recognition problems. A -εSVR programming with radial basis kernel function is established with the help of using MATLAB software. The results show that this programming has fairly good establishment and prediction accuracy of regression model for one-dimensional functions.

(2) The core porosity, water phase permeability,particle size of nanoparticles and contact angel were selected as the input parameters of the drag reduction effect predication model of -εSVR programming of the nanoparticles adsorption method and the ratio(kwa/kwb) as the output of the model. The results show that, based on the training sample, this model has quite good prediction accuracy for the verification sample, and can be used as a rapid evaluation tool for drag reduction effect predication.

[1] DI Qin-feng, GU Chun-yuan and SHI Li-yi. Pressure drop mechanism of enhancing water injection technology with hydrophobicity nanometer SiO2[J]. Drilling and Production Technology, 2007, 30(4): 91-94(in Chinese).

[2] GU Chun-yuan, DI Qin-feng and FANG Hai-ping. Slip velocity model of porous walls absorbed by hydrophobic nanoparticles SiO2[J].Journal of Hydrodynamics, Ser. B, 2007, 19(3): 365-371.

[3] ZHANG Ren-liang, DI Qin-feng and WANG Xin-liang et al. Numerical study of wall wettabilities and topography on drag reduction effect in micro-channel flow by lattice Boltzmann method[J]. Journal of Hydrodynamics, 2010, 22(3): 366-372.

[4] ZHANG Ren-liang, DI Qin-feng and WANG Xin-liang et al. Numerical study of the relationship between apparent slip length and contact angle by lattice Boltzmann method[J]. Journal of Hydrodynamics, 2012, 24(4):535-540.

[5] BAI Peng. Support vector machine and its application in mixed gas infrared spectrum analysis[M]. Xi’an, China: Xidian University Press, 2008(in Chinese).

[6] LEE Y.-J., MANGARASIAN O. L. SSVM: A smooth support vector machine for classification[J]. Computational Optimization and Applications, 2001, 22(1):5-21.

[7] CHANG Tian-tian LIU Hong-wei. Support vector machine ensemble learning algorithm research based on heterogeneous data[J]. Journal of Xidian University, 2010, 37(1): 136-141(in Chinese).

[8] GUO Hui, WANG Ling and LIU He-ping. Integrating kernel principal component analysis with least squares support vector machines for time series forecasting problems[J]. Journal of University of Science and Technology Bejing, 2006, 28(3): 303-307(in Chinese).

[9] SUYKENS J. A. K. Least squares support vector machine classifiers[J]. Neural Processing Letters, 1999,9(3): 293-300.

[10] BAI Peng, ZHANG Xi-bin and ZHANG Bin. Support vector machine theory and engineering application examples[M]. Xi’an, China: Xi?an Electronic Sience and Technology University Press, 2008(in Chinese).

[11] DENG Nai-yang, TIAN Ying-jie. Support vector machine (SVM): Theory, algorithms, and development[M]. Beijing, China: Science Press, 2009.

[12] HASTIE T. The entire regularization path for the support vector machine[J]. Journal of Machine Learning Research, 2004, 5: 1391-1415.

[13] ZHANG Qian, YANG Yai-quan. Research on the kernel function of support vector machine[J]. Electric Power Science and Engineering, 2012, 28(5): 42-46(in Chinese).

[14] ZHU Shu-xian, ZHANG Yen-jie. Research for selection of kernel functions used in support vector machine[J]. Science Technology and Engineering, 2008, 8(16):4513-4518(in Chinese).

[15] ZHU Guo-qiang, LIU Shi-rong. Support vector machine and its applications to function approximation[J], Journal of East China University of Science and Technology, 2002, 28(5): 555-559(in Chinese).

[16] XIAO Jian, YU Long and BAI Yi-feng. Survey of the selection of kernels and hyper-parameters in support vector regression[J]. Journal of Southwest Jiaotong University, 2008, 43(3): 297-303(in Chinese).

[17] CHENG Peng, WANG Xi-li. Influence of SVR parameter on non-linear function approximation[J]. Computer Engineering, 2011, 37(3): 190-191(in Chinese).

[18] WANG Xing-ling, LI Zhang-bin. Identifying the parameters of the kernel function in support vector machines based on the grid-search method[J]. Journal of Ocean University of Qingdao, 2005, 35(5): 859-862(in Chinese).

10.1016/S1001-6058(15)60461-9

* Project supported by the National Natural Science Foundation of China (Grant No. 50874071), the Chinese National Programs for High Technology Research and Development(Grant No. SS2013AA061104), Shanghai Program for Innovative Research Team in Universities, Shanghai Leading Academic Discipline Project (Grant No. S30106) and the Shanghai Leading Talents Project and the Key Program of Science and Technology Commission of Shanghai Municipality (Grant No. 12160500200).

Biography: DI Qin-feng (1963-), Male, Ph. D., Professor

91香蕉高清国产线观看免费-97夜夜澡人人爽人人喊a-99久久久无码国产精品9-国产亚洲日韩欧美综合