论文部分内容阅读
Some variational data assimilation(VDA)problems of time-and space-discrete models with on/off parameterizations can be regarded as non-smooth optimization problems.Same as the sub-gradient type method,intelligent optimization algorithms,which are widely used in engineering optimization,can also be adopted in VDA in virtue of their no requirement of cost function’s gradient(or sub-gradient)and their capability of global convergence.Two typical intelligent optimization algorithms,genetic algorithm (GA)and particle swarm optimization(PSO),are introduced to VDA of modified Lorenz equations with on-off parameterizations,then two VDA schemes are proposed,that is,GA based VDA(GA-VDA)and PSO based VDA(PSO-VDA).After revealing the advantage of GA and PSO over conventional adjoint methods in the ability of global searching at the existence of cost function’s discontinuity induced by on-off switches,sensitivities of GA-VDA and PSO-VDA to population size,observational noise,model error and observational density are detailedly analyzed. It’s shown that,in the context of modified Lorenz equations,with proper population size,GA-VDA and PSO-VDA can effectively estimate the global optimal solution,while PSO-VDA consumes much less computational time than GA-VDA with the same population size,and requires a much lower population size with nearly the same results,both methods are not very sensitive to observation noise and model error, while PSO-VDA shows a better performance with observational noise than GA-VDA.It is encouraging that both methods are not sensitive to observational density,especially PSO-VDA,using which almost the same perfect assimilation results can be obtained with comparatively sparse observations.
Some variational data assimilation (VDA) problems of time-and space-discrete models with on / off parameterizations can be regarded as non-smooth optimization problems. Same as the sub-gradient type method, intelligent optimization algorithms, which are widely used in engineering optimization, can also be adopted in VDA in virtue of their no requirement of cost function's gradient (or sub-gradient) and their capability of global convergence. bright typical intelligent optimization algorithms, genetic algorithm (GA) and particle swarm optimization (PSO), both introduced to VDA of modified Lorenz equations with on-off parameterizations, then two VDA schemes are proposed, that is, GA based VDA (GA-VDA) and PSO based VDA (PSO-VDA). After revealing the advantage of GA and PSO over conventional adjoint methods in the ability of global searching at the existence of cost function's discontinuity induced by on-off switches, sensitivities of GA-VDA and PSO-VDA to population size, observational noise, model error and obs It's shown that in the context of modified Lorenz equations, with proper population size, GA-VDA and PSO-VDA can be able to estimate the global optimal solution, while PSO-VDA consumes much less computational time than GA- VDA with the same population size, and requires a much lower population size with almost the same results, both methods are not very sensitive to observation noise and model error, while PSO-VDA shows a better performance with observational noise than GA-VDA.It is encouraging that both methods are not sensitive to observational density, especially PSO-VDA, using which almost the same perfect assimilation results can be obtained with comparatively sparse observations.