QRPEM – A New Standard of Accuracy, Precision, and Efficiency in NLME Population PK/PD Methods

Summary: A new accurate likelihood EM estimation method QRPEM (Quasi-random Parametric Expectation Maximization) has been introduced in the current release of Phoenix NLME. The method belongs to the same general class of parametric EM methods as IMPEM in NONMEM 7, MCPEM in S-ADAPT, and SAEM in MONOLIX, S-ADAPT, and NONMEM 7. The QRPEM method is distinguished by its use of low discrepancy (also called ‘quasi-random’) Sobol sequences as the core sampling technique in the expectation step, as opposed to the stochastic Monte Carlo sampling techniques used in the other EM methods. The theoretical best case accuracy for QR sampling is an error that decays as N-1, where N is the number of samples. This represents a significant advantage over the slower N-1/2 error decay rate characteristic of stochastic sampling. The fundamental characteristics of the types of problems typically encountered in the population PK/PD NLME domain are relatively low dimensionality and high degree of smoothness of the function being sampled. This known to be the ideal case for application of QR techniques and suggests that the best case N-1 behavior may in fact be achievable.

A second distinguishing feature of QRPEM is the use of the SIR (sampling-importance-resampling) algorithm to greatly improve computational efficiency for models where fixed effects cannot be driven by a simple EM update based on the estimated mean and covariance matrix of the posterior distributions for each subject. These include commonly occurring cases such as non-linear covariate models, compound additive-proportional residual error models, standalone fixed effects and in general cases where so-called ‘mu-modeling’ is not or cannot be used to specify all structural parameters. In such cases a computationally expensive auxiliary log likelihood optimization is introduced to drive the iterative updating of these fixed effects. The use of the SIR algorithm allows this optimization, which tends to be the largest computational expense in the overall estimation procedure, to be greatly reduced in size and complexity, with consequent large reductions in overall run time.

Learn More LinkedIn Twitter Facebook Email