Skip to main content

User-friendly and powerful nonlinear mixed effect modeling software

Phoenix NLMETM lets you focus on your models, not the tools required to implement the models.   Phoenix NLME is a population modeling and simulation software for scientists with varying levels of experience—from novice PK/PD scientists to the most advanced modelers. This comprehensive package includes integrated data preparation, modeling, and graphics tools with the same user interface that is used in Phoenix WinNonlin™.

18 of the top 20 pharmaceutical companies in the world use Phoenix NLME.

PHOENIX VERSION 8.4

Accomplish more with the latest release of the Phoenix platform with:

  • Usability Enhancements: You will find a collection of improvements focused on making the user experience more intuitive and streamlined , making your daily tasks smoother and more efficient. For example, duplicating objects within a workflow is a convenience that will save you time and allow you to work more effectively.
  • Auditability Improvements: We understand the importance of maintaining a robust audit trail. One of the auditability features in Phoenix 8.4 is the ability to easily set user preference for UTC or system time zone for date/time formatting.
  • Compatibility Upgrades: The release makes Phoenix more compatible than ever before. For example, Phoenix 8.4 now supports more data filetypes, including sas7bdat files and SAS transport file (XPT) formats through v9.
  • Watch the What’s New video for a tour of the enhancements

DOWNLOAD NOW

Widely used for regulatory submissions

Phoenix NLME meets all scientific and technical requirements for PK/PD analyses. Because of its flexibility and power, Phoenix NLME is used across drug development phases and for global regulatory submissions, including:

  • Translational modeling
  • Extrapolation of PK data from animals to humans
  • Prediction of the pharmacodynamics in humans based on in vitro models
  • Combination of PK and PD data from multiple animal studies
  • Enhancement of study designs to minimize animal use
Get A Quote
Widely used for regulatory submissions
Grid computing and algorithms for faster results

Grid computing and algorithms for faster results

  • Includes QRPEM, the fastest expectation maximization algorithm available
  • Comes grid-enabled with an algorithm that uses the maximum number of cores for each run to minimize time from days to minutes. Send jobs to a remote computing platform with one click.
  • Offers separates analyses for different types of data such as continuous data with BQLs, categorical data, and time to event data with Visual Predictive Check
  • Simplifies coding delays in PK/PD models with the distributed delay function, which can be used in place of transit compartments, dual absorption models, effect compartment models, and indirect response models.
Download brochure

Intuitive graphical user interface

  • Develop models without coding with our graphical model editor and model library. Advanced users can code using Phoenix Modeling Language (PML)
  • Avoid poor parameterization with the graphical tool for initial model parameter estimates
  • Quickly troubleshoot issues with diagnostic dialog messages that pinpoint errors
  • Compare models side-by-side with the comparer tool
  • Automatically generate diagnostic outputs for each model to assess model robustness
Contact us
Intuitive graphical user interface
Access Phoenix Anywhere!

Access Phoenix Anywhere!

Phoenix Hosted empowers research teams with enhanced performance and increased flexibility in their daily NLME business. Phoenix Hosted supports various operating systems and hardware and offers easy scalability to adapt to organizational needs. The solution facilitates seamless collaboration on PK projects and, as an added productivity booster, allows extensive processing to occur remotely while users focus on scientific tasks within Phoenix. IT departments also benefit from Phoenix Hosted, as it reduces their workload by eliminating the need for desktop software installation and management.

Learn more