Automatic differentiation vs. phylogenetic gradient computation

Yuwei BaoJune 9, 2023

The paper Automatic differentiation is no panacea for phylogenetic gradient computation [1] compares the performance of automatic differentiation (AD) using machine-learning libraries against other six gradient implementations of the phylogenetic likelihood functions. The tools and analysis can be found at the Github page [2].

What is automatic differentiation?

It leverages the chain rule of calculus to compute derivatives of complex functions by decomposing them into elementary operations, which achieves high precision. It is used in machine learning to compute gradients for training neural networks efficiently.

Analytic derivatives: The exact symbolic solution of derivatives.

Numeric derivatives: It approximates the derivatives by calculating the function at multiple points and taking finite differences.

Running the pipeline with docker

Setup nextflow [3]

java -version # Make sure 11 or later is installed
curl -s https://get.nextflow.io | bash
./nextflow run hello

# Add path to bashrc
export PATH=$PATH:PATH_TO_nextflow

Installation

git clone https://github.com/4ment/gradient-benchmark.git

Usage

nextflow run 4ment/autodiff-experiments -profile docker -with-trace

Note: the pipeline will take weeks to run to completion


  1. https://arxiv.org/abs/2211.02168open in new window ↩︎

  2. https://github.com/4ment/gradient-benchmarkopen in new window ↩︎

  3. https://www.nextflow.io/open in new window ↩︎