Automatic differentiation vs. phylogenetic gradient computation
The paper Automatic differentiation is no panacea for phylogenetic gradient computation [1] compares the performance of automatic differentiation (AD) using machine-learning libraries against other six gradient implementations of the phylogenetic likelihood functions. The tools and analysis can be found at the Github page [2].
What is automatic differentiation?
It leverages the chain rule of calculus to compute derivatives of complex functions by decomposing them into elementary operations, which achieves high precision. It is used in machine learning to compute gradients for training neural networks efficiently.
Analytic derivatives: The exact symbolic solution of derivatives.
Numeric derivatives: It approximates the derivatives by calculating the function at multiple points and taking finite differences.
Running the pipeline with docker
[3]
Setup nextflowjava -version # Make sure 11 or later is installed
curl -s https://get.nextflow.io | bash
./nextflow run hello
# Add path to bashrc
export PATH=$PATH:PATH_TO_nextflow
Installation
git clone https://github.com/4ment/gradient-benchmark.git
Usage
nextflow run 4ment/autodiff-experiments -profile docker -with-trace
Note: the pipeline will take weeks to run to completion