mirror of
https://git.FreeBSD.org/ports.git
synced 2025-01-26 09:46:09 +00:00
fb16dfecae
Commit b7f05445c0
has added WWW entries to port Makefiles based on
WWW: lines in pkg-descr files.
This commit removes the WWW: lines of moved-over URLs from these
pkg-descr files.
Approved by: portmgr (tcberner)
25 lines
1.4 KiB
Plaintext
25 lines
1.4 KiB
Plaintext
JAX is Autograd and XLA, brought together for high-performance machine learning
|
|
research.
|
|
|
|
With its updated version of Autograd, JAX can automatically differentiate native
|
|
Python and NumPy functions. It can differentiate through loops, branches,
|
|
recursion, and closures, and it can take derivatives of derivatives of
|
|
derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation)
|
|
via grad as well as forward-mode differentiation, and the two can be composed
|
|
arbitrarily to any order.
|
|
|
|
What's new is that JAX uses XLA to compile and run your NumPy programs on GPUs
|
|
and TPUs. Compilation happens under the hood by default, with library calls
|
|
getting just-in-time compiled and executed. But JAX also lets you just-in-time
|
|
compile your own Python functions into XLA-optimized kernels using a
|
|
one-function API, jit. Compilation and automatic differentiation can be composed
|
|
arbitrarily, so you can express sophisticated algorithms and get maximal
|
|
performance without leaving Python. You can even program multiple GPUs or TPU
|
|
cores at once using pmap, and differentiate through the whole thing.
|
|
|
|
Dig a little deeper, and you'll see that JAX is really an extensible system for
|
|
composable function transformations. Both grad and jit are instances of such
|
|
transformations. Others are vmap for automatic vectorization and pmap for
|
|
single-program multiple-data (SPMD) parallel programming of multiple
|
|
accelerators, with more to come.
|