JIT/GPU accelerated deep learning for Elixir with Axon v0.1

I am excited to announce the official v0.1.0 release of Axon and AxonOnnx. A lot has changed (and improved) since the initial public announcement of Axon. In this post I will explore Axon and its internals, and give reasoning for some of the design decisions made along the way. You can view the official documentation here: AxonAxonOnnx What is … Continue reading JIT/GPU accelerated deep learning for Elixir with Axon v0.1

Nx Tip of the Week #14 – Slicing and Indexing

Often times you want to slice and index into specific parts of a tensor. Nx offers a few different slicing and indexing routines which allow you to accomplish most of what you would want to do. Slicing can be a bit tricky given static shape requirements, but you usually can work around limitations. First, you … Continue reading Nx Tip of the Week #14 – Slicing and Indexing

Nx Tip of the Week #13 – Hooks

Part of the restrictiveness of defn is the inability to debug in the same way you would debug a normal Elixir function. I'm personally a big fan of plain old IO.inspect debugging. Because of how defn works, it's not possible to inspect intermediate tensor values in the same way you would inspect intermediate values in … Continue reading Nx Tip of the Week #13 – Hooks

Nx Tip of the Week #12 – Nx.to_heatmap

Sometimes you want to quickly visualize the contents of a tensor. For example, when working with the MNIST dataset, you might want to make sure you've sliced it up correctly. A quick way to visualize images across a single color channel is with Nx.to_heatmap: Nx.to_heatmap(img) When inspecting the result of Nx.to_heatmap, you'll get a nice … Continue reading Nx Tip of the Week #12 – Nx.to_heatmap

Nx Tip of the Week #11 – While Loops

Some numeric algorithms require sequential operations. In TOTW #9, we talked about one operation you can use to avoid while-loops in specific situations. Unfortunately, you won't always be able to avoid a while-loop. Nx has a while construct which is implemented as an Elixir macro. The while construct takes an initial state, a condition, and … Continue reading Nx Tip of the Week #11 – While Loops

Nx Tip of the Week #10 – Using Nx.select

Nx's API can seem a little more restrictive due to some of it's static shape requirements. For example, boolean indexing is not currently supported because it would be impossible to know the shape at runtime. For those who don't know boolean indexing selects values of an array based on some boolean mask. For example, let's … Continue reading Nx Tip of the Week #10 – Using Nx.select

Nx Tip of the Week #8 – Using Nx.Defn.aot/3

Last week, we discussed the usage of Nx.Defn.jit/3 to JIT compile and run numerical definitions. Nx also supports ahead-of-time compilation using Nx.Defn.aot/3. In this post, we'll briefly look at how to use ahead-of-time compilation, and why you'd want to do it in the first place. Ahead-of-time compilation allows you to compile your numerical definitions into … Continue reading Nx Tip of the Week #8 – Using Nx.Defn.aot/3

Nx Tip of the Week #7 – Using Nx.Defn.jit

There are actually 2 ways in Nx to accelerate your numerical definitions: invoking calls to defn with a @defn_compiler attribute set, or calling Nx.Defn.jit/3. Let's take a look at these 2 methods in practice: defmodule JIT do import Nx.Defn @default_defn_compiler EXLA defn softmax(x) do max_val = Nx.reduce_max(x) Nx.exp(x - max_val) / Nx.sum(Nx.exp(x - max_val)) end … Continue reading Nx Tip of the Week #7 – Using Nx.Defn.jit

Axon: Deep Learning in Elixir

ax·on/ˈakˌsän/nounthe long threadlike part of a nerve cell along which impulses are conducted from the cell body to other cells. Today I am excited to publicly announce Axon, a library for creating neural networks in Elixir. Axon is still pre-release; however, I believe it's reached a point where it's ready for experimentation and input from … Continue reading Axon: Deep Learning in Elixir