Documentation

Join the Nengo community and learn the ropes

The Nengo Ecosystem

The Nengo ecosystem is made up of several interacting projects. The following image groups these projects into rough categories.

Nengo Ecosystem Chart
New to Nengo?

Download Nengo and start running models today!

Download Nengo

Core Framework

The core of the Nengo ecosystem is the Python library nengo, which includes the five Nengo objects (Ensemble, Node, Connection, Probe, Network) and a NumPy-based simulator.

Nengo GUI is a web browser-based interactive model construction and visualization tool.

Nengo DL simulates Nengo models using the TensorFlow library to easily interact with deep learning networks, as well as use deep learning training procedures to optimize Nengo model parameters.

Add-ons & Models

The Semantic Pointer Architecture (SPA) uses Nengo to build large cognitive models.

Extra utilities and add-ons for Nengo. These utilities are helpful when you need them, but they’re not necessary for using Nengo, so we keep them separate to keep the Nengo core as small as possible.

Additional extensions for large-scale brain modelling with Nengo. Includes advanced dynamics networks, additional synapse models, and more.

Nengo models that are too big or complex to include in documentation, but are still useful examples for people building similar models.

Simulation Backends

NengoFPGA is an extension of Nengo that allows portions of a network to be run on an FPGA to improve performance and efficiency.

Nengo Loihi runs Nengo models on Intel's Loihi neuromorphic hardware. Nengo Loihi also includes a software simulation of Loihi’s spiking neuron cores so that models can be prototyped before running on real hardware.

Nengo OpenCL uses the OpenCL framework to run Nengo models on GPUs and other platforms. Most models run significantly faster with Nengo OpenCL.

Nengo SpiNNaker simulates Nengo models using SpiNNaker architecture and associated hardware. Models running on SpiNNaker always execute in real time.

Nengo MPI simulates Nengo models using a C++ backend that uses MPI to parallelize the running of the model on large numbers of heterogeneous processing units.