This demo implements a one-dimensional neural integrator.

This example utilizes a recurrent network. It shows how neurons can be used to implement stable dynamics. Such dynamics are important for memory, noise cleanup, statistical inference, and many other dynamic transformations.

import matplotlib.pyplot as plt
%matplotlib inline

import nengo
from nengo.processes import Piecewise
import nengo_loihi
WARNING:tensorflow:From /home/travis/virtualenv/python3.5.2/lib/python3.5/site-packages/tensorflow_core/python/compat/ disable_resource_variables (from tensorflow.python.ops.variable_scope) is deprecated and will be removed in a future version.
Instructions for updating:
non-resource variables are not supported in the long term

Creating the network in Nengo

Our model consists of one recurrently connected ensemble, and an input node. The input node will provide a piecewise step function as input so that we can see the effects of recurrence.

with nengo.Network(label='Integrator') as model:
    ens = nengo.Ensemble(n_neurons=120, dimensions=1)
    stim = nengo.Node(
            0: 0,
            0.2: 1,
            1: 0,
            2: -2,
            3: 0,
            4: 1,
            5: 0

    # Connect the population to itself
    tau = 0.1
    nengo.Connection(ens, ens,
        stim, ens, transform=[[tau]], synapse=tau)

    # Collect data for plotting
    stim_probe = nengo.Probe(stim)
    ens_probe = nengo.Probe(ens, synapse=0.01)

Running the network in Nengo

We can use Nengo to see the desired model output.

with nengo.Simulator(model) as sim:
t = sim.trange()
def plot_decoded(t, data):
    plt.plot(t, data[stim_probe], label="Input")
    plt.plot(t, data[ens_probe], 'k', label="Integrator output")


Running the network with Nengo Loihi

with nengo_loihi.Simulator(model) as sim:
t = sim.trange()
/home/travis/build/nengo/nengo-loihi/nengo_loihi/ UserWarning: Lost 2 extra bits in weight rounding
  warnings.warn("Lost %d extra bits in weight rounding" % (-s2,))

The network integrates its input, but without input decays quicker than the Nengo model. Likely the same workarounds discussed in the communication channel example will be useful here.