lava.magma.core.model.py
lava.magma.core.model.py.connection
digraph inheritanceedd4dc7f3b { bgcolor=transparent; rankdir=TB; size=""; "AbstractLearningConnection" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.connection.AbstractLearningConnection",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Base class for learning connection ProcessModels."]; "LearningConnectionModelBitApproximate" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.connection.LearningConnectionModelBitApproximate",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Fixed-point, bit-approximate implementation of the Connection base"]; "PyLearningConnection" -> "LearningConnectionModelBitApproximate" [arrowsize=0.5,style="setlinewidth(0.5)"]; "LearningConnectionModelFloat" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.connection.LearningConnectionModelFloat",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Floating-point implementation of the Connection Process."]; "PyLearningConnection" -> "LearningConnectionModelFloat" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyLearningConnection" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.connection.PyLearningConnection",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Base class for learning connection ProcessModels in Python / CPU."]; "AbstractLearningConnection" -> "PyLearningConnection" [arrowsize=0.5,style="setlinewidth(0.5)"]; }- class lava.magma.core.model.py.connection.AbstractLearningConnection
Bases:
object
Base class for learning connection ProcessModels.
- dd = None
- dt = None
- dw = None
- s_in_bap = None
- s_in_y1 = None
- s_in_y2 = None
- s_in_y3 = None
- tag_1 = None
- tag_2 = None
- tx = None
- ty = None
- x0 = None
- x1 = None
- x1_impulse = None
- x1_tau = None
- x2 = None
- x2_impulse = None
- x2_tau = None
- y0 = None
- y1 = None
- y1_impulse = None
- y1_tau = None
- y2 = None
- y2_impulse = None
- y2_tau = None
- y3 = None
- y3_impulse = None
- y3_tau = None
- class lava.magma.core.model.py.connection.LearningConnectionModelBitApproximate(proc_params)
Bases:
PyLearningConnection
Fixed-point, bit-approximate implementation of the Connection base class.
This class implements the learning simulation with integer and fixed point arithmetic but does not implement the exact behavior of Loihi. Nevertheless, the results are comparable to those by Loihi.
To summarize the behavior:
Spiking phase: run_spk:
(1) (Dense) Send activations from past time step to post-synaptic neuron Process. (2) (Dense) Compute activations to be sent on next time step. (3) (Dense) Receive spikes from pre-synaptic neuron Process. (4) (Dense) Record within-epoch pre-synaptic spiking time. Update pre-synaptic traces if more than one spike during the epoch. (5) Receive spikes from post-synaptic neuron Process. (6) Record within-epoch pre-synaptic spiking time. Update pre-synaptic traces if more than one spike during the epoch. (7) Advance trace random generators.
Learning phase: run_lrn:
Advance synaptic variable random generators.
(2) Compute updates for each active synaptic variable, according to associated learning rule, based on the state of Vars representing dependencies and factors. (3) Update traces based on within-epoch spiking times and trace configuration parameters (impulse, decay). (4) Reset within-epoch spiking times and dependency Vars
Note: The synaptic variable tag_2 currently DOES NOT induce synaptic delay in this connections Process. It can be adapted according to its learning rule (learned), but it will not affect synaptic activity.
- Parameters:
proc_params (dict) – Parameters from the ProcessModel
-
dd:
str
= LavaPyType(cls=<class 'str'>, d_type=<class 'str'>, precision=None)
-
dt:
str
= LavaPyType(cls=<class 'str'>, d_type=<class 'str'>, precision=None)
-
dw:
str
= LavaPyType(cls=<class 'str'>, d_type=<class 'str'>, precision=None)
-
s_in_bap:
PyInPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyInPortVectorDense'>, d_type=<class 'bool'>, precision=1)
-
s_in_y1:
PyInPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyInPortVectorDense'>, d_type=<class 'numpy.int32'>, precision=7)
-
s_in_y2:
PyInPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyInPortVectorDense'>, d_type=<class 'numpy.int32'>, precision=7)
-
s_in_y3:
PyInPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyInPortVectorDense'>, d_type=<class 'numpy.int32'>, precision=7)
-
tag_1:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
-
tag_2:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=6)
-
tx:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=6)
-
ty:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=6)
-
x0:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'bool'>, precision=None)
-
x1:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=7)
-
x1_impulse:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
-
x1_tau:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
-
x2:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=7)
-
x2_impulse:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
-
x2_tau:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
-
y0:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'bool'>, precision=None)
-
y1:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=7)
-
y1_impulse:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
-
y1_tau:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
-
y2:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=7)
-
y2_impulse:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
-
y2_tau:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
-
y3:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=7)
-
y3_impulse:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
-
y3_tau:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=8)
- class lava.magma.core.model.py.connection.LearningConnectionModelFloat(proc_params)
Bases:
PyLearningConnection
Floating-point implementation of the Connection Process.
This ProcessModel constitutes a behavioral implementation of Loihi synapses written in Python, executing on CPU, and operating in floating-point arithmetic.
To summarize the behavior:
Spiking phase: run_spk:
(1) (Dense) Send activations from past time step to post-synaptic neuron Process. (2) (Dense) Compute activations to be sent on next time step. (3) (Dense) Receive spikes from pre-synaptic neuron Process. (4) (Dense) Record within-epoch pre-synaptic spiking time. Update pre-synaptic traces if more than one spike during the epoch. (5) Receive spikes from post-synaptic neuron Process. (6) Record within-epoch pre-synaptic spiking time. Update pre-synaptic traces if more than one spike during the epoch.
Learning phase: run_lrn:
(1) Compute updates for each active synaptic variable, according to associated learning rule, based on the state of Vars representing dependencies and factors. (2) Update traces based on within-epoch spiking times and trace configuration parameters (impulse, decay). (3) Reset within-epoch spiking times and dependency Vars
Note: The synaptic variable tag_2 currently DOES NOT induce synaptic delay in this connections Process. It can be adapted according to its learning rule (learned), but it will not affect synaptic activity.
- Parameters:
proc_params (dict) – Parameters from the ProcessModel
-
dd:
str
= LavaPyType(cls=<class 'str'>, d_type=<class 'str'>, precision=None)
-
dt:
str
= LavaPyType(cls=<class 'str'>, d_type=<class 'str'>, precision=None)
-
dw:
str
= LavaPyType(cls=<class 'str'>, d_type=<class 'str'>, precision=None)
-
s_in_bap:
PyInPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyInPortVectorDense'>, d_type=<class 'bool'>, precision=None)
-
s_in_y1:
PyInPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyInPortVectorDense'>, d_type=<class 'float'>, precision=None)
-
s_in_y2:
PyInPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyInPortVectorDense'>, d_type=<class 'float'>, precision=None)
-
s_in_y3:
PyInPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyInPortVectorDense'>, d_type=<class 'float'>, precision=None)
-
tag_1:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
tag_2:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
tx:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=6)
-
ty:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'int'>, precision=6)
-
x0:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'bool'>, precision=None)
-
x1:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
x1_impulse:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
x1_tau:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
x2:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
x2_impulse:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
x2_tau:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y0:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'bool'>, precision=None)
-
y1:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y1_impulse:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y1_tau:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y2:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y2_impulse:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y2_tau:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y3:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y3_impulse:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y3_tau:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
- class lava.magma.core.model.py.connection.PyLearningConnection(proc_params)
Bases:
AbstractLearningConnection
Base class for learning connection ProcessModels in Python / CPU.
This class provides commonly used functions for simulating the Loihi learning engine. It is subclasses for floating and fixed point simulations.
To summarize the behavior:
Spiking phase: run_spk:
(1) (Dense) Send activations from past time step to post-synaptic neuron Process. (2) (Dense) Compute activations to be sent on next time step. (3) (Dense) Receive spikes from pre-synaptic neuron Process. (4) (Dense) Record within-epoch pre-synaptic spiking time. Update pre-synaptic traces if more than one spike during the epoch. (5) Receive spikes from post-synaptic neuron Process. (6) Record within-epoch pre-synaptic spiking time. Update pre-synaptic traces if more than one spike during the epoch. (7) Advance trace random generators.
Learning phase: run_lrn:
Advance synaptic variable random generators.
(2) Compute updates for each active synaptic variable, according to associated learning rule, based on the state of Vars representing dependencies and factors. (3) Update traces based on within-epoch spiking times and trace configuration parameters (impulse, decay). (4) Reset within-epoch spiking times and dependency Vars
Note: The synaptic variable tag_2 currently DOES NOT induce synaptic delay in this connections Process. It can be adapted according to its learning rule (learned), but it will not affect synaptic activity.
- Parameters:
proc_params (dict) – Parameters from the ProcessModel
- lrn_guard()
- Return type:
bool
- on_var_update()
Update the learning rule parameters when on single Var is updated.
- recv_traces(s_in)
Function to receive and update y1, y2 and y3 traces from the post-synaptic neuron.
- Parameters:
s_in (np.adarray) – Synaptic spike input
- Return type:
None
- run_lrn()
- Return type:
None
lava.magma.core.model.py.model
digraph inheritance9c9cf40b3a { bgcolor=transparent; rankdir=TB; size=""; "ABC" [fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",tooltip="Helper class that provides a standard way to create an ABC using"]; "AbstractProcessModel" [URL="../lava.magma.core.model.html#lava.magma.core.model.model.AbstractProcessModel",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Represents a model that implements the behavior of a Process."]; "ABC" -> "AbstractProcessModel" [arrowsize=0.5,style="setlinewidth(0.5)"]; "AbstractPyProcessModel" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.model.AbstractPyProcessModel",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Abstract interface for Python ProcessModels."]; "AbstractProcessModel" -> "AbstractPyProcessModel" [arrowsize=0.5,style="setlinewidth(0.5)"]; "ABC" -> "AbstractPyProcessModel" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyAsyncProcessModel" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.model.PyAsyncProcessModel",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Process Model for Asynchronous Processes executed on CPU."]; "AbstractPyProcessModel" -> "PyAsyncProcessModel" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyLoihiProcessModel" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.model.PyLoihiProcessModel",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="ProcessModel to simulate a Process on Loihi using CPU."]; "AbstractPyProcessModel" -> "PyLoihiProcessModel" [arrowsize=0.5,style="setlinewidth(0.5)"]; }- class lava.magma.core.model.py.model.AbstractPyProcessModel(proc_params, loglevel=30)
Bases:
AbstractProcessModel
,ABC
Abstract interface for Python ProcessModels.
- Example for how variables and ports might be initialized:
a_in: PyInPort = LavaPyType(PyInPort.VEC_DENSE, float) s_out: PyInPort = LavaPyType(PyOutPort.VEC_DENSE, bool, precision=1) u: np.ndarray = LavaPyType(np.ndarray, np.int32, precision=24) v: np.ndarray = LavaPyType(np.ndarray, np.int32, precision=24) bias: np.ndarray = LavaPyType(np.ndarray, np.int16, precision=12) du: int = LavaPyType(int, np.uint16, precision=12)
- __setattr__(key, value)
Sets attribute in the object. This function is used by the builder to add ports to py_ports and var_ports list.
- Parameters:
key (Attribute being set) –
value (Value of the attribute) –
------- –
- abstract add_ports_for_polling()
Add various ports to poll for communication on ports
- join()
Wait for all the ports to shutdown.
- on_var_update()
This method is called if a Var is updated. It can be used as callback function to calculate dependent changes.
- run()
Retrieves commands from the runtime service and calls their corresponding methods of the ProcessModels. After calling the method of the ProcessModels, the runtime service is informed about completion. The loop ends when the STOP command is received.
- start()
Starts the process model, by spinning up all the ports (mgmt and py_ports) and calls the run function.
- class lava.magma.core.model.py.model.PyAsyncProcessModel(proc_params=None)
Bases:
AbstractPyProcessModel
Process Model for Asynchronous Processes executed on CPU.
This ProcessModel is used in combination with the AsyncProtocol to implement asynchronous execution. This means that the processes could run with a varying speed and message passing is possible at any time.
In order to use the PyAsyncProcessModel, the run_async() function must be implemented which defines the behavior of the underlying Process.
Example
>>> @implements(proc=SimpleProcess, protocol=AsyncProtocol) >>> @requires(CPU) >>> class SimpleProcessModel(PyAsyncProcessModel): >>> u = LavaPyType(int, int) >>> v = LavaPyType(int, int)
>>> def run_async(self): >>> while True: >>> self.u = self.u + 10 >>> self.v = self.v + 1000 >>> if self.check_for_stop_cmd(): >>> return
- class Response
Bases:
object
Different types of response for a RuntimeService Request
- REQ_PAUSE = array([-4.])
Signifies Request of PAUSE
- REQ_STOP = array([-5.])
Signifies Request of STOP
- STATUS_DONE = array([0.])
Signifies Ack or Finished with the Command
- STATUS_ERROR = array([-2.])
Signifies Error raised
- STATUS_PAUSED = array([-3.])
Signifies Execution State to be Paused
- STATUS_TERMINATED = array([-1.])
Signifies Termination
- add_ports_for_polling()
Add various ports to poll for communication on ports
- check_for_pause_cmd()
Checks if the RS has sent a PAUSE command.
- Return type:
bool
- check_for_stop_cmd()
Checks if the RS has sent a STOP command.
- Return type:
bool
- run_async()
User needs to define this function which will run asynchronously when RUN command is received.
- lava.magma.core.model.py.model.PyLoihiModelToPyAsyncModel(py_loihi_model)
Factory function that converts Py-Loihi process models to equivalent Py-Async definition.
- Parameters:
py_loihi_model (ty.Type[PyLoihiProcessModel]) – Py-Loihi model that describes the functional behavior of a process using Loihi Protocol.
- Returns:
Equivalent Py-Async protocol model. The async process model class name is the original loihi process model class name with Async postfix.
- Return type:
ty.Type[PyAsyncProcessModel]
- class lava.magma.core.model.py.model.PyLoihiProcessModel(proc_params=None)
Bases:
AbstractPyProcessModel
ProcessModel to simulate a Process on Loihi using CPU.
The PyLoihiProcessModel implements the same phases of execution as the Loihi 1/2 processor but is executed on CPU. See the LoihiProtocol for a description of the different phases.
Example
>>> @implements(proc=RingBuffer, protocol=LoihiProtocol) >>> @requires(CPU) >>> @tag('floating_pt') >>> class PySendModel(AbstractPyRingBuffer): >>> # Ring buffer send process model. >>> s_out: PyOutPort = LavaPyType(PyOutPort.VEC_DENSE, float) >>> data: np.ndarray = LavaPyType(np.ndarray, float)
>>> def run_spk(self) -> None: >>> buffer = self.data.shape[-1] >>> self.s_out.send(self.data[..., (self.time_step - 1) % buffer])
- class Phase
Bases:
object
Different States of the State Machine of a Loihi Process
- HOST = array([5.])
- LRN = array([3.])
- POST_MGMT = array([4.])
- PRE_MGMT = array([2.])
- SPK = array([1.])
- class Response
Bases:
object
Different types of response for a RuntimeService Request
- REQ_LEARNING = array([-5.])
Signifies Request of LEARNING
- REQ_PAUSE = array([-7.])
Signifies Request of PAUSE
- REQ_POST_LRN_MGMT = array([-6.])
Signifies Request of PREMPTION after Learning
- REQ_PRE_LRN_MGMT = array([-4.])
Signifies Request of PREMPTION before Learning
- REQ_STOP = array([-8.])
Signifies Request of STOP
- STATUS_DONE = array([0.])
Signfies Ack or Finished with the Command
- STATUS_ERROR = array([-2.])
Signifies Error raised
- STATUS_PAUSED = array([-3.])
Signifies Execution State to be Paused
- STATUS_TERMINATED = array([-1.])
Signifies Termination
- add_ports_for_polling()
Add various ports to poll for communication on ports
- advance_to_time_step(ts)
Required for output ports which should send a signal to advance time
- lrn_guard()
Guard function that determines if lrn phase will get executed or not for the current timestep.
- post_guard()
Guard function that determines if post lrn mgmt phase will get executed or not for the current timestep.
- pre_guard()
Guard function that determines if pre lrn mgmt phase will get executed or not for the current timestep.
- run_lrn()
Function that runs in Learning Phase
- run_post_mgmt()
Function that runs in Post Lrn Mgmt Phase
- run_pre_mgmt()
Function that runs in Pre Lrn Mgmt Phase
- run_spk()
Function that runs in Spiking Phase
lava.magma.core.model.py.neuron
digraph inheritancec7d40c210c { bgcolor=transparent; rankdir=TB; size=""; "ABC" [fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",tooltip="Helper class that provides a standard way to create an ABC using"]; "AbstractProcessModel" [URL="../lava.magma.core.model.html#lava.magma.core.model.model.AbstractProcessModel",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Represents a model that implements the behavior of a Process."]; "ABC" -> "AbstractProcessModel" [arrowsize=0.5,style="setlinewidth(0.5)"]; "AbstractPyProcessModel" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.model.AbstractPyProcessModel",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Abstract interface for Python ProcessModels."]; "AbstractProcessModel" -> "AbstractPyProcessModel" [arrowsize=0.5,style="setlinewidth(0.5)"]; "ABC" -> "AbstractPyProcessModel" [arrowsize=0.5,style="setlinewidth(0.5)"]; "LearningNeuronModel" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.neuron.LearningNeuronModel",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Base class for learning enables neuron models."]; "PyLoihiProcessModel" -> "LearningNeuronModel" [arrowsize=0.5,style="setlinewidth(0.5)"]; "LearningNeuronModelFixed" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.neuron.LearningNeuronModelFixed",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Base class for learning enables neuron models."]; "LearningNeuronModel" -> "LearningNeuronModelFixed" [arrowsize=0.5,style="setlinewidth(0.5)"]; "LearningNeuronModelFloat" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.neuron.LearningNeuronModelFloat",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Base class for learning enables neuron models."]; "LearningNeuronModel" -> "LearningNeuronModelFloat" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyLoihiProcessModel" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.model.PyLoihiProcessModel",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="ProcessModel to simulate a Process on Loihi using CPU."]; "AbstractPyProcessModel" -> "PyLoihiProcessModel" [arrowsize=0.5,style="setlinewidth(0.5)"]; }- class lava.magma.core.model.py.neuron.LearningNeuronModel(proc_params)
Bases:
PyLoihiProcessModel
Base class for learning enables neuron models.
Implements ports and vars used by learning enabled neurons. Must be inherited by floating point and fixed point implementations.
- Parameters:
proc_params (dict) – Process parameters from the neuron model.
- a_third_factor_in = None
- s_out_bap = None
- s_out_y1 = None
- s_out_y2 = None
- s_out_y3 = None
- y1 = None
- y2 = None
- y3 = None
- class lava.magma.core.model.py.neuron.LearningNeuronModelFixed(proc_params)
Bases:
LearningNeuronModel
Base class for learning enables neuron models.
Implements ports and vars used by learning enabled neurons for fixed point implementations.
- Parameters:
proc_params (dict) – Process parameters from the neuron model.
-
a_third_factor_in:
PyInPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyInPortVectorDense'>, d_type=<class 'numpy.int32'>, precision=7)
-
s_out_bap:
PyOutPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyOutPortVectorDense'>, d_type=<class 'bool'>, precision=1)
-
s_out_y1:
PyOutPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyOutPortVectorDense'>, d_type=<class 'numpy.int32'>, precision=7)
-
s_out_y2:
PyOutPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyOutPortVectorDense'>, d_type=<class 'numpy.int32'>, precision=7)
-
s_out_y3:
PyOutPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyOutPortVectorDense'>, d_type=<class 'numpy.int32'>, precision=7)
-
y1:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'numpy.int32'>, precision=7)
-
y2:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'numpy.int32'>, precision=7)
-
y3:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'numpy.int32'>, precision=7)
- class lava.magma.core.model.py.neuron.LearningNeuronModelFloat(proc_params)
Bases:
LearningNeuronModel
Base class for learning enables neuron models.
Implements ports and vars used by learning enabled neurons for floating point implementations.
- Parameters:
proc_params (dict) – Process parameters from the neuron model.
-
a_third_factor_in:
PyInPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyInPortVectorDense'>, d_type=<class 'float'>, precision=None)
-
s_out_bap:
PyOutPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyOutPortVectorDense'>, d_type=<class 'bool'>, precision=None)
-
s_out_y1:
PyOutPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyOutPortVectorDense'>, d_type=<class 'float'>, precision=None)
-
s_out_y2:
PyOutPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyOutPortVectorDense'>, d_type=<class 'float'>, precision=None)
-
s_out_y3:
PyOutPort
= LavaPyType(cls=<class 'lava.magma.core.model.py.ports.PyOutPortVectorDense'>, d_type=<class 'float'>, precision=None)
-
y1:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y2:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
-
y3:
ndarray
= LavaPyType(cls=<class 'numpy.ndarray'>, d_type=<class 'float'>, precision=None)
lava.magma.core.model.py.ports
digraph inheritance3cd5dd3368 { bgcolor=transparent; rankdir=TB; size=""; "ABC" [fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",tooltip="Helper class that provides a standard way to create an ABC using"]; "AbstractPortImplementation" [URL="../lava.magma.core.model.html#lava.magma.core.model.interfaces.AbstractPortImplementation",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top"]; "ABC" -> "AbstractPortImplementation" [arrowsize=0.5,style="setlinewidth(0.5)"]; "AbstractPyIOPort" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.AbstractPyIOPort",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Abstract class of an input/output Port implemented in python."]; "AbstractPyPort" -> "AbstractPyIOPort" [arrowsize=0.5,style="setlinewidth(0.5)"]; "AbstractPyPort" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.AbstractPyPort",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Abstract class for Ports implemented in Python."]; "AbstractPortImplementation" -> "AbstractPyPort" [arrowsize=0.5,style="setlinewidth(0.5)"]; "AbstractTransformer" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.AbstractTransformer",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Interface for Transformers that are used in receiving PyPorts to"]; "ABC" -> "AbstractTransformer" [arrowsize=0.5,style="setlinewidth(0.5)"]; "IdentityTransformer" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.IdentityTransformer",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Transformer that does not transform the data but returns it unchanged."]; "AbstractTransformer" -> "IdentityTransformer" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyInPort" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyInPort",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of InPort used within AbstractPyProcessModel."]; "AbstractPyIOPort" -> "PyInPort" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyInPortScalarDense" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyInPortScalarDense",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of PyInPort for dense scalar data."]; "PyInPort" -> "PyInPortScalarDense" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyInPortScalarSparse" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyInPortScalarSparse",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of PyInPort for sparse scalar data."]; "PyInPort" -> "PyInPortScalarSparse" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyInPortVectorDense" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyInPortVectorDense",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of PyInPort for dense vector data."]; "PyInPort" -> "PyInPortVectorDense" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyInPortVectorSparse" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyInPortVectorSparse",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of PyInPort for sparse vector data."]; "PyInPort" -> "PyInPortVectorSparse" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyOutPort" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyOutPort",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of OutPort used within AbstractPyProcessModels."]; "AbstractPyIOPort" -> "PyOutPort" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyOutPortScalarDense" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyOutPortScalarDense",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of PyOutPort for dense scalar data."]; "PyOutPort" -> "PyOutPortScalarDense" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyOutPortScalarSparse" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyOutPortScalarSparse",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of PyOutPort for sparse scalar data."]; "PyOutPort" -> "PyOutPortScalarSparse" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyOutPortVectorDense" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyOutPortVectorDense",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of PyOutPort for dense vector data."]; "PyOutPort" -> "PyOutPortVectorDense" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyOutPortVectorSparse" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyOutPortVectorSparse",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of PyOutPort for sparse vector data."]; "PyOutPort" -> "PyOutPortVectorSparse" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyRefPort" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyRefPort",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of RefPort used within AbstractPyProcessModels."]; "AbstractPyPort" -> "PyRefPort" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyRefPortScalarDense" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyRefPortScalarDense",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of RefPort for dense scalar data."]; "PyRefPort" -> "PyRefPortScalarDense" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyRefPortScalarSparse" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyRefPortScalarSparse",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of RefPort for sparse scalar data."]; "PyRefPort" -> "PyRefPortScalarSparse" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyRefPortVectorDense" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyRefPortVectorDense",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of RefPort for dense vector data."]; "PyRefPort" -> "PyRefPortVectorDense" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyRefPortVectorSparse" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyRefPortVectorSparse",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of RefPort for sparse vector data."]; "PyRefPort" -> "PyRefPortVectorSparse" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyVarPort" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyVarPort",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of VarPort used within AbstractPyProcessModel."]; "AbstractPyPort" -> "PyVarPort" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyVarPortScalarDense" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyVarPortScalarDense",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of VarPort for dense scalar data."]; "PyVarPort" -> "PyVarPortScalarDense" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyVarPortScalarSparse" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyVarPortScalarSparse",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of VarPort for sparse scalar data."]; "PyVarPort" -> "PyVarPortScalarSparse" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyVarPortVectorDense" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyVarPortVectorDense",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of VarPort for dense vector data."]; "PyVarPort" -> "PyVarPortVectorDense" [arrowsize=0.5,style="setlinewidth(0.5)"]; "PyVarPortVectorSparse" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.PyVarPortVectorSparse",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Python implementation of VarPort for sparse vector data."]; "PyVarPort" -> "PyVarPortVectorSparse" [arrowsize=0.5,style="setlinewidth(0.5)"]; "RefVarTypeMapping" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.RefVarTypeMapping",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Class to get the mapping of PyRefPort types to PyVarPort types."]; "VarPortCmd" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.VarPortCmd",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top",tooltip="Helper class to specify constants. Used for communication between"]; "VirtualPortTransformer" [URL="../lava/lava.magma.core.model.py.html#lava.magma.core.model.py.ports.VirtualPortTransformer",fillcolor=white,fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5),filled",target="_top"]; "AbstractTransformer" -> "VirtualPortTransformer" [arrowsize=0.5,style="setlinewidth(0.5)"]; }- class lava.magma.core.model.py.ports.AbstractPyIOPort(csp_ports, process_model, shape, d_type)
Bases:
AbstractPyPort
Abstract class of an input/output Port implemented in python.
A PyIOPort can either be an input or an output Port and is the common abstraction of PyInPort/PyOutPort. _csp_ports is a list of CSP Ports which are used to send/receive data by connected PyIOPorts.
- Parameters:
csp_ports (list) – A list of CSP Ports used by this IO Port.
process_model (AbstractProcessModel) – The process model used by the process of the Port.
shape (tuple) – The shape of the Port.
d_type (type) – The data type of the Port.
- _csp_ports
A list of CSP Ports used by this IO Port.
- Type:
list
- property csp_ports: List[AbstractCspPort]
Property to get the corresponding CSP Ports of all connected PyPorts (csp_ports). The CSP Port is the low level interface of the backend messaging infrastructure which is used to send and receive data.
- Return type:
A list of all CSP Ports connected to the PyPort.
- class lava.magma.core.model.py.ports.AbstractPyPort(process_model, shape=(), d_type=<class 'int'>)
Bases:
AbstractPortImplementation
Abstract class for Ports implemented in Python.
Ports at the Process level provide an interface to connect Processes with each other. Once two Processes have been connected by Ports, they can exchange data. Lava provides four types of Ports: InPorts, OutPorts, RefPorts and VarPorts. An OutPort of a Process can be connected to one or multiple InPorts of other Processes to transfer data from the OutPort to the InPorts. A RefPort of a Process can be connected to a VarPort of another Process. The difference to In-/OutPorts is that a VarPort is directly linked to a Var and via a RefPort the Var can be directly modified from a different Process. To exchange data, PyPorts provide an interface to send and receive messages via channels implemented by a backend messaging infrastructure, which has been inspired by the Communicating Sequential Processes (CSP) paradigm. Thus, a channel denotes a CSP channel of the messaging infrastructure and CSP Ports denote the low level ports also used in the messaging infrastructure. PyPorts are the implementation for message exchange in Python, using the low level CSP Ports of the backend messaging infrastructure. A PyPort may have one or multiple connection to other PyPorts. These connections are represented by csp_ports, which is a list of CSP ports corresponding to the connected PyPorts.
- abstract property csp_ports: List[AbstractCspPort]
Abstract property to get a list of the corresponding CSP Ports of all connected PyPorts. The CSP Port is the low level interface of the backend messaging infrastructure which is used to send and receive data.
- Return type:
A list of all CSP Ports connected to the PyPort.
- class lava.magma.core.model.py.ports.AbstractTransformer
Bases:
ABC
Interface for Transformers that are used in receiving PyPorts to transform data.
- abstract transform(data, csp_port)
Transforms incoming data in way that is determined by which CSP port the data is received.
- Parameters:
data (numpy.ndarray) – data that will be transformed
csp_port (AbstractCspPort) – CSP port that the data was received on
- Returns:
transformed_data – the transformed data
- Return type:
numpy.ndarray
- class lava.magma.core.model.py.ports.IdentityTransformer
Bases:
AbstractTransformer
Transformer that does not transform the data but returns it unchanged.
- transform(data, _)
Transforms incoming data in way that is determined by which CSP port the data is received.
- Parameters:
data (numpy.ndarray) – data that will be transformed
csp_port (AbstractCspPort) – CSP port that the data was received on
- Returns:
transformed_data – the transformed data
- Return type:
numpy.ndarray
- class lava.magma.core.model.py.ports.PyInPort(csp_ports, process_model, shape, d_type, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
AbstractPyIOPort
Python implementation of InPort used within AbstractPyProcessModel.
PyInPort is an input Port that can be used in a Process to receive data sent from a connected PyOutPort of another Process over a channel. PyInPort can receive (recv()) the data, which removes it from the channel, look (peek()) at the data which keeps it on the channel or check (probe()) if there is data on the channel. The different class attributes are used to select the type of OutPorts via LavaPyType declarations in PyProcModels, e.g., LavaPyType(PyInPort.VEC_DENSE, np.int32, precision=24) creates a PyInPort. A PyOutPort (source) can be connected to one or multiple PyInPorts (target).
- Parameters:
csp_ports (ty.List[AbstractCspPort]) – Used to receive data from the referenced PyOutPort.
process_model (AbstractProcessModel) – The process model used by the process of the Port.
shape (tuple, default=tuple()) – The shape of the Port.
d_type (type, default=int) – The data type of the Port.
transformer (AbstractTransformer, default: identity function) – Enables transforming the received data in accordance with the virtual ports on the path to the PyInPort.
- _transformer
Enables transforming the received data in accordance with the virtual ports on the path to the PyVarPort.
- Type:
- VEC_DENSE
Type of PyInPort. CSP Port sends data as dense vector.
- Type:
PyInPortVectorDense, default=None
- VEC_SPARSE
Type of PyInPort. CSP Port sends data as sparse vector (data + indices), so only entries which have changed in a vector need to be communicated.
- Type:
PyInPortVectorSparse, default=None
- SCALAR_DENSE
Type of PyInPort. CSP Port sends data element by element for the whole data structure. So the CSP channel does need less memory to transfer data.
- Type:
PyInPortScalarDense, default=None
- SCALAR_SPARSE
Type of PyInPort. CSP Port sends data element by element, but after each element the index of the data entry is also given. So only entries which need to be changed need to be communicated.
- Type:
PyInPortScalarSparse, default=None
- SCALAR_DENSE
alias of
PyInPortScalarDense
- SCALAR_SPARSE
alias of
PyInPortScalarSparse
- VEC_DENSE
alias of
PyInPortVectorDense
- VEC_SPARSE
alias of
PyInPortVectorSparse
- abstract peek()
Abstract method to receive data (vectors/scalars) sent from connected OutPorts (source Ports). Keeps the data on the channel.
- Returns:
The scalar or vector received from a connected OutPort. If the InPort is
connected to several OutPorts, their input is added in a point-wise
fashion.
- probe()
Method to check (probe) if there is data (vectors/scalars) to receive from connected OutPorts (source Ports).
- Returns:
result – Returns True only when there is data to receive from all connected OutPort channels.
- Return type:
bool
- abstract recv()
Abstract method to receive data (vectors/scalars) sent from connected OutPorts (source Ports). Removes the retrieved data from the channel. Expects data on the channel and will block execution if there is no data to retrieve on the channel.
- Returns:
The scalar or vector received from a connected OutPort. If the InPort is
connected to several OutPorts, their input is added in a point-wise
fashion.
- class lava.magma.core.model.py.ports.PyInPortScalarDense(csp_ports, process_model, shape, d_type, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyInPort
Python implementation of PyInPort for dense scalar data.
- peek()
TBD
- Return type:
int
- recv()
TBD
- Return type:
int
- class lava.magma.core.model.py.ports.PyInPortScalarSparse(csp_ports, process_model, shape, d_type, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyInPort
Python implementation of PyInPort for sparse scalar data.
- peek()
TBD
- Return type:
Tuple
[int
,int
]
- recv()
TBD
- Return type:
Tuple
[int
,int
]
- class lava.magma.core.model.py.ports.PyInPortVectorDense(csp_ports, process_model, shape, d_type, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyInPort
Python implementation of PyInPort for dense vector data.
- peek()
Method to receive data (vectors) sent from connected OutPorts (source Ports). Keeps the data on the channel.
- Returns:
result – The vector received from a connected OutPort. If the InPort is connected to several OutPorts, their input is added in a point-wise fashion.
- Return type:
ndarray of shape _shape
- recv()
Method to receive data (vectors/scalars) sent from connected OutPorts (source Ports). Removes the retrieved data from the channel. Expects data on the channel and will block execution if there is no data to retrieve on the channel.
- Returns:
result – The vector received from a connected OutPort. If the InPort is connected to several OutPorts, their input is added in a point-wise fashion.
- Return type:
ndarray of shape _shape
- class lava.magma.core.model.py.ports.PyInPortVectorSparse(csp_ports, process_model, shape, d_type, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyInPort
Python implementation of PyInPort for sparse vector data.
- peek()
TBD
- Return type:
Tuple
[ndarray
,ndarray
]
- recv()
TBD
- Return type:
Tuple
[ndarray
,ndarray
]
- class lava.magma.core.model.py.ports.PyOutPort(csp_ports, process_model, shape, d_type)
Bases:
AbstractPyIOPort
Python implementation of OutPort used within AbstractPyProcessModels.
PyOutPort is an output Port sending data to a connected input Port (PyInPort) over a channel. PyOutPort can send (send()) the data by adding it to the channel, or it can clear (flush()) the channel to remove any data from it. The different class attributes are used to select the type of OutPorts via LavaPyType declarations in PyProcModels, e.g., LavaPyType( PyOutPort.VEC_DENSE, np.int32, precision=24) creates a PyOutPort. A PyOutPort (source) can be connected to one or multiple PyInPorts (target).
- Parameters:
csp_ports (list) – A list of CSP Ports used by this IO Port.
process_model (AbstractProcessModel) – The process model used by the process of the Port.
shape (tuple) – The shape of the Port.
d_type (type) – The data type of the Port.
- VEC_DENSE
Type of PyInPort. CSP Port sends data as dense vector.
- Type:
PyOutPortVectorDense, default=None
- VEC_SPARSE
Type of PyInPort. CSP Port sends data as sparse vector (data + indices), so only entries which have changed in a vector need to be communicated.
- Type:
PyOutPortVectorSparse, default=None
- SCALAR_DENSE
Type of PyInPort. CSP Port sends data element by element for the whole data structure. So the CSP channel does need less memory to transfer data.
- Type:
PyOutPortScalarDense, default=None
- SCALAR_SPARSE
Type of PyInPort. CSP Port sends data element by element, but after each element the index of the data entry is also given. So only entries which need to be changed need to be communicated.
- Type:
PyOutPortScalarSparse, default=None
- SCALAR_DENSE
alias of
PyOutPortScalarDense
- SCALAR_SPARSE
alias of
PyOutPortScalarSparse
- VEC_DENSE
alias of
PyOutPortVectorDense
- VEC_SPARSE
alias of
PyOutPortVectorSparse
- advance_to_time_step(ts)
- flush()
TBD
- abstract send(data)
Abstract method to send data to the connected Port PyInPort (target).
- Parameters:
data (ndarray or int) – The data (vector or scalar) to be sent to the InPort (target).
- class lava.magma.core.model.py.ports.PyOutPortScalarDense(csp_ports, process_model, shape, d_type)
Bases:
PyOutPort
Python implementation of PyOutPort for dense scalar data.
- send(data)
TBD
- class lava.magma.core.model.py.ports.PyOutPortScalarSparse(csp_ports, process_model, shape, d_type)
Bases:
PyOutPort
Python implementation of PyOutPort for sparse scalar data.
- send(data, idx)
TBD
- class lava.magma.core.model.py.ports.PyOutPortVectorDense(csp_ports, process_model, shape, d_type)
Bases:
PyOutPort
Python implementation of PyOutPort for dense vector data.
- send(data)
Abstract method to send data to the connected in Port (target).
Sends data only if the OutPort is connected to at least one InPort.
- Parameters:
data (ndarray) – The data vector to be sent to the in Port (target).
- class lava.magma.core.model.py.ports.PyOutPortVectorSparse(csp_ports, process_model, shape, d_type)
Bases:
PyOutPort
Python implementation of PyOutPort for sparse vector data.
- send(data, indices)
TBD
- class lava.magma.core.model.py.ports.PyRefPort(csp_send_port, csp_recv_port, process_model, shape=(), d_type=<class 'int'>, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
AbstractPyPort
Python implementation of RefPort used within AbstractPyProcessModels.
A PyRefPort is a Port connected to a VarPort of a variable Var of another Process. It is used to get or set the value of the referenced Var across Processes. A PyRefPort is connected via two CSP channels and corresponding CSP ports to a PyVarPort. One channel is used to send data from the PyRefPort to the PyVarPort and the other channel is used to receive data from the PyVarPort. PyRefPorts can get the value of a referenced Var (read()), set the value of a referenced Var (write()) and block execution until receipt of prior ‘write’ commands (sent from PyRefPort to PyVarPort) have been acknowledged (wait()).
- Parameters:
csp_send_port (CspSendPort or None) – Used to send data to the referenced Port PyVarPort (target).
csp_recv_port (CspRecvPort or None) – Used to receive data from the referenced Port PyVarPort (source).
process_model (AbstractProcessModel) – The process model used by the process of the Port.
shape (tuple, default=tuple()) – The shape of the Port.
d_type (type, default=int) – The data type of the Port.
transformer (AbstractTransformer, default: identity function) – Enables transforming the received data in accordance with the virtual ports on the path to the PyRefPort.
- _csp_send_port
Used to send data to the referenced Port PyVarPort (target).
- Type:
- _csp_recv_port
Used to receive data from the referenced Port PyVarPort (source).
- Type:
- _transformer
Enables transforming the received data in accordance with the virtual ports on the path to the PyRefPort.
- Type:
- VEC_DENSE
Type of PyInPort. CSP Port sends data as dense vector.
- Type:
PyRefPortVectorDense, default=None
- VEC_SPARSE
Type of PyInPort. CSP Port sends data as sparse vector (data + indices), so only entries which have changed in a vector need to be communicated.
- Type:
PyRefPortVectorSparse, default=None
- SCALAR_DENSE
Type of PyInPort. CSP Port sends data element by element for the whole data structure. So the CSP channel does need less memory to transfer data.
- Type:
PyRefPortScalarDense, default=None
- SCALAR_SPARSE
Type of PyInPort. CSP Port sends data element by element, but after each element the index of the data entry is also given. So only entries which need to be changed need to be communicated.
- Type:
PyRefPortScalarSparse, default=None
- SCALAR_DENSE
alias of
PyRefPortScalarDense
- SCALAR_SPARSE
alias of
PyRefPortScalarSparse
- VEC_DENSE
alias of
PyRefPortVectorDense
- VEC_SPARSE
alias of
PyRefPortVectorSparse
- property csp_ports: List[AbstractCspPort]
Property to get the corresponding CSP Ports of all connected PyPorts (csp_ports). The CSP Port is the low level interface of the backend messaging infrastructure which is used to send and receive data.
- Return type:
A list of all CSP Ports connected to the PyPort.
- abstract read()
Abstract method to request and return data from a VarPort. :rtype: The value of the referenced var.
- wait()
Blocks execution until receipt of prior ‘write’ commands (sent from RefPort to VarPort) have been acknowledged. Calling wait() ensures that the value written by the RefPort can be received (and set) by the VarPort at the same time step. If wait() is not called, it is possible that the value is received only at the next time step (non-deterministic).
>>> port = PyRefPort() >>> port.write(5) >>> # potentially do other stuff >>> port.wait() # waits until (all) previous writes have finished
Preliminary implementation. Currently, a simple read() ensures the writes have been acknowledged. This is inefficient and will be optimized later at the CspChannel level
- abstract write(data)
Abstract method to write data to a VarPort to set its Var.
- Parameters:
data (ndarray, tuple of ndarray, int, tuple of int) – The new value of the referenced Var.
- class lava.magma.core.model.py.ports.PyRefPortScalarDense(csp_send_port, csp_recv_port, process_model, shape=(), d_type=<class 'int'>, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyRefPort
Python implementation of RefPort for dense scalar data.
- read()
TBD
- Return type:
int
- write(data)
TBD
- class lava.magma.core.model.py.ports.PyRefPortScalarSparse(csp_send_port, csp_recv_port, process_model, shape=(), d_type=<class 'int'>, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyRefPort
Python implementation of RefPort for sparse scalar data.
- read()
TBD
- Return type:
Tuple
[int
,int
]
- write(data, idx)
TBD
- class lava.magma.core.model.py.ports.PyRefPortVectorDense(csp_send_port, csp_recv_port, process_model, shape=(), d_type=<class 'int'>, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyRefPort
Python implementation of RefPort for dense vector data.
- read()
Method to request and return data from a referenced Var using a PyVarPort.
- Returns:
result – The value of the referenced Var.
- Return type:
ndarray of shape _shape
- write(data)
Abstract method to write data to a VarPort to set the value of the referenced Var.
- Parameters:
data (ndarray) – The data to send via _csp_send_port.
- class lava.magma.core.model.py.ports.PyRefPortVectorSparse(csp_send_port, csp_recv_port, process_model, shape=(), d_type=<class 'int'>, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyRefPort
Python implementation of RefPort for sparse vector data.
- read()
TBD
- Return type:
Tuple
[ndarray
,ndarray
]
- write(data, idx)
TBD
- class lava.magma.core.model.py.ports.PyVarPort(var_name, csp_send_port, csp_recv_port, process_model, shape=(), d_type=<class 'int'>, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
AbstractPyPort
Python implementation of VarPort used within AbstractPyProcessModel.
A PyVarPort is a Port linked to a variable Var of a Process and might be connected to a RefPort of another process. It is used to get or set the value of the referenced Var across Processes. A PyVarPort is connected via two channels to a PyRefPort. One channel is used to send data from the PyRefPort to the PyVarPort and the other is used to receive data from the PyVarPort. PyVarPorts set or send the value of the linked Var (service()) given the command VarPortCmd received by a connected PyRefPort.
- Parameters:
var_name (str) – The name of the Var linked to this VarPort.
csp_send_port (CspSendPort or None) – Csp Port used to send data to the referenced in Port (target).
csp_recv_port (CspRecvPort or None) – Csp Port used to receive data from the referenced Port (source).
process_model (AbstractProcessModel) – The process model used by the process of the Port.
shape (tuple, default=tuple()) – The shape of the Port.
d_type (type, default=int) – The data type of the Port.
transformer (AbstractTransformer, default: identity function) – Enables transforming the received data in accordance with the virtual ports on the path to the PyVarPort.
- var_name
The name of the Var linked to this VarPort.
- Type:
str
- _csp_send_port
Used to send data to the referenced Port PyRefPort (target).
- Type:
- _csp_recv_port
Used to receive data from the referenced Port PyRefPort (source).
- Type:
- _transformer
Enables transforming the received data in accordance with the virtual ports on the path to the PyVarPort.
- Type:
- VEC_DENSE
Type of PyInPort. CSP Port sends data as dense vector.
- Type:
PyVarPortVectorDense, default=None
- VEC_SPARSE
Type of PyInPort. CSP Port sends data as sparse vector (data + indices), so only entries which have changed in a vector need to be communicated.
- Type:
PyVarPortVectorSparse, default=None
- SCALAR_DENSE
Type of PyInPort. CSP Port sends data element by element for the whole data structure. So the CSP channel does need less memory to transfer data.
- Type:
PyVarPortScalarDense, default=None
- SCALAR_SPARSE
Type of PyInPort. CSP Port sends data element by element, but after each element the index of the data entry is also given. So only entries which need to be changed need to be communicated.
- Type:
PyVarPortScalarSparse, default=None
- SCALAR_DENSE
alias of
PyVarPortScalarDense
- SCALAR_SPARSE
alias of
PyVarPortScalarSparse
- VEC_DENSE
alias of
PyVarPortVectorDense
- VEC_SPARSE
alias of
PyVarPortVectorSparse
- property csp_ports: List[AbstractCspPort]
Property to get the corresponding CSP Ports of all connected PyPorts (csp_ports). The CSP Port is the low level interface of the backend messaging infrastructure which is used to send and receive data.
- Return type:
A list of all CSP Ports connected to the PyPort.
- abstract service()
Abstract method to set the value of the linked Var of the VarPort, received from the connected RefPort, or to send the value of the linked Var of the VarPort to the connected RefPort. The connected RefPort determines whether it will perform a read() or write() operation by sending a command VarPortCmd.
- class lava.magma.core.model.py.ports.PyVarPortScalarDense(var_name, csp_send_port, csp_recv_port, process_model, shape=(), d_type=<class 'int'>, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyVarPort
Python implementation of VarPort for dense scalar data.
- peek()
TBD
- Return type:
int
- recv()
TBD
- Return type:
int
- service()
TBD
- class lava.magma.core.model.py.ports.PyVarPortScalarSparse(var_name, csp_send_port, csp_recv_port, process_model, shape=(), d_type=<class 'int'>, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyVarPort
Python implementation of VarPort for sparse scalar data.
- peek()
TBD
- Return type:
Tuple
[int
,int
]
- recv()
TBD
- Return type:
Tuple
[int
,int
]
- service()
TBD
- class lava.magma.core.model.py.ports.PyVarPortVectorDense(var_name, csp_send_port, csp_recv_port, process_model, shape=(), d_type=<class 'int'>, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyVarPort
Python implementation of VarPort for dense vector data.
- service()
Method to set the value of the linked Var of the VarPort, received from the connected RefPort, or to send the value of the linked Var of the VarPort to the connected RefPort. The connected RefPort determines whether it will perform a read() or write() operation by sending a command VarPortCmd.
- class lava.magma.core.model.py.ports.PyVarPortVectorSparse(var_name, csp_send_port, csp_recv_port, process_model, shape=(), d_type=<class 'int'>, transformer=<lava.magma.core.model.py.ports.IdentityTransformer object>)
Bases:
PyVarPort
Python implementation of VarPort for sparse vector data.
- peek()
TBD
- Return type:
Tuple
[ndarray
,ndarray
]
- recv()
TBD
- Return type:
Tuple
[ndarray
,ndarray
]
- service()
TBD
- class lava.magma.core.model.py.ports.RefVarTypeMapping
Bases:
object
Class to get the mapping of PyRefPort types to PyVarPort types.
PyRefPorts and PyVarPorts can be implemented as different subtypes, defining the format of the data to process. To connect PyRefPorts and PyVarPorts they need to have a compatible data format. This class maps the fitting data format between PyRefPorts and PyVarPorts.
- mapping
Dictionary containing the mapping of compatible PyRefPort types to PyVarPort types.
- Type:
dict
- classmethod get(ref_port)
Class method to return the compatible PyVarPort type given the PyRefPort type.
-
mapping:
Dict
[Type
[PyRefPort
],Type
[PyVarPort
]] = {<class 'lava.magma.core.model.py.ports.PyRefPortVectorDense'>: <class 'lava.magma.core.model.py.ports.PyVarPortVectorDense'>, <class 'lava.magma.core.model.py.ports.PyRefPortVectorSparse'>: <class 'lava.magma.core.model.py.ports.PyVarPortVectorSparse'>, <class 'lava.magma.core.model.py.ports.PyRefPortScalarDense'>: <class 'lava.magma.core.model.py.ports.PyVarPortScalarDense'>, <class 'lava.magma.core.model.py.ports.PyRefPortScalarSparse'>: <class 'lava.magma.core.model.py.ports.PyVarPortScalarSparse'>}
- class lava.magma.core.model.py.ports.VarPortCmd
Bases:
object
Helper class to specify constants. Used for communication between PyRefPorts and PyVarPorts.
- GET = array([0.])
- SET = array([1.])
- class lava.magma.core.model.py.ports.VirtualPortTransformer(csp_ports, transform_funcs)
Bases:
AbstractTransformer
- transform(data, csp_port)
Transforms incoming data in way that is determined by which CSP port the data is received.
- Parameters:
data (numpy.ndarray) – data that will be transformed
csp_port (AbstractCspPort) – CSP port that the data was received on
- Returns:
transformed_data – the transformed data
- Return type:
numpy.ndarray