-
Notifications
You must be signed in to change notification settings - Fork 5
/
lava_architecture_overview.html
694 lines (671 loc) · 77.3 KB
/
lava_architecture_overview.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
<!DOCTYPE html>
<html class="writer-html5" lang="en" >
<head>
<meta charset="utf-8" /><meta name="generator" content="Docutils 0.18.1: http://docutils.sourceforge.net/" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Lava Architecture — Lava documentation</title>
<link rel="stylesheet" href="_static/pygments.css" type="text/css" />
<link rel="stylesheet" href="_static/css/theme.css" type="text/css" />
<link rel="stylesheet" href="_static/graphviz.css" type="text/css" />
<!--[if lt IE 9]>
<script src="_static/js/html5shiv.min.js"></script>
<![endif]-->
<script src="_static/jquery.js"></script>
<script src="_static/_sphinx_javascript_frameworks_compat.js"></script>
<script data-url_root="./" id="documentation_options" src="_static/documentation_options.js"></script>
<script src="_static/doctools.js"></script>
<script src="_static/sphinx_highlight.js"></script>
<script crossorigin="anonymous" integrity="sha256-Ae2Vz/4ePdIu6ZyI/5ZGsYnb+m0JlOmKPjt6XZ9JJkA=" src="https://cdnjs.cloudflare.com/ajax/libs/require.js/2.3.4/require.min.js"></script>
<script src="_static/js/theme.js"></script>
<link rel="index" title="Index" href="genindex.html" />
<link rel="search" title="Search" href="search.html" />
<link rel="next" title="Getting Started with Lava" href="getting_started_with_lava.html" />
<link rel="prev" title="Lava Software Framework" href="index.html" />
</head>
<body class="wy-body-for-nav">
<div class="wy-grid-for-nav">
<nav data-toggle="wy-nav-shift" class="wy-nav-side">
<div class="wy-side-scroll">
<div class="wy-side-nav-search" >
<a href="index.html" class="icon icon-home">
Lava
</a>
<div role="search">
<form id="rtd-search-form" class="wy-form" action="search.html" method="get">
<input type="text" name="q" placeholder="Search docs" aria-label="Search docs" />
<input type="hidden" name="check_keywords" value="yes" />
<input type="hidden" name="area" value="default" />
</form>
</div>
</div><div class="wy-menu wy-menu-vertical" data-spy="affix" role="navigation" aria-label="Navigation menu">
<ul class="current">
<li class="toctree-l1 current"><a class="current reference internal" href="#">Lava Architecture</a><ul>
<li class="toctree-l2"><a class="reference internal" href="#key-attributes">Key attributes</a></li>
<li class="toctree-l2"><a class="reference internal" href="#why-do-we-need-lava">Why do we need Lava?</a></li>
<li class="toctree-l2"><a class="reference internal" href="#lava-s-foundational-concepts">Lava’s foundational concepts</a><ul>
<li class="toctree-l3"><a class="reference internal" href="#processes">1. Processes</a></li>
<li class="toctree-l3"><a class="reference internal" href="#behavioral-implementations-via-processmodels">2. Behavioral implementations via ProcessModels</a></li>
<li class="toctree-l3"><a class="reference internal" href="#composability-and-connectivity">3. Composability and connectivity</a></li>
<li class="toctree-l3"><a class="reference internal" href="#cross-platform-execution">4. Cross-platform execution</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="#lava-software-stack">Lava software stack</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="getting_started_with_lava.html">Getting Started with Lava</a><ul>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/in_depth/tutorial01_installing_lava.html">Installing Lava</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial01_installing_lava.html#1.-System-Requirements">1. System Requirements</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial01_installing_lava.html#2.-Getting-Started">2. Getting Started</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial01_installing_lava.html#2.1-Cloning-Lava-and-Running-from-Source">2.1 Cloning Lava and Running from Source</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial01_installing_lava.html#2.2-[Alternative]-Installing-Lava-from-Binaries">2.2 [Alternative] Installing Lava from Binaries</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial01_installing_lava.html#3.-Running-Lava-on-Intel-Loihi">3. Running Lava on Intel Loihi</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial01_installing_lava.html#4.-Lava-Developer-Guide">4. Lava Developer Guide</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial01_installing_lava.html#5.-Tutorials">5. Tutorials</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial01_installing_lava.html#How-to-learn-more?">How to learn more?</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html">Walk through Lava</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#1.-Usage-of-the-Process-Library">1. Usage of the Process Library</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Processes">Processes</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Ports-and-connections">Ports and connections</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Variables">Variables</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Record-internal-Vars-over-time">Record internal Vars over time</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Execution">Execution</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Retrieve-recorded-data">Retrieve recorded data</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Summary">Summary</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Learn-more-about">Learn more about</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#2.-Create-a-custom-Process">2. Create a custom Process</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Create-a-new-ProcessModel">Create a new ProcessModel</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Use-the-custom-SpikeGenerator">Use the custom SpikeGenerator</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#Execute-and-plot">Execute and plot</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#id1">Summary</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#id2">Learn more about</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial00_tour_through_lava.html#How-to-learn-more?">How to learn more?</a></li>
</ul>
</li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html">Processes</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#Recommended-tutorials-before-starting:">Recommended tutorials before starting:</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#What-is-a-Process?">What is a <em>Process</em>?</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#How-to-build-a-Process?">How to build a <em>Process</em>?</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#Overall-architecture">Overall architecture</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#AbstractProcess:-Defining-Vars,-Ports,-and-the-API"><em>AbstractProcess</em>: Defining <em>Vars</em>, <em>Ports</em>, and the API</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#ProcessModel:-Defining-the-behavior-of-a-Process"><em>ProcessModel</em>: Defining the behavior of a <em>Process</em></a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#Instantiating-the-Process">Instantiating the <em>Process</em></a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#Interacting-with-Processes">Interacting with <em>Processes</em></a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#Accessing-Vars">Accessing <em>Vars</em></a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#Using-custom-APIs">Using custom APIs</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#Executing-a-Process">Executing a <em>Process</em></a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#Update-Vars">Update <em>Vars</em></a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial02_processes.html#How-to-learn-more?">How to learn more?</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/in_depth/tutorial03_process_models.html"><em>ProcessModels</em></a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial03_process_models.html#Recommended-tutorials-before-starting:">Recommended tutorials before starting:</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial03_process_models.html#Create-a-LIF-Process">Create a LIF <em>Process</em></a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial03_process_models.html#Create-a-Python-LeafProcessModel-that-implements-the-LIF-Process">Create a Python <em>LeafProcessModel</em> that implements the LIF <em>Process</em></a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial03_process_models.html#Setup">Setup</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial03_process_models.html#Defining-a-PyLifModel-for-LIF">Defining a <em>PyLifModel</em> for LIF</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial03_process_models.html#Compile-and-run-PyLifModel">Compile and run <em>PyLifModel</em></a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial03_process_models.html#Selecting-1-ProcessModel:-More-on-LeafProcessModel-attributes-and-relations">Selecting 1 <em>ProcessModel</em>: More on <em>LeafProcessModel</em> attributes and relations</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial03_process_models.html#How-to-learn-more?">How to learn more?</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/in_depth/tutorial04_execution.html">Execution</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial04_execution.html#Recommended-tutorials-before-starting:">Recommended tutorials before starting:</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial04_execution.html#Configuring-and-starting-execution">Configuring and starting execution</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial04_execution.html#Run-conditions">Run conditions</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial04_execution.html#Run-configurations">Run configurations</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial04_execution.html#Running-multiple-Processes">Running multiple <em>Processes</em></a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial04_execution.html#Pausing,-resuming,-and-stopping-execution">Pausing, resuming, and stopping execution</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial04_execution.html#Manual-compilation-and-execution">Manual compilation and execution</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial04_execution.html#How-to-learn-more?">How to learn more?</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/in_depth/tutorial05_connect_processes.html">Connect processes</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial05_connect_processes.html#Recommended-tutorials-before-starting:">Recommended tutorials before starting:</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial05_connect_processes.html#Building-a-network-of-Processes">Building a network of <em>Processes</em></a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial05_connect_processes.html#Create-a-connection">Create a connection</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial05_connect_processes.html#Possible-connections">Possible connections</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial05_connect_processes.html#There-are-some-things-to-consider-though:">There are some things to consider though:</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial05_connect_processes.html#Connect-multiple-InPorts-from-a-single-OutPort">Connect multiple <em>InPorts</em> from a single <em>OutPort</em></a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial05_connect_processes.html#Connecting-multiple-InPorts-to-a-single-OutPort">Connecting multiple <em>InPorts</em> to a single <em>OutPort</em></a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial05_connect_processes.html#How-to-learn-more?">How to learn more?</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html">Hierarchical <em>Processes</em> and <em>SubProcessModels</em></a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#Recommended-tutorials-before-starting:">Recommended tutorials before starting:</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#Create-LIF-and-Dense-Processes-and-ProcessModels">Create LIF and Dense <em>Processes</em> and <em>ProcessModels</em></a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#Create-a-Dense-connection-Process">Create a Dense connection <em>Process</em></a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#Create-a-Python-Dense-connection-ProcessModel-implementing-the-Loihi-Sync-Protocol-and-requiring-a-CPU-compute-resource">Create a Python Dense connection <em>ProcessModel</em> implementing the Loihi Sync Protocol and requiring a CPU compute resource</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#Create-a-LIF-neuron-Process">Create a LIF neuron <em>Process</em></a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#Create-a-Python-LIF-neuron-ProcessModel-implementing-the-Loihi-Sync-Protocol-and-requiring-a-CPU-compute-resource">Create a Python LIF neuron <em>ProcessModel</em> implementing the Loihi Sync Protocol and requiring a CPU compute resource</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#Create-a-DenseLayer-Hierarchical-Process-that-encompasses-Dense-and-LIF-Process-behavior">Create a DenseLayer Hierarchical <em>Process</em> that encompasses Dense and LIF <em>Process</em> behavior</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#Create-a-SubProcessModel-that-implements-the-DenseLayer-Process-using-Dense-and-LIF-child-Processes">Create a <em>SubProcessModel</em> that implements the DenseLayer <em>Process</em> using Dense and LIF child <em>Processes</em></a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#Run-the-DenseLayer-Process">Run the DenseLayer <em>Process</em></a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#Run-Connected-DenseLayer-Processes">Run Connected DenseLayer <em>Processes</em></a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial06_hierarchical_processes.html#How-to-learn-more?">How to learn more?</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/in_depth/tutorial07_remote_memory_access.html">Remote Memory Access</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial07_remote_memory_access.html#Recommended-tutorials-before-starting:">Recommended tutorials before starting:</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial07_remote_memory_access.html#Create-a-minimal-Process-and-ProcessModel-with-a-RefPort">Create a minimal <em>Process</em> and <em>ProcessModel</em> with a <em>RefPort</em></a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial07_remote_memory_access.html#Create-a-Python-Process-Model-implementing-the-Loihi-Sync-Protocol-and-requiring-a-CPU-compute-resource">Create a Python Process Model implementing the Loihi Sync Protocol and requiring a CPU compute resource</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial07_remote_memory_access.html#Run-the-Processes">Run the <em>Processes</em></a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial07_remote_memory_access.html#Implicit-and-explicit-VarPorts">Implicit and explicit VarPorts</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial07_remote_memory_access.html#Options-to-connect-RefPorts-and-VarPorts">Options to connect RefPorts and VarPorts</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial07_remote_memory_access.html#How-to-learn-more?">How to learn more?</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html">MNIST Digit Classification with Lava</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html#This-tutorial-assumes-that-you:">This tutorial assumes that you:</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html#This-tutorial-gives-a-bird’s-eye-view-of">This tutorial gives a bird’s-eye view of</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html#Our-MNIST-Classifier">Our MNIST Classifier</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html#General-Imports">General Imports</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html#Lava-Processes">Lava Processes</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html#ProcessModels-for-Python-execution">ProcessModels for Python execution</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html#Connecting-Processes">Connecting Processes</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html#Execution-and-results">Execution and results</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html#How-to-learn-more?">How to learn more?</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial01_mnist_digit_classification.html#Follow-the-links-below-for-deep-dive-tutorials-on-the-concepts-in-this-tutorial:">Follow the links below for deep-dive tutorials on the concepts in this tutorial:</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html">Excitatory-Inhibitory Neural Network with Lava</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#This-tutorial-assumes-that-you:">This tutorial assumes that you:</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#This-tutorial-gives-a-high-level-view-of">This tutorial gives a high level view of</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#E/I-Network">E/I Network</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#General-imports">General imports</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#E/I-Network-Lava-Process">E/I Network Lava Process</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#ProcessModels-for-Python-execution">ProcessModels for Python execution</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#Rate-neurons">Rate neurons</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#Defining-the-parameters-for-the-network">Defining the parameters for the network</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#Execution-and-Results">Execution and Results</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#Visualizing-the-activity">Visualizing the activity</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#Further-analysis">Further analysis</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#Controlling-the-network">Controlling the network</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#LIF-Neurons">LIF Neurons</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#id7">Execution and Results</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#id8">Visualizing the activity</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#id9">Controlling the network</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#DIfferent-recurrent-activation-regimes">DIfferent recurrent activation regimes</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#Running-a-ProcessModel-bit-accurate-with-Loihi">Running a ProcessModel bit-accurate with Loihi</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#Execution-of-bit-accurate-model">Execution of bit accurate model</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/end_to_end/tutorial02_excitatory_inhibitory_network.html#Follow-the-links-below-for-deep-dive-tutorials-on-the-concepts-in-this-tutorial:">Follow the links below for deep-dive tutorials on the concepts in this tutorial:</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/in_depth/tutorial08_stdp.html">Spike-timing Dependent Plasticity (STDP)</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial08_stdp.html#This-tutorial-assumes-that-you:">This tutorial assumes that you:</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial08_stdp.html#STDP-from-Lavas-Process-Library">STDP from Lavas Process Library</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial08_stdp.html#The-plastic-connection-Process">The plastic connection Process</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial08_stdp.html#Plot-spike-trains">Plot spike trains</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial08_stdp.html#Plot-traces">Plot traces</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial08_stdp.html#Plot-STDP-learning-window-and-weight-changes">Plot STDP learning window and weight changes</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html">Custom Learning Rules</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#This-tutorial-assumes-that-you:">This tutorial assumes that you:</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#2.-Loihi’s-learning-engine">2. Loihi’s learning engine</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Epoch-based-updates">Epoch-based updates</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Synaptic-variables">Synaptic variables</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Learning-rules">Learning rules</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Dependencies">Dependencies</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Scaling-factors">Scaling factors</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Factors">Factors</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Traces">Traces</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Example:-Basic-pair-based-STDP">Example: Basic pair-based STDP</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Instantiating-LearningRule">Instantiating LearningRule</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#The-plastic-connection-Process">The plastic connection Process</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Plot-spike-trains">Plot spike trains</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Plot-traces">Plot traces</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Plot-STDP-learning-window-and-weight-changes">Plot STDP learning window and weight changes</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/tutorial09_custom_learning_rules.html#Follow-the-links-below-for-deep-dive-tutorials-on-the-concepts-in-this-tutorial:">Follow the links below for deep-dive tutorials on the concepts in this tutorial:</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html">Three Factor Learning with Lava</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#This-tutorial-assumes-that-you:">This tutorial assumes that you:</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Defining-three-factor-learning-rule-interfaces-in-Lava">Defining three-factor learning rule interfaces in Lava</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Reward-modulated-Spike-Timing-Dependent-Plasticity-(R-STDP)-learning-rule">Reward-modulated Spike-Timing Dependent Plasticity (R-STDP) learning rule</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Defining-a-simple-learning-network-with-localized-reward-signals">Defining a simple learning network with localized reward signals</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Initialize-network-parameters-and-weights">Initialize network parameters and weights</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Generate-binary-input-and-graded-reward-spikes">Generate binary input and graded reward spikes</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Initialize-Network-Processes">Initialize Network Processes</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Connect-Network-Processes">Connect Network Processes</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Create-monitors-to-observe-the-weight-and-trace-dynamics-during-learning">Create monitors to observe the weight and trace dynamics during learning</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Run-the-network">Run the network</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Visualize-the-learning-results">Visualize the learning results</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Plot-eligibility-trace-dynamics">Plot eligibility trace dynamics</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Plot-reward-trace-dynamics">Plot reward trace dynamics</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Advanced-Topic:-Implementing-custom-learning-rule-interfaces">Advanced Topic: Implementing custom learning rule interfaces</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#How-to-learn-more?">How to learn more?</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/notebooks/in_depth/three_factor_learning/tutorial01_Reward_Modulated_STDP.html#Follow-the-links-below-for-deep-dive-tutorials-on-the-concepts-in-this-tutorial:">Follow the links below for deep-dive tutorials on the concepts in this tutorial:</a></li>
</ul>
</li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="algorithms.html">Algorithms and Application Libraries</a><ul>
<li class="toctree-l2"><a class="reference internal" href="dl.html">Deep Learning</a><ul>
<li class="toctree-l3"><a class="reference internal" href="dl.html#introduction">Introduction</a></li>
<li class="toctree-l3"><a class="reference internal" href="dl.html#lava-dl-workflow">Lava-DL Workflow</a></li>
<li class="toctree-l3"><a class="reference internal" href="dl.html#getting-started">Getting Started</a></li>
<li class="toctree-l3"><a class="reference internal" href="dl.html#slayer-2-0">SLAYER 2.0</a><ul>
<li class="toctree-l4"><a class="reference internal" href="dl.html#example-code">Example Code</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="dl.html#bootstrap">Bootstrap</a><ul>
<li class="toctree-l4"><a class="reference internal" href="dl.html#example-code-1">Example Code</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="dl.html#network-exchange-netx-library">Network Exchange (NetX) Library</a><ul>
<li class="toctree-l4"><a class="reference internal" href="dl.html#example-code-2">Example Code</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="dl.html#detailed-description">Detailed Description</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/slayer.html">Lava-DL SLAYER</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/bootstrap/bootstrap.html">Lava-DL Bootstrap</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/netx/netx.html">Lava-DL NetX</a></li>
</ul>
</li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="dnf.html">Dynamic Neural Fields</a><ul>
<li class="toctree-l3"><a class="reference internal" href="dnf.html#introduction">Introduction</a></li>
<li class="toctree-l3"><a class="reference internal" href="dnf.html#what-is-lava-dnf">What is lava-dnf?</a></li>
<li class="toctree-l3"><a class="reference internal" href="dnf.html#key-features">Key features</a></li>
<li class="toctree-l3"><a class="reference internal" href="dnf.html#example">Example</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="optimization.html">Neuromorphic Constrained Optimization Library</a><ul>
<li class="toctree-l3"><a class="reference internal" href="optimization.html#about-the-project">About the Project</a><ul>
<li class="toctree-l4"><a class="reference internal" href="optimization.html#taxonomy-of-optimization-problems">Taxonomy of Optimization Problems</a></li>
<li class="toctree-l4"><a class="reference internal" href="optimization.html#optimizationsolver-and-optimizationproblem-classes">OptimizationSolver and OptimizationProblem Classes</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="optimization.html#tutorials">Tutorials</a><ul>
<li class="toctree-l4"><a class="reference internal" href="optimization.html#quadratic-programming">Quadratic Programming</a></li>
<li class="toctree-l4"><a class="reference internal" href="optimization.html#quadratic-uncosntrained-binary-optimization">Quadratic Uncosntrained Binary Optimization</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="optimization.html#examples">Examples</a><ul>
<li class="toctree-l4"><a class="reference internal" href="optimization.html#solving-qp-problems">Solving QP problems</a></li>
<li class="toctree-l4"><a class="reference internal" href="optimization.html#solving-qubo">Solving QUBO</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="optimization.html#getting-started">Getting Started</a><ul>
<li class="toctree-l4"><a class="reference internal" href="optimization.html#requirements">Requirements</a></li>
<li class="toctree-l4"><a class="reference internal" href="optimization.html#installation">Installation</a></li>
<li class="toctree-l4"><a class="reference internal" href="optimization.html#alternative-installing-lava-via-conda">[Alternative] Installing Lava via Conda</a></li>
</ul>
</li>
</ul>
</li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="developer_guide.html">Developer Guide</a><ul>
<li class="toctree-l2"><a class="reference internal" href="developer_guide.html#lava-s-origins">Lava’s Origins</a></li>
<li class="toctree-l2"><a class="reference internal" href="developer_guide.html#contact-information">Contact Information</a></li>
<li class="toctree-l2"><a class="reference internal" href="developer_guide.html#development-roadmap">Development Roadmap</a><ul>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#initial-release">Initial Release</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="developer_guide.html#how-to-contribute-to-lava">How to contribute to Lava</a><ul>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#open-an-issue">Open an Issue</a></li>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#pull-request-checklist">Pull Request Checklist</a></li>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#open-a-pull-request">Open a Pull Request</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="developer_guide.html#coding-conventions">Coding Conventions</a><ul>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#code-requirements">Code Requirements</a></li>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#guidelines">Guidelines</a></li>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#docstring-format">Docstring Format</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="developer_guide.html#contributors">Contributors</a><ul>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#contributor">Contributor</a></li>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#committer">Committer</a><ul>
<li class="toctree-l4"><a class="reference internal" href="developer_guide.html#list-of-lava-nc-lava-project-committers">List of lava-nc/lava Project Committers</a></li>
<li class="toctree-l4"><a class="reference internal" href="developer_guide.html#list-of-lava-nc-lava-dnf-project-committers">List of lava-nc/lava-dnf Project Committers</a></li>
<li class="toctree-l4"><a class="reference internal" href="developer_guide.html#list-of-lava-nc-lava-optimization-project-committers">List of lava-nc/lava-optimization Project Committers</a></li>
<li class="toctree-l4"><a class="reference internal" href="developer_guide.html#list-of-lava-nc-lava-dl-project-committers">List of lava-nc/lava-dl Project Committers</a></li>
<li class="toctree-l4"><a class="reference internal" href="developer_guide.html#committer-promotion">Committer Promotion</a></li>
</ul>
</li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="developer_guide.html#repository-structure">Repository Structure</a><ul>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#id17">lava-nc/lava</a></li>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#lava-nc-lava-dnf">lava-nc/lava-dnf</a></li>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#lava-nc-lava-dl">lava-nc/lava-dl</a></li>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#lava-nc-lava-optimization">lava-nc/lava-optimization</a></li>
<li class="toctree-l3"><a class="reference internal" href="developer_guide.html#lava-nc-lava-docs">lava-nc/lava-docs</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="developer_guide.html#code-of-conduct">Code of Conduct</a></li>
<li class="toctree-l2"><a class="reference internal" href="developer_guide.html#licenses">Licenses</a></li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="lava_api_documentation.html">Lava API Documentation</a><ul>
<li class="toctree-l2"><a class="reference internal" href="lava/lava.html">Lava</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava/lava.magma.html">Magma</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.magma.compiler.html">lava.magma.compiler</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.magma.core.html">lava.magma.core</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.magma.runtime.html">lava.magma.runtime</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/lava.proc.html">Lava process library</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.proc.conv.html">lava.proc.conv</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.proc.dense.html">lava.proc.dense</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.proc.io.html">lava.proc.io</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.proc.learning_rules.html">lava.proc.learning_rules</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.proc.lif.html">lava.proc.lif</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.proc.monitor.html">lava.proc.monitor</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.proc.receiver.html">lava.proc.receiver</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.proc.sdn.html">lava.proc.sdn</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.proc.spiker.html">lava.proc.spiker</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava/lava.utils.html">Lava Utils</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.utils.dataloader.html">lava.utils.dataloader</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.utils.html#lava-utils-float2fixed">lava.utils.float2fixed</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.utils.html#lava-utils-profiler">lava.utils.profiler</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.utils.html#lava-utils-system">lava.utils.system</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.utils.html#lava-utils-validator">lava.utils.validator</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.utils.html#lava-utils-visualizer">lava.utils.visualizer</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava/lava.utils.html#lava-utils-weightutils">lava.utils.weightutils</a></li>
</ul>
</li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava-lib-dl/index.html">Lava - Deep Learning</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dl/slayer/index.html">SLAYER</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/neuron/modules.html">Neuron</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/synapse/modules.html">Synapse</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/spike/modules.html">Spike</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/axon/modules.html">Axon</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/dendrite/modules.html">Dendrite</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/block/modules.html">Blocks</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/loss.html">Loss</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/classifier.html">Classifier</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/io.html">Input/Output</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/auto.html">Auto</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/slayer/utils/modules.html">Utilities</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dl/slayer/index.html#indices-and-tables">Indices and tables</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dl/bootstrap/index.html">Bootstrap (ANN-SNN training)</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/bootstrap/block/modules.html">Blocks</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/bootstrap/ann_sampler.html">ANN Statistics Sampler</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/bootstrap/routine.html">Routine</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dl/bootstrap/index.html#indices-and-tables">Indices and tables</a></li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dl/netx/index.html">Lava-DL NetX</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/netx/blocks/modules.html">Blocks</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/netx/hdf5.html">HDF5</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dl/netx/utils.html">Utils</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dl/netx/index.html#indices-and-tables">Indices and tables</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.html">Lava - Dynamic Neural Fields</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.connect.html">lava.lib.dnf.connect</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.connect.html#lava-lib-dnf-connect-connect">lava.lib.dnf.connect.connect</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.connect.html#lava-lib-dnf-connect-exceptions">lava.lib.dnf.connect.exceptions</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.kernels.html">lava.lib.dnf.kernels</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.kernels.html#lava-lib-dnf-kernels-kernels">lava.lib.dnf.kernels.kernels</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.operations.html">lava.lib.dnf.operations</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.operations.html#lava-lib-dnf-operations-enums">lava.lib.dnf.operations.enums</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.operations.html#lava-lib-dnf-operations-exceptions">lava.lib.dnf.operations.exceptions</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.operations.html#lava-lib-dnf-operations-operations">lava.lib.dnf.operations.operations</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.operations.html#lava-lib-dnf-operations-shape-handlers">lava.lib.dnf.operations.shape_handlers</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.inputs.html">lava.lib.dnf.inputs</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.inputs.gauss_pattern.html">lava.lib.dnf.inputs.gauss_pattern</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.inputs.rate_code_spike_gen.html">lava.lib.dnf.inputs.rate_code_spike_gen</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.utils.html">lava.lib.dnf.utils</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.utils.html#lava-lib-dnf-utils-convenience">lava.lib.dnf.utils.convenience</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.utils.html#lava-lib-dnf-utils-math">lava.lib.dnf.utils.math</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.utils.html#lava-lib-dnf-utils-plotting">lava.lib.dnf.utils.plotting</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-dnf/lava.lib.dnf.utils.html#lava-lib-dnf-utils-validation">lava.lib.dnf.utils.validation</a></li>
</ul>
</li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.html">Lava - Optimization</a><ul>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.problems.html">lava.lib.optimization.problems</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.problems.bayesian.html">lava.lib.optimization.problems.bayesian</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.problems.html#lava-lib-optimization-problems-coefficients">lava.lib.optimization.problems.coefficients</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.problems.html#lava-lib-optimization-problems-constraints">lava.lib.optimization.problems.constraints</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.problems.html#lava-lib-optimization-problems-cost">lava.lib.optimization.problems.cost</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.problems.html#lava-lib-optimization-problems-problems">lava.lib.optimization.problems.problems</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.problems.html#lava-lib-optimization-problems-variables">lava.lib.optimization.problems.variables</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.solvers.html">lava.lib.optimization.solvers</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.solvers.bayesian.html">lava.lib.optimization.solvers.bayesian</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.solvers.generic.html">lava.lib.optimization.solvers.generic</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.solvers.qp.html">lava.lib.optimization.solvers.qp</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.utils.html">lava.lib.optimization.utils</a><ul>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.utils.generators.html">lava.lib.optimization.utils.generators</a></li>
<li class="toctree-l4"><a class="reference internal" href="lava-lib-optimization/lava.lib.optimization.utils.html#lava-lib-optimization-utils-solver-tuner">lava.lib.optimization.utils.solver_tuner</a></li>
</ul>
</li>
</ul>
</li>
</ul>
</li>
</ul>
</div>
</div>
</nav>
<section data-toggle="wy-nav-shift" class="wy-nav-content-wrap"><nav class="wy-nav-top" aria-label="Mobile navigation menu" >
<i data-toggle="wy-nav-top" class="fa fa-bars"></i>
<a href="index.html">Lava</a>
</nav>
<div class="wy-nav-content">
<div class="rst-content">
<div role="navigation" aria-label="Page navigation">
<ul class="wy-breadcrumbs">
<li><a href="index.html" class="icon icon-home" aria-label="Home"></a></li>
<li class="breadcrumb-item active">Lava Architecture</li>
<li class="wy-breadcrumbs-aside">
<a href="_sources/lava_architecture_overview.rst.txt" rel="nofollow"> View page source</a>
</li>
</ul>
<hr/>
</div>
<div role="main" class="document" itemscope="itemscope" itemtype="http://schema.org/Article">
<div itemprop="articleBody">
<style>
/* CSS overrides for sphinx_rtd_theme */
/* 24px margin */
.nbinput.nblast.container,
.nboutput.nblast.container {
margin-bottom: 19px; /* padding has already 5px */
}
/* ... except between code cells! */
.nblast.container + .nbinput.container {
margin-top: -19px;
}
.admonition > p:before {
margin-right: 4px; /* make room for the exclamation icon */
}
/* Fix math alignment, see https://github.com/rtfd/sphinx_rtd_theme/pull/686 */
.math {
text-align: unset;
}
</style>
<section id="lava-architecture">
<span id="id1"></span><h1>Lava Architecture<a class="headerlink" href="#lava-architecture" title="Permalink to this heading"></a></h1>
<p>Lava is an emerging open-source software framework for neuromorphic computing. Many parts of Lava are still under development. Therefore the following sections describe Lava’s key attributes and fundamental architectural concepts that are partly implemented and partly envisioned today. A deeper dive into these architectural concepts can be found in the ‘Getting started with Lava’ section. We hope to inspire and invite developers to contribute to this open-source effort to solve some of the key challenges of neuromorphic hardware, software and application development.</p>
<section id="key-attributes">
<h2>Key attributes<a class="headerlink" href="#key-attributes" title="Permalink to this heading"></a></h2>
<ul class="simple">
<li><p><strong>Asynchronous parallelism:</strong> Lava is a software framework for asynchronous event-based processing for distributed, neuromorphic, heterogeneous, and massively parallel hardware platforms supporting edge to cloud based systems.</p></li>
<li><p><strong>Refinement:</strong> Lava facilitates iterative software development through refinement of abstract computational processes into architecture-specific implementations. This allows application development to start from high-level behavioral models that can be broken down successively into lower-level models optimized for different platforms.</p></li>
<li><p><strong>Cross-platform:</strong> Lava supports flexible cross-platform execution of computational processes on novel neuromorphic architectures such as Intel Loihi as well as conventional CPU/GPU architectures. It is flexible in the dual sense of allowing the same computational processes to execute on different platforms while also allowing different processes to execute and interact across architectures through message passing.</p></li>
<li><p><strong>Modular and composable:</strong> Lava’s computational processes follow a consistent architecture, to make them interoperable which allows to compose modular systems from other computational processes.</p></li>
<li><p><strong>Extensible:</strong> Lava is open and extensible to support use cases of increasing breadth over time and to interact with other third party frameworks such as TensorFlow, ROS, Brian and more.</p></li>
<li><p><strong>Trainable:</strong> Lava comes with powerful training algorithms to train models offline and continually online in real-time in the future.</p></li>
<li><p><strong>Accessible:</strong> Lava provides an intuitive Python API to quickly build and execute models on distributed parallel systems.</p></li>
</ul>
</section>
<section id="why-do-we-need-lava">
<h2>Why do we need Lava?<a class="headerlink" href="#why-do-we-need-lava" title="Permalink to this heading"></a></h2>
<p>At a micro-scale, neuromorphic systems are brain-inspired. Brains consist of a large number of neurons and synapses that operate in parallel and communicate with each other with sparse asynchronous messages (spikes), which lead to large gains in computational efficiency.</p>
<p>At a macro-scale, neuromorphic hardware systems often involve multiple physical computing elements, ranging from special purpose neural accelerators to conventional CPU/GPUs, sensors, or actuator devices.
Neuromorphic hardware systems and Lava mirror this general, massively parallel, heterogenous architecture from the ground up. All algorithms built in Lava are built from independent, modular computational processes that may execute on different hardware platforms and communicate through generalized message types.</p>
<p>So far, there is no single open software framework today that combines all of these architectural aspects in a coherent, easy to use, and performant fashion to pave the way for broader adoption of neuromorphic technologies.</p>
</section>
<section id="lava-s-foundational-concepts">
<h2>Lava’s foundational concepts<a class="headerlink" href="#lava-s-foundational-concepts" title="Permalink to this heading"></a></h2>
<section id="processes">
<h3>1. Processes<a class="headerlink" href="#processes" title="Permalink to this heading"></a></h3>
<p><em>Processes</em> are the fundamental building block in the Lava architecture from which all algorithms and applications are built. <em>Processes</em> are stateful objects with internal variables, input and output ports for message-based communication via channels.
<em>Processes</em> come in different forms. A <em>Process</em> can be as simple as a single neuron or as complex an entire neural network like a ResNet architecture, or an excitatory/inhibitory network.
A <em>Process</em> could also represent regular program code like a search algorithm, a system management process or be used for data pre and post processing.
In addition, peripheral devices such as sensors or actuators can also be wrapped into the <em>Process</em> abstraction to integrate them seamlessly into a Lava application alongside other computational processes.
In short, everything in Lava is a <em>Process</em>, which has its own private memory and communicates with its environment solely via messages. This makes Lava <em>Processes</em> a recursive programming abstraction from which modular, large-scale parallel applications can be built.</p>
<figure class="align-center">
<a class="reference internal image-reference" href="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig1_Processes.png"><img alt="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig1_Processes.png" src="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig1_Processes.png" style="width: 800px;" /></a>
</figure>
</section>
<section id="behavioral-implementations-via-processmodels">
<h3>2. Behavioral implementations via ProcessModels<a class="headerlink" href="#behavioral-implementations-via-processmodels" title="Permalink to this heading"></a></h3>
<p><em>Process</em> classes themselves only define the interface of a <em>Process</em> in terms of state variables, ports, and other class methods for other <em>Processes</em> to use or interact with any given <em>Process</em>. However, <em>Processes</em> do not provide a behavioral implementation to make a <em>Process</em> executable on a particular hardware architecture.</p>
<p>The behavioral implementation - i.e. a concrete implementation of variables, ports, channels and internal business logic - is provided via separate <em>ProcessModel</em> classes. <em>Processes</em> can have one or more <em>ProcessModels</em> of the same or different types. We distinguish between two categories of <em>ProcessModel</em> types:</p>
<figure class="align-center">
<a class="reference internal image-reference" href="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig2_ProcessModels.png"><img alt="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig2_ProcessModels.png" src="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig2_ProcessModels.png" style="width: 800px;" /></a>
</figure>
<ul class="simple">
<li><p><strong>SubProcessModels:</strong> The first way <em>ProcessModels</em> can be implemented is in terms of other <em>Processes</em>. With a <em>SubProcessModel</em>, other sub-<em>Processes</em> can be instantiated, configured, and connected to define the behavior of the parent-<em>Process</em>. The ports of a <em>SubProcessModel</em> can be connected to the ports of its sub-<em>Processes</em> and variables of sub-<em>Processes</em> can be exposed as variables of the parent-<em>Process</em> as needed to make them accessible in its environment. This type of refinement of a parent-<em>Process</em> by means of sub-<em>Processes</em> can be applied recursively and will result in a tree-structured <em>Process</em>/<em>ProcessModel</em> hierarchy which allows to build sophisticated and reusable application modules.</p></li>
<li><p><strong>LeafProcessModels:</strong> <em>ProcessModels</em> can also be implemented directly as opposed to the composition of other sub-<em>Processes</em>. In this case, we refer to them as <em>LeafProcessModels</em> because they form the leaves of the tree-structured <em>Process</em> hierarchy. Such <em>LeafProcesses</em> can be implemented in a programming language such as <em>Python</em> or <em>C</em>. For different hardware architectures, the <em>LeafProcessModel</em> code either describes computations directly in terms of executable instructions for a von-Neumann processor, possibly using external libraries such as Numpy, TensorFlow or PyTorch. In contrast, compute resources like neuromorphic cores do not execute arbitrary sequences of instructions but gain computational advantages by executing through collective dynamics specified by structural descriptions. In this case, the code of corresponding <em>NcProcessModels</em> for such cores is responsible for the structural allocation and configuration of neural network resources such as axons, synapses, and neurons in a neuro core. In the future, Lava <em>Processes</em> may also model and specify the operation of analog neuromorphic chips with behavioral models that will only approximately match their real-time execution.</p></li>
</ul>
<p>In general, in neuromorphic architectures computation emerges from collective dynamical processes that are often approximate, nondeterministic, and may vary in detailed execution from platform to platform. Therefore Lava views behavior as <em>modeled</em> rather than <em>programmed</em> or <em>specified</em>, and this perspective motivates the name <em>ProcessModel</em>.</p>
<p>Fundamentally, all <em>Processes</em> within a system or network operate in parallel and communicate asynchronously with each other through the exchange of message tokens. But many use cases require synchronization among <em>Processes</em>, for instance to implement a discrete-time dynamical system representing a particular neuron model progressing from one algorithmic time step to the next.
Lava allows developers to define synchronization protocols that describe how <em>Processes</em> in the same synchronization domain synchronize with each other. This <em>SyncProtocol</em> is orchestrated by a <em>Synchronizer</em> within a <em>SyncDomain</em> which exchanges synchronization message tokens with all <em>Processes</em> in a <em>SyncDomain</em>. The compiler either assigns <em>Processes</em> automatically to a <em>SyncDomain</em> based on the <em>SyncProtocol</em> it implements but also allows users to assign <em>Processes</em> manually to <em>SyncDomains</em> to customize synchronization among <em>Processes</em> in more detail.</p>
<figure class="align-center">
<a class="reference internal image-reference" href="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig3_ProcessModel_Sync.png"><img alt="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig3_ProcessModel_Sync.png" src="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig3_ProcessModel_Sync.png" style="width: 800px;" /></a>
</figure>
<p>Besides implementing a specific <em>Process</em> and <em>SyncProtocol</em>, a <em>ProcessModel</em> has a specific type (such as <em>PyProcessModel</em>, <em>CProcessModel</em>, <em>SubProcessModel</em>, etc.), has one or more resource requirements, and has one or more tags. Resource requirements specify what compute or peripheral resources a <em>ProcessModel</em> may require in order to execute, such as CPU, neuromorphic core, hard disk, or access to a camera. Furthermore, tags specify additional behavioral attributes of a <em>ProcessModel</em>.</p>
<p>In order to execute a specific instance of a <em>Process</em>, the compiler will select one of the <em>ProcessModels</em> that implement a given <em>Process</em> class. In order to allow for different selection strategies, the compiler delegates this <em>ProcessModel</em> selection to instances of a separate <em>RunConfig</em> class. Such <em>RunConfigs</em> correspond to a set of rules that determine which <em>ProcessModel</em> to select for a <em>Process</em> given user preferences and the resource requirements and tags of a <em>ProcessModel</em>. For instance, a particular <em>RunConfig</em> may always select the highest-level Python-based implementation of a <em>Process</em> for quick application prototyping on a conventional CPU or GPU architecture without physical access to neuromorphic systems like Intel’s Kapoho Bay or Pohohiki Springs. Another <em>RunConfig</em> might prefer to map <em>Processes</em> to neuro cores or embedded CPUs whenever such neuromorphic systems are available. Lava will provide several pre-configured <em>RunConfigs</em> but allows users to customize them or create their own.</p>
<p>In summary, while <em>Processes</em> only provide a universal interface to interact with their environment via message passing, one or more <em>ProcessModels</em> are what implement the behavioral model of a <em>Process</em> in different languages and ways tailored to different hardware architectures. This implementation can either be provided directly or by refining a given <em>Process</em> iteratively by implementing its behavior via sub-<em>Processes</em> which allows for code reuse and greater modularity.</p>
<p>Overall, this programming model enables quick application prototyping at a high level, agnostic of the intricate constraints and complexities that are often associated with neuromorphic architectures, in the language of choice of a user, while deferring specifics about the hardware architecture for later. Once ready, high-level behavioral models can be replaced or refined by more efficient lower-level implementations (often provided by the Lava process library). At the same time, the availability of different behavioral implementations allows users to run the same application on different hardware platforms such as a CPU or a neuromorphic system when only one of them is available.</p>
</section>
<section id="composability-and-connectivity">
<h3>3. Composability and connectivity<a class="headerlink" href="#composability-and-connectivity" title="Permalink to this heading"></a></h3>
<p><em>Processes</em> are connected via ports with each other for message-based communication over channels. For hierarchical <em>Processes</em>, ports of a parent process can also be internally connected to ports of a child <em>Process</em> and vice versa within a <em>SubProcessModel</em>.
In general, connections between <em>Processes</em> can be feed-forward or recurrent and support branching and joining of connections.
While ports at the <em>Process</em>-level are only responsible for establishing connectivity between <em>Processes</em> before compilation, the port implementation at <em>ProcessModel</em>-level is responsible for actual message-passing at runtime.</p>
<figure class="align-center">
<a class="reference internal image-reference" href="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig4_Connectivity.png"><img alt="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig4_Connectivity.png" src="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig4_Connectivity.png" style="width: 800px;" /></a>
</figure>
<p>The different kinds of ports and variables are part of the same formal class hierarchy. Both ports and variables are members of a <em>Process</em>. As such, both ports and variables communicate or represent numeric data, characterized by a data type that must be specified when allocating them in a <em>Process</em>. Additionally, port and variable implementations have an associated type and support a configurable precision of the numeric data they communicate or represent.
There are four main types of ports that can be grouped in two different ways:</p>
<ul class="simple">
<li><p><em>OutPorts</em> connect to and send messages to <em>InPorts</em> by value; meaning that the <em>InPort</em> receives a copy of the data sent via the <em>OutPort</em>. Thus changes to that data on the sender or receiver side will not affect the other side, enabling safe parallel processing without side-effects.</p></li>
<li><p>On the other hand, <em>RefPorts</em> connect to <em>VarPorts</em> which act as a proxy to internal state variables of a <em>Process</em>. <em>RefPorts</em> enable one <em>Process</em> to directly access a variable or internal memory of another <em>Process</em> by reference as if it was its own. Such direct-memory access is generally very powerful, but also more dangerous as it can lead to unforeseen side effect in parallel programming and should therefore only be used with caution. Yet, sometimes it is necessary to achieve certain behaviors.</p></li>
</ul>
<p>Aside from these main port types, there are additional virtual ports that effectively act as directives to the compiler to transform the shape of ports or how to combine multiple ports. Currently, Lava supports <em>ReshapePorts</em> and <em>ConcatPorts</em> to change the shape of a port or to concatenate multiple ports into one.
Finally, system-level communication between <em>Processes</em> such as for synchronization is also implemented via ports and channels, but those are not managed directly by – and therefore are hidden from – the user.</p>
<figure class="align-center">
<a class="reference internal image-reference" href="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig5_ProcessMembers.png"><img alt="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig5_ProcessMembers.png" src="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig5_ProcessMembers.png" style="width: 800px;" /></a>
</figure>
</section>
<section id="cross-platform-execution">
<h3>4. Cross-platform execution<a class="headerlink" href="#cross-platform-execution" title="Permalink to this heading"></a></h3>
<p>Lava supports cross-platform execution of processes on a distributed set of compute nodes. Nodes in Lava have compute resources associated with them such as CPUs or neuro cores, and peripheral resources such as sensors, actuators or hard-disks. Given a graph of <em>Processes</em>, their <em>ProcessModels</em> and a <em>RunConfig</em>, this allows the compiler to map the <em>Processes</em> defined in the user system process to one or more <em>NodeConfigurations</em>. Depending on which type of node a <em>Process</em> is mapped to, a different <em>ProcessModel</em> with a node-specific implementation of its variables, ports, channels and behavior is chosen.
In the end, each node in a <em>NodeConfiguration</em> can host one or more <em>SyncDomains</em> with one or more <em>ProcModels</em> in it. Each such <em>SyncDomain</em> also contains a local <em>RuntimeService</em> process. The <em>RuntimeService</em> is responsible for system management and includes the <em>Synchronizer</em> for orchestrating the <em>SyncProtocol</em> of the <em>SyncDomain</em>. Irrespective of the presence of multiple <em>SyncDomains</em> on multiple nodes, all user-defined and system processes communicate seamlessly via one asynchronous message-passing backend with each other and the global Runtime within the user system process.</p>
<figure class="align-center">
<a class="reference internal image-reference" href="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig6_NodeConfigs.png"><img alt="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig6_NodeConfigs.png" src="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig6_NodeConfigs.png" style="width: 800px;" /></a>
</figure>
</section>
</section>
<section id="lava-software-stack">
<h2>Lava software stack<a class="headerlink" href="#lava-software-stack" title="Permalink to this heading"></a></h2>
<figure class="align-center">
<a class="with-border reference internal image-reference" href="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig7_SwStack.png"><img alt="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig7_SwStack.png" class="with-border" src="https://raw.githubusercontent.com/lava-nc/lava-nc.github.io/main/_static/images/arch/Fig7_SwStack.png" style="width: 800px;" /></a>
</figure>
<p>The core components of the Lava software stack are comprised of the <em>Communicating Sequential Process</em> API, a powerful compiler, and a runtime. In combination, these components form the <em>Magma</em> layer of Lava, which is the foundation on which new <em>Processes</em> and <em>ProcessModels</em> are built.
The Lava process library provides a growing collection of generic, low-level, reusable, and widely applicable <em>Processes</em> from which higher-level algorithm and application libraries are built.</p>
<p>The first libraries, to be released as part of the Lava software framework, are libraries for deep learning (lava-dl), optimization (lava-optim), and dynamic neural fields (lava-dnf). Future libraries will add support for vector symbolic architectures (lava-vsa) and evolutionary optimization (lava-evo).</p>
<p>Besides these components, future releases of Lava will offer several utilities for application profiling, automatic float to fixed-point model conversion, network visualization, and more.
Finally, Lava is open for extension to other third-party frameworks such as TensorFlow, ROS or Nengo.
We welcome open-source contributions to any of these future libraries and utilities.</p>
</section>
</section>
</div>
</div>
<footer><div class="rst-footer-buttons" role="navigation" aria-label="Footer">
<a href="index.html" class="btn btn-neutral float-left" title="Lava Software Framework" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left" aria-hidden="true"></span> Previous</a>
<a href="getting_started_with_lava.html" class="btn btn-neutral float-right" title="Getting Started with Lava" accesskey="n" rel="next">Next <span class="fa fa-arrow-circle-right" aria-hidden="true"></span></a>
</div>
<hr/>
<div role="contentinfo">
<p>© Copyright 2021, Intel Corporation.</p>
</div>
Built with <a href="https://www.sphinx-doc.org/">Sphinx</a> using a
<a href="https://github.com/readthedocs/sphinx_rtd_theme">theme</a>
provided by <a href="https://readthedocs.org">Read the Docs</a>.
</footer>
</div>
</div>
</section>
</div>
<script>
jQuery(function () {
SphinxRtdTheme.Navigation.enable(false);
});
</script>
</body>
</html>