Show simple item record

dc.contributor.authorDobric, D
dc.contributor.authorPech, A
dc.contributor.authorGhita, B
dc.contributor.authorWennekers, Thomas
dc.date.accessioned2022-05-03T13:27:58Z
dc.date.available2022-05-03T13:27:58Z
dc.date.issued2020-07-31
dc.identifier.issn0976-2191
dc.identifier.urihttp://hdl.handle.net/10026.1/19151
dc.description.abstract

The Hierarchical Temporal Memory Cortical Learning Algorithm (HTM CLA) is a theory and machine learning technology that aims to capture cortical algorithm of the neocortex. Inspired by the biological functioning of the neocortex, it provides a theoretical framework, which helps to better understand how the cortical algorithm inside of the brain might work. It organizes populations of neurons in column-like units, crossing several layers such that the units are connected into structures called regions (areas). Areas and columns are hierarchically organized and can further be connected into more complex networks, which implement higher cognitive capabilities like invariant representations. Columns inside of layers are specialized on learning of spatial patterns and sequences. This work targets specifically spatial pattern learning algorithm called Spatial Pooler. A complex topology and high number of neurons used in this algorithm, require more computing power than even a single machine with multiple cores or a GPUs could provide. This work aims to improve the HTM CLA Spatial Pooler by enabling it to run in the distributed environment on multiple physical machines by using the Actor Programming Model. The proposed model is based on a mathematical theory and computation model, which targets massive concurrency. Using this model drives different reasoning about concurrent execution and enables flexible distribution of parallel cortical computation logic across multiple physical nodes. This work is the first one about the parallel HTM Spatial Pooler on multiple physical nodes with named computational model. With the increasing popularity of cloud computing and server less architectures, it is the first step towards proposing interconnected independent HTM CLA units in an elastic cognitive network. Thereby it can provide an alternative to deep neuronal networks, with theoretically unlimited scale in a distributed cloud environment.

dc.format.extent83-100
dc.language.isoen
dc.publisherAcademy & Industry Research Collaboration Center (AIRCC)
dc.titleScaling the HTM Spatial Pooler
dc.typejournal-article
plymouth.issue4
plymouth.volume11
plymouth.publication-statusPublished
plymouth.journalInternational Journal of Artificial Intelligence and Applications
dc.identifier.doi10.5121/ijaia.2020.11407
plymouth.organisational-group/Plymouth
plymouth.organisational-group/Plymouth/Admin Group - REF
plymouth.organisational-group/Plymouth/Admin Group - REF/REF Admin Group - FoSE
plymouth.organisational-group/Plymouth/Faculty of Science and Engineering
plymouth.organisational-group/Plymouth/Faculty of Science and Engineering/School of Engineering, Computing and Mathematics
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA/UoA11 Computer Science and Informatics
plymouth.organisational-group/Plymouth/Users by role
plymouth.organisational-group/Plymouth/Users by role/Academics
dcterms.dateAccepted2020-01-01
dc.rights.embargodate2022-5-18
dc.rights.embargoperiodNot known
rioxxterms.versionofrecord10.5121/ijaia.2020.11407
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.typeJournal Article/Review


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV