Abstract

How exactly our brain works is still an open question, but one thing seems to be clear: biological neural systems are computationally powerful, robust and noisy. Using the Reservoir Computing paradigm based on Spiking Neural Networks, also known as Liquid State Machines, we present results from a novel approach where diverse and noisy parallel reservoirs, totalling 3,000 modelled neurons, work together receiving the same averaged feedback. Inspired by the ideas of action learning and embodiment we use the safe and flexible industrial robot BAXTER in our experiments. The robot was taught to draw three different 2D shapes on top of a desk using a total of four joints. Together with the parallel approach, the same basic system was implemented in a serial way to compare it with our new method. The results show our parallel approach enables BAXTER to produce the trajectories to draw the learned shapes more accurately than the traditional serial one.

DOI

10.1109/IJCNN.2016.7727325

Publication Date

2016-11-03

Event

2016 International Joint Conference on Neural Networks (IJCNN)

Publication Title

2016 International Joint Conference on Neural Networks (IJCNN)

Publisher

IEEE

Embargo Period

2024-11-22

Comments

keywords: Biological neural networks;Computational modeling;Liquids;Neurons;Noise measurement;Robots;Shape;BAXTER;V-REP;humanoid robots;liquid state machines;parallel;processing;reservoir computing;spiking neural networks

Keywords

Biological neural networks, Computational modeling, Liquids, Neurons, Noise measurement, Robots, Shape, BAXTER, V-REP, humanoid robots, liquid state machines, parallel, processing, reservoir computing, spiking neural networks

First Page

1134

Last Page

1142

Share

COinS