Synaptic Neural Network (SynaNN)

Chang Li

Chang Li

Toronto, Ontario

1 0
  • 0 Collaborators

Research Synaptic Neural Network. Synapse Learning. Dataflow. ...learn more

Project status: Under Development

Artificial Intelligence

Intel Technologies
AI DevCloud / Xeon, Intel Opt ML/DL Framework

Overview / Usage

Synaptic Neural Network (SynaNN) consists of a set of synapses and neurons. A synapse is a nonlinear function of two excitatory and inhibitory probability functions. After introducing the surprisal space (negative logarithmic space), the surprisal synapse is the sum of the surprisal of the excitatory probability and the topologically conjugate surprisal of the inhibitory probability. We found that the derivative of the surprisal synapse over the parameter is equal to the negative Bose-Einstein distribution. In addition, we could construct a synapse graph such as a fully connected synapse graph (synapse tensor), and embedded it into other neural networks to form a hybrid neural network. Furthermore, we proved the gradient expression of the cross-entropy loss function over parameters, so the synapse learning was compatible with the gradient descent and backpropagation of deep learning. Finally, we performed the MNIST experiment to proof the concept of the synaptic neural network, and achieved a high accuracy inthe model training and inference. We have tested tensorflow and keras code on Intel's DevCloud platform.

A Synaptic Neural Network and Synapse Learning | Request PDF. Available from: https://www.researchgate.net/publication/327433405_A_Synaptic_Neural_Network_and_Synapse_Learning [accessed Nov 01 2018].

Methodology / Approach

Synaptic Neural Network Theory has been published and studied. Its framework has been built. Various potential applications have been approached.

Our Methodology is to apply SynaNN theory in the real applications and approach its features.

Technologies Used

Tensorflow and Keras ML framework
Python language
Xeon Acceleration in Cloud

Comments (0)