Neural Networks and Elixir

12
Neural Networks & Elixir By Quentin Thomas @thequengineer

Transcript of Neural Networks and Elixir

Neural Networks & ElixirBy Quentin Thomas

@thequengineer

Two Views On A.I. ProgrammingCLASSICAL

Involves imperative step by step instructions

Exact rules must be handwritten

Very Predictable

Human solves the problem

Maintenance is tedious

Linear Solutions

MATHEMATICAL/PROBABILISTIC

Declarative & Functional

Deals with context and Probability

Continuous Learning

Machine Solves the problem

Data Driven

Unpredictable

Ubiquitous solutions In order to understand this new way of development we have to think differently about programming by merging the two views.

WWe need the ideas of Erlang/Elixir and Neural Networks now more than ever

- 80% of the webs data is unstructured, and not understood

- Problems are getting harder and harder to solve via the classical approach

- FPGA (Field Programmable Gate Array Chips) Are parallel and will force a different style of programming to be needed in order to extract real value from them.

- IoT demands

- Need Systems that can evolve and keep up with the rate of change happening all around us

- Will not be able to compete technologically without their use. Companies utilizing them will disrupt their competition quite easily. (Knowledge == Power) ANN’s bring knowledge through data

NEURONS

Operates as a giant interconnected distributed system

86 Billion Neurons In the Brain

Responsible for all brain processing activity & Learning

Parallelized Structure by Nature

Their operation is still a mystery in most cases

Artificial Neural Network Architecture Example

NEURONS ELIXIR

Soma: aka The body of the cell. The nucleus responsible for storage of cell memory over time. Processes Electrical

Signals.

Agents, GenServers, Supervisors

Synapsis: The electrical contact between two or more neurons.

Imagine our Message passing via tuples etc.

Dendrites: Collects inputs and serves as channels Elixir Channels can broadcast messages to multiple processes simultaneously. Can model dendrite behaviors

this way.

Axon: Turns the processed inputs into output Can Be any IoT device in the real world. Phones, Cars, Tablets, watches, Bots, Servers etc….

How do they relate to Elixir?

Let’s Examine a Neuron.

We will show a Single Neuron Perceptron, which is great for illustrating how An Artificial Neuron works

Elixir makes designing Neural Nets Simple due to its built in concurrency and parallelism.

Understanding the structure of one Artificial Neuron and how it performs calculations prepares you for creating more complicated networks later.

Data Signals Go Through 3 Phases within an Artificial Neuron

B

{ Input 1, weight 1 }

{ Input 2, weight 2 }

1. Summation Phase: All inputs and weights are multiplied and summed to get one calculation.

2. Transfer/Activation Phase: The summation calculation is added to the particular transfer function chosen by the creator of the network. Type of function is chosen based on the type of problem being solved.

3. Bias Phase: The bias is added to produce the final output.

Activations

How is Learning achieved?Supervised Learning: Process of training a machine

towards a target specified by the human. Problem is well defined and understood and taught to the network via examples.

There are 3 methods….

Graded Learning: The network is given a grade and it works until it achieves a grade that is satisfactory to the creator. Sometimes called Goal Oriented Learning.

Unsupervised Learning: The network learns on its own through experiments, documents, and interactions with human beings. The most fascinating area of Cognitive Computing right now, and is likely the next big thing in the tech industry

Elixir Processes can Learn.1 Hot Or Cold 0

We need to separate grocery items in a list by whether they are hot or cold. How do we architect this problem in Elixir using the Single Neuron Perceptron

architecture?

Good Reads to help with understanding Neural Networks

Neural Network Design - by Martin T. Hagan

Build Your Own Neural Network- by Tariq Rashid

The Math of Neural Networks- by Jeff Heaton

Principles Of Neurodynamics. Perceptrons and the Theory of brain Mechanisms - by Dr. Frank Rosenblatt 1964

DARPA Neural Network Study: October 1987 - February 1988 - By MIT Lexington Lincoln Lab

A Logical Calculus of the Ideas Immanent in Nervous Activity - by Warren McCulloch & Walter Pitts 1943

Calculus Better Explained- by Kalid Azad

Questions?