Build a Neural Net in 4 Minutes

Only a few days left to sign up for my new course! Learn more and sign-up here https://www.theschool.ai/courses/decentralized-applications
How does a Neural network work? Its the basis of deep learning and the reason why image recognition, chatbots, self driving cars, and language translation work! In this video, i’ll use python to code up a neural network in just 4 minutes using just the numpy library, capable of doing matrix mathematics.

Code for this video:
https://github.com/llSourcell/Make_a_neural_network

I created a Slack channel for us, sign up here:
https://wizards.herokuapp.com/

Please Subscribe! That is the thing you could do that would make me happiest.

I recently created a Patreon page. If you like my videos, feel free to help support my effort here!:
https://www.patreon.com/user?ty=h&u=3191693

2 Great Neural Net Tutorials:

(please subscribe for more videos like these! )

1. https://medium.com/technology-invention-and-more/how-to-build-a-simple-neural-network-in-9-lines-of-python-code-cc8f23647ca1#.l51z38s7f

2. https://iamtrask.github.io/2015/07/12/basic-python-network/

Awesome Tutorial Series on Neural Networks:

http://lumiverse.io/series/neural-networks-demystified

The Canonical Machine Learning Course:

https://www.coursera.org/learn/machine-learning

Curious just how inspired neural networks are from brain architecture? Take some time to learn about the human brain! This is my favorite intro to neuroscience course:

https://www.mcb80x.org/
Follow me:
Twitter: https://twitter.com/sirajraval
Facebook: https://www.facebook.com/sirajology Instagram: https://www.instagram.com/sirajraval/ Instagram: https://www.instagram.com/sirajraval/

Comments

Yung Trash says:

Ok I built it, it works but like what do I do with it now lol can I somehow use this to make a chatbot I previously coded better?

Rishik Mourya says:

Did u started like this? And understood forward propagation, back propagation, gradient descent….. In just 4 mins?

Percentexz says:

See the funny thing is there is no reason to make this 4min long

Flávio Mendes says:

Why l2_error (line 34) is different from l1_error? (Line 41)

Thanks you all. :)

Harsh Deep says:

Won’t ReLU give better performance over sigmoid in this case.

Peter's Journey says:

Can someone help explain why we need this code?

if (deriv == True):
return (x * (1 – x))

SA Kiwi says:

what version of python are u using?

Kevin McCarthy says:

Recently ran this in Python3 after fixing many of the bugs. I keep outputting 0.47, 0.48, 0.54 and 0.54. Does anyone know why this is? My predicted output is nowhere close to the actual output, and my error rate doesn’t decrease each iteration — it stays at 0.4964

Lindsay Fowler says:

I think I have it working, at least it compiles now… :

Siraj’s 4 minute Neural Network!
# 3 layer neural net:
   
#     Input            Hidden           Output
#     Layer            Layer            Layer
#
#                        O
#       O      x                  x
#              x         O        x
#       O   synapses0           synapses1    O
#              x         O        x
#       O      x                  x
#                        O
#    
#
#

#I had to run this in Anaconda since pip said numpy was already installed but stupid IDLE was complaining it wasn’t.
#also I guessed a lot of the error fixes… so there could still be problems.

import numpy as np

#sigmoid function: any value gets squashed to between 0 and 1
#Sigmoid function is used at every neuron to generate probabilities out of numbers
def nonlin(x,deriv): 
    if(deriv==True):
        return x*(1-x)
    return 1/(1+np.exp(-x))

#input data initialised as a matrix
# each row is a different training example with three input neurons each
# each column is a different nueron
X= np.array([[0,0,1],
            [0,1,1],
            [1,0,1],
            [1,1,1]])
#output data set: four examples, so one output neuron each
y = np.array([[0],
              [1],
              [1],
              [0]])

np.random.seed(1) # always use the same random number sequence (deterministic)
#for the following code in orer to help with debugging

#2 synapses matrices for a three layer network
#Synapses are the connections between each neuron in one layer to each neuron in the next layer
#each synaps has a random weight assigned to it.
syn0 = 2*np.random.random((3,4))-1 #3 inputs to 4 intermediate neurons
syn1 = 2*np.random.random((4,1))-1 #4 intermediate neurons to 1 output neuron

#training
for j in range (60000): #for loop that iterates over the training code to optimise the network for the given data set
    
    layer0 = X # create our first layer which is out input data
    layer1 = nonlin(np.dot(layer0, syn0),deriv=False) #prediction layer. perform matrix multiplication
    #between each layer and its synaps then run our sigmoid function on each value in the matrix to
    #create our next layer which is a prediction of the output data.
    layer2 = nonlin(np.dot(layer1,syn1),deriv=False) # then we do the same thing on this layer to get a more refined prediction

    #now that we have a prediction, let’s compare it to the expected data:
    l2_error = y – layer2 # get the error rate

    if(j %1000)==0:
        print (“Error:” + str(np.mean(np.abs(l2_error))) )# print error rate at a set interval for monitoring purposes.

    #next multiply our error rate by our sigmoid function of the prediction to get the derivative
    #of our output prediction from layer 2, giving us a delta which we will use to reduce our error rate
    #of our predictions when we update our synapses every iteration.
    l2_delta = l2_error*nonlin(layer2, deriv=True)
    #next we see how much layer 1 contributed to the errors in layer 2, this is called back propogation
    # we get this error by multiplying layer2’s delta by syn1’s transpose
    l1_error = l2_delta.dot(syn1.T)
    #then we’ll get layer 1’s delta by multiplying its error by the resutl of our sigmoid function
    #the function is used to get the derivative of layer 1
    l1_delta = l1_error*nonlin(layer1,deriv=True)
    #now that we have deltas for each of our layers, we can use them to update our synaps weights
    #to reduce the error rate more and more.

    #update weights – algorithm: Gradient Descent
    syn1 += layer1.T.dot(l2_delta)
    syn0 += layer0.T.dot(l1_delta)
print (“Outpu after training”)
print (layer2)

J.R. Millstone says:

God, I wish I understood how to code…. O.o

The_ProGamer says:

idk why but that feels like 4 years

Blake Neely says:

sensei neromation review

Mark Flanagan says:

Great Work .. respect

Micheal Bee says:

Very cool. I like how you home brewed this.

Tom L says:

I have a interesting idea for a program, but i have no idea of how about actually programming something like this, im wanting to make it for facebook users

Harminder Singh Bhamra says:

Where are the bias ????

Tom M says:

eh, maybe you should have done this in 40 minutes, not 4

starjuancho says:

how to read what he wrote ar 0:55 -> 1/(1+np,exp(-x)) I guess np is the Euler’s number, but I don’t understand the expresion… I don’t code in Python…

pkworlz says:

your video looks cool but they are useless if someone wanaa learn from it seriously..

Waqar Ahmed says:

Your code on” https://github.com/llSourcell/Make_a_neural_network” is for only one neuron. Do you have another code for multiple neuron.

to stupid for a Name says:

I HAVE NEVER SEEN A THING WITH THAT MUCH INFORMATION PER TIME HOLY MOLY THIS IS SO COOL.

Travis Ashworth says:

Can we get an example to show how bias fits in? …please?

The Daily Compilation says:

instructions unclear, i now have *FoOT fUnGeS*

Nigel Van Der Laan says:

Thanks, very helpful.

Carlos Manuel says:

Hey Siraj, i would like to be able to make you some questions cause im trying to implement an AI for a game but im a little stuck… where can i write you other than YT comments…??

Kacper Tomasik says:

Wait
How did you run it? You imported numpy as numpy
None of the np.s should have worked
You used a comma instead of a dot
It might work, but you cheated kinda

ab y says:

i have programmed before, but only learn python 2 days ago,

I dont understand a thing he said …

Write a comment

*

Human Verification: In order to verify that you are a human and not a spam bot, please enter the answer into the following box below based on the instructions contained in the graphic.


Do you like our videos?
Do you want to see more like that?

Please click below to support us on Facebook!

Send this to a friend

▷ Other ReviewsVehicles▷ Show Cars▷ Motorbikes▷ Scooters▷ Bicycles▷ Rims & Tires▷ Luxury BoatsFashion▷ Sunglasses▷ Luxury Watches▷ Luxury Purses▷ Jeans Wear▷ High Heels▷ Kinis Swimwear▷ Perfumes▷ Jewellery▷ Cosmetics▷ Shaving Helpers▷ Fashion HatsFooding▷ Chef Club▷ Fooding Helpers▷ Coktails & LiquorsSports▷ Sport Shoes▷ Fitness & Detox▷ Golf Gear▷ Racquets▷ Hiking & Trek Gear▷ Diving Equipment▷ Ski Gear▷ Snowboards▷ Surf Boards▷ Rollers & SkatesEntertainment▷ DIY Guides▷ Zik Instruments▷ Published Books▷ Music Albums▷ Cine Movies▷ Trading Helpers▷ Make Money▷ Fishing Equipment▷ Paintball Supplies▷ Trading Card Games▷ Telescopes▷ Knives▷ VapesHigh Tech▷ Flat Screens▷ Tech Devices▷ Camera Lenses▷ Audio HiFi▷ Printers▷ USB Devices▷ PC Hardware▷ Network Gear▷ Cloud Servers▷ Software Helpers▷ Programmer Helpers▷ Mobile Apps▷ Hearing AidsHome▷ Home Furniture▷ Home Appliances▷ Tools Workshop▷ Beddings▷ Floor Layings▷ Barbecues▷ Aquarium Gear▷ Safe Boxes▷ Office Supplies▷ Security Locks▷ Cleaning ProductsKids▷ Baby Strollers▷ Child Car Seats▷ Remote ControlledTravel▷ Luggages & Bags▷ Airlines Seats▷ Hotel Rooms▷ Fun Trips▷ Cruise Ships▷ Mexico Tours