Courses/AI & ML/Neural Networks Introduction

    Lesson 7 • Intermediate

    Neural Networks Introduction

    Understand neurons, layers, weights, and activation functions — build a neural network from scratch.

    ✅ What You'll Learn

    • • The perceptron: a single artificial neuron
    • • Multi-layer networks and backpropagation
    • • Activation functions: sigmoid, tanh, ReLU
    • • Solving non-linear problems (XOR) with hidden layers

    🧠 How Neural Networks Think

    🎯 Real-World Analogy: Imagine a factory assembly line. Raw materials (inputs) enter, pass through workers at different stations (neurons in hidden layers) who each transform the material slightly, and a finished product (prediction) comes out. Training is like optimising each worker's technique until the factory produces exactly what's needed.

    A neural network is layers of simple mathematical functions chained together. Each neuron takes inputs, multiplies them by weights, adds a bias, and passes the result through an activation function. Individually they're simple — together they can approximate any function.

    Input → [w1×x1 + w2×x2 + b] → activation → Output

    neuron = activation(weights · inputs + bias)

    Try It: The Perceptron

    Train a single neuron to learn the AND gate

    Try it Yourself »
    Python
    import numpy as np
    
    # The Perceptron: Simplest neural network (1 neuron)
    # Inspired by how brain neurons work
    
    def sigmoid(x):
        return 1 / (1 + np.exp(-np.clip(x, -500, 500)))
    
    # AND gate: both inputs must be 1
    print("=== Training a Perceptron (AND gate) ===")
    X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
    y = np.array([0, 0, 0, 1])  # AND: only 1 when both inputs are 1
    
    # Random initial weights
    np.random.seed(42)
    weights = np.random.randn(2) * 0.5
    bias = 0.0
    learning_rate = 0.5
    
    # Train for 
    ...

    Try It: Multi-Layer Network (XOR)

    Build a 2-layer network that solves the XOR problem from scratch

    Try it Yourself »
    Python
    import numpy as np
    
    # Multi-Layer Neural Network: Solving XOR!
    # XOR can't be solved by a single neuron — you need layers
    
    def sigmoid(x):
        return 1 / (1 + np.exp(-np.clip(x, -500, 500)))
    
    def sigmoid_derivative(x):
        return x * (1 - x)
    
    # XOR problem (not linearly separable!)
    X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
    y = np.array([[0], [1], [1], [0]])
    
    # Network: 2 inputs → 4 hidden neurons → 1 output
    np.random.seed(42)
    w_hidden = np.random.randn(2, 4) * 0.5
    b_hidden = np.zeros((1, 4)
    ...

    Try It: Activation Functions

    Compare sigmoid, tanh, ReLU and learn when to use each

    Try it Yourself »
    Python
    import numpy as np
    
    # Activation Functions: The "decision makers" of neural networks
    # Each activation gives neurons different behaviour
    
    def sigmoid(x):
        return 1 / (1 + np.exp(-np.clip(x, -500, 500)))
    
    def tanh(x):
        return np.tanh(x)
    
    def relu(x):
        return np.maximum(0, x)
    
    def leaky_relu(x, alpha=0.01):
        return np.where(x > 0, x, alpha * x)
    
    # Compare activations on the same inputs
    inputs = np.array([-3, -2, -1, -0.5, 0, 0.5, 1, 2, 3])
    
    print("=== Activation Functions Compared ==="
    ...

    📋 Quick Reference

    ComponentRoleAnalogy
    WeightsImportance of each inputVolume knobs
    BiasShifts the activationThreshold adjuster
    ActivationIntroduces non-linearityOn/off switch
    Forward passInput → predictionAssembly line
    BackpropagationError → weight updatesQuality feedback

    🎉 Lesson Complete!

    You've built neural networks from scratch! Next, dive deeper into training with backpropagation and deep learning.

    Sign up for free to track which lessons you've completed and get learning reminders.

    Previous

    Cookie & Privacy Settings

    We use cookies to improve your experience, analyze traffic, and show personalized ads. You can manage your preferences below.

    By clicking "Accept All", you consent to our use of cookies for analytics and personalized advertising. You can customize your preferences or reject non-essential cookies.

    Privacy PolicyTerms of Service