Layers Python Implementation

Affine (Fully connected layer):

from builtins import range
import numpy as np


def affine_forward(x, w, b):
    """
    Computes the forward pass for an affine (fully-connected) layer.

    The input x has shape (N, d_1, ..., d_k) and contains a minibatch of N
    examples, where each example x[i] has shape (d_1, ..., d_k). We will
    reshape each input into a vector of dimension D = d_1 * ... * d_k, and
    then transform it to an output vector of dimension M.

    Inputs:
    - x: A numpy array containing input data, of shape (N, d_1, ..., d_k)
    - w: A numpy array of weights, of shape (D, M)
    - b: A numpy array of biases, of shape (M,)

    Returns a tuple of:
    - out: output, of shape (N, M)
    - cache: (x, w, b)
    """
    # Number of images in the batch.
    NN = x.shape[0]

    # Reshape each input in our batch to a vector.
    reshaped_input = np.reshape(x,[NN, -1])

    # FC layer forward pass.
    out = np.dot(reshaped_input, w) + b
    
    cache = (x, w, b)
    return out, cache

Relu

Batch Norm

BN Circuit

The variance of a random variable X is the expected value of the squared deviation from the mean of X, {\displaystyle \mu =\operatorname {E} [X]}:\operatorname {Var} (X)=\operatorname {E} \left[(X-\mu )^{2}\right].

This definition encompasses random variables that are generated by processes that are discrete, continuous, neither, or mixed. The variance can also be thought of as the covariance of a random variable with itself:\operatorname {Var} (X)=\operatorname {Cov} (X,X).

The variance is also equivalent to the second cumulant of a probability distribution that generates X. The variance is typically designated as \operatorname {Var} (X), \sigma _{X}^{2}, or simply (pronounced "sigma squared"). The expression for the variance can be expanded:

Last updated