Neural Net Functions and Operations Modularized in Python
These are common functions, and operations in the case of dot product, that are applied in training neural networks. Sigmoid, softmax, and dot product.
import math import numpy as np def sigmoid(value, round_to): # Squashes a set of real numbers into 0,1 # Plots S-shaped curve # Note: Add derivative option (value, deriv=True/False) # Prevent overflow? np.clip(value,-500,500) # Sigmoid formula return round(1.0 / ( 1 + np.exp(-value)), round_to) def soft_max(some_array, round_to): # Softmax function highlights largest values in an array (vector) # and suppresses values which are significantly below maximum value. # Sum of output array equals 1. # Create array as float, eg 1.0 array_exp = [math.exp(i) for i in some_array] sum_array_exp = sum(array_exp) # Squash values proportionately into 0,1 soft_max = [round(i / sum_array_exp, round_to) for i in array_exp] return soft_max def dot_product(vectors_array): # For vectors of any dimension. # Sums products of corresponding entries of two sequences of numbers, # which is the cosine of the angle between two vectors dot_prod = 0 for i in range(len(vectors_array)): dot_prod += vectors_array[i] * vectors_array[i] return dot_prod