Prompt
Answer
Implementing a Neural Network from Scratch in Python
Creating a neural network from scratch involves several steps including defining the network architecture, initializing weights, implementing the forward pass, calculating the loss, implementing the backpropagation algorithm, and updating weights. Below is a structured guide to implementing a simple feedforward neural network.
1. Necessary Imports
We will use NumPy, a powerful library for numerical computations in Python.
import numpy as np
2. Documentation Blocks and Code Implementation
Class Definition
The following class NeuralNetwork
implements a simple neural network:
class NeuralNetwork:
"""
Simple feedforward neural network class.
Attributes:
input_size (int): Number of input features.
hidden_size (int): Number of neurons in the hidden layer.
output_size (int): Number of output classes.
learning_rate (float): Learning rate for weight updates.
weights_input_hidden (ndarray): Weights for input layer to hidden layer.
weights_hidden_output (ndarray): Weights for hidden layer to output layer.
"""
def __init__(self, input_size, hidden_size, output_size, learning_rate=0.01):
"""
Initialize the neural network with the specified sizes and learning rate.
Parameters:
input_size (int): Number of input features.
hidden_size (int): Number of neurons in the hidden layer.
output_size (int): Number of output classes.
learning_rate (float): Learning rate for weight updates.
"""
self.input_size = input_size
self.hidden_size = hidden_size
self.output_size = output_size
self.learning_rate = learning_rate
# Initialize weights with random values
self.weights_input_hidden = np.random.rand(self.input_size, self.hidden_size)
self.weights_hidden_output = np.random.rand(self.hidden_size, self.output_size)
def sigmoid(self, x):
"""Compute the sigmoid activation function."""
return 1 / (1 + np.exp(-x))
def sigmoid_derivative(self, x):
"""Compute the derivative of the sigmoid function."""
return x * (1 - x)
def forward(self, X):
"""
Perform the forward pass through the network.
Parameters:
X (ndarray): Input data of shape (n_samples, input_size).
Returns:
ndarray: Output of the network of shape (n_samples, output_size).
"""
self.hidden_layer_input = np.dot(X, self.weights_input_hidden)
self.hidden_layer_output = self.sigmoid(self.hidden_layer_input)
self.final_input = np.dot(self.hidden_layer_output, self.weights_hidden_output)
self.output = self.sigmoid(self.final_input)
return self.output
def backward(self, X, y, output):
"""
Perform the backward pass and update weights based on error.
Parameters:
X (ndarray): Input data.
y (ndarray): Target output.
output (ndarray): Output from the forward pass.
"""
output_error = y - output # Calculate error
output_delta = output_error * self.sigmoid_derivative(output) # Compute gradient
# Update weights from hidden to output layer
self.weights_hidden_output += np.dot(self.hidden_layer_output.T, output_delta) * self.learning_rate
hidden_layer_error = np.dot(output_delta, self.weights_hidden_output.T) # Backpropagate the error
hidden_layer_delta = hidden_layer_error * self.sigmoid_derivative(self.hidden_layer_output)
# Update weights from input to hidden layer
self.weights_input_hidden += np.dot(X.T, hidden_layer_delta) * self.learning_rate
def train(self, X, y, epochs=1000):
"""
Train the neural network.
Parameters:
X (ndarray): Input data.
y (ndarray): Target output.
epochs (int): Number of training iterations.
"""
for _ in range(epochs):
output = self.forward(X) # Forward pass
self.backward(X, y, output) # Backward pass
def predict(self, X):
"""
Make predictions based on input data.
Parameters:
X (ndarray): Input data.
Returns:
ndarray: Predicted output.
"""
return self.forward(X)
3. Input Validation
While the basic implementation does not include specific input validation, ensure inputs are numpy arrays of the correct shape when using the train
and predict
methods for robustness.
4. Commentary
- Initialization: Weights are initialized randomly.
- Activation Function: The sigmoid function activates neurons; its derivative is used for backpropagation.
- Forward Pass: Inputs are transformed into outputs through weighted connections and activation.
- Backpropagation: Weights are adjusted based on the error gradient, allowing the network to learn.
5. Code Usage Example
Here's a practical example of how to use the NeuralNetwork
class:
# Example usage
if __name__ == "__main__":
# Input data: 4 samples with 3 features
X = np.array([[0, 0, 1],
[1, 0, 1],
[0, 1, 1],
[1, 1, 1]])
# Target output: 4 samples with 1 output
y = np.array([[0], [1], [1], [0]])
# Create an instance of the NeuralNetwork
nn = NeuralNetwork(input_size=3, hidden_size=4, output_size=1)
# Train the network
nn.train(X, y, epochs=10000)
# Make predictions
predictions = nn.predict(X)
print("Predictions:\n", predictions)
Conclusion
This implementation provides a fundamental understanding of building a neural network from scratch in Python. For advanced topics and optimizations, consider resources available on the Enterprise DNA Platform.
Description
This guide provides step-by-step instructions for implementing a simple feedforward neural network in Python using NumPy, covering architecture, forward and backward passes, and training methods.