DEV Community

Cover image for Building a Neural Network from Scratch in Rust
EvolveDev
EvolveDev

Posted on

2 1 1 1 1

Building a Neural Network from Scratch in Rust

In this blog, we will build a simple neural network from scratch in Rust. We'll start by setting up our project, then implement the core components of a neural network, and finally train it on a basic dataset.

Project Setup

First, let's set up a new Rust project. Open your terminal and run:

cargo new neural_network
cd neural_network
Enter fullscreen mode Exit fullscreen mode

Next, we'll add the ndarray crate for numerical operations and rand crate for random number generation. Update your Cargo.toml file to include these dependencies:

[dependencies]
ndarray = "0.15"
rand = "0.8"
Enter fullscreen mode Exit fullscreen mode

Implementing the Neural Network

We'll start by creating a network.rs file in the src directory to hold our neural network implementation.

Defining the Network Structure
Create a Network struct that will hold our weights and biases:

// src/network.rs
use ndarray::{Array1, Array2, Axis};
use rand::thread_rng;
use rand::Rng;

pub struct Network {
    weights1: Array2<f64>,
    biases1: Array1<f64>,
    weights2: Array2<f64>,
    biases2: Array1<f64>,
}

impl Network {
    pub fn new(input_size: usize, hidden_size: usize, output_size: usize) -> Self {
        let mut rng = thread_rng();

        let weights1 = Array2::from_shape_fn((hidden_size, input_size), |_| rng.gen_range(-1.0..1.0));
        let biases1 = Array1::from_shape_fn(hidden_size, |_| rng.gen_range(-1.0..1.0));
        let weights2 = Array2::from_shape_fn((output_size, hidden_size), |_| rng.gen_range(-1.0..1.0));
        let biases2 = Array1::from_shape_fn(output_size, |_| rng.gen_range(-1.0..1.0));

        Network {
            weights1,
            biases1,
            weights2,
            biases2,
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Forward Pass
Implement the forward pass of the network, which involves calculating the activations for each layer:

impl Network {
    fn sigmoid(x: &Array1<f64>) -> Array1<f64> {
        x.mapv(|x| 1.0 / (1.0 + (-x).exp()))
    }

    fn sigmoid_derivative(x: &Array1<f64>) -> Array1<f64> {
        x * &(1.0 - x)
    }

    pub fn forward(&self, input: &Array1<f64>) -> (Array1<f64>, Array1<f64>, Array1<f64>) {
        let hidden_input = self.weights1.dot(input) + &self.biases1;
        let hidden_output = Self::sigmoid(&hidden_input);
        let final_input = self.weights2.dot(&hidden_output) + &self.biases2;
        let final_output = Self::sigmoid(&final_input);

        (hidden_output, final_input, final_output)
    }
}
Enter fullscreen mode Exit fullscreen mode

Read the full article here!

Conclusion

In this blog, we built a simple neural network from scratch in Rust. We covered the core components, including initialization, forward pass, and backpropagation. This example can be expanded to more complex networks and datasets, providing a solid foundation for neural network implementation in Rust.

Feel free to experiment with different architectures, activation functions, and learning rates to see how they affect the network's performance.

Feature flag article image

Create a feature flag in your IDE in 5 minutes with LaunchDarkly’s MCP server ⏰

How to create, evaluate, and modify flags from within your IDE or AI client using natural language with LaunchDarkly's new MCP server. Follow along with this tutorial for step by step instructions.

Read full post

Top comments (0)

Runner H image

Automate Your Workflow in Slack, Gmail, Notion & more

Runner H connects to your favorite tools and handles repetitive tasks for you. Save hours daily. Try it free while it’s in beta.

Try for Free

👋 Kindness is contagious

Take a moment to explore this thoughtful article, beloved by the supportive DEV Community. Coders of every background are invited to share and elevate our collective know-how.

A heartfelt "thank you" can brighten someone's day—leave your appreciation below!

On DEV, sharing knowledge smooths our journey and tightens our community bonds. Enjoyed this? A quick thank you to the author is hugely appreciated.

Okay