For years, I’ve bounced between Go and Node.js for my backend services. Go is great for concurrency, and Node is unbeatable for prototyping speed. But when I needed absolute memory safety and the kind of performance that makes C++ sweat, I turned to Rust. If you’re looking to build an API with Rust and Axum tutorial style, you’ve come to the right place.

I chose Axum because it’s developed by the Tokio team. It leverages the tower ecosystem, meaning it’s modular, highly compatible, and incredibly fast. In my experience, Axum strikes the perfect balance between the strictness of Rust and the ergonomics of a modern web framework.

Prerequisites

Before we dive into the code, ensure you have the following installed on your machine:

If you’re still debating whether Rust is the right choice compared to other languages, I’ve written a detailed breakdown on Rust vs Go for APIs that might help you decide.

Step 1: Project Initialization

Start by creating a new binary project using Cargo:

cargo new rust-axum-api
cd rust-axum-api

Now, let’s add the necessary dependencies to your Cargo.toml. We need axum for the framework, tokio for the async runtime, and serde for JSON serialization.

[dependencies]
axum = "0.7"
tokio = { version = "1.0", features = ["full"] }
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"

Step 2: Creating Your First Route

Let’s build a simple “Hello World” endpoint. In src/main.rs, replace the boilerplate with the following code:

use axum::{routing::get, Router};

#[tokio::main]
async fn main() {
    // build our application with a single route
    let app = Router::new().route("/", get(handler));

    // run it with hyper on localhost:3000
    let listener = tokio::net::TcpListener::bind("0.0.0.0:3000").await.unwrap();
    println!("🚀 Server running on http://localhost:3000");
    axum::serve(listener, app).await.unwrap();
}

async fn handler() -> &'static str {
    "Hello from Axum!"
}

Run the server using cargo run and visit http://localhost:3000. You should see the greeting immediately.

Step 3: Handling JSON and State

A real API needs to handle data. Let’s implement a simple User resource. To keep this tutorial focused, I’ll use an Arc<Mutex<T>> for in-memory storage, but in a production environment, you’ll want to look into ORM alternatives for Rust like Diesel or SeaORM.

use axum::{routing::{get, post}, extract::{State, Json}, Router};
use serde::{Deserialize, Serialize};
use std::sync::{Arc, Mutex};

#[derive(Serialize, Deserialize, Clone)]
struct User {
    id: u64,
    username: String,
}

type SharedState = Arc<Mutex<Vec<User>>>;

#[tokio::main]
async fn main() {
    let shared_state = Arc::new(Mutex::new(Vec::new()));

    let app = Router::new()
        .route("/users", get(get_users).post(create_user))
        .with_state(shared_state);

    let listener = tokio::net::TcpListener::bind("0.0.0.0:3000").await.unwrap();
    axum::serve(listener, app).await.unwrap();
}

async fn get_users(State(state): State<SharedState>) -> Json<Vec<User>> {
    let users = state.lock().unwrap().clone();
    Json(users)
}

async fn create_user(State(state): State<SharedState>, Json(payload): Json<User>) -> Json<User> {
    let mut users = state.lock().unwrap();
    users.push(payload.clone());
    Json(payload)
}

As shown in the image below, your terminal output will confirm the server is listening. You can now use Postman to POST a JSON object like {"id": 1, "username": "ajmani"} to /users and then GET it back.

Terminal output showing a successful Axum server start and CURL requests
Terminal output showing a successful Axum server start and CURL requests

Pro Tips for Axum Development

Troubleshooting Common Issues

“Cannot find type in this scope”

This usually happens when you’ve forgotten to import a trait. Axum relies heavily on traits for its extractors. Double-check your use statements.

“Async function cannot be called in sync context”

Ensure your main function is annotated with #[tokio::main] and that you are using .await on all asynchronous calls.

What’s Next?

Now that you have a basic API running, it’s time to think about deployment. While a VPS is a classic choice, I’ve found that for event-driven workloads, deploying Rust to AWS Lambda provides an incredible price-to-performance ratio.

Ready to scale? Start by adding a database like PostgreSQL and implementing JWT authentication for your routes.