In this tutorial, we will build a WebAssembly-powered serverless platform using Rust 🦀. This platform will enable you to register and invoke WebAssembly functions dynamically. We’ll also explore how to harness Rust’s performance, type safety, and concurrency features to build a scalable, robust system.
WebAssembly is an excellent choice for serverless platforms because of its portability, security model, and near-native performance. By compiling functions to WebAssembly, developers can write in any language that supports WASM compilation while the platform remains agnostic to the source language.
You can find the complete source code for this tutorial on GitHub:
https://github.com/luishsr/rust-serverless
Step 1: Setting Up the Project
First, create a new Rust project. We’ll structure the project as a binary crate for the main application and include libraries for modularity.
cargo new rust-serverless
cd rust-serverless
Update the Cargo.toml file to include the necessary dependencies:
[dependencies]
warp = "0.3"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
sled = "0.34"
wasmtime = "27.0"
tokio = { version = "1.0", features = ["full"] }
These dependencies enable us to build a web server (warp), serialize and deserialize JSON (serde), store registered functions (sled), and execute WebAssembly modules (wasmtime).
Step 2: Designing the Platform Architecture
The platform will have the following main components:
- HTTP Server: Built using
warp, this will handle API requests for registering and invoking functions. - Storage Layer: Backed by
sled, it will persist WebAssembly modules for registered functions. - Execution Engine: Using
wasmtime, this will dynamically execute WebAssembly functions with supplied inputs.
We will organize these components into modules: api.rs for the HTTP server, storage.rs for the storage layer, and executor.rs for the WebAssembly execution engine.
Step 3: Implementing the Storage Layer
The storage layer is responsible for persisting WebAssembly modules under unique names. We’ll use sled, a fast and simple key-value store.
Create a file called src/storage.rs:
use sled::Db;
pub struct Storage {
db: Db,
}
impl Storage {
pub fn init_with_path(path: &str) -> Result<Self, Box<dyn std::error::Error>> {
let db = sled::open(path)?;
Ok(Self { db })
}
pub fn init() -> Result<Self, Box<dyn std::error::Error>> {
Self::init_with_path("functions_db")
}
pub fn save_function(&self, name: String, code: String) -> Result<(), sled::Error> {
self.db.insert(name, code.as_bytes())?;
Ok(())
}
pub fn load_function(&self, name: &str) -> Result<String, sled::Error> {
if let Some(code) = self.db.get(name)? {
Ok(String::from_utf8(code.to_vec()).unwrap())
} else {
Err(sled::Error::Io(std::io::Error::new(
std::io::ErrorKind::NotFound,
"Function not found",
)))
}
}
}
This module provides methods to save and retrieve WebAssembly modules. It uses sled to persist the mappings of function names to their Base64-encoded WASM binaries.
Step 4: Implementing the Execution Engine
The execution engine will use wasmtime to dynamically load and execute WebAssembly modules. Create a file called src/executor.rs:
use wasmtime::*;
use serde_json::Value;
pub fn execute(code: &str, function_name: &str, inputs: &[Value]) -> Result<Value, Box<dyn std::error::Error>> {
let engine = Engine::default();
let module = Module::new(&engine, code)?;
let mut store = Store::new(&engine, ());
let instance = Instance::new(&mut store, &module, &[])?;
let func = instance.get_func(&mut store, function_name)
.ok_or_else(|| format!("Function '{}' not found in module", function_name))?;
let func_ty = func.ty(&store);
let params: Vec<_> = func_ty.params().collect();
let results: Vec<_> = func_ty.results().collect();
if params.len() != inputs.len() {
return Err(format!(
"Function '{}' expected {} arguments, but got {}",
function_name, params.len(), inputs.len()
).into());
}
let mut wasm_inputs = Vec::new();
for (param, input) in params.iter().zip(inputs.iter()) {
let value = match (param, input) {
(ValType::I32, Value::Number(n)) => Val::I32(n.as_i64().ok_or("Invalid i32")? as i32),
_ => return Err(format!("Unsupported parameter type: {:?}", param).into()),
};
wasm_inputs.push(value);
}
let mut wasm_results = vec![Val::I32(0); results.len()];
func.call(&mut store, &wasm_inputs, &mut wasm_results)?;
if wasm_results.len() > 1 {
return Err("Multiple return values are not supported yet".into());
}
let result = match wasm_results.get(0) {
Some(Val::I32(v)) => Value::Number((*v).into()),
_ => return Err("Unsupported return type".into()),
};
Ok(result)
}
This module dynamically loads a WebAssembly module, executes the specified function, and returns the result.
Click here to Learn More!
Step 5: Implementing the API Layer
The API layer will provide HTTP endpoints for registering and invoking functions. Create a file called src/api.rs:
use warp::Filter;
use std::sync::Arc;
use crate::{executor, storage};
use serde_json::Value;
pub fn server(
storage: Arc<storage::Storage>,
) -> impl Filter<Extract = impl warp::Reply, Error = warp::Rejection> + Clone {
let register = warp::post()
.and(warp::path("register"))
.and(warp::body::json())
.and(with_storage(storage.clone()))
.and_then(register_function);
let invoke = warp::post()
.and(warp::path("invoke"))
.and(warp::body::json())
.and(with_storage(storage.clone()))
.and_then(invoke_function);
register.or(invoke)
}
fn with_storage(
storage: Arc<storage::Storage>,
) -> impl Filter<Extract = (Arc<storage::Storage>,), Error = std::convert::Infallible> + Clone {
warp::any().map(move || storage.clone())
}


