Skip to content

Commit

Permalink
flag compilations for ONNX runtime (#5)
Browse files Browse the repository at this point in the history
* flag compilations for ONNX runtime

* removing main.rs

* removing the verbose arguement

* inspecting github actions

* inspecting the build

* removing the delete in the environment

* adding writes depending on OS

* removing the ORT_DYLIB_PATH env var after the load

* adding writes depending on OS

* altering binary builds

* reinserting test pipeline

* documentation added and ready for deployment
  • Loading branch information
maxwellflitton authored Dec 20, 2023
1 parent f0b0db3 commit efab958
Show file tree
Hide file tree
Showing 24 changed files with 377 additions and 50 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/surrealml_core_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,4 @@ jobs:
override: true

- name: Run Unit Tests
run: cd modules/utils && cargo test --verbose
run: cd modules/utils && cargo test
2 changes: 1 addition & 1 deletion .github/workflows/surrealml_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,4 @@ jobs:
override: true

- name: Run Unit Tests
run: cargo test --verbose
run: cargo test
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
# SurrealMl
This package is for storing machine learning models with meta data in Rust so they can be used on the SurrealDB server.

## Compilation config
If nothing is configured the crate will compile the ONNX runtime into the binary. This is the default behaviour. However, if you want to use an ONNX runtime that is installed on your system, you can set the environment variable `ONNXRUNTIME_LIB_PATH` before you compile the crate. This will make the crate use the ONNX runtime that is installed on your system.

## Quick start with Sk-learn

Sk-learn models can also be converted and stored in the `.surml` format enabling developers to load them in any
Expand Down
9 changes: 0 additions & 9 deletions modules/onnx_driver/Cargo.toml

This file was deleted.

3 changes: 0 additions & 3 deletions modules/onnx_driver/src/main.rs

This file was deleted.

9 changes: 9 additions & 0 deletions modules/utils/.dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
.idea/
builds/
onnx_driver/
target/
tests/
output/
LICENSE
README.md
Cargo.lock
3 changes: 3 additions & 0 deletions modules/utils/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
onnx_driver/
target/
output/
2 changes: 1 addition & 1 deletion modules/utils/Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "surrealml-core"
version = "0.0.2"
version = "0.0.3"
edition = "2021"
build = "./build.rs"
description = "The core machine learning library for SurrealML that enables SurrealDB to store and load ML models"
Expand Down
101 changes: 96 additions & 5 deletions modules/utils/README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,99 @@
# Surml Core

# Utils
An embedded ONNX runtime directly in the Rust binary when compiling result in no need for installing ONNX runtime separately or worrying about version clashes with other runtimes.

This package is for having shared structs and functions for either the inference or ML client.
This package contains the `onnxruntime`, the computation adapters, and the storage mechanisms
which is the loading and saving of the models, with a header.
This crate is just the Rust implementation of the Surml API. It is advised that you just use this crate directly if you are running a Rust server. It must be noted that the version of ONNX needs to be the same as the client when using this crate. For this current version of Surml, the ONNX version is `1.16.0`.

This package can be used for the python `surrealml` package, or directly in a Rust server.
## Compilation config
If nothing is configured the crate will compile the ONNX runtime into the binary. This is the default behaviour. However, if you want to use an ONNX runtime that is installed on your system, you can set the environment variable `ONNXRUNTIME_LIB_PATH` before you compile the crate. This will make the crate use the ONNX runtime that is installed on your system.

## Nix Support
At this point in time NIX is not directly supported. The `ONNXRUNTIME_LIB_PATH` needs to be defined. This is explained in the `Compilation config` section.

## Usage
Surml can be used to store, load, and execute ONNX models.

### Storing and accessing models
We can store models and meta data around the models with the following code:
```rust
use std::fs::File;
use std::io::{self, Read, Write};

use surrealml_core::storage::surml_file::SurMlFile;
use surrealml_core::storage::header::Header;
use surrealml_core::storage::header::normalisers::{
wrapper::NormaliserType,
linear_scaling::LinearScaling
};


// load your own model here (surrealml python package can be used to convert PyTorch,
// and Sklearn models to ONNX or package them as surml files)
let mut file = File::open("./stash/linear_test.onnx").unwrap();
let mut model_bytes = Vec::new();
file.read_to_end(&mut model_bytes).unwrap();

// create a header for the model
let mut header = Header::fresh();
header.add_column(String::from("squarefoot"));
header.add_column(String::from("num_floors"));
header.add_output(String::from("house_price"), None);

// add normalisers if needed
header.add_normaliser(
"squarefoot".to_string(),
NormaliserType::LinearScaling(LinearScaling { min: 0.0, max: 1.0 })
);
header.add_normaliser(
"num_floors".to_string(),
NormaliserType::LinearScaling(LinearScaling { min: 0.0, max: 1.0 })
);

// create a surml file
let surml_file = SurMlFile::new(header, model_bytes);

// read and write surml files
surml_file.write("./stash/test.surml").unwrap();
let new_file = SurMlFile::from_file("./stash/test.surml").unwrap();
let file_from_bytes = SurMlFile::from_bytes(surml_file.to_bytes()).unwrap();
```

## Executing models

We you load a `surml` file, you can execute the model with the following code:

```rust
use surrealml_core::storage::surml_file::SurMlFile;
use surrealml_core::execution::compute::ModelComputation;
use ndarray::ArrayD;
use std::collections::HashMap;


let mut file = SurMlFile::from_file("./stash/test.surml").unwrap();

let compute_unit = ModelComputation {
surml_file: &mut file,
};

// automatically map inputs and apply normalisers to the compute if this data was put in the header
let mut input_values = HashMap::new();
input_values.insert(String::from("squarefoot"), 1000.0);
input_values.insert(String::from("num_floors"), 2.0);

let output = compute_unit.buffered_compute(&mut input_values).unwrap();

// feed a raw ndarray into the model if no header was provided or if you want to bypass the header
let x = vec![1000.0, 2.0];
let data: ArrayD<f32> = ndarray::arr1(&x).into_dyn();

// None input can be a tuple of dimensions of the input data
let output = compute_unit.raw_compute(data, None).unwrap();
```

## ONNX runtime assets

We can find the ONNX assets with the following link:

```
https://github.com/microsoft/onnxruntime/releases/tag/v1.16.2
```
64 changes: 48 additions & 16 deletions modules/utils/build.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2,23 +2,55 @@ use std::process::Command;

fn main() {

let _ = Command::new("sh")
.arg("-c")
.arg("cargo new onnx_driver && cd onnx_driver && echo 'ort = \"1.16.2\"' >> Cargo.toml
")
.status()
.expect("failed to execute process");
match std::env::var("ONNXRUNTIME_LIB_PATH") {
Ok(_) => {
println!("cargo:rustc-cfg=onnx_runtime_env_var_set");
},
Err(_) => {
#[cfg(not(windows))]
{
let _ = Command::new("sh")
.arg("-c")
.arg("cargo new onnx_driver && cd onnx_driver && echo 'ort = \"1.16.2\"' >> Cargo.toml
")
.status()
.expect("failed to execute process");
}

let _ = Command::new("sh")
.arg("-c")
.arg("cd onnx_driver && cargo build")
.status()
.expect("failed to execute process");
#[cfg(windows)]
{
// let _ = Command::new("cmd")
// .args(&["/C", "cargo new onnx_driver && cd onnx_driver && echo ort = \"1.16.2\" >> Cargo.toml"])
// .status()
// .expect("failed to execute process");
let _ = Command::new("powershell")
.arg("-Command")
.arg("cargo new onnx_driver; Set-Location onnx_driver; Add-Content -Path .\\Cargo.toml -Value 'ort = \"1.16.2\"'")
.status()
.expect("failed to execute process");
}

// let _ = Command::new("sh")
// .arg("-c")
// .arg("cd ../onnx_driver && cargo build")
// .status()
// .expect("failed to execute process");
#[cfg(not(windows))]
{
let _ = Command::new("sh")
.arg("-c")
.arg("cd onnx_driver && cargo build")
.status()
.expect("failed to execute process");
}

#[cfg(windows)]
{
let _ = Command::new("cmd")
.args(&["/C", "cd onnx_driver && cargo build"])
.status()
.expect("failed to execute process");
}
}
}
}

// fn main() {
// // println!("cargo:rustc-cfg=onnx_runtime_env_var_set");
// println!("test");
// }
17 changes: 17 additions & 0 deletions modules/utils/builds/Dockerfile.linux
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Start from a base image, e.g., Ubuntu
FROM ubuntu:latest

RUN apt-get update && apt-get install -y curl build-essential
RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y

ENV PATH="/root/.cargo/bin:${PATH}"

WORKDIR /app

COPY . .

# RUN cargo build --release

# CMD ["cargo", "test"]
# run in infinite loop
CMD ["tail", "-f", "/dev/null"]
Empty file.
21 changes: 21 additions & 0 deletions modules/utils/builds/Dockerfile.nix
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Start from a base image, e.g., Ubuntu
FROM nixos/nix:latest

# Update Nix channel
RUN nix-channel --update

# Install Rust and build tools using Nix
RUN nix-env -iA nixpkgs.rustup nixpkgs.gcc nixpkgs.pkg-config nixpkgs.cmake nixpkgs.coreutils

# Initialize Rust environment
RUN rustup default stable

ENV PATH="/root/.cargo/bin:${PATH}"

WORKDIR /app

COPY . .

# RUN cargo build --release

CMD ["cargo", "run"]
37 changes: 37 additions & 0 deletions modules/utils/builds/Dockerfile.windows
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# # Use a Windows base image
# FROM mcr.microsoft.com/dotnet/core/sdk:2.1

# # Install Rust
# RUN powershell -Command \
# $ErrorActionPreference = 'Stop'; \
# [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12; \
# Invoke-WebRequest https://win.rustup.rs -OutFile rustup-init.exe; \
# Start-Process ./rustup-init.exe -ArgumentList '-y' -Wait; \
# Remove-Item rustup-init.exe

# # Add Cargo to PATH
# ENV PATH="C:\\Users\\ContainerAdministrator\\.cargo\\bin;${PATH}"

# WORKDIR /app
# COPY . .

# # Command to run on container start
# CMD ["cargo", "run"]

# Use the latest Windows Server Core image
FROM mcr.microsoft.com/windows:ltsc2019

# Set the working directory to C:\app
WORKDIR C:\app

# Install Rust
RUN powershell.exe -Command "Invoke-WebRequest https://win.rustup.rs -OutFile rustup-init.exe; .\rustup-init.exe -y"

# Add Rust to the PATH environment variable
RUN setx /M PATH $('C:\Users\ContainerAdministrator\.cargo\bin;' + $Env:PATH)

# Copy the source code into the container
COPY . .

# Run the application
CMD ["cargo", "run"]
15 changes: 15 additions & 0 deletions modules/utils/builds/docker_configs/linux.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
version: "3.8"

services:
linux_surrealml_core:
build:
context: .
dockerfile: builds/Dockerfile.linux
restart: unless-stopped
# command: tail -f /dev/null
environment:
TEST: test_env
volumes:
- ./output/linux:/app/output
ports:
- "8001:8001"
12 changes: 12 additions & 0 deletions modules/utils/builds/docker_configs/macos.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
version: "3.8"

services:
surrealml_core:
build:
context: .
dockerfile: builds/Dockerfile.macos
restart: unless-stopped
environment:
TEST: test_env
ports:
- "8001:8001"
13 changes: 13 additions & 0 deletions modules/utils/builds/docker_configs/nix.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
version: "3.8"

services:
nix_surrealml_core:
build:
context: .
dockerfile: builds/Dockerfile.nix
restart: unless-stopped
command: tail -f /dev/null
environment:
TEST: test_env
ports:
- "8001:8001"
13 changes: 13 additions & 0 deletions modules/utils/builds/docker_configs/windows.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
version: "3.8"

services:
windows_surrealml_core:
build:
context: .
dockerfile: builds/Dockerfile.windows
restart: unless-stopped
command: tail -f /dev/null
environment:
TEST: test_env
ports:
- "8001:8001"
8 changes: 8 additions & 0 deletions modules/utils/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
version: '3.8'

services:

busybox:
image: busybox
# command: tail -f /dev/null
command: echo "Hello World"
12 changes: 12 additions & 0 deletions modules/utils/scripts/linux_compose.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
#!/usr/bin/env bash

# navigate to directory
SCRIPTPATH="$( cd "$(dirname "$0")" ; pwd -P )"
cd $SCRIPTPATH

cd ..

# compose_command=$1

# docker-compose -f docker-compose.yml -f aarch.yml $1
docker-compose -f docker-compose.yml -f builds/docker_configs/linux.yml $1
Loading

0 comments on commit efab958

Please sign in to comment.