- Terraform
- AWS Lambda
- Docker
- Bash Scripting
This project exerts the simple way on running Ollama on AWS Lambda. This leverages the power of serverless computing to run Ollama on AWS Lambda while staying relevant to the AWS Free Tier. This project is a simple demonstration of how to run Ollama on AWS Lambda.
Please do not use this method for Production use-case as the inteferencing is still slow on AWS Lambda. But this is a great way of running Ollama on AWS Lambda for testing purposes or learning purposes.
git clone https://github.com/InspectorGadget/ollama-on-lambda.git
- Update
backend.tf
with your S3 Bucket name. - Run
terraform init
- Verify the changes that would be performed on your environment.
- Run
terraform apply
- Wait for the changes to be applied.
- Perform inferencing with your newly deployed Ollama instance.