Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running Locally? #45

Open
JimLynchCodes opened this issue Feb 22, 2022 · 1 comment
Open

Running Locally? #45

JimLynchCodes opened this issue Feb 22, 2022 · 1 comment

Comments

@JimLynchCodes
Copy link

Hey! 👋

Nice project! The artillery load tests is a cool addition, too. 👍

I am wondering if it's possible to run this project locally.

I think something like serverless-offline would be a nice way to "host the lambdas locally"...

However, when I try to use it I get an error that the Rust runtime is not supported...

Any ideas for this? How can people productively develop something using this project?

Thanks!

@nmoutschen
Copy link
Contributor

Hey @JimLynchCodes!

If I remember correctly, you're using the rust plugin for the Serverless framework, right? If that's the case, there might be an interference between that plugin and serverless-offline.

I also see that serverless-offline only supports Node, Python, Ruby, and Go runtimes at the moment. That said, once you've compiled your Lambda functions, any tool to run Lambda functions locally should work as long as it supports provided.al2. For example, sam local start-api works fine if I try with a project in SAM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants