diff --git a/README.md b/README.md
index 58b8623..24c7637 100644
--- a/README.md
+++ b/README.md
@@ -1,21 +1,25 @@
-# OmniLog
+# OmniLog
## Understand your LLM prompts, Empowered by Generative AI
-
-
+
+
# Overview
-Omnilog allows you to easily monitor your LLM project!
+[![GitHub last commit](https://img.shields.io/github/last-commit/Theodo-UK/OmniLog)](https://github.com/Theodo-UK/OmniLog/commits)
+[![Downloads](https://static.pepy.tech/badge/omnilogger)](https://www.pepy.tech/projects/omnilogger)
+
+Omnilog allows you to easily monitor your LLM project!
+
1. Simply run our initialisation script on your AWS account
-2. Start our listener in our python project,
-3. View your logs and analytics in your privately deployed web app!
+2. Start our listener in your python project,
+3. View your logs and analytics in your privately deployed web app!
![image](https://github.com/Theodo-UK/OmniLog/assets/57725347/a494d81d-dab1-4836-8922-efec380c5812)
-(For developer setup information, refer to the Developer Info section at the bottom of the page)
+(For developer setup information, refer to the [Developer Info](#developer-info) section at the bottom of the page)
# Quickstart
@@ -24,10 +28,11 @@ Get started with OmniLog in five easy steps:
1. Ensure you have an AWS account with a user that has the necessary permissions to bootstrap our project into your AWS. [You can check & follow the steps here to do that](/docs/aws_setup.md).
2. Configure a Postgres database:
+
- We recommend [Neon](https://neon.tech/) for an simple setup with a very generous free tier. You will need the connection string later (This can be found on the homepage after creating your Neon project) e.g:
-
+
3. Third, clone our repository:
@@ -37,6 +42,7 @@ git clone https://github.com/Theodo-UK/OmniLog.git
```
4. Bootstrap the project onto your AWS by running the following script:
+
```bash
bash ./init.sh
```
@@ -45,7 +51,8 @@ bash ./init.sh
6. Run the `init.sh` script again, and skip the configuring of the .env files to deploy your web app properly.
-7. Use the Python SDK inside your LLM project:
+7. Use the [Python SDK](https://pypi.org/project/omnilogger/) inside your LLM project:
+
```python
from omnilogger import start_openai_listener
@@ -55,20 +62,31 @@ start_openai_listener()
...
```
-That's it! You should now be able to see your logs at the AWS URI given from `init.sh`. See the [sdk-python docs](/sdk-python/README.md) for more details.
+
+That's it! You should now be able to see your logs at the AWS URI given from `init.sh`. See the [sdk-python docs](/sdk-python/README.md) for more details. To share your project with others, see the [Adding new users](/docs/create_user.md) guide.
# Removing the AWS resources
If you want to remove the AWS resources, then you simply have to call this script:
+
```bash
./teardown.sh
```
->⚠️ REMOVAL POLICY:
-By default, AWS does not remove resources like S3 buckets or DynamoDB tables. You will need to modify these manually via the AWS console.
+> ⚠️ REMOVAL POLICY:
+> By default, AWS does not remove resources like S3 buckets or DynamoDB tables. You will need to modify these manually via the AWS console.
+
+# Help us by contributing
+
+[![Build Status](https://github.com/Theodo-UK/OmniLog/workflows/app-cd/badge.svg)](https://github.com/Theodo-UK/OmniLog/actions)
+[![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/YourUsername/YourRepo)
+
+We welcome contributions from the community to help improve OmniLog. You can contribute in the following ways:
+
+Contribute to the Web App by setting up a development environment as outlined in the [Dev-setup](/front/README.md) guide.
+
+Contribute to the Omnilogger pip package by following the guidelines provided in the [Python SDK](/sdk-python/README.md#contributing) documentation.
+
+Your contributions are valuable and greatly appreciated! If you have ideas for improvements, feature requests, or bug reports, please don't hesitate to open an issue or create a pull request.
-# Developer Info
-### Table of Contents
-1. [Dev setup - front](/front/README.md)
-2. [Python SDK](/sdk-python/README.md)
-3. [Adding new users](/docs/create_user.md)
+Thank you for your support in making OmniLog even better!
diff --git a/front/README.md b/front/README.md
index 46f7497..fe6366d 100644
--- a/front/README.md
+++ b/front/README.md
@@ -1,12 +1,12 @@
This is a [Next.js](https://nextjs.org/) project bootstrapped with [`create-next-app`](https://github.com/vercel/next.js/tree/canary/packages/create-next-app).
-## Getting Started
+# Getting Started
This project uses [sst](https://docs.sst.dev/what-is-sst), so we're using the SST local development flow [described here](https://docs.sst.dev/live-lambda-development).
These steps are for devs who want to contribute to Omnilog, if you want to use Omnilog, follow the instructions [here](https://github.com/Theodo-UK/OmniLog#quickstart).
-### 1. Get Next.js app running locally
+## 1. Get Next.js app running locally
If you are developing the website without the external resources, e.g using dummy data and don't want to setup the AWS stack, use this option.
@@ -15,31 +15,34 @@ If you are developing the website without the external resources, e.g using dumm
If you click the localhost URL in the console, you should see the Next.js app running. If you are on a route which relies on external resources, you will encounter errors because it is not hooked up to your AWS lambdas; follow the next steps to get that set up!
-### 2. Set up local AWS credentials
+## 2. Set up local AWS credentials
Having AWS credentials on your local machine is required to use SST.
You can see the [steps required to add AWS credentials here.](/docs/aws_setup.md)
-### 3. Add database URI to .env
+## 3. Add database URI to .env
Create the following env files:
+
```
// .env (used for yarn seed)
DATABASE_URL=
?pgbouncer=true
NEXTAUTH_SECRET=secret
NEXTAUTH_URL=http://localhost:3000
```
+
```
-// .env.development (used for yarn sst dev)
+// .env.development (used for yarn dev_sst)
AWS_PROFILE_NAME=
SST_STAGE_NAME=-dev
DATABASE_URL=?pgbouncer=true
NEXTAUTH_SECRET=secret
NEXTAUTH_URL=http://localhost:3000
```
+
```
-// .env.production (used for yarn sst deploy)
+// .env.production (used for yarn deploy)
AWS_PROFILE_NAME=
SST_STAGE_NAME=staging
DATABASE_URL=?pgbouncer=true
@@ -49,7 +52,7 @@ NEXTAUTH_URL=<>
Note that `?pgbouncer=true` is required at the end of DATABASE_URL ([see issue](https://github.com/prisma/prisma/issues/11643#issuecomment-1034078942)):
-### 4. Build Next.js app into Lambdas and deploy them locally
+## 4. Build Next.js app into Lambdas and deploy them locally
- `yarn dev_sst` to start the Live Lambda Development environment.
- This command does the following:
@@ -57,13 +60,15 @@ Note that `?pgbouncer=true` is required at the end of DATABASE_URL ([see issue](
- Builds the Next.js app into lambda functions,
- and deploys them to the local Lambda environment
-### 6. Bind Next.js app to local Lambda environment so that it can invoke AWS resources
+## 6. Bind Next.js app to local Lambda environment so that it can invoke AWS resources
- `yarn dev` to bind the Next.js app to sst, which allows it to invoke AWS resources.
- This command does the following:
- Starts the Next.js app at localhost
- Binds the Next.js app to the local Lambda environment (therefore allowing it to use AWS resources)
-## Deploying to staging
+# Deploying to staging
+
+This is the command to deploy the local code straight to staging (without going through a PR). We discourage its usage but it can be useful for debugging.
- `yarn deploy`