Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Prompt ChatGPT function #8

Merged
merged 16 commits into from
Jul 29, 2023
Merged
130 changes: 130 additions & 0 deletions node/prompt-chatgpt/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,130 @@
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
.pnpm-debug.log*

# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json

# Runtime data
pids
*.pid
*.seed
*.pid.lock

# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov

# Coverage directory used by tools like istanbul
coverage
*.lcov

# nyc test coverage
.nyc_output

# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt

# Bower dependency directory (https://bower.io/)
bower_components

# node-waf configuration
.lock-wscript

# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release

# Dependency directories
node_modules/
jspm_packages/

# Snowpack dependency directory (https://snowpack.dev/)
web_modules/

# TypeScript cache
*.tsbuildinfo

# Optional npm cache directory
.npm

# Optional eslint cache
.eslintcache

# Optional stylelint cache
.stylelintcache

# Microbundle cache
.rpt2_cache/
.rts2_cache_cjs/
.rts2_cache_es/
.rts2_cache_umd/

# Optional REPL history
.node_repl_history

# Output of 'npm pack'
*.tgz

# Yarn Integrity file
.yarn-integrity

# dotenv environment variable files
.env
.env.development.local
.env.test.local
.env.production.local
.env.local

# parcel-bundler cache (https://parceljs.org/)
.cache
.parcel-cache

# Next.js build output
.next
out

# Nuxt.js build / generate output
.nuxt
dist

# Gatsby files
.cache/
# Comment in the public line in if your project uses Gatsby and not Next.js
# https://nextjs.org/blog/next-9-1#public-directory-support
# public

# vuepress build output
.vuepress/dist

# vuepress v2.x temp and cache directory
.temp
.cache

# Docusaurus cache and generated files
.docusaurus

# Serverless directories
.serverless/

# FuseBox cache
.fusebox/

# DynamoDB Local files
.dynamodb/

# TernJS port file
.tern-port

# Stores VSCode versions used for testing VSCode extensions
.vscode-test

# yarn v2
.yarn/cache
.yarn/unplugged
.yarn/build-state.yml
.yarn/install-state.gz
.pnp.*
6 changes: 6 additions & 0 deletions node/prompt-chatgpt/.prettierrc.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
{
"trailingComma": "es5",
"tabWidth": 2,
"semi": true,
"singleQuote": true
}
69 changes: 69 additions & 0 deletions node/prompt-chatgpt/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
# ⚡ OpenAI ChatGPT Function
loks0n marked this conversation as resolved.
Show resolved Hide resolved
loks0n marked this conversation as resolved.
Show resolved Hide resolved

Query the OpenAI GPT-3.5-turbo model for chat completions.

## 🧰 Usage

### `GET`

HTML form for interacting with the model.

### `POST`

Query the model for a completion.

**Parameters**

| Name | Description | Location | Type | Sample Value |
| ---- | ------------------------ | -------- | ------ | ----------------------------- |
| N/A | Text to prompt the model | Body | String | `Write a haiku about Mondays` |
Meldiron marked this conversation as resolved.
Show resolved Hide resolved

Sample `200` Response:

Response from the model.

```text
Monday's heavy weight,
Dawning with a sigh of grey,
Hopeful hearts await.
loks0n marked this conversation as resolved.
Show resolved Hide resolved
```

Sample `400` Response:

Response when the request body is missing.

```
Missing body with a prompt.
```
loks0n marked this conversation as resolved.
Show resolved Hide resolved

## ⚙️ Configuration

| Setting | Value |
| ----------------- | ------------- |
| Runtime | Node (18.0) |
| Entrypoint | `src/main.js` |
| Build Commands | `npm install` |
| Permissions | `any` |
| Timeout (Seconds) | 15 |

## 🔒 Environment Variables

### OPENAI_API_KEY

A unique key used to authenticate with the OpenAI API. Please note that this is a paid service and you will be charged for each request made to the API. For more information, see the [OpenAI pricing page](https://openai.com/pricing/).

| Question | Answer |
| ------------- | --------------------------------------------------------------------------- |
| Required | Yes |
| Sample Value | `d1efb...aec35` |
| Documentation | [OpenAI Docs](https://platform.openai.com/docs/quickstart/add-your-api-key) |

### OPENAI_MAX_TOKENS

The maximum number of tokens that the OpenAI response should contain. Be aware that OpenAI models read and write a maximum number of tokens per API call, which varies depending on the model. For GPT-3.5-turbo, the limit is 4096 tokens.

| Question | Answer |
| ------------- | ------------------------------------------------------------------------------------------------------------- |
| Required | No |
| Sample Value | `512` |
| Documentation | [OpenAI: What are tokens?](https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them) |
126 changes: 126 additions & 0 deletions node/prompt-chatgpt/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

17 changes: 17 additions & 0 deletions node/prompt-chatgpt/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
{
"name": "prompt-chatgpt",
"version": "1.0.0",
"description": "",
"main": "src/main.js",
"type": "module",
"scripts": {
"format": "prettier --write ."
},
"keywords": [],
"dependencies": {
"openai": "^3.3.0"
},
"devDependencies": {
"prettier": "^3.0.0"
}
}
35 changes: 35 additions & 0 deletions node/prompt-chatgpt/src/environment.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
/**
Meldiron marked this conversation as resolved.
Show resolved Hide resolved
* @param {string} key
* @return {string}
*/
function getRequiredEnv(key) {
const value = process.env[key];
if (value === undefined) {
throw new Error(`Environment variable ${key} is not set`);
}
return value;
}

/**
* @param {string} key
* @return {number | undefined}
*/
function getNumberEnv(key) {
const value = process.env[key];
if (value === undefined) {
return undefined;
}

try {
return parseInt(value);
} catch (e) {
throw new Error(`Environment variable ${key} is not a number`);
}
}

class EnvironmentService {
OPENAI_API_KEY = getRequiredEnv('OPENAI_API_KEY');
OPENAI_MAX_TOKENS = getNumberEnv('OPENAI_MAX_TOKENS') ?? 512;
}

export default EnvironmentService;
Loading