Starter kit application for AI-driven projects, written in TypeScript and using Fastify with MongoDB, Apache Kafka, Jest testing and much more configured out of the box.
Node TypeScript AI Starter is a starter kit for AI-driven applications written in TypeScript and using Fastify as web framework. It comes with the following features configured out of the box:
- Domain Driven Design (DDD) structure
- Server configuration with enabled CORS
- dotenv to handle environment variables
- MongoDB integration
- Domain Events via Apache Kafka
- Testing via Jest
- Modern .eslintrc configuration
- Chat endpoint powered by OpenAI and MemoryBuffer
- Endpoints to ingest documents and query them, powered by OpenAI + hnswlib/Redis
- Installation
- Development
- Running the Application
- Configuration
- Features
- Testing
- Contribution Guidelines
- License
Be sure to use Node.js v18 or greater.
-
Clone the repo;
-
Either
- copy the
env.example
file into a new.env
file and fill in the variables, or - directly set up the needed environment variables.
- copy the
-
Run
npm install
to install the needed dependencies. -
(optional) If you want to remove the AI endpoints and logic for the AI-powered chat, follow this steps:
- Run
npm run clear-ai
- Delete the unused lines in
src/api/index.ts
- Run
For local development Node TypeScript AI Starter is shipped with a docker-compose file.
Simply run docker-compose up
(or TMPDIR=/private$TMPDIR docker-compose up
on MacOS) to spin up the containers.
Then run npm run debug
to debug the application, or npm run watch
to run the debugger with Nodemon.
Node TypeScript AI Starter comes with a Dockerfile out of the box. In order to locally run the containers, just make sure to have Docker installed and run:
- (Optional)
docker-compose up
(orTMPDIR=/private$TMPDIR docker-compose up
on MacOS) to run the docker-compose with the needed dependencies docker build -t node-ts-ai-starter .
to build the Containerdocker run -dp 3010:3010 node-ts-ai-starter
to run it. The application will be served on the port 3010
Alternatively, you can the application directly via npm run build
and then npm start
to run it.
Main Application
NODE_ENV
: Set the environment name, default isdevelopment
PORT
: Port the server will be available inMONGO_URI
: Set the complete MongoDB connection string. Defaultmongodb://localhost:27017/node-ts-ai-starter_<NODE_ENV>}
, where<NODE_ENV>
in theNODE_ENV
env variable valueDEBUG_MODE
: Set the value to'1'
to run the application in Debug Mode, i.e. max logging
OpenAI and AI-powered Features
CHAT_MEMORY_PERSISTENCE
: How the chat discussions are persisted. Accepted values:memory
OPENAI_ORGANIZATION_ID
: The organization id used by OpenAIOPENAI_API_KEY
: Your API keyOPEN_AI_MODEL
: The OpenAI model you want to use. Default:gpt-3.5-turbo
REDIS_URL
: If using Redis as Vector store, the Redis URL is requiredREDIS_PASSWORD
= If using Redis as Vector store, and the instance is password protectedVECTOR_STORE
: Vector store to be used. Accepted values:hnswlib
(default)redis
Apache Kafka
ENABLE_MESSAGE_BROKER
:'true'
or'false'
, whether domain events are published. Default'false'
;KAFKA_URI
: URI of the Kafka Message Broker. Defaultlocalhost:9092
;KAFKA_CLIENT_ID
: client id for the Kafka connection. If not set, a random generated will be used;KAFKA_LOG_LEVEL
: set the logging level of the Kafka client:0
= nothing1
= error2
= warning4
= info (default)5
= debug
SSL_CERT
: SSL certificate (string);SSL_KEY
: SSL key (string);SSL_CA
: SSL certificate authority (string);
Node TypeScript AI Starter comes with some AI-powered features set up out of the box. In order to use them, please obtain an AI key from https://platform.openai.com/account/api-keys.
Check the /src/config/index.ts
file to customise the behaviour of the LLM.
The API endpoint allows the user to chat with ChatGPT.
It identifies incoming requests via the ip address and context is kept in memory. Therefore, rebooting the application will remove all existing context.
Endpoints
-
POST
/api/llm/chat/message
// Body { message: <string>; // The message sent to the chat } // Response { data: <string>; // The message returned from the chat }
This endpoint allows to chat with the LLM. It will keep track of your conversation history, using the IP Address of the incoming request as user identified.
The following endpoints allow to ingest documents storing their embeddings into a Vector Database and query them using a LLM.
Modifying or updating a previously ingested document is not supported. You will need to clean the Vector Database and ingest all documents again.
Endpoints
-
POST
/api/llm/search/documents
// Body (form-data) { file // File to be ingested } // Response { data: <string>; // A confirmation message that the file has been successfully loaded }
This endpoint allows ingesting an input file.
-
GET
/api/llm/search/documents
// Query Params { query: <string> Query to be sent to search in the documents } // Response { data: <string>; // The response to the input query, which the relevant information, if found. }
This endpoint allows to directly query the documents previously ingested.
-
DELETE
/api/llm/search/documents
// Response { data: <string>; // A confirmation message that the Vector Store has been successfully cleaned }
This endpoint allows to remove all data previously added to the Vector Store.
Domain Events will be automatically published when a User is created or updated. The Node Messagebrokers package is used to publish to Apache Kafka.
The following events are published to the topic specified in the config file src/config/index.ts
(default is the myCompany.events.node-ts-ai-starter.user
).
Topic names should follow the name structure <company>.events.<application_name>.<aggregate_name>
.
The following events are emitted by the Application:
UserCreated
: A new User has been created
{
id: <string | integer>,
createdAt: <string>,
email: <string>,
username: <string | undefined>
}
Run npm test
to run the tests or npm run watch-test
to run the tests with the watcher.
Pull requests are welcome.
Node TypeScript AI Starter is free software distributed under the terms of the MIT license.