Skip to content

iovdin/tune

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tune for VSCode

Tune is a handy extension for Visual Studio Code that lets you chat with large language models (LLMs) right in your code editor. Need quick answers, help brainstorming, or running scripts? Tune’s got you covered! You can even use different LLMs and create handy tools in Python, JavaScript, or PHP—all without leaving your editor. Get ready to boost your productivity!

🚀 Features

💬 .chat file

To start chatting create a .chat file. Use <Shift>+<Enter> to trigger chat response

🔗 Variable expansion

You can use {filename} to inject filename contents to the chat. file can be any text file or image.

🛠️ Setup

  1. Install nodejs.
  2. Set OPENAI_KEY in settings. By default Tune uses gpt4o-mini. But you can change it

⚙️ Support for Multiple LLMs

Configure different LLMs by creating a .config.js file that defines the HTTP payload.

Example: Setting up Claude with openrouter.ai

({
    url: "https://openrouter.ai/api/v1/chat/completions",
    method: "POST",
    headers: { 
      "content-type": "application/json",
      authorization: `Bearer ${OPEN_ROUTER_KEY}`,
    },
    body: JSON.stringify({ 
      ...payload,
      model: "anthropic/claude-3.5-sonnet"
    })
})
  • Add your OPEN_ROUTER_KEY to the .env file.

  • Use the config in your .chat file:

    u: {claude}
    What is the meaning of life?
    
  • Naming the config default.config.js makes it the default LLM used by Tune.

Note: Tune supports OpenAI and Ollama for streaming. For Anthropic models, use services like OpenRouter.ai.

🛠️ Tools

Tools are functions or scripts that LLMs can call to perform tasks beyond their native capabilities, such as querying a database or calling an external API.

Easily create tools using JavaScript, Python, or PHP. Name your tool files as toolname.tool.mjs/js/cjs/py/php/chat.

Here is a shell tool example as ESM module

sh.tool.mjs.

import { execSync } from 'node:child_process';

export default async function sh({ text }) {
  return execSync(text, { encoding: "utf8" });
}

To use the tool, just expand the file

u: {sh} 
what is contents fo the current directory?
tc: sh
ls -la
tr: total 944
-rw-r--r--   1 iovdin  staff    248  3 Oct 22:12 4o.config.js
-rw-r--r--   1 iovdin  staff    253 17 Sep 16:10 4om.config.js
-rw-r--r--   1 iovdin  staff   2128  8 Oct 17:18 README.md
c: 
 tc: stands for tool call
 tr: tool result

Here are the shell tool in other languages

sh.tool.cjs

const { execSync } = require('child_process');

exports.default = async function sh({ text }) {
  return execSync(text, { encoding: "utf8" });
};

sh.tool.py

import subprocess

def main(params):
    return subprocess.check_output(params['text'], shell=True, text=True)

sh.tool.php

<?php
function main($params) {
    return shell_exec($params['text']);
}
?>

It is possible to make tool out of another chat. Lets create a tool that gives file a name given text content

filename.tool.chat

s: You're given text content, please come up with a filename for the content.
it should use camel case
u: {text}

How to use it?

s: {filename}  
tc: filename
console.log("hello world")
tr: HelloWorld.js

Checkout more tools at Tune GitHub

Help System Prompt

You can ask for help using

s: {esc:tune_help}
u: how to make a tool? 

📫 Contact

For any inquiries or support, feel free to open an issue on GitHub. Or drop a message to Tune Discord Channel

About

llm chat in text file

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published