llm-code

llm-code

tl;dr: llm-code is a CLI tool that helps you write code. It’s powered by OpenAI’s models. It’s open-source and free to use. Check it out on github or install it with pipx install llm-code.

llm-code

I’m really excited about this project. Excuse the enthusiasm.

I’ve been messing around with LLMs since leaving my job in February. In part, I’ve tried to get back to my technical roots by coding up a storm, and reading paper after paper. However, the honest truth is that I’m just having a lot of fun.

As I explore what LLMs can do, I started writing little utilities to help me program. For example, one of the most boring things an engineer has to do is write a ton of unit tests. The net result: we get lazy and unit test some bits and then hope the rest works out. What if we all had a little assistant that did the boring bits for us - like writing unit tests?

I was messing around with bits and pieces of what would become llm-code for a few weeks. Then I saw Simon Willison’s post about his llm tool and was inspired to write and release llm-code.

What is llm-code?

llm-code is an open-source CLI tool that helps you write code. It’s really straightforward to use, with a few quality of life features thrown in. I won’t go into all the things you can do here - there are a lot more examples on the github page - github.com/radoshi/llm-code, but here’s a sample.

llm-code 'write me a hello world program in rust'
fn main() {
    println!("Hello World!");
}

Simple enough.

Let’s get more complicated.

llm-code """Write a python program that uses SQLAlchemy to model the following database class:
id - int autoincrement primary key
name - string
phone - string

Add a function to create the database and a function to add a new entry to the database.
"""
from sqlalchemy import create_engine, Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker

Base = declarative_base()
engine = create_engine('sqlite:///phonebook.db')
Session = sessionmaker(bind=engine)

class Contact(Base):
    __tablename__ = 'contacts'
    id = Column(Integer, primary_key=True, autoincrement=True)
    name = Column(String)
    phone = Column(String)

def create_database():
    Base.metadata.create_all(engine)

def add_contact(name, phone):
    session = Session()
    new_contact = Contact(name=name, phone=phone)
    session.add(new_contact)
    session.commit()

create_database()
add_contact('John Doe', '123-456-7890')

That’s pretty darn good!

Quality of life features

A common pattern that I’ve been using is to get the LLM to output something, examine it, and then do something with it. Copy-pasting console output becomes tedious, and in theory, one could use tee to store it away, but what if the tool just cached the previous response and repeated it if you asked it exactly the same question?

llm-code """Write a python ...""" > contact.py

No round-trips to OpenAI necessary.

Didn’t like the output? Want to reroll or use the more expensive GPT-4 model? No problem. Changing any parameters or using the -nc option gets rid of the caching.

llm-code --gpt-4 """Write a python ..."""

Remembering responses

llm-code by default stores all interaction with the llm in ~/.llm_code/db.sqlite (just like llm). You can examine all old ouputs, even look at the input and output tokens to estimate costs (if that’s your thing). Recommend using Datasette to view this database easily, but any tool will suffice.

Installation

Install quickly using pipx

pipx install llm-code

Configuration

You need an API key from OpenAI. Store it in an environment variable called OPENAI_API_KEY or in an env file in ~/.llm_code/env.

Enjoy coding again

The whole goal of this tool is to help take some of the pain away from doing great software engineering. It’s very evident, after using Copilot and ChatGPT and now llm-code that we’re in a whole new age of programming productivity. These tools don’t make software engineers obsolete - they give us super powers and help us go 10x faster.

Happy coding!