Take the following code using RΞASON:

import { reason } from 'tryreason'

interface Joke {
  /** Use this property to indicate the age rating of the joke */
  rating: number;
  joke: string;

  /** Use this property to explain the joke to those who did not understood it */
  explanation: string;
}

const joke = await reason<Joke>('tell me a really spicy joke')

The joke object would be:

{
  "joke": "I'd tell you a chemistry joke but I know I wouldn't get a reaction.",
  "rating": 18,
  "explanation": "This joke is a play on words. The term 'reaction' refers to both a chemical process and a response from someone. The humor comes from the double meaning, implying that the joke might not be funny enough to elicit a response."
}

And if we just add a new property to the Joke interface:

interface Joke {
  // ...
  topics: string[]; // 👈 new property
  // ...
}

const joke = await reason<Joke>('tell me a really spicy joke')

The joke object would now contain a topics array:

{
  "joke": "I'd tell you a chemistry joke but I know I wouldn't get a reaction.",
  "rating": 18,
  "topics": [
    "chemistry",
    "puns",
    "science"
  ],
  "explanation": "This joke is a play on words. The term 'reaction' refers to both a chemical process and a response from someone. The humor comes from the double meaning, implying that the joke might not be funny enough to elicit a response."
}

Wait, what?!

You probably noticed by now that RΞASON is different from others frameworks because it actually utilizes your Typescript type information to change the actual output of your program. This is a key distinction: RΞASON uses Typescript (& JSDoc comments) not only at compile-time but also at runtime.

A quick shoutout to Instructor, a Python library that is where we got the idea to use interfaces to get structured output from a LLM. Its an awesome library and if you are using Python, check them out.

Design

RΞASON is a backend Typescript framework for building LLM apps. We also have frontend libraries to make it easy to make a great experience for the final user.

We call RΞASON “minimalistic” because it is laser-focused on three areas only:

  • String parsing:

    RΞASON returns JSON objects so you never have to parse a raw completion again. RΞASON also has an incredible syntax for you to declare your own agents and never have to worry about calling your functions and passing parameters.

  • Streaming:

    RΞASON handles both streaming-in and out of your app. Never worry about HTTP streams from OpenAI or to your frontend again.

  • Observability:

    RΞASON is OpenTelemetry compatible — allowing you to use any observability tool you wish.

Whether you are creating a complex multi-agent environnment or just adding a simple LLM call to an app, RΞASON can help you deliver a great experience for your user.

The philosophy behind RΞASON

We believe LLMs are a new primitive that programmers can use — not a new way to program; its still up to the programmer to do the right thing & create the right abstractions.

At the core of RΞASON, we believe its the developers job to learn the new concepts that comes from this new primitive, such as prompting & retrieval. That’s why RΞASON does not interfere on those areas — we actively stay away from them.

RΞASON is a functional transparent box that interops with normal TS code; not a const agent = new ReasonChatAgent() that is terrible to extend and customize.

To be clear, prompting & retrieval are key areas to the sucess of your LLM app, and, precisely because of that you should be the one in charge of it — not the framework. It is the job of the framework to focus on areas that do not differenciate your business, such as string parsing and handling HTTP streams.

Where to go next

If are new to LLMs in general, we recommend going through the LLM 101 guide.

If you already somewhat know LLMs but are new to RΞASON, we recommend you proceed to the quickstart. Here are some other useful links: