Overview
reason()
— and reasonStream()
— is one of the main building blocks of RΞASON. It allows you to call a LLM and gets its output:
src/entrypoints/test.ts
In order to use
reason()
you need to first set your OpenAI API Key.test
entrypoint in the RΞASON Playground will result in:

Output from `POST /test`
reason()
just calls the default LLM you defined in the .reason.config.js
file and returns the completion for you.
There’s also the streaming variant called reasonStream()
:
src/entrypoints/test.ts
`reasonStream` response
reason
vs reasonStream
It is important to note that reason()
and reasonStream()
are the same function with the distiction being that one streams data from the LLM in real-time and the other only returns when the LLM has fully finished.
For instance:
By the way: At any point you can test your entrypoint in the RΞASON Playground.
reason()
and reasonStream()
interchangeably and everything talked here applies to both functions.
Although they only differ in one aspect, there are some important concepts to learn when working with reasonStream()
, we’ll go over in the next page.
The real magic
So far bothreason()
and reasonStream()
functions seems nothing out of the ordinary. The real magic kicks in when we pass an interface
to it:
src/entrypoints/test.ts

`reason()` with an interface
Wait, what?!
Yep — RΞASON uses your TypeScript interface to actually ensure the LLM return in an object in that format.How does this work under-the-hood?
How does this work under-the-hood?
It is a two-step process:As you can see, the
- Somehow get the interface during runtime;
- Somehow ensure the LLM response conforms to the interface.
Getting the interface
When you runnpm run dev
what you are actually running is npx reason dev
, which runs the RΞASON transpiler.The RΞASON transpiler — as the name suggests — transpile your code to a different representation.For instance, the previous examples gets transpiled to:City
interface gets converted to an object that is passed as a parameter to reason()
.The inspiration came from React: they created a new syntax (JSX) for writing “HTML” in .js
files that gets transpiled to normal JavaScript. The same is true with RΞASON — except we didn’t create a new syntax, we changed how interfaces
are used.Ensuring the LLM conforms to the interface
RΞASON uses function calling to ensure a LLM returns an object that conforms to an interface.For OpenAI’s models that support JSON mode we use that instead of function calling.
reason<interface>()
or reasonStream<interface>()
will change its output as well.
If we add a new property called population
:
src/entrypoints/test.ts
population
:

`reason()` with an interface
This was just an example. In real world usage getting the population from a LLM is probably not the best idea ever.
Will the LLM always output a valid object?
This is a great question. LLMs are non deterministic, making it impossible to assure they will always return the object that conforms to yourinterface
.
How does RΞASON handle when the LLM outputs an invalid object?
By default, RΞASON will throw an error in your code and point to what properties are missing/wrong. This can be changed to anignore
behaviour, where RΞASON will not throw an error and continue as if everything was normal — read more here
What can I do to make sure the LLM outputs a valid object?
There five things you can do:- Try a different prompt;
- Try different names for the interface properties as they are passed to the LLM inside your prompt;
- Use JSDoc to create good descriptions for your properties as they are also passed to the LLM;
- Make your
interface
less complex; - Use a more capable model.
Passing interfaces
to reasonStream()
Naturally, you can also pass a interface
to reasonStream()
to have the object streamed:
src/entrypoints/test.ts
`reasonStream` response
interface
to reason()
and we’ll go over them in a bit. First, we want to highlight that RΞASON has an ESLint plugin that really helps developers.
RΞASON ESLint plugin
Since RΞASON adds a little bit of new syntax, having an ESLint plugin becomes important because it wants developers while they’re in their IDE (vscode, nvim, etc) that something is wrong. Let’s take a look at a code that will error in RΞASON:
`reason()` with an interface
npx use-reason
, the only thing you should do is install the ESLint plugin for your IDE. Most developers already have it installed, but we recommend checking to make sure.
interface
rules
The interface
you pass to reason()
(or reasonStream()
) needs to follows some rules. Here they are:
- The
interface
needs to be fully defined within the same file that callsreason()
; - The
interface
cannotextend
anotherinterface
/type
; - You can only pass an
interface
toreason()
, for instancereason<boolean>()
is invalid; - The
interface
you pass cannot be empty. It needs to have at least one property; - The
interface
cannot use utility types (Omit<>
,Partial<>
, etc);
There’s also some rules about the type that each property in your
interface
can have:
- The following are invalid types:
any
,unknown
,void
,null
,never
,undefined
andthis
; - Literal types are not allowed. A literal type is
foo: 'bar'
; property1: "foo" | "bar" | 10
is valid butproperty1: string | number
is not;
JSDoc comments
In our previous example the LLM was returning:state
to be CA
instead of California
— TX
instead of Texas
, etc. Well, that’s easy: Just ask to the LLM!
src/entrypoints/test.ts

`reason()` with an interface
/** Return the acronym only */
comment above the state
property. It worked because:
- JSDoc comments above a property in the interface are passed into your prompt to the LLM:
- Precisely, it is passed as the description for the parameter in the function calling body.
- Then the LLM just followed the instructions you passed;
- You should treat JSDoc comments the same you treat prompts — after all, they go in the prompt for the LLM.
If you are familiar with JSDoc, feel free to skip the next section.
JSDoc
Have you ever noticed that some functions have a description when you hover over them?
Description when you hover over `fs.readFileSync()`
fs.readFileSync()
definition:

The definition of `fs.readFileSync()`
- They need to be defined as
/** comment */
; - Your IDE uses them to show code hints — showing them when you hover (and some other times);
- And RΞASON uses them as part of your prompt to the LLM.
How to add a JSDoc comment
Adding a basic JSDoc comment is easy:- Every line of the comment needs to start with a
*
; - The comment has to be created with
/** {comment} */
://* I'm an invalid JSDoc comment :(
is an invalid JSDoc comment.
reason()
and reasonStream()
their only use is above interface’s properties as property description for the LLM.
`reasonStream` response
Return the acronym only
when state
is hovered.
reason()
options
There are some options you can pass to reason()
and reasonStream()
to change their behaviour:
model
: which LLM model to use;max_tokens
: the maximum number of tokens the LLM can return;temperature
: set the temperature of the LLM;validation_strategy
: what should RΞASON do if the LLM outputs an object that does not conforms to theinterface
you passed, possible values:error
: the default value and it means that RΞASON will throw an error if the object is invalid;ignore
: RΞASON will ignore the invalid object.
Example using nested objects
To solidify your knowledge, let’s go through a more complex example. Let’s say we are building a website that users can input a city they plan to visit and we show them some cool places to visit in that city:
Conclusion
In this page we learned:- How one of the building blocks of RΞASON — the
reason()
function — works. - How to pass an interface to it.
- That it exists the variant
reasonStream()
that is able to stream from the LLM.
reasonStream()
and reason()
are almost identical, their difference being that reasonStream()
streams data from the LLM. This makes reasonStream()
a bit different in its usage — since it is not a normal async function
that you can const result = await reasonStream()
.
Since a great LLM app almost always needs streaming, being able to use reasonStream()
is key to building a great LLM app. And that’s what we will go over in the next page.