Just a quick one from us to let you know about a few changes we released last week that you probably want to hear about.
Underlying model prices are changing all the time (usually getting cheaper) and we are still committed to passing these savings on to you so you get the best value out of Chat Thing possible.
We've made two changes that are going to positively effect your bots.
More message tokens for all plans
First, we've significantly boosted the number of message tokens on available on each plan! This is due our default model (gpt-3.5-turbo-0125) pricing being massively reduced by OpenAI.
For most plans your message tokens will have almost doubled over night!
Message tokens that go further
The second change we have made relates to how we calculate the tokens used while chatting to your bots.
We now count input and output tokens separately so they can be charged at different rates.
Many model providers charge different amounts for input tokens and output tokens.
Input tokens are often cheaper and include the prompt, the message, and anything else that is sent to the model when you chat with a bot. Output tokens are often more expensive and consist of the responses generated by your bots.
Previously, we treated all the tokens used by your bot the same, but now we have introduced multipliers for input and output tokens so we can adjust the number of message tokens they use to better reflect the underlying models.
In general you will use way more input tokens than output tokens, because every time you send a message the input tokens are made up from your bot prompt and entire chat history, so this new change should mean that on the whole your message tokens go further.
We have added a section to our docs explaining this change, include a table with each of the model's input and output modifier values.
https://chatthing.ai/docs/what-are-message-tokens
AI bots are becoming more affordable every day
Our biggest cancelation reason is its "too expensive" and to be honest, until recently it was hard to disagree, especially if you need the more expensive models.
Luckily over the past 6 months we have been able to reduce the cost of our models multiple times and will continue to do so whenever the underlying cost changes.
At the current rate of price reductions AI bots will soon be extremely affordable for everyone and even more use-cases!