Get Started
Mainly LLM
Out of the box LLM models
The llm.generate_text
node will default to our LLM proxy, which bills usage directly from the Credits included in your subscription to the MainlyAI Platform. This also means that you don’t have to manually set up API keys or other credentials. To use your own models, simply connect them to the llm
reciever.
Supported Models
All models are billed by Credits (¤) per 1 milion tokens. The prices listed in this table may not perfectly reflect the current pricing of the models and are only intended to give you a rough idea of the cost of using the models. Always check the pricing directly in the platform if you want to know the exact current costs.
OpenAI
Model | Context | Input Cost / 1M | Output Cost / 1M |
---|---|---|---|
o1 | 128k | ¤ 1410.26 | ¤ 5641.03 |
o1-mini | 128k | ¤ 282.05 | ¤ 1128.21 |
gpt-4o | 128k | ¤ 235.04 | ¤ 940.17 |
gpt-4o-mini | 128k | ¤ 14.10 | ¤ 56.41 |
Mistral
Model | Context | Input Cost / 1M | Output Cost / 1M |
---|---|---|---|
ministral-3b-latest | 128k | ¤ 3.76 | ¤ 3.76 |
ministral-8b-latest | 128k | ¤ 9.40 | ¤ 9.40 |
codestral-latest | 32k | ¤ 18.80 | ¤ 56.41 |
mistral-small-latest | 32k | ¤ 18.80 | ¤ 56.41 |
mistral-large-latest | 128k | ¤ 188.03 | ¤ 564.10 |
Groq
Model | Context | Input Cost / 1M | Output Cost / 1M |
---|---|---|---|
llama-3.3-70b-versatile | 128k | ¤ 55.47 | ¤ 74.27 |
llama-3.1-8b-instant | 128k | ¤ 4.70 | ¤ 7.52 |
llama3-70b-8192 | 8k | ¤ 55.47 | ¤ 74.27 |
llama3-8b-8192 | 8k | ¤ 4.70 | ¤ 7.52 |
mixtral-8x7b-32768 | 32k | ¤ 22.56 | ¤ 22.56 |
gemma2-9b-it | 8k | ¤ 18.80 | ¤ 18.80 |
XAI
Model | Context | Input Cost / 1M | Output Cost / 1M |
---|---|---|---|
grok-beta | 131k | ¤ 470.09 | ¤ 1410.26 |
grok-2-latest | 131k | ¤ 188.03 | ¤ 940.17 |
Looking for more models? Let us know by emailing us at contact@mainly.ai!