The llm.generate_text node will default to our LLM proxy, which bills usage directly from the Credits included in your subscription to the MainlyAI Platform. This also means that you don’t have to manually set up API keys or other credentials. To use your own models, simply connect them to the llm reciever.

Supported Models

All models are billed by Credits (¤) per 1 milion tokens. The prices listed in this table may not perfectly reflect the current pricing of the models and are only intended to give you a rough idea of the cost of using the models. Always check the pricing directly in the platform if you want to know the exact current costs.

OpenAI

ModelContextInput Cost / 1MOutput Cost / 1M
o1128k¤ 1410.26¤ 5641.03
o1-mini128k¤ 282.05¤ 1128.21
gpt-4o128k¤ 235.04¤ 940.17
gpt-4o-mini128k¤ 14.10¤ 56.41

Mistral

ModelContextInput Cost / 1MOutput Cost / 1M
ministral-3b-latest128k¤ 3.76¤ 3.76
ministral-8b-latest128k¤ 9.40¤ 9.40
codestral-latest32k¤ 18.80¤ 56.41
mistral-small-latest32k¤ 18.80¤ 56.41
mistral-large-latest128k¤ 188.03¤ 564.10

Groq

ModelContextInput Cost / 1MOutput Cost / 1M
llama-3.3-70b-versatile128k¤ 55.47¤ 74.27
llama-3.1-8b-instant128k¤ 4.70¤ 7.52
llama3-70b-81928k¤ 55.47¤ 74.27
llama3-8b-81928k¤ 4.70¤ 7.52
mixtral-8x7b-3276832k¤ 22.56¤ 22.56
gemma2-9b-it8k¤ 18.80¤ 18.80

XAI

ModelContextInput Cost / 1MOutput Cost / 1M
grok-beta131k¤ 470.09¤ 1410.26
grok-2-latest131k¤ 188.03¤ 940.17

Looking for more models? Let us know by emailing us at contact@mainly.ai!