Home / Model Expert Homepage / Model Expert Help / aiExpert / aiExpert Settings
Import an MDG to create a Reference Model
Importing and exporting Reference Models
Editing RM Connector type properties
Stereotypes inheriting from other Element Types
Customizing Reference Model Properties
Customizing Reference Model Element Properties
Table of contents
aiExpert Settings
Table of contents
aiExpert talks to either the Open AI ‘GPT’ or Google Gemini models.
To do this, it needs to have an account on which every LLM you choose to use. We currently use GPT5 for our testing, as we have a paid-for account on that platform.
As a user of aiExpert, you get 50 free queries to our paid-for account. After that, you need to get your own paid-for account. All calls via an API to the OpenAI GPT models are paid for.
What will it cost?
As wait said above, your first 20 queries are free – paid for by eaTeamWorks.
After that, once you have an OpenAI account, making request via aiExpert is currently quite cheap: we have only use USD 0.63 since we started creating aiExpert. We think this is because the amount of data we are sending to the LLM is quite small (by LLM standards) and it’s all-text data, which is by far the cheapest.
As of October 2025, the pricing for GPT-5 Mini, which is our default LLM, is USD 0.25/1M tokens for input, and USD 2.00 per 1M output tokens. As a guide, a request to the LLM for one of our test BPMN diagrams (which had 15 elements) uses about 300 tokens, and the response (for a deliberately bad example with lots of errors) was about 15,000 tokens. So you can see that each call is very cheap. So loading your account with USD20 will probably last you for a long time.