Power BI Semantic Model Memory Errors, Part 2: Max Offline Semantic Model Size

In the Power BI Service, Import mode models are stored offline and paged into memory only when they are needed – for example because someone runs a report that uses the model as its source. As discussed in my last post, though, there’s a limit on the amount of memory that a model can use which varies by the size and type of capacity you’re using using. There’s also an extra complication in that Import mode models are stored offline in a compressed format that means the Power BI Service doesn’t know exactly how much memory will be needed if the entire model needs to be held in memory. As a result there is an additional limit enforced on the size of the offline copy of Import mode models to ensure they don’t use too much memory when they are eventually paged in.

This limit can be configured in the Admin Portal by setting the Max Offline Semantic Model Size property (still shown as the Max Offline Dataset Size in the UI at the time of writing):

The default value of this property is 0, which means that the limit will be set to the maximum allowed value for the capacity SKU you’re using. Different maximum limits are enforced if you have turned on the Large semantic model storage format option for your model and if you haven’t. The maximum limits for the Large model format are the same as the limits on the maximum amount of memory that a model can use, as listed in the table here; for the Small model format they are published here. You can also set the property to a value that is lower than the allowed limit if you want to control the size of the models that your developers can publish – smaller models generally use fewer CUs during refresh or querying, which could help reduce the load on your capacity.

To test what happens when you exceed this limit I created a new F16 capacity (which has a maximum offline model size of 5GB), and then created a workspace on it containing an Import mode Power BI semantic model using the Large model format that was 3.3GB in size. I then scaled my capacity down to an F2 and then tried running a DAX query against the model. The Max Offline Semantic Model Size limit for an F2 capacity for models with the Large model format enabled is 3GB, which is less than the size of the model I had created. As a result the query returned the following error message:

Database ‘xyz’ exceeds the maximum size limit on disk; the size of the database to be loaded or committed is 3500853108 bytes, and the valid size limit is 3221225472 bytes. If using Power BI Premium, the maximum DB size is defined by the customer SKU size (hard limit) and the max dataset size from the Capacity Settings page in the Power BI Portal.

If you’re getting this error then you need to reduce the size of your model, for example by reducing the number of rows in your tables, removing unused columns, or reducing the number of distinct values in the columns you keep. You can find more suggestions on how to reduce model size here.

[Thanks to Akshai Mirchandani for answering my questions on this topic]

3 thoughts on “Power BI Semantic Model Memory Errors, Part 2: Max Offline Semantic Model Size

  1. Hi @Chris Webb,

    Thanks for sharing. Can you please include throttling error in your series.. This is a kind suggestion.

  2. Hi Chris,
    Do you also have a link where the offline file size and Memory size limits are described for Fabric capacities? It seems like the link you have, only points to the limits for Power BI Premium.

    Regards
    Steen

Leave a Reply