Dynamics 365 CRM/CDS Storage Pricing & Enforcement
Storage for Dynamics 365 environments has been a hot topic for us over the past few months. Microsoft announced the new storage pricing a good while ago, but have just recently started enforcing limitations on customers that are consuming more storage than their subscription, which is putting pressure on everyone to get current.
Here’s a quick refresher on what’s changing for Dynamics 365 Storage:
The previous storage model, aka the “legacy storage model” had all storage covered under one add-on SKU, usually priced around $5-$10 per GB per month.
In the new model, Microsoft is making a distinction between three types of storage, which each have their own SKU, pricing, and per-user entitlement levels. Here’s how it breaks down:
- Database storage includes all your CRM records (Accounts, Contacts, Activities, etc.) and the definition of how your system is set up (fields, entities, workflows, etc.). You get 10 GB included with your subscription, plus 250 MB per licensed user (for Enterprise licenses). This data is at the heart of your Dynamics 365 environment, and Microsoft is pricing as such, charging $40 per GB per month if you need more than the base entitlement.
- File storage includes your file attachments, typically to your Appointments, Emails, and Notes. You get 20 GB of file storage to start with, plus 2 GB per licensed user (for Enterprise licenses). Additional file storage costs $2 per GB per month.
- Log storage includes your Audit History (the records of who changed which field, when, and what it was changed from/to). You get 2 GB of log storage, with no additional entitlements per user.
The storage entitlements are at the tenant level, so all environments will count against the same three buckets of storage. Along these lines, another change that Microsoft made the ability to create as many environments as you want, with no additional charges as long as you have a minimum of 1 GB of database storage available. We recommend all customers to take advantage of this to create a “free” sandbox environment for training and testing.
Where to find your current storage consumption
You can find your current storage consumption report in the Power Platform Admin Center, under Resources > Capacity. From this report, you can also drill in to see how much of each storage type is being consumed by the environment.
Restrictions in place
Microsoft has started putting restrictions in place for tenants that are over their storage subscription. At the moment, if you’re over, you can’t create a new environment, or copy one environment over another. If you need to take either of these actions, you’ll need to purchase at least 1 GB more than you’re currently consuming.
Now that we’re in this new paradigm of storage, we’re recommending that all Dynamics customers review their storage amounts and take some specific actions for each type to make sure the benefits are being maximized at the new cost structure. Here are our high-level recommendations:
At $40/GB/mo, the key here is to make sure all data stored in the system has a purpose and is worth keeping. For this exercise, drill into your Production environment in the Capacity Reporting.
Then ‘Download all tables’ from the ‘Top database capacity use, by table’ chart. Open this in Excel and sort it with the largest table at the top, and focus your efforts on the largest tables.
Some tables that are usual suspects in the top 10 are unfortunately not under our control (like WebResourceBase, RibbonClientMetadataBase). There’s an effort to get some attention on this from Microsoft here on the Dynamics 365 Application Ideas forum.
For the tables that we do have control over, have a look at what they are, and spend some time looking at Advanced Finds to gauge their level of use. When was the first record created? When was it last modified? Are records that old still valuable? Would you be able to get by with keeping only the last 2, 3, or 4 years of data? This will take some time to put together and get a consensus from the users on, but once decided, you can use Bulk Delete jobs to clear out the unnecessary records (and you can schedule them to keep your record retention policy in place, hands-free).
File storage has reduced in price and increased in entitlements compared to the legacy storage model, however, it’s still a good place to review as there are a couple of great options. Instead of saving the attachments to Appointments, Notes, and Emails directly to the Dynamics records, you can use integration to store them in either SharePoint or Azure Blob storage instead.
Microsoft provides an out-of-the-box SharePoint integration that handles this really well. The files themselves get stored in SharePoint (not counting towards your Dynamics File Storage), and are still linked to the records under the Files tab.
An alternative to SharePoint is Azure Blob storage, which is also very affordable. We recently built an integration that automatically moves files attached to Dynamics records to Azure Blob storage and links them back to the original record by overwriting the default link. This is a more seamless experience for the users since they don’t have to go to the Files tab as they would with the out-of-the-box SharePoint integration. Contact us if you’d like to see a demo.
Similar to the Database Storage analysis, for Log Storage you’ll want to make sure that auditing is only enabled for the entities and fields that are valuable. For the analysis, we recommend a three-pronged approach:
Looking Back: Audit Log Retention Review
Go to Settings > Auditing > Audit Log Management, and take a look at how far back your audit logs go.
Do you really need audit history back that far? Would you be able to get by with keeping the last 1, 2, or 3 years of audit data? This might take some discussion with the users, but once agreed upon, you can delete the audit logs that aren’t necessary. Note that the audit logs are stored in three-month blocks, and you can only delete them one at a time, starting with the oldest. Once the initial round of deletions is done, schedule a quarterly recurring task to review and delete the oldest audit record, which will keep your rolling retention policy in place.
Looking Forward: Entity-Level
Auditing can be controlled at the entity and field level, so the next step is to review which entities have auditing enabled. Use the Audit Center tool in the XRMToolBox to quickly get a view of what’s being audited and speak with the users to determine what’s really needed, and disable entities accordingly.
Looking Forward: Field-Level
Once the entity pass is complete, for the entities with auditing still enabled, review all the fields that have auditing turned on using the Audit Center tool. Again, have a chat with the users about what’s important, and disable the fields that aren’t needed.
At this point, you should have your audit logs as clean as they’re going to get. Depending on how long you’re keeping your audit logs, it might take a few years for your “Looking forward” changes to take full effect (disabling an entity or field doesn’t delete the past audit data – it just won’t save it going forward).
In addition to the maintenance steps above, we’ve also built a tool called Dynamic Audit Locker to offload your audit data to Azure and keep it accessible on the records for the users. This tool is designed to be more affordable than the $10/GB/mo subscription from Microsoft while giving some extra features to the users (namely the ability to search and sort the audit history of a record), and Power BI access to the audit data for admins and power users (great for adoption monitoring). Click here to check out a video demo and let us know if you’d like to talk about setting this up in your environment.
Planning Your Storage Purchase
To help ensure you’re not needing to add new storage every week, we recommend reviewing your rate of usage and purchasing enough storage to cover at least 6 months of planned usage. Microsoft provides some trending data on the individual tables in the capacity reporting (shown above), but we find it easier to manually record the Database, File, and Log storage levels once a month in Excel and using that data for the trending.
Another way to go about this is to target a specific percentage utilization with the current usage. Let’s say that your Database usage is at 50 GB, and you’ve got 15 GB available. At this point, you’d be 35 GB short (233% over). If you wanted to be at 50% capacity today (we recommend keeping below 80% utilized to ensure there’s enough runway for growth), you’d need to purchase 100 GB (50 GB / 50%).
Repeat this process for calculating all three storage buckets, and make the purchase to get back under the limits.
It’s important to have a look at these new storage prices and their impact on your overall solution costs well ahead of your license renewal. Depending on your mix of users and current storage amounts, the new paradigm might cost less or could cost a lot more. Let us know if you need any help with your analysis.