My SharePoint is getting throttled. What can I do?

Throttling is something that is always in the back of mind when migrating content to SharePoint using scripts. Microsoft limit the number of calls to various API’s to ensure other users and M365 tenants aren’t impacted by the actions of others. Fair enough, imagine the complaints if SharePoint randomly slowed down when someone in a different organisation did a bulk action.

Throttling limits differ based on the number of licensed users in your tenant. This makes sense because the volume of general activity will be higher in larger tenants. How there is a problem, just because you are a smaller organisation, doesn’t mean you have less content.

How can you reduce the impact of throttling in a few different ways.

Take your time and plan your migration

Give yourself plenty of time to upload when planning your migration. Consider doing a staged migration where you do the initial content migration before go-live and migrate recently changed content only during your migration window.

Throttling limits are in 5 minute intervals and are the most common throttling. Pausing for a few minutes before resetting the allocation. Per App tenant limits are based on a 24 hour period, this is a hard limit. Details of the throttling thresholds can be found here.

You should also consider if all content needs to migrate or whether you just need more recent documents (archiving the old stuff) e.g., the last 2 years content. If you are migrating from a document management system, consider whether you need the most recent published document or if you need version history. The answer may differ for different areas of content.

PnP PowerShell Batch Updates

Rather than uploading documents individually, PnP PowerShell supports batch mode (New-PnPBatch). This is a feature of the API used to upload documents, so if you are building a custom solution, you can use the same technique.

Example PowerShell Script using Batch to upload files

This code uses an input CSV file to bulk create List items.

# Read CSV file
$ListItems= Import-CSV c:\test\datafile.csv -Header Title,Type,Location

$batch = New-PnPBatch
 
foreach($item in $ListItems) 
{
    Add-PnPListItem -List "MyList" -Values @{"Title" = $item.Title; "Type" = $item.type; "Location" = $item.location } -Batch $batch
}
 
# Execute the batch all at once
Invoke-PnPBatch -Batch $batch

Things to note:

  • New-PnPBatch: Creates a batch to group operations.
  • Add-PnPListItem -Batch : Creates List items with the values from the input variables.
  • Invoke-PnPBatch: Sends all operations in a single request.

Batch processes increase the upload rate significantly and will reduce the time it takes to bulk load, especially when you very large numbers of items to migrate.

Unfortunately Batch uploading documents is not supported however you can update metadata values using Set-PnPListItem (use the Document Library as the List name) and item ID (add an ID column to the input CSV file and use this code to update.

Set-PnPListItem -List $ListName -Identity $ID -Values $ItemValue -Batch $Batch 

Alternatives to scripting

In most cases migrating files to SharePoint with a migration tool is a practical option. I tend to use PowerShell when migrating from legacy document management systems where I want the metadata as well as the document.

The SharePoint Migration Tool and Microsoft 365 Migration Manager support some of the more common migration requirements.

You can still get throttling with migration tools, especially if you are running multiple migrations at once in your tenant.

Can we get a temporary exemption?

Microsoft does not allow exemptions to the throttling limit for SharePoint content, so this isn’t an option unfortunately. Microsoft does off a “Data Box” service for very large migrations to SharePoint.

Learn more about throttling


Discover more from SharePoint Moments

Subscribe to get the latest posts sent to your email.

Leave a comment