Increase 5MB limit for CheckPoint
Many cmdlets need to be run as InlineScript. And as Checkpoint are not available in InlineScript trying to assign the returned values from an InlineScript to a variable is a solution.... sometimes.
When dealing with larger data sets, e.g. Get-MsolUser with +1000 users this no longer works because of a 5 MB limit for a Checkpoint.

Thanks for posting this! How large does the data set get with you are working with a function like Get-MsolUser with 1000 users? Also, how frequently do you rely one PowerShell Workflow functionality in your scripts vs. using PowerShell and making the script idempotent?
3 comments
-
Ben Stegink commented
I’m dealing with now as well. Trying to work with 27000 users and running into all kinds of issues. Any resolution or update on working with large sets of Azure AD users?
-
The issue that you are seeing makes sense and our team is looking at your request and the possibility of increasing the limits on checkpoints. This was put in place to make sure that users don't abuse our system though, so it needs a bit of thought before we make changes.
There are a few options that might give you a good workaround in the meantime.
1. Use a hybrid runbook worker: hybrid runbook workers don't have fairshare applied to them so you will not hit a 3 hour time limit if you run there. You could use an Azure VM for this & spin the machine up whenever you need to run a job like this.
2. You could use some of the MsolUser parameters in order partition the user data into smaller sets that you can persist. Looks like SearchString, State, or Department might help depending on how large the data that you are looking at is. So for example, you could use searchstring to return only users whose name starts with A, then B, etc. Partitioning the returned set into 26 segments, where you could retrieve and process/checkpoint just one segment at a time. If you need to further break it down, you could use state or department.
Let me know if one of these suggestions helps fix the issue while we consider modifying throttling limits. We can discuss in more detail to try to find something that works for you if that helps as well.
SearchString <string>
The string to search for users. Only users with an email address or
display name starting with this string will be returned.State <string>
The filter for the user's state.-Department <string>
The department to filter results on. -
Philippe Creytens commented
Beth, I haven't got a clue and no means, IMO, of figuring this out. Here is the thing: I need to generate some (custom) stats for 4460 Office 365 users. If I run the script in Inlinescript (and need to because most of the Exchange Online cmdlets are incompatible with Powershell Workflow) the script hits the 3 hour barrier and is suspended. I can only assume that it later resumes, but it will resume from the start, creating a vicious circle. Checkpoints could be solution, but only when the output of the InlineScript can be stored. That's the scenario where the "stream failed... 50000 kB is hit.
Microsoft Support tried to assist, suggest this approach because it works for them, and me... but only with small result sets; i.e. Get-MsolUsers -MaxResults 500.
As there no way to "page" Get-MsolUsers (first 500, second 500, etc), I'm stuck.
I cannot imagine that we are the only company wanting to use Azure Automation to manipulate +500 users...