Azure Artifacts: Unpublished npm packages from public upstream (npmjs.org)
We are using Azure Artifacts as an npm feed to store a npm package that we produce, along with cached upstream packages from npmjs.org. Public upstream has been configured per your documentation, and we have enabled retention policies on the feed with the default values.
The problem we are facing is that public upstream packages are treated the same way as package we produce - if a package/version has not been consumed for a while it will be removed in accordance with the retention policies and later also removed from the Recycle Bin. After they have been removed, either to Recycle Bin or purged completely, any build requesting this package/version will fail.
If the package/version is still available in the Recycle Bin we can manually restore it to get the build working again - not good enough (might be production critical), if the package/version has been purged completely we should be able to get it restored by contacting Microsoft - definitely not good enough, regardless of criticality.
This is a flawed approach, the expectation must be that any packages in a public upstream will always be available (if still available in the upstream and not actively removed from our feed, e.g. due to compatibility issues or security). In the case that the cached version has been removed in accordance with the retention policies they should be seamlessly put back.
With the current implementation we are therefore forced to either;
1. Use the public upstreams directly, defeating some of the purpose of the Azure Artifact feeds 2. Keep running builds to ensure that no needed packages are ever removed 3. Disable, or use less restrictive, retention policies for feeds with public upstreams - no OK when you are soon about to charge for feed storage...
How packages/versions from public upstreams should work;
1. If a package that has been removed from the feed by the retention policies, regardless if it has been moved to the Recycle Bin or purged completely, it should seamlessly be put back when requested - build should not fail 2. If a package that has been removed manually (possibly for good reasons) it should not be put back when requested - build should fail 3. There should be a way to get manually removed packages back in case of packages being wrongly removed
Ben Sgroi commented
@Dustin -- we have managed to work around the issue by renaming/scoping all of our 1st-party packages. That way we can use a "@<scopname>:registry=https://pkgs.dev.azure.com/..." in our .npmrc in order to load all the 1st-party packages from the ADO feed, and all of the upstream packages come directly from the main npm registry. This solution also fixed another issue we were having, which is frequent failures in npm installs.
But it's certainly frustrating that this is handled so poorly in ADO.
Dustin Brooks commented
What are the best workarounds? We have our own packages in the feed and don't want to re-create the feed from scratch. We have many unrecoverable packages in the upstream that are transitive dependencies.
Assuming scoping all of our own packages down and explicitly adding them to the .npmrc is our best option.
Luse, Kaleb commented
Wholly unacceptable. As others have stated, this seems like such a common sense thing or at the very least something that should be fixed immediately given that the only acceptable workaround seems to be to stop using Azure Artifacts which my team is now currently in the process of doing.
I found this one the hard way as well by researching build starting to fail randomly!
I agree to the previous comments this should be common sense and packages from upstream sources should be available again if deleted by retention settings.
Please fix it.
Kim Wallis commented
I agree with Charles. This feels like basic functionality that should just be there, and should just work. Having a system that randomly breaks itself and then is impossible for even admin users to fix, is just not acceptable.
Charles Roberto Canato commented
I feel stupid to spend votes on something that should be common sense. Why in the earth would someone want to unpublish PUBLIC packages automatically, WITHOUT the possibility of recovering them when needed?
Ben Sgroi commented
This is a big problem for us too. We're now in a situation where certain transitive upstream packages won't install because they fell outside of our retention policies. This happened over 30 days ago, so now we have no way of installing some major packages (like Angular 11). We may have to reevaluate using Azure Artifacts as a result.
Brett Postin commented
We have just started encountering this after enabling retention policies. Suddenly random builds have started to break due to package versions being deleted from our feed.
These package versions in many cases are not even directly referenced by our projects, they are transitive dependencies. We do not want to be in a position of having to constantly maintain top-level package versions to deal with transitive dependencies that do not keep up with our retention policies. This is a constant game of whack-a-mole.
The only workaround for now is to manually restore the offending packages, that's if they are still available in the recycle bin. Increasing the retention policy version count also alleviates the issue somewhat, however we are then forced to pay for additional storage.
As the OP suggests, packages from upstream sources should not be treated the same as "published" packages. Upstream source packages should always be available. We have moved from MyGet which handles this seamlessly. The approach taken with Azure Artifacts seems to be a fundamental design flaw in the concept of upstream sources.