Keep recently unsupported databricks runtime engines in ADF linked services
When a databricks runtime engine becomes unsupported, it is removed from the available list of cluster versions in Data Factory's databricks linked service.
Any linked service defined to use this version prior to it becoming unsupported fails to connect to databricks from that point onward, and ADF jobs using this linked service fail to run.
The error message states "the specified spark version X.Xx-scalaX.XX is not available"
Can you please avoid removing recently unsupported databricks runtime engines from the ADF linked service or at least prevent this removal from causing previously configured linked services using it to stop working?