Metric alert lifecycle on 30 days expiration
Currently, Azure Monitor removes alerts after 30 days. For metric alerts, it creates the situation where active (fired) alerts can be removed by the system BEFORE it get into resolved state. This leads to condition when there is no visual indicators in the UI about unhealthy state of the metric. At the same time, Azure Monitor does not fire new notifications as it considers that underlying metric conditions didn't change.
This gap in the Azure monitoring experience has to be addressed. Ideally, I would love to see next behavior. When system removes 30-days old alert with "Fired" monitor condition, it sets alert's monitor condition to "Resolved". This will allow Azure Monitor to fire new alert on the next metric evaluation and this will keeps alert visible until it is actually resolved.
P.S. This suggestion is based on the outcome of the support case with Azure Support where we have tried to find reason of the metric alerts not being fired. The suggested workaround to set alert with autoMitigate = false doesn't solve the issue as it triggers flood of the new alerts and notifications.