Identify and reject duplicate log transmissions
We experienced occasional duplicate log entries in application insights
that we were able to trace back to failed transmissions:
An error occurred while sending the request. | Unable to read data from the transport connection: Connection reset by peer. | Connection reset by peer
The ServerTelemetryChannel, which we are using, will then store the transmission in the file system and retry after a certain interval. The issue is that the first transmission was also successful and a networking issue happened for some unknown reason afterwards. So the data gets processed twice.
To prevent this, I would propose the following solution:
- Add an id/digest property to the API that is receiving log transmissions
- implement a logic, that will reject a transmission, if a transmission with the same id has already been processed earlier for the same instrumentation key.
This will allow the ServerTelemetryChannel SDK to implement a logic where the id remains constant for one transmission and its retries, retaining the ability to recover a transmission but without generating duplicate logs
