Azure Stream Analytics to handle case sensitive columns (currently getting InputDeserializerError)
Azure Stream Analytics currently does not support two column with names only differing by case as input.
//input data
""shopperId"":""1234567"",
""shopperid"":"""",
//ASA returns InputDeserializerError error in input
{"Source":"eventhubinput","Type":"DataError","DataErrorType":"InputDeserializerError.InvalidData","BriefMessage":"Could not deserialize the input event(s) from resource 'Partition: [0], Offset: [12894685888], SequenceNumber: [104569]' as Json. Some possible reasons: 1) Malformed events 2) Input source configured with incorrect serialization format","Message":"Input Message Id: Partition: [0], Offset: [12894685888], SequenceNumber: [104569] Error: Index was out of range. Must be non-negative and less than the size of the collection.\r\nParameter name: index","ExampleEvents
ASK to Azure Stream Analytics team
Please consider to change Azure Stream Analytics to be able to handle case sensitive columns, just like Newtossoft.Json can handle. This is important for users who work with Elastic Common Schema and Azure Streaming Analytics.
Thank you,

1 comment
-
Anonymous commented
Please refere to the Elastic Common Schema
https://www.elastic.co/guide/en/ecs/current/ecs-custom-fields-in-ecs.htmlIt excplictly recommends using the same field names that differ by case.
What happens to your customs that want to leverage ECS migration paths? They will have to look at an alternative streaming analytics solution to keep data consistant across all storage paths.