Support more complex types in Avro format, like Dictionaries and Arrays
When trying to integrate a more complex scenario using Event Hub archive feature, I wasn't able to process this messages because Data Factory copy activity didn't support Dictionaries. When trying to use Stream Analytics writing to Avro format it didn't work because of the Arrays. More complex end-to-end scenarios should be supported.
Saumyakumar Suhagiya commented
Also, looks like avro is not preserving DateTime datatype as well.
This problem should really be adressed - It makes no sense that the Azure Event Hubs default format for data capture cannot be read by the Azure Data factory driver
ADF should definitely be able to process captured data from Event Hub or IoT Hub as well as ASA Avro Outputs!
Kjetil Tonstad commented
Same issue with data types like ST_GEOMETRY. Although some data in Oracle contains only text, it doesn't work in the Data Factory when using this data type
Nuno Centeno commented
Is there any update regarding this? Doesn't seem plausible that we can't put two different services in Azure working together.
Lars Kemmann commented
I agree 100%. Azure Data Factory would be the *ideal* solution for moving Avro data from Event Hubs into a user-specific "cold store" (Storage Tables, Data Lake, etc.). Any other solutions (Stream Analytics, Functions, etc.) require more work and/or much higher cost to implement and monitor.