According to this article https://azure.microsoft.com/en-us/support/legal/sla/log-analytics/v1_1/ SLA on indexing log data might take up to 6 hours. OMS has built in alerting that allows you to trigger actions within 5 minutes of data arrival. But if indexing takes more than 5 minutes - then what's the point of creating alert that might trigger on something that is no longer a problem, or not trigger at all if there is real problem. What is the average data indexing time? Log Analytics would be much more useful and have many more applications in real world if that indexing time is much lower. 6 hours worst case seems like a joke for any real time responding to problems system.
(Usually I see 5 seconds delay on indexing of custom logs. But couple days ago it spiked to 20 minutes for couple hours. As the result Log Analytics, which otherwise is very nice, might be not used at all, especially after reading that it actually might take 6 hours)
According to this article https://azure.microsoft.com/en-us/support/legal/sla/log-analytics/v1_1/ SLA on indexing log data might take up to 6 hours. OMS has built in alerting that allows you to trigger actions within 5 minutes of data arrival. But if indexing takes more than 5 minutes - then what's the point of creating alert that might trigger on something that is no longer a problem, or not trigger at all if there is real problem. What is the average data indexing time? Log Analytics would be much more useful and have many more applications in real world if that indexing time is much lower. 6…363 votes
We have recently published an article – https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-data-ingestion-time that details various aspects of data ingestion time for Log Analytics, and clarifies distinction between the financially-backed SLA and our Service-Level Objectives. In fact, the typical latency to ingest data into Log Analytics is between 3 and 10 minutes, with 95% of data ingested in less than 7 minutes.
We are also actively working to bring this latency down even further, and many customers already report that they experienced a significant improvement, but more is coming.
Would it be cool if you could configure Windows Server WEF (Windows Event Forwarding - http://technet.microsoft.com/en-us/library/cc748890.aspx ) to send to Advisor for Log Management scenario, without using the SCOM agent ?
Alternatively, if one already has a forwarder/collector (WEF/WEC) architecture in place, could it be possible to use just one SCOM agent/gateway to pull the 'forwarded' logs stored on that collector from that single box to the cloud.331 votes
This is currently under development, scheduled to be in preview later in 2018
Allow the collection and addition of custom fields using advanced logging or custom IIS modules. Example is to add x-forwarded-for to IIS logs in W3WC format.232 votes
This feature request is still under review and team is actively prioritizing with existing backlog. Will keep the thread updated as we move forward.
Azure WebSites write to WAD in a different folder structure. The work of this other idea http://feedback.azure.com/forums/267889-azure-operations-insights/suggestions/6519377-collect-iis-logs-from-windows-azure-diagnostics-st enables reading those IIS logs for Azure Cloud Services (i.e. web role instances) but not for Azure Web sites.
This new idea is for the latter scope.169 votes
Cloud Services / Virtual Machines write with a different container/folder structure in Azure blob than Azure WebSites. Our current ingestion processes the former, not the latter.
Anyhow, also consider the ‘generic’ idea of a platform feature to ingest your own logs http://feedback.azure.com/forums/267889-azure-operational-insights/suggestions/7928931-collect-data-from-custom-containers-in-storage-acc
OMS can collect IIS logs for web roles. Extend this capability to Azure Web Apps IIS logs as well as Azure Web Apps application logs129 votes
Would like to request a data retention interval by data type (Similar to what is done in SCOM.) Specifically, the ability to set retention timeframes on "Performance Data", "Event data", and "Analytic Data."105 votes
Work on enabling different retention intervals by data type has started. We expect this to be available later in 2019.
ComputerIP is populated with the IP Address from which Azure Log Analytics is receiving data. For nodes behind a firewall/proxy or OMS Gateway this mean to have the external IP Address of the proxy.
ComputerIP must contain IP(s) information collected by the Agent on the computer hosting it to enable Compliance and Security Scenario on the console.
RemoteIPAddress could be added as the External IP address for proxy based agents or will contains the same address of the ComputerIP for agents not behind a proxy/firewall/Gateway.
This have a serious impact on compliance in the actual implementation.100 votes
This item is still under review in our backlog. Please keep upvoting this allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested. Thank you
It's possible delete Custom Logs sended by HTTP Data Collector API?
Thanks in advance :)
Hugo Faria88 votes
Need to have the ability to modify the extraction criteria for existing Custom Fields. I have added a handful based on SharePoint ULS, but they aren't always matching properly. The only way that I have found to "improve" them is to remove the column and re-add it again.87 votes
I want to be able to filter stuff I don't want to collect in logs. For example with ACS (in SCOM) I could apply filters that didn't collect system logins. I would like this functionality in all logs, for example I would want to filter IIS logs to remove data from certain IP addresses.
I can see customers wanting to use this type of functionality when the costs of data start to pile up.82 votes
This feature is already in progress, limited preview is expected later in 2018
On Microsoft Azure you can enable Azure Storage logging. The logging information is saved in a $logs container in your StorageAccount. It would be great if we can add this log information to OpInsights. More information about how you can enable this type of logging: https://msdn.microsoft.com/en-us/library/azure/dn782840.aspx72 votes
Thanks for the suggestion. We’re looking to add more support for Azure services and this will help us prioritize.
Windows Events collected today are only from the 'classic' NT-style eventlogs (Application/System) as well as from the Crimson logs (Vista and above) that are saved in ETVX format.
It would be nice to enable collection of ETW Trace Logs too (.ETL), like /Analytics and /Debug logs.60 votes
Since there are devices like Raspberry PI with ARM architecture, it would be great if you provided binaries for ARM based Linux systems as well. Currently, I am unable to run the agent on Raspberry with Raspbian despites the tutorials available on various sites.50 votes
We are looking at making the experience for building the OMS Agent for Linux from source easier for different architectures
Here the requirement is clear/obvious. We just have not prioritized this work yet.
The overall ‘performance’ data collection needs to be refined – not just for Linux.
Right now we only collect/provide hourly aggregates of some specific performance counters related to HyperV for the ‘Capacity Intelligence Pack’ scenario.
Real time monitoring scenario might need some different shape of performance data to start with, before we enable this for Linux or for Windows alike, i.e. http://feedback.azure.com/forums/267889-azure-operational-insights/suggestions/6519061-collect-custom-windows-performance-counters
Request to introduce user defined delimiter for Custom logs
We run into issues where we're unable to delimit RabbitMQ log timestamp format
Unfortunately, there is no configuration for us to change that timestamp format in RabbitMQ and have to implement a heavy workaround in order to work around this to convert it to a date time format supported by Microsoft then forwarding it to OMS.47 votes
Thanks for the feedback. We do consider adding more timestamp formats.
One amazing idea is create custom fields on custom log sample process. Another good idea is add more timestamp samples (like ISO 8601 format, YYYYMMDDThhmmss.fffK where YYYY: Year, MM: Month, DD: Day in month, T: Delimiter, hh: Hour, mm: Minutes, ss: Seconds, fff: Milliseconds, K: Time zone offset) or add the possobility to create a custom timestamp.
It will be possible delete some imported custom logs to make some tests?45 votes
We’re planning on allowing you to import/export Custom Logs & Fields via the UI & ARM Templates. We’re currently implementing the ARM support today for most of Settings in OMS.
Thanks for sharing some of the timestamps you need. Feel free to e-mail them to me here: evanhi(at)microsoft.com
We’re actively planning way for you to specify timestamps yourselves.
Please implement a NLog target for the OMS data collector API43 votes
Recursive Log collection paths for Custom Logs
This will help users like me with folders that have logs + subfolders with logs.39 votes
Currently O365 logs are only collected for AzureActiveDirectory, Exchange, SharePoint and OneDrive workloads. Please add support for other audit log schemas as well, eg. the ones that are exposed via Office 365 Management Activity API: Teams, PowerBI, Sway, Yammer, ...38 votes
Thanks for the valuable feedback. Your feedback is open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.
connection for SCCM 2012 R2 so that we can see all hardware inv data on all managed servers32 votes
- Don't see your idea?