Azure Monitor-Log Analytics

Welcome to the "Azure Log Analytics ":https://azure.microsoft.com/en-us/services/log-analytics/ Feedback page. We appreciate your feedback and look forward to hearing from you. Use this site for new ideas and bug reports or to request help.
NOTE – Log Analytics is now a part of Operations Management Suite. Learn more at http://microsoft.com/OMS

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Please DO NOT RETIRE the demo site: https://portal.loganalytics.io/demo#/query/main

    For log analytics, there is a demo site: https://portal.loganalytics.io/demo#/query/main, which is very useful for testing purpose since it has a lot of data.

    And now I see it will be retired on September 2nd, 2019, please keep it alive.

    26 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  2. Send logs for all Office 365 audit log entries/schemas

    Currently O365 logs are only collected for AzureActiveDirectory, Exchange, SharePoint and OneDrive workloads. Please add support for other audit log schemas as well, eg. the ones that are exposed via Office 365 Management Activity API: Teams, PowerBI, Sway, Yammer, ...

    40 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the valuable feedback. Your feedback is open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  3. Collect Log Analytics Logs by vNet not through internet.

    We would like to collect logs without going through the internet.
    We need to be able to connect some URLs on the Azure Datacenter via the Internet.
    Communication is encrypted and URLs are fixed, but important data such as security logs are sent, so it is more secure if we can collect it simply by connecting to vNet.
    If this feature is implemented, we can collect more important data such a customer data and audit information.

    40 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the valid suggestion. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  4. custom logs timestamp

    Please add this timestamp delimiter YYYY-M-D HH:mm:ss.

    Currently have logs in an application that use this and would like to gather the custom logs using this delimiter format.

    18 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the feedback. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  5. Color customization in log analytics chart

    A feature to customize the color of the charts should be available, as the default colors don't make any sense in many scenarios (green, blue, etc). Furthermore, when publishing the charts into a dashboard, the colors and chart shape are changed again (from pie to donut and from green and blue to dark blue and light blue).
    The ability to choose the colors should be added to the "render" method and consistency between the actual chart in log analytics and the published chart in the dashboard are very important features that should be available.

    23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the valid suggestion. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  6. DCR: powershell cmdlet to collect activity log across subscription in Log analytics workspace

    We have activity log solution here: https://docs.microsoft.com/en-us/azure/azure-monitor/platform/collect-activity-logs
    cx want to do the below steps by powershell command:
    - Configure activity logs to go to your Log Analytics workspace.
    - In the Azure portal, select your workspace and then click Azure Activity log.
    - For each subscription, click the subscription name.

    Currently we don't have any powershell commands to do the same, as I know, we have some powershell command to enable custom logs in log analytics, hence please also provide similar command to connect activity logs in workspace across subscription.

    35 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  7. App Insights should have the capability to have a logarithmic chart.

    When capturing stats against the 95/99th percentile latencies, it would be useful to have the ability to apply a logarithmic chart to App Insights. Currently, this can be completed in Splunk.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the feedback. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  8. Include "Valued caching policies" on the analytics result

    It would be very helpful if you have a built-in mechanism for counting value cache misses/success.

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for your feedback. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  9. Log analytics to support multi dimensional metrics input from Event Hubs

    Sending multi-dimensional metrics via diagnostic settings is not currently supported. Metrics with dimensions are exported as flattened single dimensional metrics, aggregated across dimension values.
    For example: The 'Incoming Messages' metric on an Event Hub can be explored and charted on a per queue level. However, when exported via diagnostic settings the metric will be represented as all incoming messages across all queues in the Event Hub.

    The Diagnostics and Metrics team did confirm that the Incoming messages metric is multi-dimensional. And because of this when sending the data to Log Analytics it will only present all of the incoming messages…

    4 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the feedback . Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  10. Support millisecond/microsecond precision for time-generated-field in HTTP Data Collector

    Support millisecond/microsecond precision for time-generated-field in HTTP Data Collector. Use ISO 8601 format YYYY-MM-DDThh:mm:ss.msecZ

    2 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the feedback. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  11. Recursive Log Collection paths

    Recursive Log collection paths for Custom Logs

    This will help users like me with folders that have logs + subfolders with logs.

    42 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  12. Log Analytics - allow configuring different data collection settings per connected machine\agent in the same workspace

    for example: in a scenario where two windows machines are connected to the same log analytics workspace, provide the option to ingest windows performance counters data to the workspace only for the first machine (when both machines connected to the same log analytics workspace)

    This feature will be very useful when a customer needs to use a single workspace for different use cases, but willing to keep all the data in a single location, while preventing unnecessary data ingestion which will result in additional costs.

    16 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  13. Allow user to define the sample data set for Custom Field mapping with FastExtract

    Log Analytics provides for modeling custom fields in custom logs using FastExtract.

    The mapping tool limits the user to the top 100 log entries when doing the modeling of custom fields. This can be seen in the interface when doing a field extract by clicking the "hide tips" link in the right hand side. The tips scroll away revealing the Condition section which indicates a "take 100" limit.

    I've struggled to create fields that contain all possible values for a field or column in my custom logs due to this arbitrary limit.

    A much more useful implementation would be to…

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  14. No AzureActiveDirectory audit data in workspace after recreating ws with same name

    AAD diagnostic Settings do not update the id of the Workspace if this newly created one gets the same name again.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the feedback. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  15. Log rotation includes dynamic folder name change

    I am importing backup log to Azure log analytics via custom logs and will create alert if it fails ,but log rotation is enabled which will replace folder also,below is example.

    /mnt/backupfileshare/tool/flow/0/2018-11-15_0744/azure-backup_file_incr0_2018-11-15_0744_9535.log
    /mnt/backupfileshare/tool/flow/0/2018-11-16_0744/azure-backup_file_incr0_2018-11-16_0744_9535.log

    here my folders 2018-11-15_0744 & 2018-11-16_0744 also changing dynamically with log file.how to import this kind in Azure log analytic custom log

    MS URL:
    https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-custom-logs

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the valid suggestion. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  16. Regex to support MM/DD/YYYY HH:MM:SS

    Need a Regex option in Custom Logs to support 24hr formatting minus the AM/PM requirement.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Log Management and Log Collection Policy  ·  Flag idea as inappropriate…  ·  Admin →
  17. Custom Logs to support Unicode files

    SQL Server supports unicode files only and this is not a supported format to import into custom logs. https://blog.sqlauthority.com/2018/05/14/sql-server-fix-msg-22004-the-log-file-is-not-using-unicode-format/

    9 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  18. Can I monitor process on Linux?

    Linux Agent can not monitor process other than custom log.
    When is process monitoring installed as a standard function?
    Customer wants to use it.
    Because customer need to take cost for using Custom Log.
    Now, customer redirects result of ps command to Custom log file.
    They want to stop these operation.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  19. Populate ComputerIP field with agent manager Computer IP address

    ComputerIP is populated with the IP Address from which Azure Log Analytics is receiving data. For nodes behind a firewall/proxy or OMS Gateway this mean to have the external IP Address of the proxy.
    ComputerIP must contain IP(s) information collected by the Agent on the computer hosting it to enable Compliance and Security Scenario on the console.
    RemoteIPAddress could be added as the External IP address for proxy based agents or will contains the same address of the ComputerIP for agents not behind a proxy/firewall/Gateway.
    This have a serious impact on compliance in the actual implementation.

    106 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  20. Auto Extraction of JSON Data

    It needs the ability to import JSON files (and other fomats such as XML) and have the fields auto extracted as custom fields. Without this, it makes searching on new fields cumbersome and creating new custom fields for all new JSON fields isn't feasible.

    This would more closely match the capabilities of Splunk and allow more people to make a more seamless transition.

    8 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
← Previous 1 3 4 5 6 7
  • Don't see your idea?

Feedback and Knowledge Base