Azure Monitor-Log Analytics

Welcome to the "Azure Log Analytics ":https://azure.microsoft.com/en-us/services/log-analytics/ Feedback page. We appreciate your feedback and look forward to hearing from you. Use this site for new ideas and bug reports or to request help.
NOTE – Log Analytics is now a part of Operations Management Suite. Learn more at http://microsoft.com/OMS

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Cannot display Threshold

    Cannot display Threshold

    When we execute the following query, Threshold will not be displayed.
    If we specify "Computer" on "summarize", the Y-axis will be plotted using the "avgCounterValue" values of all computers.
    At this time, the Threshold value is not avg
    CounterValue, therefore, it is not plotted on the graph.

    Perf
    | where TimeGenerated > ago(30m)
    | where CounterName == "% Processor Time" and ObjectName == "Processor" and InstanceName == "_Total"
    | summarize avg(CounterValue) by bin(TimeGenerated, 30s), Computer
    | extend Threshold = 10
    | render timechart

    If query for a single computer, the threshold will be displayed. However, it…

    32 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Search UI and Language  ·  Flag idea as inappropriate…  ·  Admin →
  2. Enable configuration of multiple log analytics workspaces for linux agent

    Currently the log analytics agent for Linux only supports the configuration of one log analytics workspace. It'd be of advantage to be able to send the logs to multiple log analytics workspaces.

    We have a probably quite common setup among mid to large size enterprises that consists of hub(s) and multiple subscriptions connected to them. For Update Management we are utilizing Azure Automation which should be controlled through our hub(s). Azure Automation therefore requires to be linked to one log analytics workspace in the same subscription.

    At the same time the development / operations teams that make use of the…

    36 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  3. Display of supported service tags

    The below docment is the network requirements for using the Log Analytics agent. However, there is no description of service tags to use Log Analytics agent.

    https://docs.microsoft.com/en-us/azure/azure-monitor/platform/log-analytics-agent#firewall-requirements

    On the other hand, the service tag is described below, but it is inaccurate and difficult to understand.

    https://docs.microsoft.com/en-us/azure/virtual-network/service-tags-overview#available-service-tags

    Currently, I think we can communicate between Agent and Log Analytics workspace with the following service tags.
    However, especially, "GuestAndHybridManagement" is not listed on the service tag or network requirements docments.
    -AzureMonitor
    -Storage
    -GuestAndHybridManagement

    I would like to check the service tag as official information. Please descrive the service tag for using the Log…

    23 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  4. Change a parameter in the URL of the Workbook whose query contains some parameters

    Currently, we can generate an URL for the workbook. However, it contains only the information of workbook name. However, some customers configure the workbook with Kusto query which contains some parameters. In such a case, it would be great if the parameter could be set in URL of the workbook.

    31 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Workspace Settings / Administration  ·  Flag idea as inappropriate…  ·  Admin →
  5. Add the Continous Export function to Log Analytics (as seen in Application Insights) or provide similar functionality

    Still not quite at feature parity between Application Insights and Log Analytics. Both can now set the retention period to 730 days - but if you need to store logs for longer, only Application Insights has a function to do this via Continous Export.

    Ideally, we want to be able to long-term store logs for more than 730 days. We don't necessarily need the analytics aspect, so archive/cold storage in a storage account (or the like) would be perfectly fine over increasing the retention period past 730 days.

    60 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for your feedback and its now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  6. Collect Automation Account - State configuration (DSC) - Refresh mode

    In Azure automation and in Log Analytics there is no direct information found over the “refresh mode” status if the DSC configuration of the virtual machine is set to pull or push. Only the compliancy status of a DSC configuration.

    In the API of automation DSC Configuration there is information available of the status information kolomn “Refresh mode” (Pull or Push)

    In Log Analytics entry I need the extra column if the DSC Configuration sets to “Pull” or “Push mode” (Refesh Mode) so we can create alert base on compliance of the DSC and the setting if DSC Configuration mode…

    19 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  7. visualizations updates

    Log Analytics visualizations have been updated and new Doughnut chart is less effective. In fact, we are unable to describe the metrics correctly and it has become cumbersome. Attached the sample As-Is vs Latest doughnut chart for your reference. In the existing chart we were able to indicate the performance metrics clearly, but same doughnut chart has become unusable post updates. At least can we have an option to keep both old & new way of displaying the Doughnut charts?

    11 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  My Dashboard  ·  Flag idea as inappropriate…  ·  Admin →
  8. New Cost Model

    For Azure Log Analytics, you pay a flat fee of $2.76 per GB of data ingested - slightly less if you do the per day "pre-paid" option.

    Storage is cheap! This is way too high. The current cost model may be competing with Splunk, but not with AWS, at all.

    After working with AWS for many years, I was shocked to see the price for ingesting data into Log Analytics. We have several dozens of applications generating roughly a total of hundreds of GB PER DAY. All of these logs are being ingested into AWS CloudWatch Logs. These logs serve…

    27 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  9. Option to select Resource-Specific with CLI, PowerShell and Rest API for collect resources logs

    Please add an option to select Resource-Specific with CLI, PowerShell and Rest API for collect resources logs.
    The documentation https://docs.microsoft.com/en-us/azure/azure-monitor/platform/resource-logs-collect-workspace#select-the-collection-mode shows it's currently not possible to select this destination with a script. So all logs will be stored by default to AzureDiagnostics and we can face some limits with the number of columns of the table.
    It's also recommended by Microsoft to select Resource-Specific target tables, but in our context, we need to be able to do it in an automated fashion. Thanks

    46 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the valid suggestion. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  10. Add all the render operations from the Kusto query language

    As of today the "with" render operator does not work in Log Analytics.
    example: | render timechart with(ymin=0)

    This will not force the y-axis to start at 0 if the values in the graph are higher than 0. It would be very good if it was possible to use all these operators that are listed in the documentation when using Log Analytics and when pinning the graphs to the dashboard.

    81 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →

    Thanks for the feedback. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  11. REST Error [500]: Internal Server Error -- Traceback (most recent call last)

    After LA installed getting below error & my another add-on 'SplunkTAmicrosoft-cloudservices' stopped working with connection refused once i disable Azure log analytics add-on then cloudservice addon started working

    Stack trace from python handler:\nTraceback (most recent call last):
    raise RestError(500, traceback.formatexc())\nRestError: REST Error [500]: Internal Server Error -- Traceback (most recent call last):\n File "/opt/splunk/etc/apps/TA-ms-loganalytics/bin/tamsloganalytics/splunktaucclib/resthandler/handler.py", line 113, in wrapper\n for name, data, acl in meth(self, *args, **kwargs):\n File "/opt/splunk/etc/apps/TA-ms-loganalytics/bin/tamsloganalytics/splunktaucclib/resthandler/handler.py", line 348, in formatallresponse\n

    6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    3 comments  ·  Flag idea as inappropriate…  ·  Admin →
  12. Time granularity in Workbooks

    While creating a few "Workbooks" with metric charts I noticed that it isn't possible to specify the time granularity used in the metric charts (i.e. the amount of time between two consecutive data points).

    See attached screenshot.

    Having charts with different time granularities (due to "auto" mode) makes it hard to compare numbers between them.

    Please add a way to set time granularity to a fixed number in workbooks.

    11 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  My Dashboard  ·  Flag idea as inappropriate…  ·  Admin →

    Thanks for your feedback and its now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  13. Color customization in log analytics chart

    A feature to customize the color of the charts should be available, as the default colors don't make any sense in many scenarios (green, blue, etc). Furthermore, when publishing the charts into a dashboard, the colors and chart shape are changed again (from pie to donut and from green and blue to dark blue and light blue).
    The ability to choose the colors should be added to the "render" method and consistency between the actual chart in log analytics and the published chart in the dashboard are very important features that should be available.

    223 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)

    Thanks for the valid suggestion. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  14. Rename Log Analytics workspace option

    Option to rename a log analytics work space should be available. Default names kept and does not make sense when used later as workspace already exists.

    27 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Workspace Settings / Administration  ·  Flag idea as inappropriate…  ·  Admin →

    Thanks for the valid suggestion. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

  15. Control the visibility of each parameter in a parameter collection

    When building workbooks for log analytics, it would be useful to have the ability to control the visibility of each parameter in a parameter collection and not only the entire parameter collection.

    If our workbook has tabs and we would like some parameters to be visible in all tabs and some to be visible in only some tabs, today we need to create two parameter collections,, creating two "lines" of parameters, instead of a single one.

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  My Dashboard  ·  Flag idea as inappropriate…  ·  Admin →
  16. 6 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  17. user friendly interface for queries under Logs

    the user interface in the portal, its really tough for users to see the result as the space on the window is very limited

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  My Dashboard  ·  Flag idea as inappropriate…  ·  Admin →
  18. Check details before using cheap assignment help services

    Do you want to make your assignment submission affordable? When you want to get everything within your budget, you must choose cheap assignment help. It’s never been an issue if you connect with the best service provider for accessing the valuable Online Assignment Help services. By connecting with excellent online services, you must choose the best provider. Don’t need to pay for hefty amount to access the online academic writing as everything depends on the outcomes. Paying a high amount and don’t get the desired result will not help you to boost your performance. Thus, you should pick the right…

    1 vote
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  AD Replication Result Solution  ·  Flag idea as inappropriate…  ·  Admin →
  19. Allow a Kusto query to reveal update agent readiness and the compliance state for patch management / azure automation

    I would like to determine if we have machines that aren't reporting a ready state due to issue with monitoring agent or hybrid workers.

    3 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
  20. Azure workbook Schedule report

    Is there any way I can schedule to send the Azure workbook report via email? Also, is there any logic apps/MS flow connector for workbook in

    13 votes
    Vote
    Sign in
    (thinking…)
    Sign in with: Microsoft
    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  My Dashboard  ·  Flag idea as inappropriate…  ·  Admin →

    Thank you for your query, this is not currently available. Your feedback is now open for the user community to upvote & comment on. This allows us to effectively prioritize your request against our existing feature backlog and also gives us insight into the potential impact of implementing the suggested feature.

← Previous 1 3 4 5 48 49
  • Don't see your idea?

Feedback and Knowledge Base