Update: Microsoft will be moving away from UserVoice sites on a product-by-product basis throughout the 2021 calendar year. We will leverage 1st party solutions for customer feedback. Learn more here.

Azure Search

Azure Search is a search-as-a-service solution that allows developers to incorporate a sophisticated search experience into web and mobile applications without having to worry about the complexities of full-text search and without having to deploy, maintain or manage any infrastructure

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Ability to index more than 1M documents in Basic

    I have a basic account for $40\month, and have 1.1M documents. But I'm not able to add an additional partition. My only options are a) limit my documents to 1M (what's the point?) or b) upgrade to Standard which supports 15M documents (overkill) for $250/month?!? Of course, I could sign up two basic accounts for less, but then have to write specific code to query two search accounts. Please add the ability to add more partitions to the Basic Tier.

    9 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Pricing and Quotas  ·  Flag idea as inappropriate…  ·  Admin →
  2. Most common searches by Analytics

    PowerBI can parse the Analytics data to obtain the "Most common search queries".
    It'd be awesome if the REST API could provide this information too.

    7 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  REST API  ·  Flag idea as inappropriate…  ·  Admin →
  3. Extend the Azure Search indexer for Azure tables to allow a search index to be sourced from multiple tables

    Would it be possible to extend the Azure Search indexer for Azure tables to allow a search index to be sourced from multiple tables? I have a search that, currently, I build from a ProductAnnotation and a ProductWholesale table (both share a common Id for lookup and this Id is stored in the search index).

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Query - Search  ·  Flag idea as inappropriate…  ·  Admin →

    Hi Keith,

    This is already possible in a slightly different form: you can create multiple datasource / indexer pairs all writing into the same index. As long as you stagger their schedules so that the indexers are unlikely to overlap on the same document at the same time, you’ll get the functionality this suggestion asks for.

    Hope this helps!
    Your Azure Search team

  4. 18 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
  5. Support shared access signature for blob datasource credentials

    Currently, blob datasource requires a full connection string. Sometime, specifying a SAS container URL is preferable.

    13 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
  6. Support indexing of append blobs

    It seems Azure Search doesn't currently work with append blobs. We use append blobs for logging and Azure search would help a lot if it could handle the blobs.

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
  7. Move service between subscriptions

    In response to the Azure Portal team's comment in this (https://feedback.azure.com/forums/223579-azure-portal/suggestions/5608987-move-services-between-subscriptions) thread, I'd like to propose to enable/implement this feature for the Search service asap.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Pricing and Quotas  ·  Flag idea as inappropriate…  ·  Admin →
  8. Provide .NET API for key operations

    Currently, there's no .NET API to perform key management operations (generate new admin keys, list keys, etc.)

    1 vote
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
  9. Allow SkipContent to be set directly on the blob indexer

    The azure blob indexer currently checks for a metadata property AzureSearch_SkipContent to skip the processing of the content. It would be easier to just tell the indexer directly to skip processing content for all blobs.

    15 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →

    This functionality is now available in REST API, using 2015-02-28-Preview version.

    Excerpt from the docs:
    Using indexer parameters to control document extraction

    If you want to skip content extraction for all blobs, you can do this using indexer configuration object, instead of having to add custom metadata to each blob individually. To do this, set SkipContent configuration property to true in the parameters object:

    PUT https://[service name].search.windows.net/indexers/?api-version=2015-02-28-Preview
    Content-Type: application/json
    api-key: [admin key]

    {
    … other parts of indexer definition
    “parameters” : { “configuration” : { “SkipContent” : true } }
    }

    Thanks!
    Your Azure Search team

  10. Extract document structure from JSON blobs

    Currently, blob indexer extracts entire document into a single 'content' field. Instead, we want the blobs containing JSON documents to be interpreted as documents with multiple fields.

    30 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    2 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
  11. 106 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
  12. blob index

    How does the Azure Search Blob Indexer work with billions of blobs? The only feasible way is if it can query the Blob Service to enumerate all the blobs with a LastModified timestamp greater than the last time that the indexer was run. But the Blob Service does not support any such filter (grr why not?!). So how does Azure Search Blob Indexer work? The transaction cost for enumerating billions of blobs would be large.

    Perhaps Azure Search Blob Indexer relies upon that "hack" of using Blob Service logging to detect when blobs are updated. But logging is described in…

    4 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →

    Hi,

    Please post questions about Azure Search on StackOverflow or our MSDN forum (https://social.msdn.microsoft.com/Forums/en-US/home?forum=azuresearch) to ensure quick response.

    A single blob indexer (or even a single search service) is not going to cope with billions of documents, due, among other reasons, to scalability limits on enumerating blobs as you note below.
    Our advice for scaling blob indexing is to provision multiple datasource/indexer pairs all writing to the same index (with datasources potentially pointing to different storage containers or storage accounts).
    Make sure your search service is scaled to run multiple indexers concurrently; keep in mind that one search unit can run one indexer at a time.

    Hope that helps!
    Your Azure Search team.

  13. 2 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
  14. Add facial recognition to blob storage file indexer capabilities

    Based on the facial recognition technology used in Windows Photo Gallery provide indexers with the ability to perform some form of facial recognition.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →

    We recently released this in “public preview” as part of “Cognitive Search”. Essentially you can select an analyze image skill to identify well known faces (such as celebrities).

    You can see a demo of this at https://www.youtube.com/watch?v=Wh5Dt8wnEhg

    Also, if you want to create your own custom face recognition model, you could train a model using the Cognitive Service Face API, and then call that from a “custom skill” — see an example of how to create a custom skill here:

    https://docs.microsoft.com/en-us/azure/search/cognitive-search-create-custom-skill-example

    Thanks!

    Luis Cabrera
    Azure Search Product Team

  15. Allow indexer data source to use custom schema for SQL Azure

    Today we HAVE to use dbo schema for source tables used as data sources by the search indexer. As you know, a lot of enterprises have rules against allowing objects to be created on dbo. This seems like a unnecessary requirement, and surely this could be a simple fix, right? Just let me specify whatever level of qualifier for the container name of my object in SQL Azure...

    6 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
  16. Support user-provided query in SQL indexer

    Let's say I run an indexer to read a really large table, with a billion records, out of which I'm interested in only a few thousands. This is a database I do not own, and I have only read permissions to it.

    In this case it would be better to have a simple filter on which data to keep in the indexer throw away the irrelevant data.

    For eg: "Index this table where TeamName equals MyTeam"

    Even though the indexer runs on a large set of data, my index would only contain a smaller subset of data.

    Smaller the index,…

    66 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    5 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
  17. Create multiple data sources

    Have the option to run multiple data sources will give a extra level of performance and will avoid reinventing the wheel.

    We are working with multiple databases in Azure but we need to combine them and bring a particular result. Multiple data sources will bring the feature which allow us to actually use Search as an really advance search engine. We use DocumentDb and SQL Server in Azure, so it will be just brilliant to have that combination. The posibilities will be endless.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    1 comment  ·  Indexing  ·  Flag idea as inappropriate…  ·  Admin →
  18. Add ASCII-friendly string decoder to indexer field mapping functions

    Indexing of custom blob metadata to specified schema fields is great, but Azure blob metadata is restricted to only ASCII characters. Currently I can get around this limitation by encoding all blob metadata using URI-encoding (Base64 encoding would also work). However, this means that those fields are not searchable/filterable on their original content.

    If you were to add a new field mapping function that decodes URI (or Base64) encoded values, this problem would go away.

    6 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
  19. Progressive cost for Azure search

    There is currently only two plans for Azure Search. Free 15,000 documents and Standard 15M documents at 300$/month. Need a more progressive pricing model like all other products. This is hard to justify when your search engine cost more then the hosting and storage...

    21 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Pricing and Quotas  ·  Flag idea as inappropriate…  ·  Admin →
  20. Azure search scoring profile dstiance

    When we try to add a scoring profile for Azure search based on distance the drop down for the field can't be selected. This makes it impossible to add a distance scoring profile.

    3 votes
    Vote

    We're glad you're here

    Please sign in to leave feedback

    Signed in as (Sign out)
    You have left! (?) (thinking…)
    0 comments  ·  Flag idea as inappropriate…  ·  Admin →
  • Don't see your idea?

Feedback and Knowledge Base