Azure Search

Azure Search is a search-as-a-service solution that allows developers to incorporate a sophisticated search experience into web and mobile applications without having to worry about the complexities of full-text search and without having to deploy, maintain or manage any infrastructure

How can we improve Azure Search?

You've used all your votes and won't be able to post a new idea, but you can still search and comment on existing ideas.

There are two ways to get more votes:

  • When an admin closes an idea you've voted on, you'll get your votes back from that idea.
  • You can remove your votes from an open idea you support.
  • To see ideas you have already voted on, select the "My feedback" filter and select "My open ideas".
(thinking…)

Enter your idea and we'll search to see if someone has already suggested it.

If a similar idea already exists, you can support and comment on it.

If it doesn't exist, you can post your idea so others can support it.

Enter your idea and we'll search to see if someone has already suggested it.

  • Hot ideas
  • Top ideas
  • New ideas
  • My feedback
  1. Expose Content as searchable field even in structured (JSON) data

    If I want to index JSON files I can either full text search using parsingMode TEXT or I can index fields using fieldMappings.

    Why not make both available? That way a user can get a fast search using an indexeg field (like a Product number), or a slower one using the file content (like Product description).

    2 votes
    Vote
    Sign in
    Check!
    (thinking…)
    Reset
    or sign in with
    • facebook
    • google
      Password icon
      I agree to the terms of service
      Signed in as (Sign out)
      You have left! (?) (thinking…)
      0 comments  ·  Indexing  ·  Flag idea as inappropriate…  ·  Admin →
    • Enabled indexing of gzip compressed blobs

      https://stackoverflow.com/questions/46245505/indexing-gzipped-blobs-in-azure-search

      I am trying to setup Azure Search over Azure Blob. The json blobs are compressed using gzip.

      When I try to index the blobs as-is, I get the exception :

      "Error detecting index schema from data source: "Error processing blob https://mystorageaccount.blob.core.windows.net/mycontainer/urlencodedname with content type ''. Status:UnsupportedMediaType, error:""

      Looks like Azure Search blob indexer does support indexing ZIP archives (application/zip MIME type), but not gzip-compressed files.

      1 vote
      Vote
      Sign in
      Check!
      (thinking…)
      Reset
      or sign in with
      • facebook
      • google
        Password icon
        I agree to the terms of service
        Signed in as (Sign out)
        You have left! (?) (thinking…)
        0 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
      • import data form

        Resolve Import Data Grid Bugginess

        4 votes
        Vote
        Sign in
        Check!
        (thinking…)
        Reset
        or sign in with
        • facebook
        • google
          Password icon
          I agree to the terms of service
          Signed in as (Sign out)
          You have left! (?) (thinking…)
          2 comments  ·  Portal  ·  Flag idea as inappropriate…  ·  Admin →
        • Batch document search

          It would be great if you could send a batch of search requests to the Azure Search, which could potentially lower the QPS count and mitigate network latency problems

          1 vote
          Vote
          Sign in
          Check!
          (thinking…)
          Reset
          or sign in with
          • facebook
          • google
            Password icon
            I agree to the terms of service
            Signed in as (Sign out)
            You have left! (?) (thinking…)
            0 comments  ·  Query - Search  ·  Flag idea as inappropriate…  ·  Admin →
          • Support multi point Edm.GeographyPoint search

            It would be great if you could sort by nearest where documents could have an array of Edm.GeographyPoints.
            We are currently duplicating documents for every different Edm.GeographyPoint.

            1 vote
            Vote
            Sign in
            Check!
            (thinking…)
            Reset
            or sign in with
            • facebook
            • google
              Password icon
              I agree to the terms of service
              Signed in as (Sign out)
              You have left! (?) (thinking…)
              0 comments  ·  Query - Search  ·  Flag idea as inappropriate…  ·  Admin →
            • Provide indexed timestamp on each document

              Like '@search.score' property, it would be good to provide '@search.indextime' as indexed timestamp.
              This is useful when we compare documents lifecycle from recorded to utilized moments.

              And also, it could be helpful if AzSearch implements time-to-live data as followed by this suggestion; https://feedback.azure.com/forums/263029-azure-search/suggestions/6328648-time-to-live-for-data

              2 votes
              Vote
              Sign in
              Check!
              (thinking…)
              Reset
              or sign in with
              • facebook
              • google
                Password icon
                I agree to the terms of service
                Signed in as (Sign out)
                You have left! (?) (thinking…)
                0 comments  ·  Indexing  ·  Flag idea as inappropriate…  ·  Admin →
              • Document Level Security Trimming

                An Index contains data from a source (Blob Store, SQL DB, DocumentDB, say DropBox one day?). Different sources may have different and detailed security on each document.

                We index these sources normally as a 'super-user' of some kind so the index may contain information from any indexed doc (like 'the bosses salary').

                At index query time of course the original document source is not available. It should be possible though to allow one or more 'roles' to be added to each document as it is indexed and to specify roles in the Search query.

                In this way a middleware search…

                13 votes
                Vote
                Sign in
                Check!
                (thinking…)
                Reset
                or sign in with
                • facebook
                • google
                  Password icon
                  I agree to the terms of service
                  Signed in as (Sign out)
                  You have left! (?) (thinking…)
                  2 comments  ·  Security  ·  Flag idea as inappropriate…  ·  Admin →
                • Find all documents and get single collection property

                  I have a index with person data. One property of it is a skills field that is a collection of strings. Search through that works just fine. For example
                  search=azure&$select=skills then you get all people that have Azure in their document and it shows the skills collection for each result.
                  However, when you want to get all possible skills by doing the query
                  search=*&$select=skills
                  then you do get all people, but almost all the skills fields are empty (except one for no apparent reason). What I want is to be able to only retrieve the skills field values of all…

                  3 votes
                  Vote
                  Sign in
                  Check!
                  (thinking…)
                  Reset
                  or sign in with
                  • facebook
                  • google
                    Password icon
                    I agree to the terms of service
                    Signed in as (Sign out)
                    You have left! (?) (thinking…)
                    1 comment  ·  Bugs  ·  Flag idea as inappropriate…  ·  Admin →
                  • Enable browsing the index with only read permissions.

                    With only read access I am not able to see indices, view or search in the index.

                    8 votes
                    Vote
                    Sign in
                    Check!
                    (thinking…)
                    Reset
                    or sign in with
                    • facebook
                    • google
                      Password icon
                      I agree to the terms of service
                      Signed in as (Sign out)
                      You have left! (?) (thinking…)
                      0 comments  ·  Bugs  ·  Flag idea as inappropriate…  ·  Admin →
                    • 3rd party DataSource development

                      Publish the an SDK and/or documentation that will allow developers to create custom DataSource components and for on-premise data indexing.

                      3 votes
                      Vote
                      Sign in
                      Check!
                      (thinking…)
                      Reset
                      or sign in with
                      • facebook
                      • google
                        Password icon
                        I agree to the terms of service
                        Signed in as (Sign out)
                        You have left! (?) (thinking…)
                        0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                      • Detailed Request Monitoring

                        We use Sitecore CMS which now supports Azure Search indexes. During our implementation we have discovered that re-index process fails. Using Fiddler we found the Azure Search REST requests that returned 207 (Multi status) responses and then reviewed the response body to determine the errors.

                        Beyond Fiddler running as a proxy and capturing the HTTPS traffic from Sitecore - we found no other way to get insight into the requests/responses from Azure Search service.

                        We enabled Operation Log monitoring but the Operation Log monitoring does not provide details about request body and response body.

                        We looked into Search Traffic Analytics,…

                        3 votes
                        Vote
                        Sign in
                        Check!
                        (thinking…)
                        Reset
                        or sign in with
                        • facebook
                        • google
                          Password icon
                          I agree to the terms of service
                          Signed in as (Sign out)
                          You have left! (?) (thinking…)
                          0 comments  ·  Indexing  ·  Flag idea as inappropriate…  ·  Admin →
                        • Retrieve index data source

                          Once the data source is created there is no current method to identify where the index is indexing the data from. It would be nice to know where the data is coming from, like is it a QA, DEV or PROD, or is it a blob storage account, sql server, documentDB, etc.

                          I don't need the actual connection string or password or sensitive info, just the repository name would be great.

                          2 votes
                          Vote
                          Sign in
                          Check!
                          (thinking…)
                          Reset
                          or sign in with
                          • facebook
                          • google
                            Password icon
                            I agree to the terms of service
                            Signed in as (Sign out)
                            You have left! (?) (thinking…)
                            0 comments  ·  Flag idea as inappropriate…  ·  Admin →
                          • Make the blob indexer faster

                            From the tests I made, currently a single blob indexer in S2 with 1 partition and 1 replica is only able to process between 50 000 and 75 000 small office documents (1 to 4 pages) in a 24 hour period.

                            The current solution which would be to restructure millions of blobs into "directories" with max 75 000 blobs in them and have 12 indexers is completely out of the question due to the insane pricing model and the time it would take to both, modify consumers with new paths and move blobs to a new structure. The latter being…

                            9 votes
                            Vote
                            Sign in
                            Check!
                            (thinking…)
                            Reset
                            or sign in with
                            • facebook
                            • google
                              Password icon
                              I agree to the terms of service
                              Signed in as (Sign out)
                              You have left! (?) (thinking…)
                              1 comment  ·  Indexing  ·  Flag idea as inappropriate…  ·  Admin →
                            • base64decode method should handle standard strings

                              Azure Search indexer base64decode method should handle standard URL Safe Base64 encoded strings, and not the out-of-standard strings returned by classic System.Web.HttpServerUtility.UrlTokenEncode method, which is not even available in latest versions of the framework. This method is returning non-standard strings, as it replaces padding characters '=' with a digit indicating the number of = signs that were removed. Azure Search base64 decode method expects this non-standard strings, rather than standard URL safe Base64 encoded strings, otherwise it breaks with error: "Error applying mapping function 'base64Decode' to field 'aaa': Array cannot be null.\r\nParameter name: bytes". Which means that using JavaScript or…

                              6 votes
                              Vote
                              Sign in
                              Check!
                              (thinking…)
                              Reset
                              or sign in with
                              • facebook
                              • google
                                Password icon
                                I agree to the terms of service
                                Signed in as (Sign out)
                                You have left! (?) (thinking…)
                                1 comment  ·  Indexing  ·  Flag idea as inappropriate…  ·  Admin →
                              • Add more document allowance to the free tier

                                Thank you for the Free Tier!! We really love it! Just one thing to ask. Is it possible to add more document capacity to the free tier? We know you have to cut it off somewhere but 10,000 is just a bit small for eval purposes when working with big datasets. Maybe 100k or 150k?

                                3 votes
                                Vote
                                Sign in
                                Check!
                                (thinking…)
                                Reset
                                or sign in with
                                • facebook
                                • google
                                  Password icon
                                  I agree to the terms of service
                                  Signed in as (Sign out)
                                  You have left! (?) (thinking…)
                                  0 comments  ·  Indexing  ·  Flag idea as inappropriate…  ·  Admin →
                                • Clone entire Azure Search instance, just like I can clone App Service.

                                  I would be extremely handy if I could clone entire Azure Search instance to the same or a new Resource Group, in the same or a new location using the same or another Azure Subscription. Like I can clone Azure App Service wherever I want.

                                  User case: If there is an issue which is reproducible with production instance of Azure Search, then I will clone in to a separate sandbox, and take it from there. As of today, I have to create a brand new Azure Search instance, clone SQL Database and reindex them to get the same data.

                                  9 votes
                                  Vote
                                  Sign in
                                  Check!
                                  (thinking…)
                                  Reset
                                  or sign in with
                                  • facebook
                                  • google
                                    Password icon
                                    I agree to the terms of service
                                    Signed in as (Sign out)
                                    You have left! (?) (thinking…)
                                    0 comments  ·  Enterprise  ·  Flag idea as inappropriate…  ·  Admin →
                                  • Allow custom defined Field Mapping Function

                                    Currently when defining the indexers, one can specify the field mappings from source to target fields. Only a limited set of functions such as Extract token at position etc are supported. It would be desirable to support Func syntax such that each row is passed onto the func which then decide the target field value based on developer defined logic in func.

                                    Eg: we have a cosmo db source and a azure table storage data source. The partition key for cosmo db collection is combination of tenantId:studentId for efficient read/writes across partitions based on our data. The azure table storage…

                                    6 votes
                                    Vote
                                    Sign in
                                    Check!
                                    (thinking…)
                                    Reset
                                    or sign in with
                                    • facebook
                                    • google
                                      Password icon
                                      I agree to the terms of service
                                      Signed in as (Sign out)
                                      You have left! (?) (thinking…)
                                      0 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
                                    • Table Storage Indexing of multiple tables at scale

                                      I believe that a good pattern for Azure Table Storage is to enable fast deletes by simply dropping tables. I've implemented a multi-tenant model for my data using a table per tenant, and would like to aggregate search data across tables.

                                      The current recommendation is to create an indexer per table. This means the number of tenants I support (which I'm hoping to be in the thousands) is limited to the number of indexers I can create, which is a very low number.

                                      The latest table storage SDK has methods to asynchronously list tables in a segmented fashion using continuation…

                                      14 votes
                                      Vote
                                      Sign in
                                      Check!
                                      (thinking…)
                                      Reset
                                      or sign in with
                                      • facebook
                                      • google
                                        Password icon
                                        I agree to the terms of service
                                        Signed in as (Sign out)
                                        You have left! (?) (thinking…)
                                        0 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
                                      • Provide warmup procedure

                                        When we scale out Search replica, we can find delay peak right after it scaled. It gives throttled query, so I want to reduce it when we apply scheduled replica change.

                                        It would be good to provide "predefined warm-up query sets" After-Scaleout event. What about to give textbox, and GET-queries are performed line by line multiple times.

                                        Note that I'm using Azure Automation with this script to change replica count by schedule; https://gallery.technet.microsoft.com/scriptcenter/azure-search-change-c0b49c4c

                                        2 votes
                                        Vote
                                        Sign in
                                        Check!
                                        (thinking…)
                                        Reset
                                        or sign in with
                                        • facebook
                                        • google
                                          Password icon
                                          I agree to the terms of service
                                          Signed in as (Sign out)
                                          You have left! (?) (thinking…)
                                          0 comments  ·  Multi-tenancy  ·  Flag idea as inappropriate…  ·  Admin →
                                        • Support for indexing multi-line JSON files

                                          Support for indexing multi-line JSON files (https://en.wikipedia.org/wiki/JSON_Streaming). Currently blob indexer extracts entire json file as one document.

                                          It would be also nice to pair this with the gzip support so that it will be able to index *.json.gz files

                                          7 votes
                                          Vote
                                          Sign in
                                          Check!
                                          (thinking…)
                                          Reset
                                          or sign in with
                                          • facebook
                                          • google
                                            Password icon
                                            I agree to the terms of service
                                            Signed in as (Sign out)
                                            You have left! (?) (thinking…)
                                            0 comments  ·  Crawlers  ·  Flag idea as inappropriate…  ·  Admin →
                                          ← Previous 1 3 4 5 9 10
                                          • Don't see your idea?

                                          Feedback and Knowledge Base