If you are using Catalog Search APIs for any customer-facing features, you are doing it wrong!
I have seen this problem a couple of times – the search feature on the site is “dead” – it is very slow, and the log file is usually filled with dead lock or timeout error. As it turns out, the search feature was implemented by Catalog Search APIs, which is a big no-no.
To be clear, there are two builtin APIs related to searching in Episerver Commerce: the “fast” one, which can be done via SearchManager, ISearchCriteria and ISearchResults, is the SearchProvider APIs. It’s the indexed search (strictly speaking, you can make it not “indexed”, but that’s beside the point), and the actual search functions will be provided by providers, like LuceneSearchProvider, Solr35SearchProvider, or FindSearchProvider.
Based on the indices, this APIs is very fast, and it does not touch the database at all (it still needs to read from the database when building index, but not for the actual search action). The “slow” one, which are any APIs which use the CatalogSearchParameters and CatalogSearchOptions, such as ICatalogSystem.FindItems, ICatalogSystem.FindItemDto, etc. These methods touch the database directly, and it is not just reading. A call to one of those will result in inserting and deleting as well. It’s inevitably slow(er) – and it’s the one we are talking about today.
You might ask – why would not Episerver remove the Catalog Search APIs altogether and use the SearchProvider APIs only, given it’s slower? Well, there are a couple of reasons for that:
- The search results from SearchProvider are not realtime. Even with eventual indexing, there will always be a delay between the data in database and the data in the index. You’ll not be guaranteed the latest data.
- SearchProvider is, at its heart, readonly. Catalog Search APIs can be used for editing as well.
Catalog Search APIs is the missing piece of the content APIs (and in the Catalog UI as well): it provides the ability to edit multiple entries/nodes at once. Let’s assume you want to update assets of every entry in your catalog. With content APIs, your best shot is to load, edit and save one by one (you might use some batching methods like GetChildren/GetDescendents, but in the end you’ll have to save one by one). With Catalog Search APIs, you can just load the CatalogEntryDto (which can have multiple Entries), update CatalogItemAsset table, and save it in one go. I have been talking about moving to new content APIs, but in this example, it simply does not stand a chance against Catalog Search APIs.
(You might argue that an SQL statement might do that as well, even faster. It’s a big no-no. SQL is bad for complex logic processing and we highly recommend you to avoid using SQL queries directly. We have cache and a lot of other things to take care of)
Back to the original statement of this blog – as Catalog Search APIs touch database (and in most cases, not caching at all), so using it intensively is a bad idea – such as put it as the backend of the search engine for your site. It will, sooner or later, bring your database to its knees. Use SearchProvider for that – it’s what it was meant for.
Even if you are using Catalog Search APIs only for your editing features – make sure that you set the “right” value for RecordsToRetrieve in CatalogSearchOptions. I’ve seen it’s set to 10000, which is, IMO, a “bad” value. Loading 10000 CatalogEntry at once is not a good idea as it’s very database intensive (and it involves inserting and deleting as well, remember 😉 ? ). The optimal value might vary from site to site, but I find 100-500 per batch is a good starting value to consider (and you might adjust a bit to get the best batch size for you).
Using the right APIs, with the right parameters, is one of the keys to make your database server happy 🙂