My best purchases from Amazon, 2024 editions

Verbatim 43888 External Slimline Blu-Ray Burner

One of my last purchase in 2024 turned out to be one of the best. I wanted to rip my bluray collection for a while, for convenience, but has always been put off by selecting the right drive. I have no space for an internal drive, and not every drive can rip 4K bluray – you will need to flash firmware. Turned out there is one that just works out of the box. Pop out the drive, plug it in, and start ripping. Simple as that.

It is fairly quiet most of the time (depending on the disc), and it just works. Speed is a bit slow given size of the 4K bluray, and it is a slim, external drive, but I can’t really complain

https://amzn.to/4iH4oAn at around 1200kr (1300kr with 100kr coupon at check out at the time of writing)

OXO Good Grips Kitchen & Herb Scissors

This is my favorite kitchen scissors. Victorinox one is good also, but this feels more sturdy when holding. And boys it just cut through every thing – for example if you need to cut down a chicken. It does not shy to cut through bone. And it’s stainless steel so you can just throw it to the dish washing machine. It can be quickly dissembled and assembled in seconds. Nothing to complain about, just a terrific kitchen scissors

https://amzn.to/4fnpKjG at around 250kr

The runner is Victorinox one

https://amzn.to/4fFiNL3 at around 220kr

Nordic Ware 44671AMZ Prism Large and Half Sheet Set, Aluminum

The absolute best baking sheets by Nordic Ware, in thick aluminum, made in the USA. If you are a fan of baking cookies, they are a must. Sadly at this point of writing they are out of stock, but they were available for around 250kr for a long time. Make sure to snatch one when it’s available again, a steal at that price – one piece is more than that (250kr) in other stores, and here you get two

https://amzn.to/3ZMvcG

OXO Good Grips Y Peeler

There are many good peelers out there and my best performance on price would be from Victorinox – it can peel carrots like there is no tomorrow. and it’s that good. Sadly it is no longer available from Amazon themselves and you have to buy from 3rd party seller which means additional shipping fees.

https://amzn.to/3ZLz4bg at around kr54.30

I also have an OXO peeler which I bought a few years ago but did not use it that much, but a friend of mine visited recently and insist that I buy for her this (she is of course paying). It is a great peeler anyway and some might prefer it to the Victorinox one as the handle is more comfortable. I bought one for my friend and she is more than happy with it. Maybe you will, too.

https://amzn.to/3BNYI75 at around 125kr.

Neutrogena Concentrated unscented hand cream (50 ml)

In many positive things about Sweden, I suffer from the cold, dry winter, and my hands suffer the most. My collection of hand moisturizers could somewhat be comparable to my wife’s skincare collection. I had anything from O’Keefee’s to Cerave to Eucerin, they helped a little but nothing really solved my problem. Enter Neutrogena hand cream. Really a life saver. Wish I knew it 9 years ago.

https://amzn.to/4glZh7e at around 38kr

Best deals on Amazon.se week 51/2024

Fjallraven 87177-244 Vidda PRO Pants M Sports Pants Men’s Suede Brown Size 48/L

Easily 2000kr on other shops

https://amzn.to/3ZZemVB

Dreame L40 Ultra Robot Vacuum Cleaner with Removable and Liftable Mop, Extendable and Liftable Side Brush, 11000Pa Suction Power, 65°C Mop and Washboard Self-Cleaning, Automatic Draining, Automatic

Reduced to kr9,770.29

https://amzn.to/41w0sMW

Yes Max Power Dishwasher Liquid Pomegranate 650ml, Easy and smooth cleaning, even for your dirtiest pots and pots

Reduced to kr35.49

https://amzn.to/3VKNzL5

Dreame Z10 Station with auto empty station

Reduced to kr2,999.00

https://amzn.to/3DfWhdU

Philips SONICARE W2 Optimal White Standard Sonic Toothbrush Heads in Original Design – Pack of 8 Units in White 

Reduced to kr349.00

https://amzn.to/3VEUPrW

ECCO Men’s Track 25 Mid GtxClassic Boots

Reduced to kr1,287.17

https://amzn.to/41F35w4

Waterpik Ultra Professional Water Flosser, Mouth Shower With 7 Nozzles And Advanced Control Of Water Pressure With 10 Settings, Tool For Removing Plaque, White (WP-660EU)

reduced to kr599 (reduced 100kr at check out)

https://amzn.to/3ZZ9wJ3

Apple iPad Pro 11 tum (M4): Ultra Retina XDR-skärm, 512 GB, skärmkamera på långsidan med 12 MP/kamera på baksidan med 12 MP, wifi 6E + 5G, batteri som räcker hela dagen, Standardglas – silver

Reduced to kr14,882.81

https://amzn.to/3BEDaKg

Best deals on Amazon.se week 50/2024

Philips Sonicare Original W2 Optimal White HX6066/10

Reduced to kr289.99

https://amzn.to/3BlL4Io

Several UGREEN USB C Hub reduced to Black Friday prices

https://amzn.to/4g3ZUCo

Thermos Bottle + Tea/FRUIT Filter Infuser, Insulated Stainless Steel Water Bottle, BPA Free | Drink Bottle + Tea Infuser / Fruit Insert, 450ml To-Go Thermo Cup Thermo Flask: Office Sports School

Reduced to kr227.99

https://amzn.to/4iv5Oy7

Philips Hx9611/19 Sonicare Expertclean 7300 Elektrisk Tandborste, Vit, Paket med 2

Reduced to kr1,577.13

https://amzn.to/3D9AHI7

Noctua NF-A12x25 PWM chromax.black.swap, Silent Premium Fan, 4-Pin (120mm, Black)

Reduced to kr199

https://amzn.to/49M9HuF

Merrell Moab 3 GTX men trekking shoes

Reduced to kr850

https://amzn.to/4f7GdZb

LEGO Technic Koenigsegg Jesko Absolut Grey Hyper Car, Construction Set for Boys and Girls, Toy Car for Kids, Buildable Model Set, Gift Idea for Motor Enthusiasts 42173

Reduced to kr390

https://amzn.to/4g9UlCi

Produkt: Apple MacBook Air 15.3″ (2024) – M3 OC 10C GPU 16GB RAM 512GB SSD

Reduced to kr15,743.52

https://amzn.to/4iyzom7

Yes Platinum All In One Dishwasher Detergent, 94 Dishwasher Tablets

Reduced to kr138.89

https://amzn.to/4fiK7hM

TIMEMORE Coffee Scale Basic 2.0 Espresso Scale

Reduced to kr542.40

https://amzn.to/3PcEqaN

Technics EAH-AZ80E-K Wireless Headphones with Noise Cancellation, Multipoint Bluetooth 3 Devices, Comfortable In-Ear Headphones, Wireless Charging, Black

Reduced to kr2,413.60

https://amzn.to/3ZwlCI4

Eneloop Pro BK-3HCDE/4BE Batteries AA

Reduced to kr80

https://amzn.to/4iAKlUk

ESI U22 XT | Professional 24-bit USB Sound Card
Reduced to kr516.78. Can be combined with the Spend 600kr get 100kr discount

https://amzn.to/4gsi0Ou

BGS 32100 | U-Ring Wrench Set | Inch Sizes | 3/8″ – 11/16″ | 6 Pieces

Reduced to kr89.34

https://amzn.to/4gnjZDK

Swiffer mop wipes (12 pads)
Reduced to kr35.79

https://amzn.to/4iL6kbf

LEGO Technic NEOM McLaren Formula E Racing Car 42169

Reduced to kr413.99

https://amzn.to/41Lr9Nx

Gillette Fusion 5 Razor Blade Refills for Men, 8pcs

Reduced to kr250.39

https://amzn.to/41Nk4MO

AsyncHelper can be considered harmful

.NET developers have been in the transition to move from synchronous APIs to asynchronous API. That was boosted a lot by await/async keyword of C# 5.0, but we are now in a dangerous middle ground: there are as many synchronous APIs as there are async ones. The mix of them requires the ability to call async APIs from a synchronous context, and vice versa. Calling synchronous APIs from an async context is simple – you can fire up a task and let it does the work. Calling async APIs from a sync context is much more complicated. And that is where AsyncHelper comes to the play.

AsyncHelper is a common thing used to run async code in a synchronous context. It is simple helper class with two methods to run async APIs

        public static TResult RunSync<TResult>(Func<Task<TResult>> func)
        {
            var cultureUi = CultureInfo.CurrentUICulture;
            var culture = CultureInfo.CurrentCulture;
            return _myTaskFactory.StartNew(() =>
            {
                Thread.CurrentThread.CurrentCulture = culture;
                Thread.CurrentThread.CurrentUICulture = cultureUi;
                return func();
            }).Unwrap().GetAwaiter().GetResult();
        }

        public static void RunSync(Func<Task> func)
        {
            var cultureUi = CultureInfo.CurrentUICulture;
            var culture = CultureInfo.CurrentCulture;
            _myTaskFactory.StartNew(() =>
            {
                Thread.CurrentThread.CurrentCulture = culture;
                Thread.CurrentThread.CurrentUICulture = cultureUi;
                return func();
            }).Unwrap().GetAwaiter().GetResult();
        }

There are slight variants of it, with and without setting the CurrentCulture and CurrentUICulture, but the main part is still spawning a new Task to run the async task, then blocks and gets the result using Unwrap().GetAwaiter().GetResult();

One of the reason it was so popular was people think it was written by Microsoft so it must be safe to use, but it is actually not true: the class is introduced as an internal class by AspNetIdentity AspNetIdentity/src/Microsoft.AspNet.Identity.Core/AsyncHelper.cs at main · aspnet/AspNetIdentity (github.com) .That means Microsoft teams can use it when they think it’s the right choice to do, it’s not the default recommendation to run async tasks in a synchronous context.

Unfortunately I’ve seen a fair share of threads stuck in AsyncHelper.RunSync stacktrace, likely have fallen victims of a deadlock situation.

    756A477F9790	    75ABD117CF16	[HelperMethodFrame_1OBJ] (System.Threading.Monitor.ObjWait)
    756A477F98C0	    75AB62F11BF9	System.Threading.ManualResetEventSlim.Wait(Int32, System.Threading.CancellationToken)
    756A477F9970	    75AB671E0529	System.Threading.Tasks.Task.SpinThenBlockingWait(Int32, System.Threading.CancellationToken)
    756A477F99D0	    75AB671E0060	System.Threading.Tasks.Task.InternalWaitCore(Int32, System.Threading.CancellationToken)
    756A477F9A40	    75AB676068B8	System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(System.Threading.Tasks.Task, System.Threading.Tasks.ConfigureAwaitOptions)
    756A477F9A60	    75AB661E4FE7	System.Runtime.CompilerServices.TaskAwaiter`1[[System.__Canon, System.Private.CoreLib]].GetResult()

An further explanation of why this is bad can be read here

c# – Is Task.Result the same as .GetAwaiter.GetResult()? – Stack Overflow

Async/sync is a complex topic and even experienced developers make mistake. There is no simple way to just run async code in a sync context. AsyncHelper is absolutely not. It is simple, convenient way, but does not guarantee to be correct thing in your use case. I see it as a shortcut to solve some problems but create bigger ones down the path.

Just because you can. doesn’t mean you should. That applies to AsyncHelper perfectly

The search for dictionary key

Recently I helped to chase down a ghost (and you might be surprised to know that I, for most part, spend hours to be a ghostbuster, it could be fun, sometimes). A customer reported a weird issue when a visitor goes to their website, have every thing correct in the cart, including the discount, only to have the discount disappeared when they check out. That would be a fairly easy task to debug and fix if not for the problem is random in nature. It might happen once in a while, but on average, daily. It could not be reproduced locally, or reproduced consistently on production, so all fix is based on guess work.

After a lot of dry code reading and then log reading, it turned out that it seems the problem with missing discount was problem with the missing cache. Once in a while, the cache that contains the promotion list is returned empty, resulting that no discount is applied to the order.

But why?

After a few guesses, it eventually came to me that the problem is with the caching using Dictionary, more specifically, campaigns are loaded and cached using a Dictionary, using IMarket as a key. It would be fine and highly efficient and well, if not for the fact that the default implementation of IMarket is not suitable to be a Dictionary key. It does not implement IComparable<T> and IEquatable<T> which means, for the only case that two IMarket instances to be equal, is that they are the same instances. Otherwise even if their properties all equal in value, they will not be equal.

This is a short program that demonstrates the problem. You can expect it write “False” to the output console.

public class Program
{
    private static Dictionary<AClass, int> dict = new Dictionary<AClass, int>();
    public static void Main()
    {
        dict.Add(new AClass("abc", 1), 1);
        dict.Add(new AClass("xyz", 2), 2);

        Console.WriteLine(dict.ContainsKey(new AClass("abc", 1)));
    }
}


public class AClass
{
    public AClass(string a, int b)
    {
        AString = a;
        AnInt = b;
    }

    public string AString { get; set; }
    public int AnInt { get; set; }
}

The question arises is that if the key is not matched and an empty list of campaigns returns, why this only happens sometimes. The answer is the IMarket instances themselves are cached, by default in 5 minutes. So for the problem to occur, a cache for campaigns must be loaded in memory, just before the cache for IMarket instances to be expired (then new instances are created). Once the new IMarket instances are loaded, then the campaigns cache must be accessed again before itself expires (default to 30 seconds). The timing needs to be “right” which causes this problem elusive and hard to find from normal testing – both automated and manual.

Time to some blaming and finger pointing. When I fix something I usually try to check the history of the code to understand the reason behind the original idea and intention. Was there a reason or just an overlook. And most importantly

Who wrote such code?

Me, about 7 months ago.

Uh oh.

The fix was simple enough. Instead of IMarket, we can change the key to MarketId which implements both IEquatable<T> and IComparer<T>. So it does not matter if you have two different instances of MarketId, as long as they have the same value, they will be equal.

A workaround was sent to the customer to test and after a week or so they reported back the problem is gone. The official fix is in Commerce 14.31 which was released yesterday https://nuget.optimizely.com/package/?id=EPiServer.Commerce.Core&v=14.31.0 , so you’re, always, highly recommended to upgrade.

Lessons learned:

  • Pick the dictionary key carefully. It should implement IEquatable<T> and IComparable<T> , properly I might ask. In general, a struct is a better choice than a class, if you can.
  • No matter how “experienced” you think you are, you are still a human being and can make mistake. It’s important to have someone to check your work from time to time, spotting problems that you couldn’t.

Best Black Friday deals on Amazon.se

Black Friday is in full speed and here are the best deals you can buy now

Ninja air fryer MAX PRO, 6.2 litres, uses no oil, large square single box, family size, non-Stick, dishwasher safe basket and crisp plate, silicone tongs, black and copper, AF180EUCP

https://amzn.to/3D3ASEC

Ninja Foodi MAX Dual Zone Digital Air Fryer, 2 Drawers, 9.5L, 6-in-1, Usually No Oil, Air Fryer, Maximize Crisp, Roast, Bake, Dry, 8 Serves, Nonstick Dishwasher Safe Baskets, Black AF400EU

https://amzn.to/4g6wumN

TP-Link Tapo P115 Smart Plug, White

https://amzn.to/4eWTQdz

Corsair SP120 ELITE, 120mm PWM Hydraulic Stored Fan For Housing With CORSAIR AirGuide Technology – Low Noise, 24.7dBA, Fan Speeds From 300 RPM – 1300 RPM, 45.4 CFM, Single Pack – Black

https://amzn.to/4gdMYJF

The downside of being too fast

Today when I was tracking down some changes, I came across this commit comment

The bugfix for COM-xxxx seems to make the importing
of metafields too fast and causing too many events raised,
potentially flood the event system. This avoids raising the events
unnecessarily.

Who wrote this? Me, almost 5 years ago. I did have a chuckle in my head, for my younger self being brutally honest (I was the one who fixed the original bug, so I was responsible for making it “too fast”.). Because it was sending too many events in a short time, the event system was overwhelmed and some other events (like cache invalidation of other parts) did not get through.

Well, the bug fix makes it even faster by reducing the number of events raised. In the end, everyone was happy.

Moral of the story:

  • The system can only be as fast as the slowest component. Keep that in mind when you put your effort. In most cases, the slowest part is where you should spend your time.
  • It is important to validate any optimization is end-to-end. Faster is not always better if some other parts (that part) could not keep up with this part becomes unexpectedly faster.
  • You can’t predict everything, something will happen in the most unexpected way. Move fast, break things, learn from it, fix it. Rinse and Repeat.

Solving the mystery of high memory usage

Sometimes, my work is easy, the problem could be resolved with one look (when I’m lucky enough to look at where it needs to be looked, just like this one Varchar can be harmful to your performance – Quan Mai’s blog (vimvq1987.com) ). Sometimes, it is hard. Can’t count number of times that I stared blankly at the screen, and decided I’d better take a nap, roast a batch of coffee, or take a walk (that is lying, however, I don’t walk), because I’m out of idea and this is going nowhere. The life of a software diagnostic engineer is like that, sometimes you are solving the mystery of “what do I need to solve this mystery”. There are usually more dots scattered around in all places, your job is to figure out which dots make senses, which dots do not, and how to connect those that are relevant to solve the problem, and to tell a story.

The story today is about a customer complaining about their scheduled instance on DXP keeps having high memory after running Find indexing job. They have a custom job that was built to optimize performance for their language settings, but the idea is the same – load content, serialize it and send it to the server endpoint for indexing. It is, indeed a memory heavy job, especially when you have a lot of content that needs to be indexed (basically, number of content x number of languages x the complexity of the content). It is normal to have an increase in memory usage during such job – the application (or rather, the runtime, depending on which way you look at it) is doing it job – content needs to be loaded in memory, and if there is available memory it will be a huge waste if it is not used for something useful. And the application will not immediately release that memory, as the content is cached. The memory will only be reclaimed only if the cache expired, or the application has memory pressure (i.e. it asks the operating system for more memory and the OS refuses “there is nothing left”). Even if the cache is expired, the application will not always compact and release the memory back to the OS (LOH etc.)

Now what is problematic is that the customer application retains 25GB of memory for indefinitely. They waited for 24h but the memory usage is still high. The application appears to be fine, it does not crash because of memory issues (like Out of Memory), but it causes confusion s and worries to our customer. Game’s on.

One thing that does not make senses in this case is that even thought they have a custom index job, it is still a scheduled job. And for scheduled jobs, the contents are supposed to have a very short sliding expiration time (default to 1 minute). However, the cache entries in the memory dumps tell a different story. A majority of the cache entries have 12h sliding expiration time. Which does explain – in part at least – why the memory remains high. When you have a longer sliding time, chance is higher that the cache is hit at least once before it expires, which reset the expiration. If you have sufficient hit, the cache will effectively remain in memory forever, until you actively evict it (by editing the content for example)

0000753878028910                        0.77kb          0                           12:00:00                    2/16/2024 5:58:43 AM +00:00    EPPageData:601596:en__CatalogContent
0000753878029DC0                        0.78kb          0                           12:00:00                    2/16/2024 2:59:39 PM +00:00    EPPageData:1345603:es-pr__CatalogContent
00007538781C7F48                        0.78kb          0                           12:00:00                    2/16/2024 2:59:39 PM +00:00    EPPageData:1351986:es-pr__CatalogContent
00007538781C8058                        0.78kb          0                           12:00:00                    2/16/2024 2:59:39 PM +00:00    EPPageData:1346230:es-pr__CatalogContent
00007538781C8168                        0.78kb          0                           12:00:00                    2/16/2024 2:59:39 PM +00:00    EPPageData:1351988:es-pr__CatalogContent
00007538786FA8E8                        0.77kb          0                           12:00:00                    2/16/2024 8:14:53 AM +00:00    EPPageData:1049433:no__CatalogContent
00007538786FC598                        0.78kb          0                           12:00:00                    2/16/2024 9:32:28 AM +00:00    EPPageData:1088026:es-pr__CatalogContent
00007538786FD9E0                        0.77kb          0                           12:00:00                    2/16/2024 8:14:53 AM +00:00    EPPageData:1049435:no__CatalogContent
0000753878700770                        0.77kb          0                           12:00:00                    2/16/2024 7:52:53 AM +00:00    EPPageData:1029725:da__CatalogContent
0000753878706528                        0.78kb          0                           12:00:00                    2/16/2024 2:59:39 PM +00:00    EPPageData:1351990:es-pr__CatalogContent
0000753878706638                        0.78kb          0                           12:00:00                    2/16/2024 2:59:39 PM +00:00    EPPageData:1350104:es-pr__CatalogContent
00007538787A2F80                        0.77kb          0                           12:00:00                    2/16/2024 8:14:53 AM +00:00    EPPageData:1049439:no__CatalogContent
00007538787A3FD0                        0.77kb          0                           12:00:00                    2/16/2024 7:52:53 AM +00:00    EPPageData:1029729:da__CatalogContent
00007538787A6B48                        0.77kb          0                           12:00:00                    2/16/2024 7:52:53 AM +00:00    EPPageData:1029731:da__CatalogContent
00007538787A74C0                        0.77kb          0                           12:00:00                    2/16/2024 6:21:34 AM +00:00    EPPageData:690644:en__CatalogContent
00007538787A9CC8                        0.78kb          0                           12:00:00                    2/16/2024 5:43:57 AM +00:00    EPPageData:181410:cs-cz__CatalogContent
00007538787ACDD8                        0.82kb          0                           12:00:00                    2/16/2024 2:17:38 PM +00:00    EPPageData:1343746__CatalogContent
00007538787ACFF8                        0.83kb          0                           12:00:00                    2/16/2024 2:17:25 PM +00:00    EPPageData:1343746:en__CatalogContent
00007538787AE658                        0.77kb          0                           12:00:00                    2/16/2024 2:59:37 PM +00:00    EPPageData:1350160:da__CatalogContent
00007538787AE768                        0.77kb          0                           12:00:00                    2/16/2024 2:59:37 PM +00:00    EPPageData:1350162:da__CatalogContent
00007538787AEA98                        0.39kb          0                           00:00:00                    2/16/2024 2:17:38 PM +00:00    EPiAnc:ContentAssetAware1343745__CatalogContent
00007538787AF058                        0.77kb          0                           12:00:00                    2/16/2024 2:59:37 PM +00:00    EPPageData:1347560:da__CatalogContent
00007538787B29A0                        0.77kb          0                           12:00:00                    2/16/2024 2:17:07 PM +00:00    EPPageData:1329806:da__CatalogContent
00007538787B2E68                        0.77kb          0                           12:00:00                    2/16/2024 2:17:07 PM +00:00    EPPageData:1329808:da__CatalogContent
00007538787B31E8                        0.77kb          0                           12:00:00                    2/16/2024 2:17:07 PM +00:00    EPPageData:1329810:da__CatalogContent

It is not what it should be, however, as the default value for sliding expiration timeout of a content loaded by a scheduled job is 1 minute – i.e. it is considered to be load once and be done item. Was it set to 12h by mistake. Nope

Timeout is set to 600.000.000 ticks which is 60 second, which is the default value.

I have been pulling my hairs over this for quite a while. What if the cache entries were not added by the scheduled job, but by some other way not affected by the limitation of scheduled job? In short, we were deceived by customer’s statement regarding Find indexing job. It was merely a victim of same issue. It was resetting the last access to the cache entry but that’s about it.

Time to dig a bit more. While Windbg is extremely powerful, it does not let you know where is the code that load a specific content into cache (not unless you catch it red handed). So the only way to know is to look around and check if there are any suspicious call the IContentLoader.GetItems or IContentLoader.GetChildren . A colleague of mine worked with the customer to obtain their source code, and another deep dive.

Fortunately for us, the customer has a custom built Find indexer we helped to built in a previous problem, and that was shown in the search for GetItems. It struck me that it could be the culprit. The job itself is … fine, however it was given wrong data so it keeps loading content to index.

If my hypothesis is correct, then these things must be true:

  • The app’s memory usage will raise to 25GB regardless of the indexing job running or not. And it remains there without much fluctuation
  • There are a lot of row in tblFindIndexQueue

It turned out both of those were correct: there were more than 4 millions of rows in tblFindIndexQueue, and this is the memory consumption of the app over 24 hours

One we figured out the source of content loading, the fix was pretty straightforward. One thing we could do from our side is to shorten caching time of content loaded by the event-driven indexer. You should upgrade to Find 16.2.0 which contains the fix for FIND-12436 which is a nice improvement for memory usage.

Moral of story:

  • I’m a workaholic. I definitely should not work on weekends, but sometimes I need to because that’s when my mind is clearest
  • Keep looking. But as always, know when to give up and admit defeat
  • Take breaks. Long, shorts. Refresh your mind and look at different angles.
  • The sliding cache expiration time can be quite unexpected. if a content is already in cache with long sliding expiration, then a cache hit (via ISynchronizedObjectInstanceCache.ReadThrough to get that content with short sliding expiration will not change that value, only refresh the last access time, and vice versa)

Fix your Search & Navigation (Find) indexing job, please

Once upon a time, a colleague asked me to look into a customer database with weird spikes in database log usage. (You might start to wonder why I am always the one who looks into weird things, is there a pattern here)

Upon reviewing the query store, I noticed a very high logical reads related to tblScheduledItem. From past experience, it was likely because of fragmentation of indexes in this table (which has only one clustered). I did a quick look at the table, and confirmed the index indeed has high fragmentation. I suggested to do a rebuild of that index and see what happen. Well, it could have been one of the daily simple quick questions, and I almost forgot about it.

A few days passed, the colleague pinged me again. Apparently they rebuilt it but it does not really help. That raised my eye brows a little bit, so I dug deeper.

To my surprised, the problem was not really fragmentation (it definitely contributed). The problem is that the column has a column of type nvarchar(max) , and it’s for recording the last text from the Execute method of the scheduled job. It was meant for something short like “The job failed successfully”, or “Deleted 12345 versions from 1337 contents”. But because it’s nvarchar(max) it could be very, very long. You can, in theory, store the entire book content of a a few very big libraries in there.

Of course because of you can, does not mean you should. When your column is long, each read from the table will be a burden to SQL Server. And the offending job was nothing less than our S&N indexing job.

In theory, any job could have caused this issue, however it’s much more likely happen with S&N indexing job for a reason – it keeps track of every exception thrown during indexing process, and because it indexes each and every content of your website (except the ones you specifically, explicitly tell it not too), the chance of its running into a recurring issue that affects multiple (reads, a lot) of content is much higher than any built-in job.

I asked, this time, to trim the column, and most importantly, fix any exceptions that might be thrown during the index. I was on my day off when my colleague noticed me that the job is running for 10h without errors, as they fixed it. Curious, so I did check some statistic. Well, let those screenshots speak for themselves:

The query itself went from 16,000ms to a mere 2.27ms. Even better, each call to get the list of scheduled job before results in 3.5GB logical reads. Now? 100KB. A lot of resource saved!

So, make sure your job is not throwing a lot of errors. And fix your S&N indexing job.

P/S: I do think S&N indexing job should have simpler return result. Maybe “Indexed 100.000 content with 1234 errors”, and the exceptions could have been logged. But that’s debatable. For now, you can do your parts!