Loading sos.dll to Windbg

Windbg is still the golden standard to troubleshoot memory dumps. and sos extension is still the essential tool to help with managed memory dump investigation. With .NET Framework (4.8 or earlier), it could be as easy as .loadby sos clr. But things got a little trickier with .NET Core, as you need to install dotnet-sos for the extension, and that could cause some confusions to get it work.

So, the thing is that you can’t (or rather, can’t stop at) install dotnet-sos like this

dotnet tool install -g dotnet-sos

Which will install it in a weird path like this C:\Users\quma.dotnet\tools.store\dotnet-sos\9.0.607501\dotnet-sos\9.0.607501\tools\net6.0\any

What you want is to install it with

dotnet-sos install

so it will be installed in a path like C:\Users\quma.dotnet\sos

And then you can run .load C:\Users\quma.dotnet\sos\sos.dll in WinDBG and everything should run as expected.

Best deal on Amazon.se week 10/2025

Assassins Creed Mirage PS5


Reduced to kr199.00

https://amzn.to/4bo19e2

Hultafors 827023 Crowbar, Red

Reduced to kr67.00

https://amzn.to/3QHb30N

Belkin BoostCharge Wireless Magnetic Car Phone Holder Compatible with iPhone 15 14 13 12 Plus Pro Pro Max Mini and More with MagSafe (Cable and Power Supply Adapter Included)

Reduced to kr96.00

https://amzn.to/4btGTYG

Belkin charger for Apple Watch, MFi certified wireless fast charging travel pad with bedside mode

Reduced to kr126.00

https://amzn.to/41nl1cE

How to: set access right to folders

Today I stumped upon this question Solution for Handling File Upload Permissions in Episerver CMS 12, and there is a simple solution for that

Using the  EditSecurity.aspx comes in handy but it is not very future proof as it is now removed in CMS 12 with its siblings WebForms part. However, we can easily make up for that by using the APIs – which is meant to remain for a long time. Even better, this could be set up to be done automatedly and repeatedly. The little secret is IContentSecurityRepository which is what EditSecurity.aspx used under the hood.

To set the access right of a content, this is what you need

        var contentlink = new ContentReference(100);
        var content = _contentRepository.Get<IContent>(contentlink);
        if (content is IContentSecurable securable)
        {
            var descriptor = securable.GetContentSecurityDescriptor();
            //You might need to add this if your content is not Catalog
            descriptor = (ContentAccessControlList)descriptor.CreateWriteableClone();
            descriptor.Clear();
            descriptor.AddEntry(new AccessControlEntry("Everyone", AccessLevel.Read, SecurityEntityType.Role));
            descriptor.AddEntry(new AccessControlEntry("Author", AccessLevel.Read | AccessLevel.Create | AccessLevel.Edit | AccessLevel.Delete | AccessLevel.Publish, SecurityEntityType.Role));
//any other access rights that you need to set
            _contentSecurityRepository.Save(contentlink, descriptor, SecuritySaveType.Replace);
        }

The first two lines are from for demonstration, and not very “automated”. You might have some hard coded value there or might have to work your magic to find the right content. (Just a kind note, if you want your code to work consistently between sites, make sure to use ContentGuid instead ContentId which is changed depending on the database).

The rest of the code is quite self-explanatory. You check if your content implements IContentSecurable which is required to have access rights. Then you clear any existing access rights and add your desirable ones. Finally save it with the SecuritySaveType.Replace to make sure that only access rights you wanted, exist.

This code can be run multiple times (Idempotent). You can add it to a scheduled job, or even better, a startup routine, to make sure that you always have the right access rights of specific content.

Be careful with your (order) notes

This happened a quite ago but only now I have had time to write about it.

Once upon a time, I was asked to look into a customer database (in a big engagement of helping them improving performance overall)

One thing stand out is this query

WITH CTE AS
(SELECT * FROM dbo.OrderGroupNote 
WHERE OrderGroupId = @OrderGroupId)
MERGE CTE AS T
USING @OrderGroupNotes AS S
ON T.OrderNoteId = S.OrderNoteId
WHEN NOT MATCHED BY TARGET
	THEN INSERT (
		[OrderGroupId],
		[CustomerId],
		[Title],
		[Type],
		[Detail],
		[Created],
		[LineItemId],
		[Channel],
		[EventType])
	VALUES(S.OrderGroupId,
		S.CustomerId,
		S.Title,
		S.Type,
		S.Detail,
		S.Created,
		S.LineItemId,
		S.Channel,
		S.EventType)
WHEN NOT MATCHED BY SOURCE
	THEN DELETE
WHEN MATCHED AND (S.IsModified = 1) THEN 
UPDATE SET
	[OrderGroupId] = S.OrderGroupId,
	[CustomerId] = S.CustomerId,
	[Title] = S.Title,
	[Type] = S.Type,
	[Detail] = S.Detail,
	[Created] = S.Created,
	[LineItemId] = S.LineItemId,
	[Channel] = S.Channel,
	[EventType] = S.EventType;

If you can guess, that is the query to save the notes of an order. Normally it’s … fine. But for this customer, it is not, each save could result in almost 10GB, yes, you read it right, ten gigabytes of logical reads. Insane

The reason was, this customer has some orders with an absurd number of notes attached to it. The most one has 52k notes. And there are, in total, 94 orders with more than 1000 notes.

Upon investigation, they have a job to validate payment of invalid orders, which runs every 10 minutes. If the validation failed, a note will be added to the order. But because of no “limit” or “cut off”, that’s kept going for forever and continuing to add notes to orders. Each time, the operation becomes more expensive.

As a side note, this is note only expensive on the saving (to the database). It’s expensive to load from database, and it’s expensive to create all the objects in the memory.

The fix in this case is obviously to trim old notes, and to make sure that if the validation failed for X times, stop processing further.

But you might ask, could we do better. could we not save the entire order notes collection just because one note is added? That’s a good question. A really good one. I took a shot at that, but it’s … complicated. This is where we are held back by our promises of keeping thing backward compatible. When we try to make it better – you can do it better yourself as well. Make sure you do not have orders with an unusually high amount of notes.

SELECT 
    ordergroupid, 
    COUNT(OrderNoteId) AS notecount
FROM 
    OrderGroupNote
GROUP BY 
    ordergroupid
ORDER BY 
    notecount DESC;

If the most you have is less than 10, all good, less than 20 is fine. More than that and you might want to check why those orders have that many notes.

How to: create Decimal metafield with custom precision

If you are using catalog system, the way of creating metafields are easy – in fact, you can forget about “metafields”, all you should be using is the typed content type. Adding this attribute to your property is enough to set the precision as you wish.

        [DecimalSettings(10, 2)]
        public virtual Decimal MyAttributesDecimal { get; set; }

Thing is a little different if you are using order system. You don’t have the strongly typed order types to work with. To automate the metafield creation, you will have to use the underlying MetaField API yourself. You probably know how to create a metafield and add it to a desirable metaclass

            var metaField = MetaField.Create(MetaDataContext.Instance, "Mediachase.Commerce.Orders.System.Shipment", "NewMetaField3", "new", "temp",
                MetaDataType.Decimal, 17, true, true, false, false);
            var metaClass = MetaClass.Load(MetaDataContext.Instance, "ShipmentEx");
            metaClass.AddField(metaField);

However, metafield created this way will be added to the metaclass with the default (18, 0) precision, which is kind of pointless. How to control the precision of the decimal metafields?

The little secret is with the MetaField.Attributes. There are two attributes that control the precision of decimal type: MdpPrecision, and MdpScale. You have to set those after the metafield is created, but before it’s added to the metaclass (the reason was simple: the underlying query to add the column to the table looks for those values to set the precision of the column). Your code should look like this

var metaField = MetaField.Create(MetaDataContext.Instance, "Mediachase.Commerce.Orders.System.Shipment", "NewMetaField5", "new", "temp",
    MetaDataType.Decimal, 17, true, true, false, false);
metaField.Attributes["MdpPrecision"] = "38";
metaField.Attributes["MdpScale"] = "9";
var metaClass = MetaClass.Load(MetaDataContext.Instance, "ShipmentEx");
metaClass.AddField(metaField);

A small but important note to remember: both of the attributes must be set, even if you want to leave one to default value.

And Tada!