Disabling Catalog Dto cache: maybe, don’t?

Recently (as recent as this morning) I was asked to look into a case when the Find indexing performance was subpar. Upon investigation – looking at a properly captured trace from dotTrace – it was clear that at least 30% percent of time was spending in loading the CatalogDto

This is something that should not happen, as the CatalogDto should have been cached. Also, a normal site should have very few catalogs, so the cache should be very effective. However, data does not lie – it has been hitting database a lot, and a quick check on the site settings revealed that the entire DTO cache has been indeed, disabled

 <Cache enabled="false" collectionTimeout="0:0:0" entryTimeout="0:0:0" nodeTimeout="0:0:0" schemaTimeout="0:0:0" /> 

By setting these timeout settings to 0, the cache is immediately invalidated, rendering them useless. The CatalogDto, therefore, is loaded everytime from database, causing the bottleneck.

The reason for setting those timeout to 0 was probably – I guess – to reduce the memory footprint of the site. However, Catalog DTOs are fairly small in size, and since Commerce 11, it has been smart enough to skip caching the DTOs if there is cache on a higher (content ) level, thanks to my colleague Magnus Rahl. So DTOs should not be of any concerns, if you are not actively using them (and in most of the case, you should not). By re-enabling the cache, the indexing time can be cut, at least 30%, according to the aforementioned trace.

As you might wonder, Catalog content provider still uses the DTOs internally, therefore it would load those for data.

Moral of the story:

  • The cache settings are there, but because you can, does not mean you should. I personally think cache settings should be as hidden as possible from accidental changes. Disabling cache, and in a lesser extend, changing default cache timeout, can have unforeseeable consequences. Only do so if you have strong reasons to do so. Or better, let us know why you need to do that, and we can figure out a compromise.

Looking for static class fields in Windbg

I am looking into the ever growing problem with LambdaExpression cache in Find, as reported here https://world.episerver.com/forum/developer-forum/Problems-and-bugs/Thread-Container/2019/9/episerver-find-lambdaexpressionextensions-_cache-keeps-growing-indefinately/ . One important part of analyzing cache is to understand how many items are in cache, and how is the cache hit ratio. I have received the memory dumps from our partners, time to fire up some Windbg. Luckily for us that is stored in the class as fields. Unluckily for us, the class in question is a static one, it is when you find out !dumpheap -type is not working for you.

The right way would be to use !name2ee

0:000> !name2ee episerver.find.dll EPiServer.Find.LambdaExpressionExtensions
Module:      00007ffd770bef80
Assembly:    EPiServer.Find.dll
Token:       000000000200000d
MethodTable: 00007ffd79c998e8
EEClass:     00007ffd79c8d268
Name:        EPiServer.Find.LambdaExpressionExtensions

!name2ee takes two parameters, the first one is the module name (basically the name of the assembly), and the second one is the name of the class. It is important to note that the class name is case sensitive, so you have to give it name with correct casing.

Now you have the EEClass and you just need to dump it using !Dumpclass

0:000> !DumpClass /d 00007ffd79c8d268
Class Name:      EPiServer.Find.LambdaExpressionExtensions
mdToken:         000000000200000d
File:            C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Temporary ASP.NET Files\root\775a589a\ddb1376c\assembly\dl3\26c58139\00ecff94_a3aed501\EPiServer.Find.dll
Parent Class:    00007ffd76045498
Module:          00007ffd770bef80
Method Table:    00007ffd79c998e8
Vtable Slots:    4
Total Method Slots:  7
Class Attributes:    100181  Abstract, 
Transparency:        Critical
NumInstanceFields:   0
NumStaticFields:     3
              MT    Field   Offset                 Type VT     Attr            Value Name
00007ffd7d0c4e28  4000009        8 ...egate, mscorlib]]  0   static 000001e6008e6768 _cache
00007ffd760d0d90  400000a      398         System.Int64  1   static 87633 _compiles
00007ffd760d0d90  400000b      3a0         System.Int64  1   static 34738206 _calls

And voilà!

Dynamic data store is slow, (but) you can do better.

If you have been developing with Episerver CMS for a while, you probably know about its embedded “ORM”, called Dynamic Data Store, or DDS for short. It allows you to define strongly typed types which are mapped to database directly to you. You don’t have to create the table(s), don’t have to write stored procedures to insert/query/delete data. Sounds very convenient, right? The fact is, DDS is quite frequently used, and more often than you might think, mis-used.

As Joel Spolsky once said Every abstraction is leaky, an ORM will likely make you forget about the nature of the RDBMS under neath, and that can cause performance problems, sometime severe problems.

Let me make it clear to you

DDS is slow, and it is not suitable for big sets of data.

If you want to store a few settings for your website, DDS should be fine. However, if you are thinking about hundreds of items, it is probably worth looking else. Thousands and more items, then it would be a NO.

I did spend some time trying to bench mark the DDS to see how bad it is. A simple test is to add 10.000 items to a store, then query by each item, then deleted by each item, to see how long does it take

The item is defined like this, this is just another boring POCO:

internal class ShippingArea : IDynamicData
{
    public Identity Id { get; set; }

    public string PostCode { get; set; }

    public string Area { get; set; }

    public DateTime Expires { get; set; }
}

The store is defined like this

    public class ShippingAreaStore
    {
        private const string TokenStoreName = "ShippingArea";

        internal virtual ShippingArea CreateNew(string postCode, string area)
        {
            var token = new ShippingArea
            {
                Id = Identity.NewIdentity(),
                PostCode = postCode,
                Area = area,
                Expires = DateTime.UtcNow.AddDays(1)
            };
            GetStore().Save(token);
            return token;
        }

        internal virtual IEnumerable<ShippingArea> LoadAll()
        {
            return GetStore().LoadAll<ShippingArea>();
        }

        internal virtual IEnumerable<ShippingArea> Find(IDictionary<string, object> parameters)
        {
            return GetStore().Find<ShippingArea>(parameters);
        }

        internal virtual void Delete(ShippingArea shippingArea)
        {
            GetStore().Delete(shippingArea);
        }

        internal virtual ShippingArea Get(Identity tokenId)
        {
            return GetStore().Load<ShippingArea>(tokenId);
        }

        private static DynamicDataStore GetStore()
        {
            return DynamicDataStoreFactory.Instance.CreateStore(TokenStoreName, typeof(ShippingArea));
        }
    }

Then I have some quick and dirty code in QuickSilver ProductController.Index to measure the time (You will have to forgive some bad coding practices here ;). As usual StopWatch should be used on demonstration only, it should not be used in production. If you want a good break down of your code execution, use tools like dotTrace. If you want to measure production performance, use some monitoring system like NewRelic or Azure Application Insights ):

        var shippingAreaStore = ServiceLocator.Current.GetInstance<ShippingAreaStore>();
        var dictionary = new Dictionary<string, string>();
        for (int i = 0; i < 10000; i++)
        {
            dictionary[RandomString(6)] = RandomString(10);
        }
        var identities = new List<ShippingArea>();
        var sw = new Stopwatch();
        sw.Start();
        foreach (var pair in dictionary)
        {
            shippingAreaStore.CreateNew(pair.Key, pair.Value);
        }
        sw.Stop();
        _logger.Error($"Creating 10000 items took {sw.ElapsedMilliseconds}");
        sw.Restart();
        foreach (var pair in dictionary)
        {
            Dictionary<string, object> parameters = new Dictionary<string, object>();
            parameters.Add("PostCode", pair.Key);
            parameters.Add("Area", pair.Value);
            identities.AddRange(shippingAreaStore.Find(parameters));
        }

        sw.Stop();
        _logger.Error($"Querying 10000 items took {sw.ElapsedMilliseconds}");
        sw.Restart();

        foreach (var id in identities)
        {
            shippingAreaStore.Delete(id);
        }
        sw.Stop();
        _logger.Error($"Deleting 10000 items took {sw.ElapsedMilliseconds}");

Everything is ready. So a few tries gave us a fairly stable result:

2019-12-02 13:33:01,574 Creating 10000 items took 11938

2019-12-02 13:34:59,594 Querying 10000 items took 118009

2019-12-02 13:35:24,728 Deleting 10000 items took 25131

And this is strictly single-threaded, the site will certainly perform worse when it comes to real site with a lot of traffic, and thus multiple insert-query-delete at the same time.

Can we do better?

There is a little better attribute that many people don’t know about DDS: you can mark a field as indexed, by adding [EPiServerDataIndex] attribute to the properties. The new class would look like this.

    [EPiServerDataStore]
    internal class ShippingArea : IDynamicData
    {
        public Identity Id { get; set; }

        [EPiServerDataIndex]
        public string PostCode { get; set; }

        [EPiServerDataIndex]
        public string Area { get; set; }

        public DateTime Expires { get; set; }
    }

If you peek into the database during the test, you can see that the data is now being written to Indexed_String01 and Indexed_String02 columns, instead of String01 and String02 as without the attributes. Such changes give us quite drastic improvement:

2019-12-02 15:38:16,376 Creating 10000 items took 7741

2019-12-02 15:38:19,245 Querying 10000 items took 2867

2019-12-02 15:38:44,266 Deleting 10000 items took 25019

The querying benefits greatly from the new index, as it no longer has to do a Clustered Index Scan, it can now do a non clustered index seek + Key look up. Deleting is still equally slow, because the delete is done by a Clustered Index delete on the Id column, which we already have, and the index on an Uniqueidentifier column is not the most effective one.

Before you are happy which such improvement, keep in mind that there are two indexes added for Indexed_String01 and Indexed_String02 separately. Naturally, we would want a combination, clustered even, on those columns, but we just can’t.

What if we want to go bare metal and create a table ourselves, write the query ourselves? Our repository would look like this

public class ShippingAreaStore2
    {
        private readonly IDatabaseExecutor _databaseExecutor;

        public ShippingAreaStore2(IDatabaseExecutor databaseExecutor)
        {
            _databaseExecutor = databaseExecutor;
        }

        /// <summary>
        /// Creates and stores a new token.
        /// </summary>
        /// <param name="blobId">The id of the blob for which the token is valid.</param>
        /// <returns>The id of the new token.</returns>
        internal virtual ShippingArea CreateNew(string postCode, string area)
        {
            var token = new ShippingArea
            {
                Id = Identity.NewIdentity(),
                PostCode = postCode,
                Area = area,
                Expires = DateTime.UtcNow.AddDays(1)
            };
            _databaseExecutor.Execute(() =>
            {
                var cmd = _databaseExecutor.CreateCommand();
                cmd.CommandText = "ShippingArea_Add";
                cmd.CommandType = CommandType.StoredProcedure;
                cmd.Parameters.Add(_databaseExecutor.CreateParameter("Id", token.Id.ExternalId));
                cmd.Parameters.Add(_databaseExecutor.CreateParameter("PostCode", token.PostCode));
                cmd.Parameters.Add(_databaseExecutor.CreateParameter("Area", token.Area));
                cmd.Parameters.Add(_databaseExecutor.CreateParameter("Expires", token.Expires));
                cmd.ExecuteNonQuery();
            });

            return token;
        }

        internal virtual IEnumerable<ShippingArea> Find(IDictionary<string, object> parameters)
        {
            return _databaseExecutor.Execute<IEnumerable<ShippingArea>>(() =>
            {
                var areas = new List<ShippingArea>();
                var cmd = _databaseExecutor.CreateCommand();
                cmd.CommandText = "ShippingArea_Find";
                cmd.CommandType = CommandType.StoredProcedure;
                cmd.Parameters.Add(_databaseExecutor.CreateParameter("PostCode", parameters.Values.First()));
                cmd.Parameters.Add(_databaseExecutor.CreateParameter("Area", parameters.Values.Last()));
                var reader = cmd.ExecuteReader();
                while (reader.Read())
                {
                    areas.Add(new ShippingArea
                    {
                        Id = (Guid)reader["Id"],
                        PostCode = (string)reader["PostCode"],
                        Area = (string)reader["Area"],
                        Expires = (DateTime)reader["Expires"]
                    });
                }
                return areas;
            });
        }

        /// <summary>
        /// Deletes a token from the store.
        /// </summary>
        /// <param name="token">The token to be deleted.</param>
        internal virtual void Delete(ShippingArea area)
        {
            _databaseExecutor.Execute(() =>
            {
                var cmd = _databaseExecutor.CreateCommand();
                cmd.CommandText = "ShippingArea_Delete";
                cmd.CommandType = CommandType.StoredProcedure;
                cmd.Parameters.Add(_databaseExecutor.CreateParameter("PostCode", area.PostCode));
                cmd.Parameters.Add(_databaseExecutor.CreateParameter("Area", area.Area));
                cmd.ExecuteNonQuery();
            });
        }
    }

And those would give us the results:

2019-12-02 16:44:14,785 Creating 10000 items took 2977

2019-12-02 16:44:17,114 Querying 10000 items took 2315

2019-12-02 16:44:20,307 Deleting 10000 items took 3190

Moral of the story?

DDS is slow and you should be avoid using it if you are working with fairly big set of data. If you have to use DDS for whatever reason, make sure to at least try to index the columns that you query the most.

And in the end of the days, hand-crafted custom table + query beats everything. Remember that you can use some tools like Dapper to do most of the works for you.

Hide certain tabs in Catalog UI

It has been a while since I write something in my blog – have been “fairly” busy making Commerce even faster for a while. But I should take a break from time to time and share things that will benefit community as a whole – and this is one of that break.

Today I come across this question on World https://world.episerver.com/forum/developer-forum/Episerver-Commerce/Thread-Container/2019/10/remove-item-from-tab-in-content-editor/ . Basically, how to hide a specific tab in the Catalog UI when you open All Properties view of a catalog content.

The original poster has found a solution from https://world.episerver.com/forum/legacy-forums/Episerver-7-CMS/Thread-Container/2013/10/Is-there-any-way-to-hide-the-settings-tab/ . While it works, I think it is not the easiest or simple way to do it. Is there a simpler way? Yes.

The Related Entries tab is generated for content with implements IAssociating interface. Bad news is EntryContentBase implements that interface, so each and every entry type you have, has that tab. But good news is we can override the implementation – by just override the property defined by IAssociating.

How?

Simple as this

        /// <inheritdoc />
        [IgnoreMetaDataPlusSynchronization]
        [Display(AutoGenerateField = false)]
        public override Associations Associations { get; set; }

We are overriding the Associations property, and the change the Display attribute to have AutoGenerateField = false. Just try to build it and see

No Related Views! But is it the end of the story. Not yet, Related Views can still be accessed by the menu

A complete solution is to also disable that view. How? By using the same technique here https://world.episerver.com/blogs/Quan-Mai/Dates/2019/8/enable-sticky-mode-for-catalog-content/ i.e. using `UIDescriptor`. You can disable certain views by adding this to your constructor

AddDisabledView(CommerceViewConfiguration.RelatedEntriesEditViewName);

A few notes:

  • This only affects the type you add the property, so for example you can hide the tab for Products, but still show it for Variants.
  • Related Entries is not the only tab you can hide. By applying the same technique you can have a lot of control over what you can hide, and what you show. I will leave that to you for exploration!

Pension fund in Sweden, an overview

In Sweden, like many other western countries, each working person has an individual pension fund. This is vastly different from some other countries like Vietnam, where the pension fund is shared, not only between every working person, but also between other purposes (maternity leaves, sick leaves…) . Like many other things in Sweden, the pension fund is transparent to you, and you can manage part of it, to some level. If you do things “correctly”, then it might make a sizable impact on your pension when you retire.

A three parts pension scheme

Pension fund in Sweden consists of 3 parts. If you have been in Sweden for more than 1 year, you can always check https://www.minpension.se/ to see how much money you have in your pension fund (your information is only “added” in your first November here)

General pension (Swedish: Allmän pension)

Every working person in Sweden will receive this pension, as contributed by their employer. Every year, 18.5% of your pensionable income, up to a limit, is contributed to this part of your pension. The limit is set as 7.5 IBA – Income Base Amount Inkomstbasbelopp . This is adjusted by Swedish government every year, and in 2019, 1 IBA = 64.400 SEK, meaning your upper limit this year is 483.000SEK. If you are making more than that in your pensionable income (good for you!), then your public pension contribution is still capped at 89.355 SEK.

The public pension actually has 2 parts of itself:

The income pension (Inkomstpension)

which is 16% of your pensionable income. You can see this, (i.e. how much money do you have), but you can’t manage it. The state will invest the money the way it sees fit, however you can guess that the money is invested in some low risk, low return bonds.

The premium pension (premiepension)

which is 2.5% of your pensionable income. You can actually manage this at https://www.pensionsmyndigheten.se/

By default, your premium pension is put into AP7 Såfa, which is actually a very good fund. It has very low fee (only 0.06 – 0.1%/year), and good return rate. It is also an adaptive fund, which means it will invest more in bonds (which is “safer”, but returns less than stocks) as you age. When you are less than 55 years old, 100% is your money is put into shares, and that will reduce as you age, at 65 (your expected retirement age), it’s 67% stocks and 33% bonds, at 75, it’s 33% stocks and 67% bonds

Pensionsmyndigheten is great, because not only you can manage your fund here, but also it has a wide range of funds for you to choose, at a superbly discounted fee. The discount is small on index funds (which already have cheap management fees by themselves), usually at 0.16%/year compared to 0.2%/year originally, but it is very significant for actively managed funds.

Most of them are discounted for more than 1%. For example: Skandia Time Global has a management fee of 1.4% year, and on top of that, you have to pay transaction fee (Transaktionskostnad) of 0.25%, in total your fee is 1.65%/year. At Pensionsmyndigheten you only pay 0.39%. Which means you get 1.26% gain per year for free! As the biggest fund manager in Sweden, they have the leverage to negotiate with other fund managers to cut down the fees, and that is really good for you.

You can leave your money as-is, I think the default choice is very decent. But you are free to make your bet to potentially make more money.

The only bad thing about Pensionsmyndigheten is that you can’t put more money to their fund, even if you want to. That low fee is just so great.

Occupational pension (Tjänstepension)

Most, but not all, employers in Sweden give their employees the occupational pension. In case your employer doesn’t – like mine – then you will likely have to pay it yourself.

Most of the companies follow ITP scheme, which is in short:

  • ITP1, for people who were born in 1979 and after, which is
    • 4.5% of your salary, up to 7.5 IBA/year, 
    • 30% of your salary part that is higher than 7.5 IBA/year.
  • ITP 2, for people who were born in 1978 and before
    • 10% of your pensionable income, up to 7.5 IBA
    • 65% of your pensionable income, from 7.5 to 20 IBA
    • 32.5% of your pensionable income, from 20 to 30 IBA

The numbers are updated here https://www.collectum.se/sv/Privat/ITP/ITP-1-och-ITP-2/ .

You can see that if your salary is higher than 40 250 SEK/month in 2019 (7.5 IBA divided by 12 months), your occupational insurance increases very fast.

Just like premium pension, you can of course manage this pension, even to a bigger extend, as you can choose the provider yourself. In many cases, however, the choices of provider can be limited, depending on your employer contract. Notable providers in the market are:

  • Most banks, including big fours (SEB, Swedbank, Nordea & Handelsbanken) and smaller ones (Skandia, Länsförsäkringar)…
  • Avanza
  • Nordnet
  • SPP
  • Söderberg & Partners

As always, it’s best to talk with your company HR and/or your fund manager to see which options do you have. I’d always recommend to choose the one which lowest fees (less than 0.3%/year) and the biggest fund portfolio. Note that the provider might have a more limited fund portfolio for pension, so make sure to check that.

With occupational pension you have too choices, or actually three:

  • Funds (Fondförsäkring)
  • Traditional insurance (Traditionell försäkring)
  • A mix between those two.

When I started paying my occupational pension, I went with Traditional insurance because I am clueless about other options. In the end, I moved my pension into funds. Why? Lower cost with higher return.

Skandia Traditionell försäkring: 0.55% management fee, with an average 8.4% return/year in the last 4 years (2014-2018), down to 5% from May 1st 2019.

An US index fund: 0.2% management fee, with an average 18%/year in the last 5 years.

It’s not even close! Assuming I put 5000 SEK/month into my occupational pension (which is not the actually number), and the traditional insurance return is 5.3%/year, while the index fund return is 9.5%/year – both after fees, after 30 years:

  • Traditional Insurance gives me 4.419.370 SEK
  • Us index fund gives me 10.245.650 SEK

The choice was easy!

What if you don’t know what funds to invest? Nordnet for example, provides about 1400 funds in their portfolio. Which to choose? A general recommendation is to invest your money into an index fund – a type of fund that follow an index, like Standard & Poor 500, passively. The common characteristics of index funds are they are very low cost (commonly 0.2%/year fee), they yield market return, which is, in most of the cases, very good. The average return for S&P 500 index since its inception has been about 9.7%/year.

A well known advice is to have a three parts portfolio: an US index fund (follow S&P 500 index), a global market index fund, and a total market bond fund. It’s up to you, your age, your risk tolerance … to decide how many percent you want to invest in each. If you ask me, then I’d would say invest into an US index fund (most banks have this fund), and forget about it. If you are older, then you might want to put part of your money into bonds, and if you are younger, you might want to bet by putting more money into actively managed funds – which might have higher return than index funds, but with higher fee. In long term, index funds should still be a majority of your investment portfolio. Avoid funds with high fees. Avoid paying for people to adjust funds for you – the less they take, the more you make.

Private pension

This is your own pension contribution. However since 1/1/2016 this is no longer a “tax deductible” contribution, so there are very few people are still contributing to this part. I personally don’t, so I will skip writing about it.

Listing permissions per user/group

This week I came cross this question on Episerver World forum https://world.episerver.com/forum/developer-forum/Episerver-Commerce/Thread-Container/2019/5/get-rolepermission-data/ , and while it is not Commerce-related. it is quite interesting to solve. Perhaps this short post will help the original poster, as well future visitors.

As in the thread, I replied the first piece to solve the puzzle:


You can use PermissionTypeRepository to get the registered PermissionTypes, then PermissionRepository to figure out which groups/users have a specific permission 

If you want to list permissions granted to a specific role or user, it is just a simple reversion using a dictionary:

            var rolePermissionMap = new Dictionary<string, HashSet<PermissionType>>(StringComparer.OrdinalIgnoreCase);
            var permissionTypes = _permissionTypeRepository.List();
            foreach (var permissionType in permissionTypes)
            {
                var securityEntities = _permissionRepository.GetPermissions(permissionType);
                foreach (var securityEntity in securityEntities)
                {
                    if (rolePermissionMap.ContainsKey(securityEntity.Name))
                    {
                        rolePermissionMap[securityEntity.Name].Add(permissionType);
                    }
                    else
                    {
                        rolePermissionMap[securityEntity.Name] = new HashSet<PermissionType>() { permissionType };
                    }
                }
            }

As suggested above, we use
PermissionTypeRepository to list the registered PermissionType(s) , and then for each PermissionType we get the list of SecurityEntity it is granted for. A SecurityEntity can be an user, a group, or a virtual role, and is identified by the name. For purpose of demonstration, we only use names: For each SecurityEntity granted a permission, we check if it is in our dictionary already, if yes, then add the permission to the list, otherwise add a new entry.

Simple, eh?

Unless if you are assigning/un-assigning permissions a lot, it is probably a good idea to keep this Dictionary in cache for some time, because it is not exactly cheap to build.

A super short review of Dragon Quest XI: Echoes of an Elusive Age

DQXI is the last game I completed, and I don’t complete a lot of games. I was interested on the game because it was hugely anticipated, but the reviews put me off a little bit. Don’t get me wrong, the reviews are very positive – or at least most of them are, but they are not that level I wanted. The final push is a very good discount on the physical version, so I was like “What the hell” . Surprises, I was hooked.

The good

Very good “anime” styles. The game looks quite as good as the pre-rendered cutscenes.

The game is very well rounded and very enjoyable. I had almost no bug, at least none noticeable during my ~100 hours spent with the game.

Some plot twists that make the story interesting.

Good characters and side stories. Even thought they are not up to the level in The Witcher 3, they are still good enough to enjoy.

The crafting system (Fun-sized forge) is actually quite fun to work with.

The bad

The combat lacks depth usually seen in turn based combat. It is not exactly a bad thing for me, but if you are looking for some challenges, then the combat system in Persona 5 is far engaging.

Traditional, quite predictable main story.

The ugly

The background music is repetitive and can be annoying at time

The new improvements in the Switch version are not coming to PS4.

A super short review of XBox One X

I have been a PS4-fan (if I can call myself so) since the day I bought the original PS4 to play Dragon Age: Inquisition. At that point, PS4 is the clearly better choice than XBox One: smaller without a separate power adapter, 50% more powerful (1.84TFLOPS vs 1.23 TFLOPS), simpler policies to share/resell game, etc etc.

I even liked XBox One X when it was announced, it checks almost all the boxes, except for, well, games and price, so I gave it a pass, especially when I got a PS4 Pro from a colleague at a very good price. This genre, PS4 has won for many reasons, one of that is it has a excellent line of exclusive.

Why not all three of them? Why choose side?

And I remain faithful until the day Elgiganten – a big electronic retailer in Sweden sells Xbox One X with Fallout 76 bundle at an unbeatable price (cheaper than Xbox One X selling alone, so the game actually reduces the price of the bundle!). I talked to myself – why not, if I don’t like it I can just return (it’s turned out that the game is digital, so I won’t be able to return. But I like it in the end so I decided to keep it. I find myself playing Apex Legends every night now, and I’m happy with it)

I won’t play Fallout 76, but this is cheaper than the Xbox One X alone, thanks to it.

The good

A marvel of engineering. It’s significantly more powerful than my PS4 Pro, yet it is just the same size and is even more quiet.

Incredible value if you think about Game Pass. Bunch of good games at a very low price, especially when you use the promotions that happen all the time. I spent about 30 SEK (less than 4 USD) for 3 months of that.

The controller battery lasts me weeks, and it takes only 1 minute or so to replace the battery.

A new generation of gamers!

Games that are optimized for X run exceptionally well. Forza Horizon 4, Red Dead Redemption 2, just to name a few.

Xbox and Xbox 360 games backward compatibility.

UHD Blu-ray player.

The bad

The UI is a mess. I complained about how HBO UI is a mess . But I think XBox One UI is on par in term of terrible.

The ugly

The Blu-ray player that refuses to play my UHD bluray, 9 out of 10 times.

I will have to re-buy many games to get the advantage of native 4K.

IContentLoader.Get(contentLink) is considered harmful for catalog content.

A while ago I wrote about how you should be aware of IContentLoader.GetChildren<T>(contentLink) here. However, that is only half of story.

IContentLoader.Get<T>(contentLink) is also considered harmful. Not in terms of it causes damage to your site (we would never, ever let that happen), nor it is slow (not unless you abuse it), but because it can behave very unexpectedly.

As you might already know, catalog content fully supports language versions, which means a catalog might have multiple languages enabled, and each and every catalog item in that catalog (node/category, and entry) will be available in those languages. However, those languages are not equal, (only) one is master language. What’s the difference then?

One of very important characteristics of that is how it affects the properties. Properties with [CultureSpecific] attribute decorated will be different in each language, and therefore, can be edited in each language. Properties without [CultureSpecific] attribute decorated will be the same in all languages, and can only be edited in master language. In Catalog UI, if you switch to non master languages, those properties will be grayed out, indicating they can’t be edited.

Now, why IContentLoader.Get<T>(contentLink) is considered harmful? Because you don’t supply a CultureInfo to let it know which version you want, it relies on the current preferred language to load the content. And if you have a catalog which has master language that is different from the current preferred language, you are loading a non-master language version. And then if you try to edit a non [CultureSpecific] property, then save it, the changes will not be saved, without error or warning.

It then will be very confusing because it sometimes works (someone changes the current preferred language that matches the catalog master language, and sometimes it doesn’t.

Which can cost you hours, if not days, to figure out what is wrong with your code.

Same thing applies to IContentLoader.TryGet<T>(contentLink)

Solution? Always use the overload that takes a CultureInfo or a LoaderOptions parameter, even if you just want to read the content. That creates a “good” habit and you can quickly spot code that might be problematic.

Use this to load master language version, if you wish to update some non CultureSpecific property.

 new LoaderOptions() { LanguageLoaderOption.MasterLanguage() }

Later versions of Commerce will log a warning if you are trying to save a non master language version with one or more changed non [CultureSpecific]properties.