Why I left Metacritic, for good.

I’m a mediocre gamer, by any measures, but that does not defy the fact that I love playing games. Great games, only. With my very limited time and budget, I must to be very selective about the games I play. For a very long time, Metacritic is my to-go website to check if I should play a game – and I set my rules rather high: anything less than 80 is a big no-no. Less than 85 is a no, unless it’s a sequel of a game I loved so much (so I can see how the story turned out to be). More than 85 is “I’ll think about it” (when it comes with a very tempting discount, I might make up my mind). A game of at least 90 Metacritic score will be in my short list (but not necessary in my playing list). A game of at least 95 Metacritic means “must try” for me. I might not love it, but a game with such reviews must have something I like about.

However, overtime, I did realize what MetaCritic is really meant to be, and why reviews can be full of sh*t.

Game reviews, as almost other content in the Internet, include the post you are reading, are opinions. It’s opinion, just like you and me, but the only thing that makes critics stand out is professionalism.

But that’s not something I’ve seen in many reviews which are listed on Metacritic.

You can either love, or hate a game. You can talk about it, openly, it’s your rights! But if you want to help your readers decide if they should  buy or play the game, based on your judgement, you have to be fair. Your opinions have to be as objective as possible. Of course, there is no such thing as “true objective” in reviews, but yours should have a reasonable amount in them.

Unfortunately, in this area of internet, clicks and views are everything. More views mean more ads impressions, ads clicks, and that mean more money.

The best known trick for clicks, is of course, controversial opinion.

Metacritic has a fairly easy review policy. The only measurable criteria is you have to have at least 30 reviews in your website, and you have to review fairly regularly. All other criteria about ethical or professionalism are indeed subjective. And that are being heavily abused.

Less known websites have been using that tactic to draw attention to their website. By giving a highly anticipated game an unreasonable low score, they create the controversy. People will urge to visit their websites to comment on the absurd review, or even sharing the reviews so their friends can attack them.

Who is benefiting? Do you have to guess?

Let’s talk about the infamous Washington Post’s Uncharted 4: A thief end review. Of 113 reviews averaging 93/100, they gave them 2/5 (equivalent to 40/100).

Or the slightly less infamous, USGamer’s Horizon: Zero Dawn review: 5/10 for a game which is otherwise rated 89/100.

Can you imagine it?

And Washington Post is not the only one. GameCritics, another less known site, has jumped in a gave the game a 50/100.

You can be harsh – that’s what famous magazine like Edge is known for – but you also need to be fair. You can criticize a game as much as you like, as long as the criticism is objective – why don’t you like it, and what it can have done better. When you bash a game because it’s not your taste, you become biased. The said review from WP was even disagreed by their staff

I’ve never disagreed more with a piece of content on our website as I have with this. haha  

The worst thing is this kind of shaggy tactic is not per review. Some sites are known to use it systematically, hoping to attract more clicks to the website:

GameCritics is just slightly better:

You see the pattern here? Well, you can expect those websites will, more often than not, give the lowest scores to a highly anticipated game, so their reviews become more visible and attract more clicks. People would go their and bash their review, like they themselves did with the game. In the end, who’s getting the benefits here?

Metacritic has changed the way we look at reviews. Before Metacritic, it’s more difficult to get a fairly balance, objective views of the game. But the good deed of Metacritic is being abused and if they don’t do anything to fight that, they will soon obsolete themselves.

Useful T-SQL snippets for development and troubleshooting

This post is more of a note-to-self. These are the useful T-SQL statements which can be incredibly useful in development and troubleshooting

SET STATISTICS IO ON

Turn on the IO statistics for statements run after that until set to OFF explicitly. We then switch to Messages tab to see how many IO operations were done on each table.

SET STATISTICS TIME ON

Find out about the statements were executed: which statements, its texts, how many reads (logical), how many time was spent on CPU and how many time was spent total

Continue reading “Useful T-SQL snippets for development and troubleshooting”

Choose your battles

This is the third part of the series: How to survive and thrive – a series for new developers to become better at their jobs. You can read the first two parts here and here.

In military, there is a term of “uphill battle”. That when you have to fight your way up a hill, when you enemy controls the top. It’s a very difficult fight and your chance of success is low. Any experienced military leader knows uphill battles are something you should avoid until there are no other options.

That also applies with any jobs. Including programming.

The truth is, you shouldn’t fight the battles you can’t win.
Continue reading “Choose your battles”

Mass update catalog entries

This is something you don’t do daily, but you will probably need one day, so it might come in handy.

Recently we got a question on how to update the code of all entries in the catalog. This is interesting, because even thought you don’t update the codes that often (if at all, as the code is the identity to identify the entries with external system, such as ERPs or PIMs), it raises a question on how to do mass update on catalog entries.

    • Update the code directly via database query. It is supposedly the fastest to do such thing. If you have been following my posts closely, you must be familiar with my note regarding how Episerver does not disclose the database schema. I list it here because it’s an option, but not the good one. It easily goes wrong (and cause catastrophes), you have to deal with versions and cache, and those can be hairy to get right. Direct data manipulation should be only used as the last resort when no other option is available.

Continue reading “Mass update catalog entries”

The art of paging

No this is not really “art” – I’m just trying to have a more clickbait title. It’s more about understanding what you have at your disposal and use them for your benefits – in this case – how new SQL statement can drastically improve your performance.

In this blogpost we will look into paging feature of SQL Server. in Commerce we usually work with large set of data – millions of rows are fairly common, and it’s natural to load data by page. There is no point loading thousands, or even millions of rows in one go. First it’s not practical to display all of them. Second you’ll likely end up with an timeout exception and/or an out of memory exception. Even if you are lucky enough to get through, it’s still able to take your SQL Server instance to a knee, and transferring that much data over network will be another bottleneck for your system. So my friends, the best practice for loading data is to do it by batches, and to not load everything at once.

Continue reading “The art of paging”

Fixing a stored procedure

At Episerver development team, we understand the importance of good performance. Who would not like a lightning fast website? We work hard to ensure the framework is fast, and we seize (almost) every opportunity to make it faster.

You know in Commerce 10.2 we introduced a new cart mode – serializable cart, and it’s proven to bring great performance compared to the “old/traditional” approach. Our own tests showed an improvement of 3-5x times faster. But can it be even faster? Probably yes.

And actually we did some improvements in later versions. In the scope of this blog post, we will just focus into a specific aspect – and to learn a little more about SQL Server performance optimization.

Continue reading “Fixing a stored procedure”

Maintaining your indexes

Indexes are crucial to SQL Server performance. Having the right indexes might make the difference of day and night with your application performance – as I once talked here.

However, even having the right indexes is not everything. You have to keep them healthy. Indexes, as any other kinds of storage, is subjected to fragmentation. SQL Server works best if the index structure is compact and continuous, but with all of the inserts/updates/deletes, it’s inevitable to get fragmented. When the fragmentation grows, it starts affecting the performance of SQL Server: Instead of having to read just one page, it now have to read two, which increases both time and resource needed, and so on and so forth.

Continue reading “Maintaining your indexes”

Episerver caching issue with .NET 4.7

Update 1: The bug is fixed in .NET 4.7.1 (thanks to Pascal van der Horst for the information)

Update 2: The related bug is fixed in CMS Core 10.10.2 and 9.12.5. If upgrading to that version is not an option, you can contact Episerver support service for further assistance.

Original post:

If you are using Episerver and update to .NET 4.7 (even involuntarily, such as you are using DXC/Azure to host your websites. Microsoft updated Azure to .NET 4.7 on June 26th) , you might notice some weird performance issues. If your servers are in Europe, Asia or Australia, then you can see a peak in memory usage. If your servers in North America, then you can see the number of database calls increased. In both cases, your website performance is affected, the former can cause your websites to constantly restarts as memory usage reaches a threshold limit, and even more obvious in the latter. Why?

It was a known issue in .NET 4.7, as mentioned here: https://support.microsoft.com/en-us/help/4035412/fix-expiration-time-issue-when-you-insert-items-by-using-the-cache-ins

Continue reading “Episerver caching issue with .NET 4.7”

Import a bacpac to SQL Server

This is more of a note-to-self.

I sometimes have to import a bacpac file from customer’s database (usually from Azure SQL database) to my local machine – . For most of the time it’ll be very easy when the databases are in .bak format, but for .bacpac file it can be pretty complicated.

Sqlpackage.exe is the standard tool to import the .bacpac file, and it can be found with the installation of Visual Studio (for example `C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\130`) or SQL Server ( `C:\Program Files (x86)\Microsoft SQL Server\130\DAC\bin` ). Latest version should be used because they can support the older formats (.bacpac exported from older SQL Server version), but not the way around (older version might not support .bacpac files exported from newer SQL Server versions)

Continue reading “Import a bacpac to SQL Server”

Price optimizing: to be or not to be

It can be quite confusing when you first edit prices in Episerver Commerce. To your surprises, some of the prices you imported or edited might disappear, or change, without your consent! What happened?

To answer that question, it’s essential to know there are two pricing system in Commerce: IPriceService and IPriceDetailService.

They have some different characteristics, and one of them is very important: the default implementation of IPriceDetailService saves prices as-is, while the default implementation of IPriceService does not: it optimizes prices before saving. Prices which are best for customers will be favored over the “less good” ones.

Continue reading “Price optimizing: to be or not to be”