Where to store big collection data

No, I do not mean that big, big data (in size of terabytes or more). It’s big collection, like when you have a List<string> and it has more than a few hundreds of items. Where to store it?

Naturally, you would want to store that data as a property of a content. it’s convenient and it just works, so you definitely can. But the actual question is: should you?

It’s as simple as this

public virtual IList<String> MyBigProperty {get;set;}

But under the hood, it’s more than just … that. Let’s ignore UI for a moment (rendering such long list is a bad UX no matters how you look at it, but you can simply ignore rendering that property by appropriate attributes), and focus on the backend aspects of it.

List<T> properties are serialized as a long strings, and save at once to database. If you have a big property in your content, this will happen every time you load your content:

  • The data must be read from database, then transferred through the network
  • The data must be parsed to create an array (the underlying data structure of List<T>. The original string is tossed away.
  • Now you have a big array that you might not use every time. it’s just there taking your previous LOH (as with the original string)

Same thing happens when you actually save that property

  • The data must be serialized as string, the List<T> is now tossed away
  • The data then must be transferred through the network
  • The data then saved to database. Even though it is a very long string and you changed, maybe 10 characters, it’s completely rewritten. Due to its size, there might be multiple page writes needed.

As you can see, it can create a lot of waste, especially if you rarely use that property. To make the matter worse, due to the size of the property, it means they are taking up space in LOH (large objects heap).

And imagine if you have such properties in each and every of your content. The waste is multiplied, and your site is now at risk of some frequent Gen 2 Garbage collection. Nobody likes visiting a website that freezes (if not crashes) once every 30 minutes.

Then when to store such big collection data?

The obvious answer is … somewhere else. Without other inputs, it’s hard to give you some concrete suggestions, but how’s about a normalized custom table? You have the key as the content reference, and the other column is each value of the list. Just an idea. Then you only load the data when you absolutely need it. More work, yes, but it’s the better way to do it.

Just a reminder that whatever you do, just stay away from DDS – Dynamic Data Store. It’s the worst option of all. Just, don’t 🙂

Tales of arise – super quick review

Tales of Arise Characters and Party Members

The good

The character designs are pretty good. With a lot of JPRG character designs are basically fan service, to a point it’s even ridiculous to look at (Looking at you, Xenoblade Chronicles 2), the characters from Tales of Arise are pretty great (saved some small parts, like, um, uh, the back of Kisara).

The game looks good – it is not breaking any record, but gorgeous in its own rights.

The bad

Tales has always been a budget series, and while Tales of arise might have a bigger budget than previous entries, it is still not an AAA production. And that shows in places. You see pretty repetitive enemies from place to place. A different color and a new name, and you’re done. This also clear that the game has more loading screens than it should, even with the current gen (Xbox Series X and PS5) versions. This is somewhat understandable as they share same design with previous gen consoles, but it was quite sad to see the limitations.

The voice acting is a hit or miss, i.e. inconsistent. Some actors sounds great and convincing, some, not so much.

Repetitive enemies designs. New area = almost same enemies with different colors.

Performance on boss battles – especially after Dohalim suffer badly. This was not covered by Digital Foundry analysis here, but trust me, I’ve seen it with my own eyes (And suffers from it)

The Ugly

It is one of the worst, if not the worst game when it comes to stability. And I played Witcher 3: The Wild Hunt at release on PS4, and XCOM 2 both on PS4 and Xbox Series X, that says something about it. I played (or rather, am playing) the game on Xbox Series X, and it crashes in almost every session. It does not work with Quick Resume (i.e. it should be disabled).

Mini annoyances

There are upgraded versions of equipment, but if a character is equipping that base version, it can’t be upgraded! You have to unequip that then craft the upgrade. Why?

Getting being delete variant codes

I got the question here Content Events and Service API | Optimizely Developer Community , and CHKN contacted Optimizely developer support service which is the good/right thing to do. However it could be beneficial for a wider audience, hence this blogpost.

If you are developing new Optimizely Commerce Cloud now, you should be using the latest version (CMS 12/ Commerce 14), or at least Commerce 13.31 if you have to stay with .NET 4.8. You then could use the IContentEvents to listen to any events that might be fired from Service API.

However, if you are using older versions, you might be limited to lower level, non content event through EventContext. It works, but with a catch: there is no EntryDeleting event in EventContext. At this point I’m not entirely sure why, probably just an overlook. But it’s not impossible to work around that issue.

As I suggested in the post, EntryUpdating is like a blanket event – every change to the entries goes through there. The sender is a CatalogEntryDto, which should contain information about the being deleted entries.

        private void Instance_EntryUpdating(object sender, Mediachase.Commerce.Catalog.Events.EntryEventArgs e)
        {
            var dto = sender as CatalogEntryDto;
            if (dto != null)
            {
                var rows = dto.CatalogEntry.Where(x => x.RowState == DataRowState.Deleted);
                var deletingCodes = rows.Select(x => (string)x["Code", DataRowVersion.Original]);
                //do stuffs with codes being deleted.
            }
        }

And then to listen to the event in one of your IInitializationModule

EventContext.Instance.EntryUpdating += Instance_EntryUpdating;

However, there is a caveat here: as EntryUpdating happens before the entry deletion, it is possible that the change did not go through (i.e. the changes were to be reverted). It is unlikely, but it’s a possibility. You might accept that, or you can:

  • Store the id and code in a “Deleting entry” dictionary
  • Listen to EntryDeleted event and match from DeletedEntryEventArgs parameter (which contains EntryId ) to get the deleted Code, and continue from there.

Pro Optimizely Commerce Cloud can now be preordered

If you want to start your week with a purchase, then I have a (hopefully) good news for you: My book Pro Optimizely Commerce  Cloud can now be preordered (but with immediate access to content). https://leanpub.com/prooptimizelycommercecloud

It took me a week due to an technical issue with Leanpub, but in the end it can now be purchased.

I planned to finish this book in August. As always, I overestimated my ability to work during nights, and underestimated my procrastination power. I made some progress, but far from completing the book.

My solution? Well, I will let money speak. By letting you (my dear readers) preorder the book, I make a commitment to finish it. The book is now at 50% complete, and I aim to make it content complete before the end of this year.

If you own the previous version (even if you got it for free), there will be a coupon with 60% off coming your way, please check your mailbox!

The book is dedicated to my children, Emmy and Erik. While I can’t say they are very supportive of me writing the book (quite the contrary), they are my biggest motivation to make more money (cough, cough), and are biggest smiles on my face.

I would like to thank many of my colleagues, former and current, who helped and supported me writing the first book, which was the beginning of everything.

I would like to thank Optimizely community for inspiration to write this book.

And of course, I would like to thank you, for buying and reading it.

And I hope it’ll help you.

The hidden gotcha with IObjectInstanceCache

It’s not a secret that cache is one of the most, if not the most, important factors to their website performance. Yes cache is great, and if you are using Optimizely Content/Commerce Cloud, you should be using ISynchronizedObjectInstanceCache to cache your objects whenever possible.

But caching is not easy. Or rather, the cache invalidation is not easy.

To ensure that you have effective caching strategy, it’s important that you have cache dependencies, i.e. In principles, there are two types of dependencies:

  • Master keys. This is to control a entire cache “segment”. For example, you could have one master key for the prices. If you needs to invalidate the entire price cache, just remove the master key and you’re done.
  • Dependency keys. This is to tell cache system that your cache item depends on this or that object. If this or that object is invalidated, your cache item will be invalidated automatically. This is particularly useful if you do not control this or that object.

ISynchronizedObjectInstanceCache allows you to control the cache dependencies by CacheEvictionPolicy . There are a few ways to construct an instance of CacheEvictionPolicy, from if the cache expiration will be absolute (i.e. it will be invalidated after a fixed amount of time), or sliding (i.e. if it is accessed, its expiration will be renewed), to if your cache will be dependent on one or more master keys, and/or one or more dependency keys, like this

   
        /// <summary>
        /// Initializes a new instance of the <see cref="CacheEvictionPolicy"/> class.
        /// </summary>
        /// <param name="cacheKeys">The dependencies to other cached items, idetified by their keys.</param>
        public CacheEvictionPolicy(IEnumerable<string> cacheKeys)

        /// <summary>
        /// Initializes a new instance of the <see cref="CacheEvictionPolicy"/> class.
        /// </summary>
        /// <param name="cacheKeys">The dependencies to other cached items, idetified by their keys.</param>
        /// <param name="masterKeys">The master keys that we depend upon.</param>
        public CacheEvictionPolicy(IEnumerable<string> cacheKeys, IEnumerable<string> masterKeys)

The constructors that takes master keys and dependency keys look pretty the same, but there is an important difference/caveat here: if there is no dependency key already existing in cache, the cache item you are inserting will be invalidated (i.e. removed from cache) immediately. (For master keys, the framework will automatically add an empty object (if none existed) for you.)

That will be some unpleasant surprise – everything seems to be working fine, no error whatsoever. But, if you look closely, your code seems to be hitting database more than it should. But other than that, your website performance is silently suffering (sometimes, not “silently”)

This is an easy mistake to make – I did once myself (albeit long ago) in an important catalog content cache path. And I saw some very experienced developers made the same mistake as well (At this point you might wonder if the API itself is to blame)

Take away:

  • Make sure you are using the right constructor when you construct an instance of CacheEvictionPolicy . Are you sure that the cache keys you are going to depend on, actually exist?

In newer version of CMS, there would be a warning in log if the cache is invalidated immediately, however, it could be missed, unless you are actively looking for it.

Note that this behavior is the same with ISynchronizedObjectInstanceCache as it extends IObjectInstanceCache.

Lelit Elizabeth v3 – 2 months impression (super short review)

First of all, if you are looking to buy Lelit Elizabeth, check out the detailed review from David Corbey. It is probably the best review on the machine. He also wrote very good information regarding setting, and maintaining the machine, so check it out.

Out of the box, the Elizabeth looks much better than in photos. I must admit, the photos do not do its justice – it looks rather dull in those. While it is certainly not the best looking espresso machine (to my taste at least, I think the one with E61 group looks more compelling), it definitely looks good.

Next to the machine it is going to replace

Coming from the popular Sage (Breville) Barista Pro, there are a few things that impress/surprise me (of course this is an unfair comparison. When you spend 3x more money on a new machine + a grinder, you would definitely not want to get “marginally better”):

  • The actual stainless steel construction is very, very nice. It feels much more solid than the fake stainless steel look from the Barista Pro. The machine is well made, probably not the best built machine around, but you know it will last you a long time.
  • It takes really long time to heat up, about 20 minutes. Compared to Barista Pro, it was a bummer at first, because you could pull your first shot almost instantaneously with the Pro as the machine is ready after 3s. But that is a lie. For the Barista Pro, the machine allows you to pull shots, but with the cost of temperature stability. I learned the hard way that most of the shots with Barista Pro is severely under temperature, resulting in extreme sour taste, and it was very hard to adjust – you have like 5 levels for temperature, ranging from 90*C to 98*C, so about 2*C each. If the machine can reach that desired temperature or not, is another question. I only realize that once I switched to Elizabeth. The portafilter is actually hot (and very uncomfortable to touch to steel part). On Barista Pro however, it is only lukewarm, even if I pulled a few empty shots before hand. Furthermore, with Elizabeth, I can set the brewing water to whatever temp I like (or half the degree if I switch to F instead of C), so I can comfortably brew light, medium or dark roast the way they are meant to be brewed.

The flow is now so, so much easier and smoother and I had with the Sage Barista Pro.

Buying accessories is also now easier and cheaper – I could easily find branded, quality accessories for reasonable prices. They might be still expensive, but I feel the price is justified for the the quality.

But no machine is perfect, so is Elizabeth, there are a few downsides

The biggest one, to me at least, is that the water tank is pretty hard to refill, as I put my machine under the kitchen cabinet, so I have a few little space left. I need to either move it out, or use a gooseneck kettle to refill. I went with the latter approach and it works quite well.

Another bummer is that there is lack of a real tamper included in the package. Lacking of the milk jug is somewhat acceptable, but tamper? I bought a nice one from Motta (wish I chose the 58.55mm version instead), but I wish they included one by default. Of course, this one is an easy one to fix.

In the end, I’m happy with my Elizabeth, and I feel happy and excited to use it every day. My only regret is that I didn’t step up to Bianca – heard great things about it. But well, it exceeded my budget at that point by a large margin, I’ll have to wait.

Meanwhile, Lelit Elizabeth will serve me well for a long, long time.

Potential performance issue with Maxmind.db

From time to time, I have to dig into some customers’ profiler traces to figure out why their site is slow (yeah, if you follow me, you’d know that’s kind of my main job). There are multiple issues that can eat your website performance for breakfast, from loading too much content, to unmaintained database indexes. While my blog does not cover everything, I think you can get a good grasp of what mistakes to avoid.

But sometimes the problem might come from a 3rd party library/framework. It’s not new, as we have seen it with A curious case of memory dump diagnostic: How Stackify can cause troubles to your site – Quan Mai’s blog (vimvq1987.com). The problem with those types of issues is that they are usually overlooked.

The library we’ll be investigating today would be Maxmind.db. To be honest, I’ve never used it my own, but it seems to be a very popular choice to geography-map the visitors. It’s usually used by Optimizely sites for that purpose, using VisitorGroup (which is why it came under my radar).

For several sites that use it, it seems more often than not stuck in this stack

It’s essentially to think that CreateActivator is doing something heavy here (evidently with the LambdaCompiler.Compile part. A peek from decompiling actually shows that yes, it’s heavy. I’m not quite sure I can post the decompiled code here without violating any agreement (I did, in fact, accepted no agreement at this point), but it’s quite straightforward code: TypeActivatorCreator uses reflection to get the constructors of the Type passed to it, to sees if there is any constructor decorated with MaxMind.Db.Constructor attribute, then prepares some parameters, and creates an LambdaExpression that would create an instance of that Type, using found constructor (which is a good thing because a compiled expression would perform much better than just a reflection call).

(I’m using Mindmax.db 2.0.0, for the record)

The code is straightforward, but it is also slow – as any code which involves reflection and lambda compilation would be. The essential step would be to cache any result of this. This is actually a very good place to cache. The number of types are fixed during runtime (except for very edge cases where you dynamically create new types), so you won’t have to worry about cache invalidation. The cache would significantly improve the performance of above code.

And in TypeActivatorCreator there is a cache for it. It is a simple ConcurrentDictionary<Type, TypeActivator> , which would return an TypeActivator if the Type was requested before, or create one and cache it, it it hasn’t been. As I said, this is a very good place to add cache to this.

There is a cache for that, which is good. However, the very important tidbit here is that the dictionary is not static. That means, the cache only works, if the class is registered as Singleton (by itself, or by another class down the dependency chain), meaning, only one of the instance is created and shared between thread (which is why the ConcurrentDictionary part is important).

But except it’s not.

When I look at a memory dump that collected for a customer that is using Maxmind.db, this is what I got:

0:000> !dumpheap -stat -type TypeAcivatorCreator
Statistics:
MT Count TotalSize Class Name
00007ffa920f67e0 1 24 MaxMind.Db.TypeAcivatorCreator+<>c
00007ffa920f6500 147 3528 MaxMind.Db.TypeAcivatorCreator
Total 148 objects

So there were 147 instances of TypeAcivatorCreator. Note that this is only the number of existing instances. There might be other instances that were disposed and garbaged by CLR.

Now it’s clear why it has been performing bad. For supposedly every request, a new instance of TypeActivatorCreator is created, and therefore its internal cache is simply empty (it is just newly created, too). Therefore each of request will go through the expensive path of CreateActivator, and performance suffers.

The obvious fix here is to make the dictionary static, or making the TypeActivatorCreator class Singleton. I don’t have the full source code of Mindmax.Db to determine which is better, but I’d be leaning toward the former.

Moral of the story:

  • Caching is very, very important, especially when you are dealing with reflection and lambda compilation
  • You can get it right 99%, but the 1% left could still destroy performance.

Update:

I reached out to Maxmind.db regarding this issue on November 9th, 2021

About 6h later they replied with this

I was at first confused, then somewhat disappointed. It is a small thing to fix to improve overall performance, rather than relying on/expecting customers to do what you say in documentation. But well, let’s just say we have different opinions.

Debugging a memory dump for .net 5

You would need to install Windbg Preview from Windows Store

Get it from Get WinDbg Preview – Microsoft Store . If you use the ordinary Windbg that comes wint Windows SDK, this is what you get from trying to open it

WinDbg:10.0.19041.685 AMD64

Could not find the C:\Users\vimvq\Downloads\core_20211102_090430.dmp Dump File, Win32 error 0n87

The parameter is incorrect.

You also need to install .NET 5 version of sos.

dotnet tool install --global dotnet-sos

and once you used Windbg Preview to open the memory dump, run this command to load it:

.load C:\Users\<your user name>\.dotnet\sos\sos.dll

And now you can start debugging as usual

Coffee roasters in Sweden – a review

If you are serious about Espresso quality, you know you must buy from a specialty roaster – not from the super market. You will pay more premium price, at least 300kr per kg, and easily up to 500kr per kilo or more, compare to around 100-150kr per kg from super market. In return, you get:

  • The obvious better coffee quality. Most if not all decent roasters only roast specialty coffee, meaning they not only taste good, they have minimal defects, especially small rocks. A bad bean will only ruins one cup, at most. But a pebble can destroy your precious coffee grinder. Your grinder will thank you for the uniformity of specialty coffee beans you buy from roasters.
  • Much better freshness. Most, if not all coffees from super market only have “expired date”, not “roast date”. You can probably guess the roast date by subtracting expired date by 24 months and the most fresh one I could find, was two months old. At this point the coffee already started degrading in quality. In contrast, when I buy from roaster, it is always less than 1 week from roast date, which is clearly printed on the bag. As a rule of thumb, you should finish your coffee in less than 8 weeks from roast date.
  • Much better roasted. Most coffee from super market is roasted with super hot air (800*c) in very short amount of time. This allows the roaster to roast ton after ton, but with the cost of coffee flavor. Specialty coffees are often roasted in much smaller batches, an a longer time, for the flavor can develop properly.
  • Traceability. You only know coffee from super market by their country origin, and that’s it. But for specialty coffee, you will know the region which produced the coffee, and in many cases, even the farm that produced it.
  • Last but not least, support for local businesses. Specialty roasters are small businesses in your city, or even area. Buying from them means you support your local economy. Many roasters also have direct trade with the coffee farms, which means you give more directly support those famers. Most farms that grow specialty coffee also follow practices regarding sustainability (and due to high price of specialty coffee, they can sustain their business with considerable smaller farms). If you care about sustainability and people likelihood, buying from roasters is a better way to support that.

What to look for from Coffee roasters

A coffee bean bag with one way valve is a must (In case you didn’t know, newly roasted coffee bean will release CO2, and that valve is important to let the CO2 out – but not let the air in) . Best if it is resealable. otherwise you would have to move the bean to an airtight container to keep them fresh for longer. Also, buy coffee beans if you can. Ground coffee starts losing their aroma and flavor just 30 minutes after grinding. Airtight container can only slow that down a little bit.

All of the roasters below have good coffees – the beans have consistent color, size and shape. My machines, techniques and taste are not at the level I can distinct each flavor, so I will focus on the services instead.

StockholmRoast (Stockholm Roast – The House of Roasting)

They offers good prices, but no subscription. They ship through DHL to the service point, with free shipping, which is nice, but not the best (compared to other options below)

The bags are well packaged, so you get proper protection of your precious coffee beans. But they also glue a plastic bag on the package (for the shipping information and the receipt), which is quite tiresome to remove for proper recycling.

They also offer better price for 1kg bag, compared to 4 bags of 250. While this is somewhat understandable from a commercial perspective, it means it’s harder to keep your coffee fresh, if you want to save some money. They probably should offer 500gr bag.

Another minus, I don’t recall their bag is resealable. Also the ink on the bag could easily get into you hands, especially if they are a little wet. They are, however, not very easy to wash off.

None of those things are critical, but they would be very nice to be fixed!

LYKKE KAFFEGÅRDAR Nyrostat kaffe | Direkt frĂ„n gĂ„rden hem till dig | Lykke KaffegĂ„rdar (lykkegardar.se)

Lykke offers subscriptions, with 10% discount, which is good. Importantly, they ship directly to your mailbox. Order, and in one day or two, you find your favorite coffee bags in your mailbox. Convenient, huh?

You can easily manage subscriptions, including changing it, skip one delivery, or cancel it, which is a huge plus.

Their bag design is beautiful, and I absolutely like it. To make things better, they even included 2 bags of tea in my first shipment – a very good way to advertise.

Their espresso range, however, is quite limited. There is only one Bam! that is dedicated for espresso. Wish they offer more choices.

UPDATE: I bought 2 bags of BAM! from them due to their Black Friday sales, and were sent ones which were roasted on October 28th, which means more than 1 month when they arrived. I was disappointed, and sent them a letter. They apologized and offered a 30% coupon for my next order. While receiving one month old coffee bags is no fun, I think the way they handled it was nice. I took the offer.

Kafferosterietkoppar UpptÀck vÄrt nyrostade kaffe | Kafferosteriet Koppar

They offer subscriptions as well, and with 20 SEK discount per bag, which is very nice. However, to change the subscription, you need to email them directly. It’s OK-ish, but I would definitely prefer the Lykke approach.

They also ship directly to mailbox, and their shipping was very fast. I ordered on Wednesday, and two of the bags appeared in my mail box on Thursday. I don’t know if they forgot, or intentionally did not send a notification email, but that was a nice surprise.

One incident with my first purchase: Out of two bags is almost empty (there were like, 30 coffee beans inside them). I mailed them to let they know, and they were happy to ship a replacement to me. In the end, everything is resolved quick and easy, but I’d hope they did have a bit more of quality control for their coffee bag.

Standout coffee https://www.standoutcoffee.com/

Their subscription is 25e (yes, euro, equivalent to about 260kr) for 100gr of coffee, or 2600kr per kilo. The reason for such high price is because it’s “Gesha village”, the most expensive coffee in the world, and they offer worldwide free shipping.

2600kr per kilo is unfortunately way too high for what I can pay for coffee, and with 100gr you might get 1-2 cup of good espresso out of it (considering you have to dial in), so thanks, but no thanks.

Drop Coffee

Apparently they are the most popular in Sweden, so I should try them out soon. They are transparent about their FOB price, which is nice. I was hesitant about their Google reviews (“only” 4.3 on 5.0, so quite lower than other roasters in this list), but it turned out it has to do with their coffee shops (which should be affected by many other things) than their actual roasting business.

Pouring coffee roastery (hallakafferosteri.com)

This is my favorite now. They offer coffees at very good price – especially if you buy in batch (5 or 10 bags of 500gr), and they usually have 10 or 15% off coupon. I buy with a few friends, and we split the bag – and I end up with around 250kr/kg (2x500gr bag), and sometimes even only 220kr. They do ship free to service point for order more than 499kr.

I can’t notice a difference between their coffee and other roasters, so I’m happy with that setup, for now.

Sage/Breville Barista Pro review

This is a super short review of this fairly popular espresso machine. I bought this last year, despite a lot of arguments from my wife. She even threatened to throw it out if I bought. I did. And now she demands latte/cappuccino every day!

My budget was pretty limited at that point, so other decent options (HX or even dual boiler machines) are out of reach. Barista Pro fits in my budget (and kitchen), and when Amazon had a very good discount on them, I pulled the trigger.

I was happy.

When it was new

Don’t laugh.

Pros:

Sage/Breville is feature oriented, and when you open the box, you have everything you need to get going: a milk jug, a 54mm portafilter with 4 different baskets (2 double shots (1 pressurized, 1 non pressurized), 2 single shot), and of course, a grinder built-in. If you are new, this is hugely important. Some sellers do not include the milk jug, or even tamper (looking at you, Lelit!), and it’s bad that you are excited to open your new fancy espresso machine and realize you can’t make a decent cappuccino due to lacking equipment. The UI is intuitive and easy to work with. Once you understand the basics, using the machine, UX wise, is simple and easy.

The flow is well defined, and smooth – you take the portafilter, put it in the holder and click – the grinder grinds coffee for you, in the fineness you chose and the time you pick. Then you take the tamper (attached to the machine using a magnet, a pretty smart design), tamp it, put it in the head, place your cup, and press a button. It is the convenience you are paying for.

Cons:

Once you open the box, you quickly realize this machine is not built to last. It’s a thin layer of stainless steel outside of plastic. Build quality is … fine, but don’t expect the same quality as Italy-made machine. It’s been reported that while Sage/Breville service is very good during warranty, but one you are out of warranty, you have to pay hefty fee for repairs, because they just break down. And repair usually means “replace”.

The machine is advertised as “3s start up time”. You press a button, and the machine is ready. Truth is, however, if you want to get better shots, you need to wait for at least 10m, and flush 1 or 2 cup first. With the empty portafilter inserted, press the double shots button, and let the hot water flows through it. It warms up the head, the portafilter, and make sure you get stabilized temperature in the boiler. Otherwise, your cups will be incredibly sour. Or some times, both sour of bitter!

The machine overall is quite noisy, both the grinder and the pump. it’s not a big deal until you have tried quieter machines. This is even more true when you have empty grinder, it sounds like it gonna break (it is fine to grind in a short amount of time, mind you)

Another downside is that this uses a 54mm portafilter. The portafilter itself is fine, well made and solid, but after a while, you will want to try out new things, like bottomless portafilter. But this is when you realize you are left with either options: 1. buy cheap no brand products from China or 2. absurdly expensive or 3. both. Should it come with a 58mm portafilter which is the “industry standard”, you will have more options from reputable brands, at reasonable prices.

The built-in grinder is merely adequate, it’s step conical burr grinder (some says it’s actual stepless, but you will need some “tricks” for that). You will be able to grind espresso with it, but not with the fineness adjustment needed to extract the best out of your coffee. Whenever you can, upgrade to a good espresso grinder would make a huge difference in your espresso. (Note: a good espresso grinder can easily cost $400 or more!). Also, cleaning it is not the easiest task – it’s doable, but requires additional tools (like a vacuum cleaner) to do it properly.

It’s messy to grind a double shot (18-20gr), because some coffee ground will be left on the portafilter holder, or on the drip tray. Yes you can use a dosing cup, or a funnel, but you will, once again, agonize the limited options of a 54mm portafilter.

So so

The included tamper is “serviceable”. It can be tucked in which is need, and it does it job. But I’d suggest to buy a nice, ergonomic tamper as soon as you can. It’ll make your experience much more enjoyable.

The steam wand is ok, but it is on the weak side, and it produces wetter steam than I would like. It is enough to froth the included milk jug, but if you want to use a bigger jug (so you can make 2 cappuccinos or 1 big latte in 1 go), it’ll not powerful enough.

The included milk jug is OK. Good ergonomic, but the wall is a bit too thin, so it gets hot very quickly. I had hard time holding it when it reaches 55*C. In comparison, my Motta one is only fairly warm even when the milk reaches 60*C (that is however not the perfect thing)

Summary

In the end, Barista Pro is a well rounded, full featured espresso machine. It’s a budget/entry one, capable of making good shots. You have everything you need to start going, but it also does not really excel in neither brewing, nor steaming. Once you horned your skill, upgrading to a better grinder, and a better machine 58mm portafilter will be a big step.

If you want to learn and can spend, of course.