Two Hobbyists Made One of This Year’s Best Video Games (Chants of Sennaar)
bloomberg.com
Here is the link to the game https://store.steampowered.com/app/1931770/Chants_of_Sennaar/
Here is the link to the game https://store.steampowered.com/app/1931770/Chants_of_Sennaar/
I played it - and if it was truly only made by two people is quite impressive - but it's just alright. The world is very cool, and is structured around multiple levels of a tower each with their own language that you need to learn to progress. My main issue with the game is that the differences between these languages, and the puzzles built around them, aren't particularly interesting or deep or varied. There are a few gems, but overall it's much closer to a traditional adventure game than you might expect on first glance.
That said, the art and world design are very cool.
Edit: As an aside, it's worth noting that the Steam Reviews metric is a tad misleading in a similar way to Rotten Tomatoes, in that it only gauges ratio of positive reviews, over what those reviews are actually saying. A universal consensus of a game being a 7/10 (if we assume 7/10 is positive) will appear "better" than a game where 99% of people believe it is a 10/10, but 1% think it sucks. It's good at predicting whether you will like it, it is bad at predicting how much.
I'm not sure how relevant this is, since your described situation pretty much doesn't happen. Like so many things in life, reviews are expected to follow a normal distribution. There are definitely counter-examples (e.g. shitstorms leading to massive downvote waves), but due to the large number of reviewers things should average out for normal cases.
I suck at math, but if the mean is sufficiently over the "positive" threshold, and there's a low standard deviation across reviews, wouldn't this have the problem I describe? The more certain people are about the quality of good games, the less relevant the ratio becomes, which is perhaps the opposite of what you would want.
Since Steam reviews are only positive or negative, not on a point scale, I'm not sure how this problem would come to pass. The distribution of reviews around the mean are expected to be similar for your described 10/10 game and the 7/10 game, and since the review system itself is only boolean in nature there is no distorted result.
Why does the ratio become less relevant the more certain people are about the quality of good games? Again, the review is only positive or negative, no actual review number assigned. In which cases do you expect the ratio to drift away from the actual useful information?
It's because there's no review number in combination with varying certainty that makes for bad information regarding judgment calls about quality. If people are certain the game is a 7/10, that could produce a better score than being less certain about an 8/10, because the wider distribution (less certainty) could put more reviews below the positive/negative threshold.
The following reviews: 6/10, 6.5/10 , 7/10, 7.5/10, 8/10 will produce a 100% rating. More certain, less useful.
The following reviews: 4/10, 6/10, 8/10, 10/10, 10/10 will produce an 80% rating. Less certain, more useful.
It's only consistent if you assume all games follow the same distribution, which is not how reviews work in my opinion. There are many websites which do surface score information, and they follow wildly different distribution patterns depending on many different factors.
Again, it is useful for predicting whether you'll like it, but bad for predicting how much you'll like it.
But what do you mean with "more certain" and "less certain"? Again, Steam doesn't have reviews beyond boolean values.
And since Steam doesn't have point-based reviews, the 100% rating is fully correct, as presumably each of these reviewers gave a positive review.
How is it "less certain"?
Do you have statistical analyses that show this assumption to be wrong?
My entire argument stems from the idea that you can ascertain quality from these ratings, which I am refuting. The rating is "correct" in that it is measuring something, and as long as people keep in mind what that something is, there is no problem. But this article, for example, uses the flood of positive reviews to make the case that it is one of the best of the year, which I believe is faulty reasoning.
What I meant by certain is that the reviews are more clumped together (again if you had a score - even though it isn't present presumably you could attach one to these reviews), so there's more agreement among different people about the quality of the product. If you don't agree that games can be more or less polarizing, you won't agree with this point unless I can back it up with data which I'm not going to spend time doing. You could go through Rotten Tomatoes and compare Critic Score with RT Score because they surface both those values and see how closely they track on different parts of the spectrum.
I wish people would speak less in absolutes, and just focus on sharing their opinions and thoughts as exactly that, their opinions and thoughts.
One of the best can easily be started as one of my favorite. Still a great title, still catches my eye, less exclusionary and less judgy of anyone who doesn't think it's the best at all or don't have a real desire to play it.
This kind of goes for just about every media article out there. Life is more of a bell curve but almost all articles hyper focus on one extreme or the other...
Bold to call something that came out a month ago and I doubt many people have heard of "one of the best games of the year".
It could be very good, but it sounds like it's barely been played yet.
I feel like a month is plenty of time to play a game and form an opinion on its quality. Am I missing something?
I don't think that's enough time for enough people to play it, to call it one of the "best games of the year"
It’s the end of the year so if it sticks out to them after playing it potentially for many hours I really think that’s not crazy at all
The implication here is that in order to be considered a potential game of the year it has to hit some arbitrary player count?
I've played plenty of games over the last decade or so that an awful lot of people would never have played, let alone heard of but that shouldn't diminish someone's opinion of its quality.
No, the implication is that when a major website reviews a game "best of the year" often means that it has had a huge effect on the gaming society. Instead it reads like a like a single person's opinion piece, which makes me feel like I am being tricked into believing this is already a an established amazing game that I should already have known about. I personally am a little more skeptical about a single opinion that I can't verify than something that a lot more people have played. That is my point. I have no problem believing that they believe this is a great game.