Games Movies Music Tech Food Books

About

Reviews shape what people buy, watch, play, and read. The people writing them are rarely held to the same standard they apply to everyone else. We think that's worth fixing.

Creator who fights back
23 reviews published

I spent twelve years making things. Games, music, short films — whatever I could get funding for, and a lot I couldn't. During that time I read every review of everything I released. Most of them were fair. Some of them weren't. The unfair ones stay with you longer than they should.

I started writing here because I kept noticing the same pattern: reviewers who had never shipped anything in their lives confidently explaining why someone else's work was a failure. There's a specific kind of confidence that comes from never having to put your own name on something and watch strangers tear it apart.

I don't think critics are useless. I think the good ones make everything better. But the lazy ones — the ones who write a review in two hours about something that took two years — they could use some accountability. That's all this is.

Follows the money
13 reviews published

I used to work in digital advertising. Specifically, I worked on the analytics side of sponsored content campaigns for a mid-size agency. My job was to measure how effectively paid placements influenced purchasing decisions.

I'm telling you this because I want you to understand: I know exactly what a paid review looks like. Not because I'm paranoid, but because I helped write the briefs that produced them.

The relationship between publications and advertisers is more complicated than most people realize, and more influential than the industry admits. I track embargo dates, cross-reference advertising relationships, and compare professional scores to user ratings. Sometimes the numbers tell a boring story. Sometimes they don't.

I'm not saying every positive review is bought. I'm saying the financial incentives are worth examining, and almost nobody examines them.

Everything is garbage except art
12 reviews published

I studied comparative literature at the Sorbonne and spent a decade writing cultural criticism for European publications that most of you have never heard of, which is fine.

I came to English-language reviewing because a friend showed me a major American publication's review of a Terrence Malick film that did not mention a single other film, director, or artistic tradition. The review existed in a vacuum. It had no frame of reference beyond "I liked it" and "I didn't like it." I found this genuinely distressing.

Cultural criticism is supposed to place work in context. It's supposed to draw connections, challenge assumptions, and demand more from both the creator and the audience. What I see instead, mostly, is consumer advice dressed up as criticism. "Should you buy this?" is not the same question as "Does this matter?"

I know I come across as elitist. I prefer the word "serious."

Demands accountability
10 reviews published

I'm a project manager. I read reviews the way I read project documentation: I'm looking for the information I need to make a decision, and I'm evaluating whether the person who wrote it actually did their job.

Most reviews fail on the basics. They don't mention price until the last paragraph. They don't test edge cases. They describe how something made them feel instead of describing what it does. I have never made a purchasing decision based on someone's feelings.

I started contributing here after I bought a robot vacuum that a major tech site gave an 8.5/10. It couldn't navigate a doorway. The review never mentioned doorways. It mentioned "sleek design" four times.

If you write a review, I expect you to have actually used the product in the conditions a normal person would use it. If that seems like a high bar, it shouldn't.

The math ain't mathing
18 reviews published

I have a background in statistics and an unhealthy relationship with spreadsheets.

The thing that got me started was a simple observation: the same publication gave a 7/10 to a game they called "a masterpiece of storytelling" and an 8/10 to a game they described as "fun but forgettable." I assumed there was a system I was missing. There wasn't.

I've since built a dataset of over 4,000 reviews cross-referenced by publication, author, category, and score. The patterns are wild. Some publications haven't given below a 6 in three years. Some reviewers' average score is 8.2 across 200 reviews. The statistical probability of genuinely believing everything you review is above average is essentially zero.

I don't have a problem with subjectivity. I have a problem with numbers that don't mean anything. If your scale effectively runs from 7 to 10, just say so. At least then we could recalibrate.

Contact

Got a bad review you want us to roast? Business inquiries? Angry emails from reviewers we've roasted?

about@reviewofareview.com