On paper, it's a winner: pretty design, Bluetooth and corded connectivity, compelling price. And, hey, look at that: a 4.1-star review average from nearly 600 buyers. That's good enough for you. Sold.
Except, hang on. Could a $65 set of headphones really compare favorably with $300 Beats? You know the old saying: If something sounds too good to be true, it probably is. And it's not uncommon for companies to stack the Amazon deck in their favor by posting (or soliciting) fake reviews.
I'm not saying that's the case here, not at all. (Indeed, I got some ears-on time with the Super 66 headphones and found them quite good overall.) However, I do know how easy it can be to just glance at a four- or five-star rating and think, "OK, must be good!" I also know I don't have the time to go digging into each and every reviewer's history to see if they're legit.
Amazon, for its part, has started cracking down on fake reviews, and company rep Angie Newman says that "inauthentic reviews make up a tiny percentage of all reviews on Amazon," and the company removes them "as soon as they're identified."
In my world, however, where I frequently encounter (and write about) lesser-known tech brands and products, fake -- or at least questionable -- reviews persist.
X marks the Fakespot
Thankfully, there's Fakespot, a free site that analyzes Amazon product reviews to help you separate the wheat from the, well, fake. All you do is copy and paste the link to the product page, then click Analyze.
The service also offers browser extensions for Chrome, Firefox and Safari, all of which make it even simpler: Just click the Fakespot icon in your toolbar for instant analysis. There's now an iOS option as well; it lets you use Fakespot on the go.
And, recently, the service introduced a premium option, Fakespot Plus, that automatically displays Fakespot reviews inline on Amazon product pages. It costs $1.99 per month, and you can try it free for seven days via your browser extension. Personally, I'm not a fan, as it can really clutter up Amazon:
Grading the grades
In the case of the aforementioned headphones, Fakespot determined that over 90 percent of the reviews were "high quality." That's an interesting change, because when I first wrote about this about a year ago (when the product was quite new), nearly one-third of them were "low quality" -- meaning reviewers were determined to have written other reviews about the same company, written only overwhelmingly positive reviews, reviewed products without purchasing them or the like.
Obviously, over time, the product sold well and earned a lot more valid reviews, hence the higher Fakespot grade.
So let's shift gears to a product like the Aroccom Portable Wireless Bluetooth Speaker. Again, it looks like an unbeatable deal: waterproof design, 12-hour battery, etc., all for just $40. And the reviews: 4.5 stars from 34 buyers.
Ah, but Fakespot gives those reviews a "D" grade, with more than half of them qualifying as low-quality. Does that mean the product itself is bad? No, but it means you should take the reviews with a big grain of salt, because some of them might have questionable origins or motivations.
This can be challenging, because I've tested products that had very low Fakespot grades but turned out to be excellent. Indeed, one of my favorite mobile chargers had an absolutely dismal grade. And I'm currently testing a mini-drone that's really terrific, but the Fakespot grade is, again, a "D."
Bottom line: Fakespot is an invaluable tool, but as with the reviews themselves, you shouldn't believe everything you read.
Solving for XX: The industry seeks to overcome outdated ideas about "women in tech."
Special Reports: All of CNET's most in-depth features in one easy spot.