Early last week we introduced a new system that rated businesses based off the products they create. One of the topics we touched on in that post was the weighting of reviews and how they factor into the overall rating.

Today we want to dive a bit deeper into the weighting system, explain how it works and show you a preview of what it’s about to do to the juice ratings site wide. If you’re interested, we’d appreciate your feedback or any thoughts you might have on the topic! Feel free to checkout our preview of the test juice ratings and submit feedback on our thread on /r/ecr.


The problems we encountered so far

Over the last 2 years, we’ve talked internally a lot about how to use JuiceDB’s rating system in the most effective manner possible. It’s hard to pull real, useful information out of a crowd sourced system purely based on opinions. It’s one of the most important things we can do though, so we need to do it right and in small increments.

One of the things we found was that we had some strange, disingenuous, and odd user behavior at times (surprise!). This kind of activity led to skewed ratings, drama and disinformation. A few of the more “difficult” behaviors we identified were:

  1. Users who left a single review for a single juice.
  2. Users who only left reviews for a single brand of juices.
  3. Users who rate everything on the extreme ends (only 5’s or 1’s).
  4. Users who we can’t tell are fake/duplicates or not.

That’s not to say everyone that falls into these categories is nefarious or did anything wrong. We just can’t trust them as much as the person who has 100 well written reviews across 25 different brands. Instead of trying to punish the strange behavior, we’re propping up those we can truly identify as real.

Adding weight to the community

The best solution we have heard so far is weighting user reviews based on user activity. The more positive and trustworthy behavior a user performs, the more we need to consider their review in a product’s overall rating.

Our new experimental section on the Juices page is our first attempt at addressing this. Each one of the situations shown above we have developed a weighting mechanism for:

  1. More weight is applied for the variety of juices reviewed.
  2. More weight is applied for a variety of different brands’ juices reviewed.
  3. More weight is applied for a variation in user ratings.
  4. More weight is applied for each unique identity associated to an account.

Currently the only thing this weight is applied towards is the product’s overall rating. The more weight a user has behind their account, the more their individual rating will affect the overall rating. This is all done in the background during the tally process and doesn’t affect the look and feel of a review an individual review at all.

We have some other experimental systems we’re toying with as well right now that would tie into weighting, but we’re not ready to dive into that yet.

New Rating of Juices

Applying all of these new rules and weights to reviews has changed the landscape a lot. The “All Time” popular juices list you’re used to seeing feels a lot different in the new list. From what we’ve seen so far, these new lists are more true to the current trend’s observed by the community.

We would like to invite you to check out the new experimental ratings and compare it to our current “Most Popular” juices ratings. Our goal here is to have a listen to anyone with an opinion on this change and we’re planning on doing it via this reddit post on /r/ecr.

This change is already live on the business ratings and has helped refine the lists significantly. To us (and to many others), applying this change to the actual juices will have an even larger impact on what we think of as “the best juices on JuiceDB”.

Feedback

If you stumble across something that doesn’t work or doesn’t look right, we can’t make it better unless you tell us about it! Don’t hesitate to submit bugs and feedback to us through one of the following channels:

It helps when reported issues are as specific as possible, so be sure to include the following information and anything else you think might be relevant:

  1. The URL of the page you were on when you experienced the issue.
  2. The browser/device you were using when you noticed it.