• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Purging the Vendetta reviews

I agree.

The management of this site has guarded against the slippery slope by only removing the most egregious examples. That extreme caution and reluctance to remove reviews has kept us high on that slope.
That's BS David...so a 0 for Woodshed is not egregious enough ? Really ?
The management is more interested in numbers, than accuracy. Wise up.
 
Last edited:
IMO the rating system here works fine, it's not perfect but any tweak that's made to it won't make it perfect either.
If a golfer makes their decision to play or not play a course based on the rating alone or only on one review that's on them.
Why is it always about the golfer ? What about how the ratings affect the private course owner that is trying to break even ?
 
That's BS David...so a 0 for Woodshed is not egregious enough ? Really ?
The management is more interested in numbers, than accuracy. Wise up.

It's not the rating, but the reasoning, that makes it a vendetta.

The 0 at Woodshed is the view of someone with a very different idea of what constitutes a good course. Or bad.

A bunch of low ratings of Beaver Ranch by an organized group, trying to damage the course because they're mad can't run a tournament when they want, is a vendetta.

*

Meanwhile, at my age, I'm as wise as I'm ever likely to get, and have to live with that deficiency.
 
I think it's been brought up before but what if TR's ratings were more heavily weighted than non-TR's? So if a TR comes through and gives course X a 3.5 it will take a lot more drive bys giving it a 1 to drag the average down. Higher the TR rating, higher the weighting? You've got to play a lot of courses to become a TR and usually just by osmosis that tends to make better reviewers. It's not perfect b/c you can become a TR easily enough if you whore around for up thumbs but it seems like it might work. I'd rather have the solution be more reviewer based rather than nebulous, moderator or algorithm based. Seems like that would encourage more reviews and better yet make reviewers better at reviewing.
 
IMO, the 1.5 rating of Black Falls was a blatant and malicious attack just like the recent vandalism that took place at the course. The person also went out of their way to slam it on the PDGA review page. I still believe that this person did not actually play the course, but instead played Cherry Hill up the road. Cherry Hill has been closed for over four years. This is where their car was parked. They never parked at Black Falls.
 
So I read the first page of this thread, then skipped to this page. Just to be upfront, and explain why I'm making a comment about rating disparity.

One of my local courses has an average rating of 3, but no 3 disc reviews. There are three over three and three under three, and seem to be reflective a course redesign that didn't make it on this site as a new course. So while the old layout would have averaged a 3.83, the new course averages a 2.17, but the course is listed as 3. Who's to say which are the outliers? (just for further info, the three oldest reviews are the more highly rated, the three newest rated 2-2.5)

I do like the idea of their being a weighted system of Trusted Reviewers getting more weight towards a course's rating. Another good example is B. Schmitz in Houston, TX where someone who only has 1 course played and 1 reviewed rated it a 2 disc course. Full disclosure, I also played and reviewed this course, so you can see my opinion on the course site, but given that there are no signs, no tees, no anything but baskets, a 2 disc rating pulls the average up to .63, despite only one out 4 reviews over .5.
 
Regarding weighting TRs, on any given course you can filter for them. Most of the time, it makes very little difference.

My only wish for a change in the way the site does it---without knowing how much work it would be, or offering to do that work myself---would be a few more filters in the course search. So that for someone whom it matters, they could create their own personal Top 10 by TRs, or Most Recent 10 Reviews, or whatever criteria they think should be used. More easily, because even now someone can go through the top 10 or top 25, apply whatever adjustments seem fairest to him, and produce a list.

As it is, the site favors simplicity:

---anyone can rate a course, with a few request, including that the rating be honest

---A course's rating is the average of the individual ratings, without judging which are more or less important

---The Top 10 is for curiosity. It doesn't claim to be the Authoritative Top Ten of world's courses. It simply lists the top 10 of course ratings, with the only filter being a minimum number.


There's a rationale behind weighting TR reviews, sure. As there is for dropping reviews over 2 years old (why do 10-year-old Flip City reviews still matter, when the disc golf world has changed so much since then?). Or dropping lowest & highest reviews. Or choosing a different minimum for the Top 10.

Though I'd value TRs over locals who join and write 1 review of their local course, I'd hesitate to put too much weight for the TRs. On the plus side, they've played lots of courses so have some perspective. But they get those TRs by the thumbs, too; writing reviews well, or perhaps just pleasing locals on courses with big followings, isn't the same as being authoritative on disc golf quality.
 
That's BS David...so a 0 for Woodshed is not egregious enough ? Really ?
The management is more interested in numbers, than accuracy. Wise up.

I can easily see someone who is an inexperienced (bad) golfer having a dreadful time on the Woodshed, particularly if playing without a guide. Wooded holes are brutally long and silly tight. Open holes are several throws worth of long. The pond is a disc eater with the Japanese iris surrounding it and lots of nasties lurking within. Sometimes the grass isn't mowed, etc, etc. Does it all add up to zero? Probably not for even the worst golfer but if that is their experience so be it. Your quest for accuracy in an opinion driven list is tilting at windmills. The only way to achieve any form of accuracy in a list such as this is to try for as many numbers as possible.

Disclaimer- The Woodshed and Whippin' Post are owned by my friends and I have played them many times. Paw Paw is probably my favorite place to play disc golf. I helped out in running the West Virginia Team Invitational there just a few weeks ago.
 
Last edited:
For what it's worth, the 0 at Woodshed was not by a beginner, but a guy with a seething hatred of wooded courses, and those who design them. Even the courses he gave high ratings, he'd spend much of the review trashing other courses for not being like them. You didn't have to read many of his reviews to know they were his honest opinions (and about as outlier as it gets). He also gave an 0.5 to the IDGC and seemed to think Charlotte was the pit of hell, discgolfwise.
 
What I'm getting from all this:

It's okay if you wanna rate up for personal agenda, but if you rate down, be prepared for the immediate backlash.

That may be but a vendetta 0 disc rating will affect the overall rating much more than a homer 5 will.
 
That's BS David...so a 0 for Woodshed is not egregious enough ? Really ?
The management is more interested in numbers, than accuracy. Wise up.
Drawing a line on that is a lot harder than it sounds. That review was his normal "I can't believe y'all think course like this are good" review that he did for every wooded course until you get to the car chase. The car chase is weird.

Is the car chase the reason he gave it a 0? It might have been a factor, but given his history I would have expected a 1 or a 1.5 from him for that course. If he docked it a point because he perceived that there was this car chase confrontation with the owner, is that really a big deal?

Take away the car chase and say he gave it a 1. It's still an outlier at a 1, but it's his opinion. We asked for his opinion and he gave it. We didn't say "we want your opinion so long as it's in line with everyone else's opinion."

I'm not unsympathetic to the course owner perspective. If I had a course that would be in the top 25 if you delete one review, I totally would want that review gone if I was trying to attract visits to my course. We can't really let private owners dictate which reviews stand and which don't, though. If we did that all the private courses would be rated 5's.

So it's damned if you do/damned if you don't. In some people's eyes we delegitimize the process by letting a 0 for The Woodshed stand because we all agree it's not a 0, but in some people's eyes we delegitimize the process by deleting a 0 for The Woodshed because we are enforcing groupthink and getting rid of reviews that stray too far from the conventional thinking. Either way, once that 0 for The Woodshed posted people were going to be unhappy.

I think we all get that if we built a disc golf course with our bare hands on our property and were kind enough to allow people to play it, it would be personally insulting for somebody to walk up to us after playing it and blurt out "That sucked!" in our face. When somebody rips a course in a review on this site and it lives there for years and years, it's like that guy getting to yell "That sucked!" in our face every day for the rest of time. That could be infuriating if you can't let it go, and some people just don't have the kind of personality to shrug that stuff off. We don't get to tell people whether a bad review of their course should bother them any more than we get to tell people they should enjoy wooded courses. I get why it's a hot-button issue with private owners and why it comes up all the time, and I wish there was a better answer. There just isn't a better answer. Just the slippery slope of a censored groupthink site.
 
I don't remember details and can't link examples, but there was a guy on here years ago that was in Texas who also hated trees as a design element and had a thing out for Houck. You could tell from his reviews that he had designed a poorly regarded open course and he was upset that people didn't understand the challenge of throwing in the wind. Houck's courses are generally highly regarded AND use trees, and that was driving him crazy. He blasted a bunch of Houck courses with bad reviews.

Was that a vendetta? Or is "anything John Houck designed is garbage" a legitimate opinion? Sometimes it's hard to tell. You had to read all of his reviews to pick up on the theme, one by one they were just weird reviews that ranted about trees and wind and Houck. It just seemed like his opinion was "trees are bad, don't use them for disc golf." The Houck thing just seemed like a "I don't like trees, this course uses trees, Houck designed it, don't know why everybody says his course are good" thing. You had to combine them the perceive a vendetta against Houck.

All of that happened years ago before any reviews were deleted for any reason. It never crossed my mind to suggest that we get rid of them even though they struck me as crazy at the time.
 
Last edited:

If you have a course with an average of a four rating (4.0) with 9 reviews

A zero brings it to an average of 3.6 with ten reviews

(4x9=36)+0(10th review)/10 reviews

A five brings the average to a 4.1 with ten reviews

This is because a 5 is closer to the usual average rating, thus less effecting the overall average.

--------------------------------

A course that averages less than 2.5 will have the opposite.

1.0 on 9 reviews

A zero brings it to a 0.9

A five brings it to a 1.4
 
More often or not, outlier review scores are the sign of shameless homerism, a sourpuss blaming the course for his crappy play, or someone with obtuse tastes.

But sometimes, the outlier is right, and once they put a chink in some overhyped course's armor, it seems others are more comfortable throwing more rocks.

Groupthink is far more dangerous to the integrity of the review console than a few who go against the grain, although in an ideal world I'd prefer to do without either.
 
I don't remember details and can't link examples, but there was a guy on here years ago that was in Texas who also hated trees as a design element and had a thing out for Houck. You could tell from his reviews that he had designed a poorly regarded open course and he was upset that people didn't understand the challenge of throwing in the wind. Houck's courses are generally highly regarded AND use trees, and that was driving him crazy. He blasted a bunch of Houck courses with bad reviews.

Was that a vendetta? Or is "anything John Houck designed is garbage" a legitimate opinion? Sometimes it's hard to tell. You had to read all of his reviews to pick up on the theme, one by one they were just weird reviews that ranted about trees and wind and Houck. It just seemed like his opinion was "trees are bad, don't use them for disc golf." The Houck thing just seemed like a "I don't like trees, this course uses trees, Houck designed it, don't know why everybody says his course are good" thing. You had to combine them the perceive a vendetta against Houck.

All of that happened years ago before any reviews were deleted for any reason. It never crossed my mind to suggest that we get rid of them even though they struck me as crazy at the time.

Scott, I'm pretty sure that was a vendetta. My memory is that if two courses had similar tree density, he would rate my course much lower. The evidence was clear enough that the DGCR powers-that-be felt justified in pulling the reviews. If someone hates my courses and has a reason for it (I can't imagine what that would be, of course) then that's fine. But that wasn't the case with this guy. I still haven't figured it out, because I had known him for 20+ years at that point, and I thought everything was fine.

I'm really surprised anyone remembers all that -- as I've said before, you're an impressive fellow. Thanks.
 
If you have a course with an average of a four rating (4.0) with 9 reviews

A zero brings it to an average of 3.6 with ten reviews

(4x9=36)+0(10th review)/10 reviews

A five brings the average to a 4.1 with ten reviews

This is because a 5 is closer to the usual average rating, thus less effecting the overall average.

--------------------------------

A course that averages less than 2.5 will have the opposite.

1.0 on 9 reviews

A zero brings it to a 0.9

A five brings it to a 1.4

Which is precisely the point: the claim that a vendetta 0 rating will affect the overall rating more than a homer 5 is valid for a subset of courses with a higher than average rating, not across the board as Tripper's unqualified claim implies.
 
Which is precisely the point: the claim that a vendetta 0 rating will affect the overall rating more than a homer 5 is valid for a subset of courses with a higher than average rating, not across the board as Tripper's unqualified claim implies.

Maybe so, but most reviews of a 5 for a 2.5 or lower aren't really alarming to most people, because everybody knows it isn't a true 5. Now a zero for a 4.5, that knocks you out of top 10, and makes it less "playable."
 
More often or not, outlier review scores are the sign of shameless homerism, a sourpuss blaming the course for his crappy play, or someone with obtuse tastes.

But sometimes, the outlier is right, and once they put a chink in some overhyped course's armor, it seems others are more comfortable throwing more rocks.

Groupthink is far more dangerous to the integrity of the review console than a few who go against the grain, although in an ideal world I'd prefer to do without either.

This is spot on, sometimes the outlier is the true voice in a veritable sea of homer's, convinced to join up to give a course a 5.
 
True, though usually those Reality Check reviews come in at a 3 or 4, not a 0 or 1.

But scarpfish is right---once someone bursts the bubble, more are willing to follow. It's a familiar story with courses that make a dazzling debut; I know of 4 courses within 2 hours of me that have taken that path, to various degrees.
 

Latest posts

Top