• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Overrated courses

I think your big mistake was driving 5 hours to play disc golf in Columbia.

Especially considering where driving 6 hours in that direction might have gotten you.

Actually, for my taste, Earlewood and Southeast are fine courses. Not Epic or Top 10 or even Top 50, but well worth playing. They don't merit a 5-hour drive, but should be a stop on any trip through the area, time permitting.
 
Works for me. Honestly, we're probably just nitpicking at this point, and I'd much rather not be at work and out playing any of the courses we've discussed.

I think we can all agree on that. I'd rather be out playing any course at all.
 
I thought Leviathan was over-rated. This is based on my preference for variety and uniqueness of terrain. Leviathan was pretty much all wooded and rolling. It was fun to play, but not a destination style course that I'd go out of my way for, nor a "must play" outside of also hitting up Flip and Mason in the area.

I just got back from a MS and LA trip and Moccasin Bend just east of Jackson MS looed promising. Man was that wrong! While there were some incredibly and awesomely unique holes, most of the course was not a course, just a basket, a poorly marked and unimproved tee area, and LOTS of random trees in between. Some holes were absolutely ridiculous. Add to that the scores of poison plants (ivy and oak) EVERYWHERE, and aside from 3-5 holes, I was upset I went out of my way to play it.
 
I really can't say in all my travels that one course really sticks out as being overrated. I can say though that usually every state has it's share of "HOMEBOY Rating" and most the course are almost always bumped up a half a disc if not a whole disc just because its in their state but that's just my personal view on the golf I have played.
 
Lindsey Park - Gold Course in Texas was overrated to me. It was rated around 4.4 when I played it so I was expecting a pretty awesome destination. While I enjoyed some of the holes it felt very repetitive and just hard for the sake of being hard. It was set in mature woods and it didn't seem that much additional work was done to carve out of the woods.
 
I really can't say in all my travels that one course really sticks out as being overrated. I can say though that usually every state has it's share of "HOMEBOY Rating" and most the course are almost always bumped up a half a disc if not a whole disc just because its in their state but that's just my personal view on the golf I have played.

Yeah, but you've barely played any courses.
 
I really can't say in all my travels that one course really sticks out as being overrated. I can say though that usually every state has it's share of "HOMEBOY Rating" and most the course are almost always bumped up a half a disc if not a whole disc just because its in their state but that's just my personal view on the golf I have played.

It would be interesting, though impractical, if you could filter reviews by players living more than, say, 500 miles away.

There are biases by untraveled players that aren't local boosterism, as much as just regional bias. If you're in an area where courses tend to be weaker, but you've never traveled, then the best course in your area is bound to be a 5.0 in your eyes. If you live in an area where one style of course dominates, and that's what you learned on, you may have a bias in favor of it, as well.
 
It would be interesting, though impractical, if you could filter reviews by players living more than, say, 500 miles away.

There are biases by untraveled players that aren't local boosterism, as much as just regional bias. If you're in an area where courses tend to be weaker, but you've never traveled, then the best course in your area is bound to be a 5.0 in your eyes. If you live in an area where one style of course dominates, and that's what you learned on, you may have a bias in favor of it, as well.

I like the idea. Looking at my map, I'm fairly centralized to the east. Would there be any regions that could stand as benchmarks?
 
I like the idea. Looking at my map, I'm fairly centralized to the east. Would there be any regions that could stand as benchmarks?

Charlotte would be a good place to start but from what I have heard, Cincinnati would be a benchmark due to great courses. I generally prefer wooded and technical courses with elevation as opposed to open field courses. IMO, most courses around schools seem to be open with the same shot over and over. Lander and USCS are the exceptions I have found.
 
Here's the issue.

Anyone, despite their knowledge of the sport or reviewing courses, can review a course. I'm ok with that.

However, their rating is equal to everyone else.

Throw in local bias / newbie bias and you get a lot of horribly innacurate ratings.

This site is far and way the best database of courses, but it's rating system is a complete joke.
 
There are always going to be courses that are inflated and others that are under-rated. It's simply because we all have different tastes and different attributes which stand out to us. I prefer park style courses with plenty of mature trees. I don't mind densely wooded courses, but they're likely not going to get a high rating from me. Likewise, a course that is completely open and long is not going to get a high rating from me.

To me, it's all about enjoyment. Amenities like trash cans and bag holders...they all have an impact. But when I sit down to write a review, it's all about how much I enjoyed the course, would I go back, and do I think others would want to go play the course. As far as my opinion, BRP is the most over-rated course I've played because of the number of filler holes. It would be a great 18 hole course...but it's only a decent 27-hole course. That's solely my opinion. I've played some of the highly-rated Colorado courses, and Conifer was the most enjoyable and diverse. Others prefer the "private" vibe. No big deal.

Bottom line: my friends and I go on trips fairly regularly and we try and play the best courses we can. Heading out to Selah Ranch in June, and I'm really pumped. But every year, we play at least one course that disappoints based on the rating. Just because I think a course is over-rated doesn't mean that others won't enjoy it. For me, I've started looking at the pictures rather than individual reviews. By looking at those, and distance distribution, I'm able to identify the courses that I prefer.
 
Here's the issue.

Anyone, despite their knowledge of the sport or reviewing courses, can review a course. I'm ok with that.

However, their rating is equal to everyone else.

Throw in local bias / newbie bias and you get a lot of horribly innacurate ratings.

This site is far and way the best database of courses, but it's rating system is a complete joke.

Read up on the Delphi method. Although this site doesn't use all aspects of Delphi, you'll be much more impressed with how well this site works.

Interestingly, one precept is that "the status of an idea's proponent" is an invalid criteria by which to judge an idea.
 
Read up on the Delphi method. Although this site doesn't use all aspects of Delphi, you'll be much more impressed with how well this site works.

Interestingly, one precept is that "the status of an idea's proponent" is an invalid criteria by which to judge an idea.

There is no spoon.
 
I've played a handful of courses and taken a couple of short roadtrips to check out the highly rated courses near me. I've played Idlewild and Lincoln Ridge. I've played Coyote Trace. These are the highly rated courses somewhat near me. I think I've agreed with most of what I've read about these courses. I certainly think they merit their ratings.

For me though, I haven't reviewed any courses on DGCR. I'd like to play a lot more variety and get a better feel for the possibilities so I can dial in what I think a 5 star course might be.

The point of my post being, there are a lot of local people living near great courses that aren't heaping on the 5 star ratings as well as those that will promote their local pitch n' putt "just cuz it's there". Do be too cynical about people's motives when they post reviews...
 
Here's the issue.

Anyone, despite their knowledge of the sport or reviewing courses, can review a course. I'm ok with that.

However, their rating is equal to everyone else.

Throw in local bias / newbie bias and you get a lot of horribly innacurate ratings.

This site is far and way the best database of courses, but it's rating system is a complete joke.

I disagree with this, with a few caveats:
1 - Assuming a decent amount of reviews. If there are only 3 reviews - sure it is easy to inflate a course up. If there are 20....not so much.
2 - There are a very few true outliers (PKP anyone?), but they are far and few between.
3 - Just because you don't agree with a rating - doesn't mean it is wrong. It may be that you value different factors than the masses. Personally - I don't mind a tight, technical course....but I have many friends who hate them. Realizing how your values differ from the masses is important to understand the ratings.
4 - The value to me isn't so much the # for the rating - but the comments. It is very easy to read between the lines when looking at how the comments stack up against the ratings.
 
Read up on the Delphi method. Although this site doesn't use all aspects of Delphi, you'll be much more impressed with how well this site works.

Interestingly, one precept is that "the status of an idea's proponent" is an invalid criteria by which to judge an idea.

For Delphi to work well, all participants need to eventually be making their assessments (ideally right away) including the same parameters and weights. That's at least one issue with DGCR is that different factors are considered and weighted differently by each reviewer.
 
4 - The value to me isn't so much the # for the rating - but the comments. It is very easy to read between the lines when looking at how the comments stack up against the ratings.

Exactly. If you have a well written overall review that describes an accurate representation of the course it really won't matter if you gave it 1/5 or 5/5. When I'm considering a course to play I don't even really pay much attention to the numbers, I look at a handful of good reviews, and odds are I know exactly what kind of course I'll be getting.
 
I was rather disappointed with Water Works. I freely admit that a good bit of that disappointment was bound to be subjective because I ended up frustrated.

Navigation was one reason that I was disappointed. The first time, with a group, we never found Hole 2 (Only partly played the course, didn't get past Hole 9). Second time, by myself, I had to be shown where 2 is by another player. This happened on at least two other holes, and on another I discovered that I had missed a hole two holes after. This despite being pretty good with maps (working with maps has been part of my work for much of my life).

Another was how difficult it was playing without a spotter. I wasted a lot of time hunting discs on the other side of slopes and ridges.

I did play this when I had only been playing 4-5 months, and had only played apx. 17 courses in 4 states (now 14 states and 2 provinces), so I had less to compare it to. Also, I played the second time (which was all the way through) in the Fall, and there were lots of leaves on the ground to complicate disc hunting.

I wanted to play this again, some league night, before I moved from KC, but didn't get the chance. Maybe that would have given me a good experience so that I could better evaluate the course objectively. Still, 4.54! It is hard for me to see how Water Works rates above Blue Valley.
 
For Delphi to work well, all participants need to eventually be making their assessments (ideally right away) including the same parameters and weights. That's at least one issue with DGCR is that different factors are considered and weighted differently by each reviewer.

We're into meta-Delphi now, where coming to conclusions about what the factors and weights "should" be is built in to the process.
 
Top