• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

I don't understand DGCR reviews/ratings

Reviews and ratings on Disc Golf Scene are about as useful as reviews and ratings on U-Disc.

Just about any course that's halfway decent is rated at least "A-" on DG Scene. For the most part of a course rated below A-, it likely sux.

The problem with a bunch of courses being rated in the top few tiers, is that it provides little separation between the truly wonderful, bucket list courses, and the merely good.

The way people choose to rate courses on those sites is essentially a binary function:
Either it's good, or it's not.
But you really can't tell to what degree they're good, or to what degree they're not. And there isn't enough substance in the reviews to make that determination.

That would certainly make tougher to plan a road trip and know you were going to hit truly high quality courses.

Probably why I switched over to this site pretty quickly after noticing I was the most recent reviewer in like 2-3 years at about 90% of the courses I was going to.
 
Tim started a course review forum. He likes the course reviews.

He set up a message board to go along with it because it seemed like the thing to do. He doesn't really seem to like this message board, which is why I've been lurking around here for the last dozen years. I read it so he doesn't have to.

Short story long, Tim is never going to see that comment. He's probably not seen any other discussions of it. You can't really guess that he does or doesn't want to do anything when he's not responding to the comments, it just means he's not reading the thread.

Tim generally watches the site suggestions, bugs and help forum. If you want to seriously discuss overhauling course reviews and engaging Tim in the conversation, you have to have a thread there. Otherwise you are depending on me to tell Tim he should read this thread and...I'm not going to. :|
So that's why I didn't get a new puppy for Christmas? Shucks.
 
Some of us pay to get a crown, some of us do not. I'll agree it's a worthless trinket, I'd do fine without it. Took me years before I decided to donate $20 to keep this running. I think someone quilted me :D, inadverently .

Probably no chance that that person was me. I mean, if you calculated the odds, there's basically ZERO percent of a chance it was me. ;)
 
Other sites may do other things better than DGCR or do things that DGCR doesn't even do but DGCR is and should always be the best place to find course info and get at least somewhat realistic reviews and ratings. I know that when I travel to play a new course and there is a wellsbranch250 etc review for it that I'm getting good solid intel on the course. For me supporting this site is a no-brainer. thrembo is DGCR for life.
 
I'll throw my two cents on the fire. I started reviewing because I played a terrible course and thought people should know not to play there. That course is gone but you can still read the review. After that I started reviewing courses I had played because I wanted to let people know what to expect of courses when they come to the KC area, and I had a friend who was doing the same. To be honest my first 15-20 reviews were... crap, and I plan on going back and re-writing them soon, but with time comes experience so I'll move to:

The use of the thumb buttons. Thumbing down should be saved for reviews that are completely useless... as you can see I've read at least 194 useless reviews on here. Were a few of my early thumbs-down deserved? Yeah, they were, but what it did is help steer me toward making my reviews more helpful and informative to people looking into the course. Thumbing down is not because they gave a lower rating than you thought the course deserved or because they brought a different view of the course. If you simply don't like the review, nobody's forcing you to hit thumb up.

As for UDisc, I use it for score tracking and disc tracking, nothing more.

As for crowns, Tim deserves a little bit of my cash for all the work he's done.

As for me, I should be writing reviews (or re-writing reviews).
 
Your course is still on the site, it's just listed as extinct. Perhaps there is a way to fix that???

I rarely leave DGCR reviews because the page refreshes after 10 minutes and I've had so many reviews get deleted as I hit the post button.

I also only usually leave them for courses that don't have many other reviews.

https://www.dgcoursereview.com/course.php?id=3127
 
* I think a few like Olorin, and maybe Mike Harrrington, and maybe 1 or 2 others had developed their own review/rating system prior to DGCR.
That is correct. I first started reviewing courses as part of the PDGA Course Evaluator program (circa 2002? Chuck, what were the dates on that?). I developed my review criteria using that framework of Basics-Design-Amenities. You can find my review criteria at Olorin's Review Criteria. So I had been looking for a site like DGCR and I was elated to find it in 2007. Thank you TimG!

I even have a website that I created before DGCR was born. It has a really catchy name "Disc Golf Course Reviews" (note the "s" on the end).
 
If I wrote more reviews, I'd have to write down notes as I played the course. My golfer's memory has gotten really good.

I personally love DGCR. The maps and info are the best for planning. Udisc has it's place for me in keeping score and quick navigation at a new course. I just wish the younger generation loved DGCR as much as we do.

My phone gets way more use than my laptop these days. That definitely accounts for less reviews as someone pointed out earlier.

Another thing no one has pointed out yet is today's culture to over pump things. Everything is hyped, marketed, pumped up no matter how awful or worthless it is. So many bootlickers out there that will prop up that person that is over pumping something. It's almost cult like. Udisc and Facebook are great avenues to use to over pump. With DGCR, reality will sink in faster with honest reviews.
 
Last edited:
If I wrote more reviews, I'd have to write down notes as I played the course. My golfer's memory has gotten really good.

I personally love DGCR. The maps and info are the best for planning. Udisc has it's place for me in keeping score and quick navigation at a new course. I just wish the younger generation loved DGCR as much as we do.

My phone gets way more use than my laptop these days. That definitely accounts for less reviews as someone pointed out earlier.

Another thing no one has pointed out yet is today's culture to over pump things. Everything is hyped, marketed, pumped up no matter how awful or worthless it is. So many bootlickers out there that will prop up that person that is over pumping something. It's almost cult like. Udisc and Facebook are great avenues to use to over pump. With DGCR, reality will sink in faster with honest reviews.

I forgot to mention people are marketing themselves extremely hard on Facebook. That'd be difficult to do on DGCR. Not something I'm interested in anyway.
 
Last edited:
That is correct. I first started reviewing courses as part of the PDGA Course Evaluator program (circa 2002? Chuck, what were the dates on that?). I developed my review criteria using that framework of Basics-Design-Amenities.
Looks like our team started to develop criteria in 2003. I have completed course review forms on file from 2004 thru 2008 when activity ceased.
 
Your course is still on the site, it's just listed as extinct. Perhaps there is a way to fix that???

I rarely leave DGCR reviews because the page refreshes after 10 minutes and I've had so many reviews get deleted as I hit the post button.

I also only usually leave them for courses that don't have many other reviews.

https://www.dgcoursereview.com/course.php?id=3127

The course is extinct. I'm the one that reported it cause I was so glad it was gone.
 
If I wrote more reviews, I'd have to write down notes as I played the course. My golfer's memory has gotten really good.
This is one of the reasons I don't write as many reviews as I used to. Years ago, I relied on printed maps and scorecards when I roadtripped. At the end of the day (sometimes during the round), I'd go through the day in my head, and write notes on them. As I transitioned from paper to smartphone, the notes stopped.

Good, bad, or ugly, I like to include enough detail in my reviews, that my memory doesn't always do the trick, especially when I hit a bunch of courses.
 
Last edited:
I don't see anything in your posts in this thread, about Udisc. Not in your original post, not in your subsequent complaints about Tim, not in the talk about crowns.

It seems to be about dissatisfaction with DGCR.

To be fair, when I saw this post I immediately thought this thread was likely going to be a critique on how much hate UDisc reviews have been getting (and the associated post with the same name only replacing DGCR with UDisc). Rich has even posted somewhere here recently that people shouldn't be so quick to discredit a short review, simply because it is short. Which he kind of has a point, it's not overly fair to completely discount someone's thoughts on a course just because they don't spend as much time to articulate that view as TR here on DGCR.

More onto the topic at hand:

Personally, overall I think DGCR is fine the way it is. Ratings in general are typically fairly subjective, and I don't think there is a perfect system for rating courses. Any potential fix would probably be too easy to abuse.

For instance, what if 25% of reviews that were most recently written/revised (or reviews written/revised with the past year) being double-weighted so that when a course slowly undergoes a fairly substantial overhaul / an overtime redesign, reviews from the original layout aren't reflected in the score? Obviously the way it should work is the old course considered extinct and a new page made. However, sometimes this process is so slow that it is hard to determine the right time for this to happen. Courses like Angry Beaver which had its layout tweaked and redesigned since it opened is now a completely different course than it used to be. (If I am adding correctly, over half the holes are significantly changed or completely new. The problem is with this method a new course without significant amount of reviews, like The Lions could absolutely have its score tanked by one abysmal review.

So maybe reviews that receive a certain number of Thumbs Up being double-weighted, to reflect what is perceived as being accurate, well written information, and not just a person who was upset they didn't play well, lost a disc, etc. The issue here is that too many people would be likely to use the Thumbs Up/Down feature to affect the rating of courses they like/dislike instead of focusing on whether the review was actually helpful.
 
There are a lot of things that could be done, to tweak the ratings averages here. But with each one, you introduce the subjective factor of that tweak -- what percentage to double-weight, how many years to go back, whose reviews are more reliable? And open the door to squabbling -- your changes to the formula aren't as "accurate" as my changes to the formula would be.

There's a nice simplicity to a simple average of all reviews, without tweaking.

Plus, it matters very little. Usually, the squawking is about the Top 10, and hundredths of a point separations. For the other 99.999% of courses, the ratings wouldn't changing enough to warrant the effort.

I've long wished there were ways we could do it ourselves -- produce a list of courses rated by people who've played over 100 courses, or who lived more than 100 miles away, or whatever each of us might dream up. It would be interesting to be able to slice & dice data that way, and see what it says. But -- and it's a big but -- it's not by website, and I'm not tackling whatever work would be necessary to make that possible, so I'm not about to say what I think Tim should do.
 
There are a lot of things that could be done, to tweak the ratings averages here. But with each one, you introduce the subjective factor of that tweak -- what percentage to double-weight, how many years to go back, whose reviews are more reliable? And open the door to squabbling -- your changes to the formula aren't as "accurate" as my changes to the formula would be.

Exactly this! It's actually what I was trying to articulate in my post, but you've definitely said it in a more concise way. There's no reason to change ratings because they are subjective, and any change to system would likely be imperfect and/or abuse-able.

I've always viewed the ratings as a loose guideline and not as a definitive ranking of courses. What I love about the way DGCR works, is you can find a reviewer who has similar tastes in courses as you, and use their reviews to find a course you might like.
 
I'll sidenote to say that for a given course, you can filter ratings by a number of factors (year reviewed, experience, etc.), and see what happens. For example, Stoney Hill is rated 4.40, but if I throw out the early reviews and only use reviews from 2016, it's only 4.38.

That's with my basic membership; I'm not sure what people more generous than I can do.

It doesn't create lists for "Top 10s", etc., but it's available for any given course.
 
i didnt know how else to explain it. :D
attachment.php
 

Attachments

  • review meme.JPG
    review meme.JPG
    34.4 KB · Views: 166

Latest posts

Top