• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Rating/review system discussion

_MTL_

Flippy Flopper
Joined
Feb 19, 2008
Messages
3,514
The inconsistencies in rating on this site are the issue.

Beyond the fact that every rating counts the same regardless of who does it, the site gives basically no guide as to how someone should rate.

For example, I rate courses based on the comparison of what the course has compared to what was offered. If I see wide open rolling hills and gorgeous views and the course is average, this is going to be a bad rating. Then the course down the street has very little land and very little scenery, yet is a solid straight forward course. Even though course one is probably better, I rate course two higher.

I couldn't care less about trash cans or multiple pins or water on a course, yet other people won't rate a course at a certain level without these factors. I also don't think you should penalize an elevation-less course (see previous comment).

The way to solve this is have multiple ratings.

A course should have 10 ratings from 0 - 5 in the same manner as it does now.

1. Safety
2. Ease of navigation
3. Equipment
4. Beauty
5. Design
6. Difficulty
7. Length
8. Elevation
9. Water
10. Additional items (trash cans, bathrooms, water fountains, etc).

Overall could be an opinion that the user selects or an average of all these factors.

Then, if I really care about well designed courses, I could sort based on that. If safety is my thing, I could sort. Beauty, etc etc etc.

Then I should be able to customize on that list what is most important. The site then should suggest courses.

Example - I really care about ease of navigation because I like new courses, equipment is important and length is important. I don't care about elevation and really hate water on courses. I put those in my profile. The site spits out courses that they recommend (example would be courses that have good navigation without water) and suggests courses I would not like (a course with poor navigation and lot of water).


This would greatly solve a lot of the issues on this site. I'm sure this is not as easy as I hope.
 
So many absolutes...:wall:

Why should someone be weighted higher than someone else? I know Diamond reviewers that didn't even play courses they reviewed. Well, at least one.
 
So many absolutes...:wall:

Why should someone be weighted higher than someone else? I know Diamond reviewers that didn't even play courses they reviewed. Well, at least one.

If you want your buddy who has never played disc golf to play a course and then rate it to have the same impact as your rating, that's fine.

I just don't agree with it.

Golf Digest uses a panel to rate courses, not anyone who has never played disc golf with internet access.
 
If you want your buddy who has never played disc golf to play a course and then rate it to have the same impact as your rating, that's fine.

I just don't agree with it.

Golf Digest uses a panel to rate courses, not anyone who has never played disc golf with internet access.

So new players don't play new courses? If they aren't at least as good as you, their opinions shouldn't count as much? That sounds like pre-flameout elitist MTL...You must've done this much for the sport to be allowed an opinion. :\ At least that rating is based on actually playing the course. I won't go into the Diamond Reviewer I know drives by courses, marks them as played, and still reviews them by recycling others' reviews even though I'd argue that taints ratings way worse than a new player who is giving their own opinion.
 
The inconsistencies in rating on this site are the issue.

Beyond the fact that every rating counts the same regardless of who does it, the site gives basically no guide as to how someone should rate.

For example, I rate courses based on the comparison of what the course has compared to what was offered. If I see wide open rolling hills and gorgeous views and the course is average, this is going to be a bad rating. Then the course down the street has very little land and very little scenery, yet is a solid straight forward course. Even though course one is probably better, I rate course two higher.

I couldn't care less about trash cans or multiple pins or water on a course, yet other people won't rate a course at a certain level without these factors. I also don't think you should penalize an elevation-less course (see previous comment).

The way to solve this is have multiple ratings.

A course should have 10 ratings from 0 - 5 in the same manner as it does now.

1. Safety
2. Ease of navigation
3. Equipment
4. Beauty
5. Design
6. Difficulty
7. Length
8. Elevation
9. Water
10. Additional items (trash cans, bathrooms, water fountains, etc).

Overall could be an opinion that the user selects or an average of all these factors.

Then, if I really care about well designed courses, I could sort based on that. If safety is my thing, I could sort. Beauty, etc etc etc.

Then I should be able to customize on that list what is most important. The site then should suggest courses.

Example - I really care about ease of navigation because I like new courses, equipment is important and length is important. I don't care about elevation and really hate water on courses. I put those in my profile. The site spits out courses that they recommend (example would be courses that have good navigation without water) and suggests courses I would not like (a course with poor navigation and lot of water).


This would greatly solve a lot of the issues on this site. I'm sure this is not as easy as I hope.

I appreciate this criticism even if I don't agree with it... countless times I've read posts that say something about how this or that reviewer sees things the way the poster commenting about it does and they like those reviews because of that... this is how folks use the DGCR reviewing system... well, at least one way. I read through a bunch of reviews and take away the common elements to get a picture of what the course in question is like... so, it's the aggregate for me. That's the point, the rating system is an aggregate and in that way is very fair. There are ratings that are extremely different from the majority, but this gets balanced out by the other reviews. This is fair. This is the premise of the rating system here.

New players with little experience have every right to review and rate just as the experienced folks do... this works.
 
Also, keep in mind that there are pictures, links, etc for each course. There is also the inclusion of meaningful icons next to the hole numbers (for example, water in play and elevation).
 
So new players don't play new courses? If they aren't at least as good as you, their opinions shouldn't count as much? That sounds like pre-flameout elitist MTL...You must've done this much for the sport to be allowed an opinion. :\ At least that rating is based on actually playing the course. I won't go into the Diamond Reviewer I know drives by courses, marks them as played, and still reviews them by recycling others' reviews even though I'd argue that taints ratings way worse than a new player who is giving their own opinion.

I understand and respect your opinion.

My opinion that newer players shouldn't have their reviews count as much is not an elitiest opinion, however. Never did I say anything along the lines that my review should count more, I said anyone who has played a lot. I even mentioned you in my example to avoid this exact response.

I do appreciate somewhat confirming that people's opinion of me is not 100% my fault.
 
The inconsistencies in rating on this site are the issue.

Beyond the fact that every rating counts the same regardless of who does it, the site gives basically no guide as to how someone should rate.

For example, I rate courses based on the comparison of what the course has compared to what was offered. If I see wide open rolling hills and gorgeous views and the course is average, this is going to be a bad rating. Then the course down the street has very little land and very little scenery, yet is a solid straight forward course. Even though course one is probably better, I rate course two higher.

I understand what you mean, but don't completely agree with you. DGCR's rating sword cuts both ways (hell, probably cuts more than two ways). The beauty is that everyone can rate a course based on their values... which is also the problem. Who's to say who is/isn't qualified to review courses, and what criteria should be used? For the most part, I think there's a great deal of consensus, but the devil's always in the details.

In your example above, you intentionally place emphasis on how well the designer utilizes the property's natural elements (something I do to a certain degree myself).

Where you and I differ, however, is that ultimately, my ratings are based how much I enjoyed playing it. While I might agree that Course A has a lot of untapped potential, if it ends up being a more enjoyable course than Course B, shouldn't the rating reflect that? Mine will. I will mention what I saw as untapped potential in Other thoughts (or possibly even cons), but try to arrive at ratings that indicate how much I actually enjoyed playing the course, with all things considered.

The way I see it, a better course should have a higher rating, regardless what it does well or doesn't do well. That being said, each if us is free to place whatever "weighting"rating we choose to whatever attributes we think are more or less important. Maybe

I can't stress enough, ratings should really only be useful for courses that are a decent distance from where they live/work. Most people here will pretty much play every course in their area, decide for themselves what they do/don't like, and how often they want to play it. Anyone who uses reviews/ratings to decide which local courses they want to play is a moron.

But when you have 2-3 days to drive 600 miles round trip, and want to make the most of your time, reviews and ratings are pretty much all you got.

That's why I always tell people to click on courses they're familiar with and read a bunch of that course's reviews. Find a few reviewers who say the same things about the course that you would say. Click on their profiles and see what other courses they've reviewed... find some people who seem to share your DG values, and make them your personal TR's.
 
Last edited:
I'm just someone who calls it what it is... it is an elitist opinion to say that only a certain class of folks should count or at least count more than another class of folks... I mean... yeah

Not an attack, just calling it what it is... and not diminishing your point either; however, it is an elitist perspective.

Anyway, I want to consider this idea of a panel a bit before I have an opinion on it...
 
to add to BNM's post... I enjoy when a reviewer mentions their rating method... some clearly state that they rate the course for the course and some clearly mention how amenities affect their rating and etc... it's pretty clear for the most part

There is also a filter view that can whittle down things like TR's or how many times the reviewer played the course... and even how many years experience the reviewer has... so, basically, you have the ability to see what you want to see.
 
I understand what you mean, but don't completely agree with you. DGCR's rating sword cuts both ways (hell, probably cuts more than two ways). The beauty is that everyone can rate a course based on their values... which is also the problem. Who's to say who is/isn't qualified to review courses, and what criteria should be used? For the most part, I think there's a great deal of consensus, but the devil's always in the details.

In your example above, you intentionally place emphasis on how well the designer utilizes the property's natural elements (something I do to a certain degree myself).

Where you and I differ, however, is that ultimately, my ratings are based how much I enjoyed playing it. While I might agree that Course A has a lot of untapped potential, if it ends up being a more enjoyable course than Course B, shouldn't the rating reflect that? Mine will. I will mention what I saw as untapped potential in Other thoughts (or possibly even cons), but try to arrive at ratings that indicate how much I actually enjoyed playing the course, with all things considered.

The way I see it, a better course should have a higher rating, regardless what it does well or doesn't do well. That being said, each if us is free to place whatever "weighting"rating we choose to whatever attributes we think are more or less important. Maybe

I can't stress enough, ratings should really only be useful for courses that are a decent distance from where they live/work. Most people here will pretty much play every course in their area, decide for themselves what they do/don't like, and how often they want to play it. Anyone who uses reviews/ratings to decide which local courses they want to play is a moron.

But when you have 2-3 days to drive 600 miles round trip, and want to make the most of your time, reviews and ratings are pretty much all you got.

That's why I always tell people to click on courses they're familiar with and read a bunch of that course's reviews. Find a few reviewers who say the same things about the course that you would say. Click on their profiles and see what other courses they've reviewed... find some people who seem to share your DG values, and make them your personal TR's.

Thanks for this response - and you too Noill.

But you are exactly right - the best thing and the worst is the fact that it's completely open to interpretation.

Which is exactly why I feel the system I am proposing would be far better. Then everyone's actual interpretation of what they want in a course is available.

In the minimum, a panel's rating and everyone's rating would also be better.
 
I say "consider the source." Up until very recently, golf pretty much was a game for the semi-elite. It's essentially cost prohibitive for the majority of the population. Disc golf, not so much.
 
I'm just someone who calls it what it is... it is an elitist opinion to say that only a certain class of folks should count or at least count more than another class of folks... I mean... yeah

Not an attack, just calling it what it is... and not diminishing your point either; however, it is an elitist perspective.

Anyway, I want to consider this idea of a panel a bit before I have an opinion on it...

While it might be elitist, it would create a perspective that would enhance the site.

Unless you take no additional weight in your favorite designers opinion of a course than your buddy who has never played, than you are doing basically the same thing.
 
The inconsistencies in rating on this site are the issue.

Beyond the fact that every rating counts the same regardless of who does it, the site gives basically no guide as to how someone should rate.

For example, I rate courses based on the comparison of what the course has compared to what was offered. If I see wide open rolling hills and gorgeous views and the course is average, this is going to be a bad rating. Then the course down the street has very little land and very little scenery, yet is a solid straight forward course. Even though course one is probably better, I rate course two higher.

I couldn't care less about trash cans or multiple pins or water on a course, yet other people won't rate a course at a certain level without these factors. I also don't think you should penalize an elevation-less course (see previous comment).

The way to solve this is have multiple ratings.

A course should have 10 ratings from 0 - 5 in the same manner as it does now.

1. Safety
2. Ease of navigation
3. Equipment
4. Beauty
5. Design
6. Difficulty
7. Length
8. Elevation
9. Water
10. Additional items (trash cans, bathrooms, water fountains, etc).

Overall could be an opinion that the user selects or an average of all these factors.

Then, if I really care about well designed courses, I could sort based on that. If safety is my thing, I could sort. Beauty, etc etc etc.

Then I should be able to customize on that list what is most important. The site then should suggest courses.

Example - I really care about ease of navigation because I like new courses, equipment is important and length is important. I don't care about elevation and really hate water on courses. I put those in my profile. The site spits out courses that they recommend (example would be courses that have good navigation without water) and suggests courses I would not like (a course with poor navigation and lot of water).


This would greatly solve a lot of the issues on this site. I'm sure this is not as easy as I hope.

Thanks for explaining what is important to YOU. Reviews and ratings are as varied as the disc golfers that write them. You seem more concerned about ratings than review content.

My reviews are two fold. First, I want to give as much information as I can about the course and lastly, to give the course my rating.

I usually try to give advice about bringing along kids, carts and strollers because that is important for some folks to know when making a decision between two courses. I want to let players know if the course is all 18 in a row, or if you can get back to the car after 8, 9 or 10 holes so players can be prepared for the round. I like to let players know how easy it is to navigate or if they can lose discs or if there is poison ivy.

I think the rating I give is the least important part of my reviews.
 
Thanks for explaining what is important to YOU. Reviews and ratings are as varied as the disc golfers that write them. You seem more concerned about ratings than review content.

My reviews are two fold. First, I want to give as much information as I can about the course and lastly, to give the course my rating.

I usually try to give advice about bringing along kids, carts and strollers because that is important for some folks to know when making a decision between two courses. I want to let players know if the course is all 18 in a row, or if you can get back to the car after 8, 9 or 10 holes so players can be prepared for the round. I like to let players know how easy it is to navigate or if they can lose discs or if there is poison ivy.

I think the rating I give is the least important part of my reviews.

Agreed that reviews are way more important than ratings.

But this thread is specifically about ratings, which is why I brought this up.

I'm shocked that no one else thinks that multiple areas of reviews that would increase everyone's enjoyment of a course based on their personal list of what makes courses good is a good idea.

I suspect it's the author and not the content.
 
While it might be elitist, it would create a perspective that would enhance the site.

Again, it wouldn't (won't post your contradiction between your reply to me and Noill). A course might be too difficult for a new player. A new player reviewing it and rating it low might help keep another new player from having a bad day there. It sounds like the only thing it would "enhance" is you wouldn't have to sift through reviews you deem to be unworthy. Sorry that you have to share air with us lesser folks.

Here's the genius behind the bad review and rating though. As that reviewer gets better and revisits the course, they can change their review and rating. While that wouldn't potentially stop more new players from visiting, it would reflect the golfer's changing thoughts on courses and the course design.
 
I understand what you mean, but don't completely agree with you. DGCR's rating sword cuts both ways (hell, probably cuts more than two ways). The beauty is that everyone can rate a course based on their values... which is also the problem. Who's to say who is/isn't qualified to review courses, and what criteria should be used? For the most part, I think there's a great deal of consensus, but the devil's always in the details.

Yes, and going along with that is that the readers all have their own values---someone looking at reviews make be looking for something different than MTL, or me.

If the ratings are by an assortment of people---experience, skill, verbosity---and the consumers are an assortment, then I'm fine with it all averaging out.

The PDGA tried a formulaic approach, measuring all sorts of factors on a fixed scale and averaging them together. It didn't last long.
 
Again, it wouldn't (won't post your contradiction between your reply to me and Noill). A course might be too difficult for a new player. A new player reviewing it and rating it low might help keep another new player from having a bad day there. It sounds like the only thing it would "enhance" is you wouldn't have to sift through reviews you deem to be unworthy. Sorry that you have to share air with us lesser folks.

I literally used you in the example. There is absolutely no reason my review should count more than yours. I've never said I am the end all be all of what is a good course and what is not. I'm comparing extremes - John Houck and your buddy has never played.

Here's the genius behind the bad review and rating though. As that reviewer gets better and revisits the course, they can change their review and rating. While that wouldn't potentially stop more new players from visiting, it would reflect the golfer's changing thoughts on courses and the course design.

I wouldn't call it genius, rather a band aid on a wound needing stitches. Sure it works and it's effective, but it's causing more work down the line and address the result of the problem rather than the problem.
 
Last edited:
Top