• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Weighted Reviews

You would get weird things at first when there are missing connections but I would think that would go away pretty quickly as connections are made. .

….which is more or less the same thing that happens with the current system. As more reviews of a course are done, the biases tend to average themselves out.

Too much work for too little return.
 
There's simply no way for a single number or ranking system, to provide a feel of what it might be like to actually play the course. If you're looking for that, you're just fooling yourself.

Not all of us enjoy the same types of courses and features to the same degree. That's why good reviewers describe the courses attributes, so the reader can decide what to put an emphasis on for themselves.


It might even vary for a visiting players depending on how much time they've got in the area. If I know I'll be tight on time, I might choose to play a less interesting course that has great navigation and flow, over a "better" course with navigational issues.

If I have the time, I'll probably choose the latter. Which one I find to be more enjoyable may depend on criteria that can vary from one visit to another. The current system allows me, as a reviewer, and you as a reader to be more objective than an ordinal system.
 
….which is more or less the same thing that happens with the current system. As more reviews of a course are done, the biases tend to average themselves out.

For places that get lots of reviews, sure. But there are lots of courses on here with three, four, five reviews that are overrated because one or two folks who had some hand in building it or are excited about their new local course gave it 4 stars when it's really a 1 or 2.

Also, if people tend to apply the full scale to the courses close to them (no idea to what degree this actually happens, but certainly to some degree) this would tend to flatten things out region to region. This could be a good thing I suppose since most people aren't lucky enough to wander the country playing disc golf and just want to know what the best courses are near them.

Just an idea and fun to think about. Not important issues by any measure.

First post and it's beating a dead horse. Wish someone had told you to save your energy before writing all this.
did I touch a nerve or something?
 
There's simply no way for a single number or ranking system, to provide a feel of what it might be like to actually play the course. If you're looking for that, you're just fooling yourself.

Not all of us enjoy the same types of courses and features to the same degree. That's why good reviewers describe the courses attributes, so the reader can decide what to put an emphasis on for themselves.


It might even vary for a visiting players depending on how much time they've got in the area. If I know I'll be tight on time, I might choose to play a less interesting course that has great navigation and flow, over a "better" course with navigational issues.

If I have the time, I'll probably choose the latter. Which one I find to be more enjoyable may depend on criteria that can vary from one visit to another. The current system allows me, as a reviewer, and you as a reader to be more objective than an ordinal system.

Oh, for sure. I read the reviews to make calls on where I want to go (and hell sometimes when I get bored I read reviews of courses I'll probably never play) and I'm very aware people have different criteria. Mine sure seem to be different than most. But I *do* sometimes when taking a road trip somewhere, and hope to hit a course on the way, have the map show me all the courses on the way above say, 2.5 discs, and I've just noticed it seems like you get a lot of clear anomalies and bunching in the 2.5 to 3.5 range. Just throwing out an idea that might help with that.
 
For places that get lots of reviews, sure. But there are lots of courses on here with three, four, five reviews that are overrated because one or two folks who had some hand in building it or are excited about their new local course gave it 4 stars when it's really a 1 or 2.

Also, if people tend to apply the full scale to the courses close to them (no idea to what degree this actually happens, but certainly to some degree) this would tend to flatten things out region to region. This could be a good thing I suppose since most people aren't lucky enough to wander the country playing disc golf and just want to know what the best courses are near them.

Just an idea and fun to think about. Not important issues by any measure.

did I touch a nerve or something?
No, you just made the same mistake that everyone else who wants to improve the review feature has made. This has been argued for more than a decade now.

You come up with this big intellectual idea that is a bit beyond most people's capacity to appreciate it. Most disc golfers like things simple, tactile and to the point. When the math stuff gets beyond the comprehension of a fourth grader a great deal of them zone out.
 
How about just changing the top ten to a "hot ten," only counting reviews in the last 6 months or a year (with a minimum number of reviews) and have the overall top courses ranked on a different tab? It would be nice to see more highly rated courses appearing on the front page.
 
I kind of like the OPs original thought of a TR top 10 or even 25. Again the issue would be how to differentiate TRs to drive bys and whatnot. But just having a button to click to look at that criteria I feel would be interesting. But like many others have said before, If it ain't broke...One other thought would be that if you write a review you need to fill out all 3 sections. Pros, cons AND other thoughts. I know it's 3 whole sections. But when your pros section says "awesome coarse, I got a 2 on hole 6" and cons says "none, Well some dude yelled at me for throwing my beer can on the ground". At least write SOMETHING in the other thoughts. I mean tell me you seen a bunch of toads out there. If you can't write 1 additional sentence on other thoughts should it really factor into ratings. And also please use the correct version of "course". :confused:
 
How about just changing the top ten to a "hot ten," only counting reviews in the last 6 months or a year (with a minimum number of reviews) and have the overall top courses ranked on a different tab? It would be nice to see more highly rated courses appearing on the front page.

The site does that, once a year. A Top-25 for courses, based on reviews in that calendar year.

It's interesting, though you see mostly the same names, shuffled around a bit. You don't see 4.6-rated courses coming out with a 3.3, or vice versa.

I've advocated (dreamed of) a feature where users can parse the reviews with whatever options they want, and create customized Top-10 lists. I gather it's more work than I imagine.
 
For places that get lots of reviews, sure. But there are lots of courses on here with three, four, five reviews that are overrated because one or two folks who had some hand in building it or are excited about their new local course gave it 4 stars when it's really a 1 or 2.

Also, if people tend to apply the full scale to the courses close to them (no idea to what degree this actually happens, but certainly to some degree) this would tend to flatten things out region to region. This could be a good thing I suppose since most people aren't lucky enough to wander the country playing disc golf and just want to know what the best courses are near them.

Just an idea and fun to think about. Not important issues by any measure.

did I touch a nerve or something?

Wouldn't the same thing happen with a course with only 4 local reviews, where the reviewers ranked it at the top of their small list of courses played?

If you've played hundreds of courses, it's hard to rank them. One course may move up or down by dozens of places on a given day, depending on how you feel about it. You may only vaguely remember some courses you've played, so be unable to say how the new course compares to them. But it's easy to have generated a category of ratings in your head, and know which the new one fits in. Yet those people who have played hundreds of courses are, if anything, the ones with the most perspective and most valuable opinions.

The current system may be too simple, but simplicity is also it's virtue. I have a notion of what a consensus of 3.5 out of 5.0 means. I'm less certain of what being the 3,319th ranked course does.

Sidebar: If you ask me what courses I'd prefer to play, you may get a different answer than what courses I recommend. Some courses I prefer to play for sentimental reasons---I started there, I live there---or may not prefer to play because, even though it's great, it's a physical beat-down so I play it rarely. But I rate courses as if I'm giving recommendations to others.
 
If you've played hundreds of courses, it's hard to rank them. One course may move up or down by dozens of places on a given day, depending on how you feel about it. You may only vaguely remember some courses you've played, so be unable to say how the new course compares to them. But it's easy to have generated a category of ratings in your head, and know which the new one fits in. Yet those people who have played hundreds of courses are, if anything, the ones with the most perspective and most valuable opinions.
Fair points but with this system you don't have to come up with a ranking up front. Everything is "course A or course B?" I was taught this as an easier way to rank things you're struggling to rank. You'll end up with a ranking but you don't have to start out with one. I don't have a great answer to your very good point about possibly not remembering some courses well enough. Maybe a 'Skip' button could be added for the comparisons you don't feel you have a fair answer for.

Wouldn't the same thing happen with a course with only 4 local reviews, where the reviewers ranked it at the top of their small list of courses played?
Because the other courses on that small list would have been compared by other reviewers to yet other courses, it doesn't take a lot to get real relative rankings.

I have a notion of what a consensus of 3.5 out of 5.0 means. I'm less certain of what being the 3,319th ranked course does.
Agree and that's why a rating would be assigned based on rank. ...and this is totally a ranked based system and I get taking issue with that. It assumes the consensus top course is a 5 and the last is a zero, whereas in the real world it's entirely possible that 80% of the courses in the world *are* in between 2.5 to 3.5 as sometimes feels like the case with the ratings here (I made that stat up. Don't chew me out if it's not true).

I'll add that I wasn't thinking about this being a fair way to shake out the top 10 or 25 but more of a way of getting better differentiation in the middle and reducing outlier high (or low) rankings.

I've advocated (dreamed of) a feature where users can parse the reviews with whatever options they want, and create customized Top-10 lists. I gather it's more work than I imagine.
That would be really nice. Personally, I would put natural beauty and peacefulness as high priorities and condition and amenities not so much.

Discuss what is par for your next post. Your post has been discussed intro the ground since essentially day one of the site and nothing has changed. That's all.
I don't care about "what is par?", thank you very much. This, however, interests me. If it doesn't interest you then why are you responding to it?
 
I don't care about "what is par?", thank you very much. This, however, interests me. If it doesn't interest you then why are you responding to it?

You are missing the points. This point has been discussed ad nauseam since the site was created. You can see the countless threads and posts on this subject. Clearly you didn't search the forums for these topics, or you wouldn't have been the latest person to propose a new method for weighted reviews.

You're trying to creative objective points to a subjective system which is why it's all for naught. Even the people who attempt to be objective with their ratings criteria, there's a level of subjectivity involved. What about the person who had a bad day on the course, and decides to give a course a 0 because they're in a foul mood? What about the person who gives a pitch-n-putt course a 4 or 5 because it was easy for their 8-year-old? Well, that just happened in the past week. And based on weighted reviews, a poor review such as this should have more influence.

It's the same issue when people argue a trusted reviewer's rating should carry more weight. Spoiler: it shouldn't. Why should my review carry any more or less weight than yours because I've reviewed & played 300+ courses and you haven't reviewed any? Why should my review carry more or less weight than yours based upon when we've each played/reviewed the course?

You can try to keep proving a point, but Tim has made it clear there isn't going to be a change to the system. The closest he's come to listening to an idea is to have older reviews age off. Even that has been shot down simply because the site doesn't get enough reviews. Look at the list of top rated courses in 2018 versus the overall list of top rated courses. There's a lot of overlap showing that great courses generally remain great. It also shows that the number of reviews per year is relatively small.

Enjoy the site for what it offers. It's done pretty good for itself these last 11 years.
 
Fair points but with this system you don't have to come up with a ranking up front. Everything is "course A or course B?" I was taught this as an easier way to rank things you're struggling to rank. You'll end up with a ranking but you don't have to start out with one. I don't have a great answer to your very good point about possibly not remembering some courses well enough. Maybe a 'Skip' button could be added for the comparisons you don't feel you have a fair answer for.

I have my own ranked list, started when I began. I only have about 130 courses played, but it's very hard to figure where a new course falls on it---as soon as I try to insert a new course somewhere, I see that I no longer agree with the earlier rankings, or don't agree with them at the moment. I've kept it diligently and ought to just quit.

I just finished playing with a guy who's played 460+ courses. I can't imagine him doing it....nor a guy I know who just played his 1,700th. Try scrolling through that list to figure out where your most recent 3.0 might fit.

Too much trouble. But I can toss the 3.0 in with all the other 3.0s, pretty easily.
 
You are missing the points. This point has been discussed ad nauseam since the site was created. You can see the countless threads and posts on this subject.
So what? It's pretty unreasonable to expect something so much a part of some site not to be continually discussed on it's forums. If you don't want to read another post about it, don't. but if you don't then don't be a jerk to the poster.

Clearly you didn't search the forums for these topics, or you wouldn't have been the latest person to propose a new method for weighted reviews.
And clearly you didn't read my idea but felt the need to jump on me for posting it anyway. There's nothing weighted about it. I did do a search. I didn't see anything like my suggestion. Apologies if it's out there and I missed it.

You're trying to creative objective points to a subjective system which is why it's all for naught.
Again, clearly you didn't actually read what I said. This is not at all what I'm trying to do. It's based on "which course do you pick" which is about as subjective as it gets.

Even the people who attempt to be objective with their ratings criteria, there's a level of subjectivity involved. What about the person who had a bad day on the course, and decides to give a course a 0 because they're in a foul mood? What about the person who gives a pitch-n-putt course a 4 or 5 because it was easy for their 8-year-old? Well, that just happened in the past week. And based on weighted reviews, a poor review such as this should have more influence.

It's the same issue when people argue a trusted reviewer's rating should carry more weight. Spoiler: it shouldn't. Why should my review carry any more or less weight than yours because I've reviewed & played 300+ courses and you haven't reviewed any? Why should my review carry more or less weight than yours based upon when we've each played/reviewed the course?
I totally agree with all this...and I even threw out the idea of not requiring a written review to rate courses to encourage *more* people to rate them.

You can try to keep proving a point, but Tim has made it clear there isn't going to be a change to the system. The closest he's come to listening to an idea is to have older reviews age off. Even that has been shot down simply because the site doesn't get enough reviews.
Again, so what? I had maybe, oh, somewhere around zero expectation the idea would actually be adopted. It's just fun to think about and discuss....or at least I thought it would be, jeez.

Enjoy the site for what it offers. It's done pretty good for itself these last 11 years.
I do and it has.
 
I have my own ranked list, started when I began. I only have about 130 courses played, but it's very hard to figure where a new course falls on it---as soon as I try to insert a new course somewhere, I see that I no longer agree with the earlier rankings, or don't agree with them at the moment. I've kept it diligently and ought to just quit.
That's one of the beauties of this system. You don't have to think that hard about it. You don't need to consider where a course fits in at all. It would just ask you "course A or course B", "Course A or Course C", "course A or Course D" and so on.
Even if you "make a mistake" and pick a "lesser" course it comes out in the wash somewhat because of the connections to other courses through other comparisons. And you could just skip comparisons you didn't want to/know how to answer.

I just finished playing with a guy who's played 460+ courses. I can't imagine him doing it....nor a guy I know who just played his 1,700th. Try scrolling through that list to figure out where your most recent 3.0 might fit.
Now that for sure is one of the uglies about this system. If you're starting out fresh with a 100 courses to compare, that's 4950 individual A to B comparisons. (1600 courses is over a million comparisons!) Only answer I've got for that is I don't think raters would need to be completist with their comparisons for a solid network of them to be built up and get reasonable results.
 
IMHO, the opinions of players who've played more courses, particularly in different parts of the country, are much more meaningful that the opinions of players who've played relatively few, or only in a handful of areas... simply because as you play more courses away from home, the better perspective you get on where courses actually might fall in line in terms of ranking. You gain a perspective you simply cant get by playing the 100 or so courses in your area, or in your state, even.

If I correctly understand your proposal, does that mean someone like myself, with 300 course under my belt, or my course bagging buddy with 700+ courses would have to perform a few hundered "A vs. B" head to head every single new course we choose to play???

If so, you're sorely mistaken if you think that's gonna happen.
And that means you'd be leaving the most valued opinions (again, in IMHO) … out of your data set.


I'd just prefer to write a review, cause when I have the time, I actually enjoy writing them.
Count me out.
 
Last edited:
IMHO, the opinions of players who've played more courses, particularly in different parts of the country, are much more meaningful that the opinions of players who've played relatively few, or only in a handful of areas....

I can't speak for others, but for me this is 100% on point.

I'm SO glad I didn't jump into writing course reviews when I first signed up here. Now that I have a few more courses under my belt, I can see that previous reviews I would have written earlier, would now be way off base.

I never set a goal for courses played before I start writing reviews, but I think 100 would be good for me.
 
So what? It's pretty unreasonable to expect something so much a part of some site not to be continually discussed on it's forums. If you don't want to read another post about it, don't. but if you don't then don't be a jerk to the poster.

And clearly you didn't read my idea but felt the need to jump on me for posting it anyway. There's nothing weighted about it. I did do a search. I didn't see anything like my suggestion. Apologies if it's out there and I missed it.

Again, clearly you didn't actually read what I said. This is not at all what I'm trying to do. It's based on "which course do you pick" which is about as subjective as it gets.

I totally agree with all this...and I even threw out the idea of not requiring a written review to rate courses to encourage *more* people to rate them.


















Again, so what? I had maybe, oh, somewhere around zero expectation the idea would actually be adopted. It's just fun to think about and discuss....or at least I thought it would be, jeez.

I do and it has.

Throwing out some new ideas? Not a dang thing wrong with it. Marking the courses you have played as having been played? Nothing wrong with that either.:popcorn:
 
Top