• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Up and Coming Good Reviewers

Funny enough, the reviews are one section of this place that I rarely pay attention to. Have never reviewed or reacted to a review. Have no clue what courses are in the top ten. Know nothing about diamond reviewers or any of that stuff. Not trying to be a hater or anything, it just has never been a part of my DGCR experience and I've been here for going on ten years.
 
Funny enough, the reviews are one section of this place that I rarely pay attention to. Have never reviewed or reacted to a review. Have no clue what courses are in the top ten. Know nothing about diamond reviewers or any of that stuff. Not trying to be a hater or anything, it just has never been a part of my DGCR experience and I've been here for going on ten years.

This begs the question.... what made you check out a sight about disc golf course reviews then? :)
 
This begs the question.... what made you check out a sight about disc golf course reviews then? :)

Been playing a long time and at the time I joined, this was one of the few sites I could safely look at while at work. I don't use reddit, barely use FB, had to pass the time somehow.
 
Some come here for the forums, and don't pay much attention to reviews. Some come for the reviews, and don't pay much attention to the forums. Some value both. Everyone gets choose for themselves what they do (or don't) care about.
 
Flip side of the coin. At some point it was changed from 200 to 250 unique votes right? Not sure when that was, but that's A LOT more uniques required for diamond. But I agree, it's a different era now. I just don't know that there's that many more people on here, if not less people than years ago. You know, with Udisc being a thing.

i don't remember about the uniques changing. it's definitely a huge difference.
UDisc is a different kind of community. i feel like there are noticeable more people around here, in general and more specifically since the pandemic started.


I get more thumbs up for my reviews now since I've been doing it for so long. But you know as well as I do that it's a bunch of the same people giving those thumbs. Just my two cents, but I will say that I feel like there has been an uptick in the number of quality reviewers recently.

for sure, on both counts
 
I genuinely think that this thread and the Diamond thread have at least a little to do with more votes.

I, for example, typically just read reviews of courses that I'm considering playing. I generally don't just browse reviews. But I have read and thumbed several reviews as a direct result of these threads.

Agreed, these threads have made me consider other's reviews more. It's made me take the time to read through other reviews from individuals or other reviews at a course and give upvotes where it seems the effort has been put in to paint an accurate picture of the course.

My downvote level is low and will generally be reserved for an absolute homer or obvious hater review which unfairly skews a courses rating. Fortunately I've not come across too many of these.
 
it is very noticeable how much more quickly reviewers are accumulating thumbs now compared to the earlier years of the site.

anyway, point being that there were a lot of old timers here who slogged for years to get to silver or gold. it's a different era now.

Funny, I'd always thought it was the opposite. I thought there used to be more different users on the site, making it easier to get the uniques as soon as you accumulated the number. It seems as if a core group (as knobby mentioned on his post) gives most of the thumbs recently, leaving reviewers short on uniques.

I did run some data out of curiosity. Here are the average number of thumbs on the first 30 reviews from some TRs - the first 5 diamond and then a selection of newer reviewers (2019 or more recent).

TVK - 10.4 thumbs per review
mashnut - 12.3 tpr
harr0140 - 11.0 tpr
optidiscic - 13.3 tpr
ERicJ - 14.2 tpr

lee76007 - 8.9 tpr
knobby325 - 3.8 tpr
Shadrach3 - 7.7 tpr
kp_1024 - 7.9 tpr
DFrah - 7.0 tpr

Of course, this could be badly skewed if enough people voted for the old reviews in the years in between.
 
Git off my lawn!!!

Keep on mind that it's unrealistic to expect to avg as many thumbs per review as long timers. Well most thumbs are given soon after the review is written, many of our reviews have been around for years, slowly accumulating an extra thumb or two.

Plus, long time reviewers are simply better established on DGCR, and generally have larger followings as a result.
 
Last edited:
Funny, I'd always thought it was the opposite. I thought there used to be more different users on the site, making it easier to get the uniques as soon as you accumulated the number. It seems as if a core group (as knobby mentioned on his post) gives most of the thumbs recently, leaving reviewers short on uniques.

I did run some data out of curiosity. Here are the average number of thumbs on the first 30 reviews from some TRs - the first 5 diamond and then a selection of newer reviewers (2019 or more recent).

TVK - 10.4 thumbs per review
mashnut - 12.3 tpr
harr0140 - 11.0 tpr
optidiscic - 13.3 tpr
ERicJ - 14.2 tpr

lee76007 - 8.9 tpr
knobby325 - 3.8 tpr
Shadrach3 - 7.7 tpr
kp_1024 - 7.9 tpr
DFrah - 7.0 tpr

Of course, this could be badly skewed if enough people voted for the old reviews in the years in between.

I wasnt a member in the early years of dgcr, but i believe the very early reviews got a lot of thumb action, plus those reviews have sat there the longest too. However, those who joined after 2012, or so, have gotten significantly less. Reviewers like pmay5 and mrbro885 didn't hit 100 thumb ups until they had over 70 reviews and their reviews were of high quality. It personally took me 59 reviews to hit 100 and 102 reviews to make it 350 thumb ups.

i think being able to see who voted on reviews has made an substantial impact on the increase in voting in the last couple years. This feature upgrade occurred in early 2019 for site supporters.
 
Last edited:
Well most thumbs are given soon after the review is written, many of our reviews have been around for years, slowly accumulating an extra thumb or two.

Yeah, unfortunately that makes the data a little bit useless...but perhaps still fun!
 
I wasnt a member in the early years of dgcr, but i believe the very early reviews got a lot of thumb action, plus those reviews have sat there the longest too. However, those who joined after 2012, or so, have gotten significantly less.

This makes a lot of sense to me. Thanks!

i think being able to see who voted on reviews has made an substantial impact on the increase in voting in the last couple years.

Just because of a tit-for-tat rule? Like, "Oh, I see you voted for me so I'll vote for you more often?"
 
Funny, I'd always thought it was the opposite. I thought there used to be more different users on the site, making it easier to get the uniques as soon as you accumulated the number. It seems as if a core group (as knobby mentioned on his post) gives most of the thumbs recently, leaving reviewers short on uniques.

I did run some data out of curiosity. Here are the average number of thumbs on the first 30 reviews from some TRs - the first 5 diamond and then a selection of newer reviewers (2019 or more recent).

TVK - 10.4 thumbs per review
mashnut - 12.3 tpr
harr0140 - 11.0 tpr
optidiscic - 13.3 tpr
ERicJ - 14.2 tpr

lee76007 - 8.9 tpr
knobby325 - 3.8 tpr
Shadrach3 - 7.7 tpr
kp_1024 - 7.9 tpr
DFrah - 7.0 tpr

Of course, this could be badly skewed if enough people voted for the old reviews in the years in between.

Reviews get more votes these days with the majority coming from the same 10-15 people.

Look at my my reviews, all 400+. There are dozens and dozens of reviews written 5-10 years ago that have 6 or less positive votes. Of the last 20 reviews I've written (practice courses excluded), only 2 have less than 10 positive votes.

One counter point is of there are less reviews being written, reviews have a longer shelf life on the main page. That generates more chances for votes as well.
 
Just because of a tit-for-tat rule? Like, "Oh, I see you voted for me so I'll vote for you more often?"

maybe? sort-of? idk
i havent studied psychology, but i think its more complex than that. several factors are probably at work.
 
Just because of a tit-for-tat rule? Like, "Oh, I see you voted for me so I'll vote for you more often?"

maybe? sort-of? idk
i havent studied psychology, but i think its more complex than that. several factors are probably at work.

If I see an unfamiliar name vote on one of my (few) reviews, I will probably check out some of their reviews. If the reviews are good, I'll give them a thumb. I have learned about some good reviewers this way, which is helpful when planning road trips to unfamiliar areas.

So yeah, a bit more complex than tit-for-tat I think.
 
If I see an unfamiliar name vote on one of my (few) reviews, I will probably check out some of their reviews. If the reviews are good, I'll give them a thumb. I have learned about some good reviewers this way, which is helpful when planning road trips to unfamiliar areas.

So yeah, a bit more complex than tit-for-tat I think.

I took a tip from wellsbranch250 several years ago and starting reading and voting on reviews (before that I only read reviews of courses I knew or was considering playing). I took a slightly different approach, however, from most people. I focused on the non-trusted reviewers. My theory was that we needed to encourage the newer reviewers, so I gave them as many thumbs up as I reasonably could in hopes of motivating them to do more and perhaps join the TR ranks. Recently I realized that I was doing a disservice to TRs, so I have tried to make sure I read and give thumbs to those reviews as well.

I have also noticed an increase in my vote totals, recently. My first 90 or so reviews average about 6 votes per review. My past 20 or so are averaging almost 12 votes.
 
I took a tip from wellsbranch250 several years ago and starting reading and voting on reviews (before that I only read reviews of courses I knew or was considering playing).


I try to play the best courses in the areas I travel to and I have used this site to find them for many, many years. Thank you to all of you who have invested so much time writing reviews over the years!

Until I started reading more of the forums last year, I didn't know what a TR was and I didn't see any reason to write a review for a course that already had several reviews. You all have helped me to understand more ways the site can be useful and I have started trying to vote more and write more.
 
Reviews get more votes these days with the majority coming from the same 10-15 people.

Look at my my reviews, all 400+. There are dozens and dozens of reviews written 5-10 years ago that have 6 or less positive votes. Of the last 20 reviews I've written (practice courses excluded), only 2 have less than 10 positive votes.

One counter point is of there are less reviews being written, reviews have a longer shelf life on the main page. That generates more chances for votes as well.

is that supposed to say "if"?

i was thinking, timg used to do an annual update thread in January where he included the list of the top 10 according to reviews from that year only. it would also make note of how many courses were listed on the site at that time. i can't remember if it ever made note of how many reviews were written that year. it would be an interesting stat and germane to this conversation.
 
I'd have a hard time giving a thumbs up to a reviewer playing way over par. Knobby played and reviewed Jones Park in CR, and his course scores are over average, over par, over mine, and I suck. I'm like plus 4-5 and that's an easy course. So take that into consideration when contemplating reviews.
 
I'd have a hard time giving a thumbs up to a reviewer playing way over par. Knobby played and reviewed Jones Park in CR, and his course scores are over average, over par, over mine, and I suck. I'm like plus 4-5 and that's an easy course. So take that into consideration when contemplating reviews.

Liking or disliking, the value of a review....based on the reviewers skill level, in comparison to yours, seems incongruent. If we have a couple 1000 rated posters here, you would anticipate them giving thumbs down to nearly all reviews? :confused:

I think I seek out reviewers that value the same things in a course that I do.
 
I'd have a hard time giving a thumbs up to a reviewer playing way over par. Knobby played and reviewed Jones Park in CR, and his course scores are over average, over par, over mine, and I suck. I'm like plus 4-5 and that's an easy course. So take that into consideration when contemplating reviews.

Does that mean if James Conrad writes a review that says 'I liked it' or 'I didn't like this course' that would hold more value than the average disc golfer (myself included) writing a long breakdown of said course? Advertisers must like you if you're easily swayed simply by big names.
 

Latest posts

Top