• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Trusted Reviewer - Groupthink

Which TR do you trust the most? (more than one choice allowed)


  • Total voters
    66
But really these numbers don't really define a good review for me. The best reviewers are honest, critical yet pleasant, informative, and sometimes humorous.

I also like when people rate well for overall course design and basic amenities but dislike it when people mark down for less important things like "front 9 doesn't end near the parking lot." It's always good to hear if someone just had fun while playing as this is always important to me.

So while I think numbers and tendencies are part of it, common sense and writing skills play a bigger role. I can often get a good idea of how I will enjoy a course from a well-written review even if I think the rating is way off.
 
atl scott - The # of 4+ ratings is irrelevant without an overall number of reviews. An example: one reviewer has 118 ratings 4 and above which sounds like a lot, but that's only 17% of that reviewer's total reviews which doesn't sound like a lot.

There's no way I would rate 17% of the courses I've played 4+. Most of the hundred and fifty or so courses I've played have been relatively mediocre. My ratings (if I did more) would probably look more like a bell curve.
 
There's no way I would rate 17% of the courses I've played 4+. Most of the hundred and fifty or so courses I've played have been relatively mediocre. My ratings (if I did more) would probably look more like a bell curve.

I agree to some extent, which is why that reviewer (me) gave only 8 of those 118 a 5 rating, and only 28 a 4.5 with the other 82 coming in at 4.
 
I just went back to your OP again and I'm not sure where any of the over/under rating data comes from. Did you just look at the 4+ courses (I'm assuming you didn't compare a few thousand individual ratings against course averages). If so, it's even less useful than I thought. I would be really interested in that stat if it took into account all ratings by each reviewer.

I looked at every single review rated 4.0 and higher of the top 20 most published TRs. I compared each of those to the average rating of each course. So, I assume your assumption is wrong. Yes, it did take quite a bit of effort.
 
Personally, Mashnut is my most trusted reviewer. He and I seem to always jive pretty closely in our reviews. Since he and I have played plenty of rounds together throughout the midwest, we've had many occasions to note the similarity in the substance of our reviews. The styles definitely differ, though. :D
First off: Congrats on making Diamond Juke! Well done. :clap:

Mashie's reviews are objective, pretty even keeled and kind of "just the facts, ma'am." Whereas Juke's have a certain color and flair I find enjoyable, while still being quite fact based. Like his review on Boyd Hill: Ghetto-y Goodness :thmbup:

I've found Juke's ratings are usually close to mine for courses we've played in common (even those I haven't actually written reviews for, like Lemon Lake, Boyd Hill, Oshtemo, West Pake, Highland Park). I know how I'd rate them, even if I haven't put my thoughts down on paper. As such, I tend to trust his reviews, because he seems to value certain things about the same as I do.
 
I looked at every single review rated 4.0 and higher of the top 20 most published TRs. I compared each of those to the average rating of each course. So, I assume your assumption is wrong. Yes, it did take quite a bit of effort.

Gotcha, I still think that only looking at the 4+ reviews (while making the task manageable) misses a lot of the point on whether reviewers over- or under-rate courses. I find myself thinking about inflated ratings a lot more with courses rated 2-3.5 than I do about the 4+ rated ones (excepting some with <5 reviews).
 
I agree to some extent, which is why that reviewer (me) gave only 8 of those 118 a 5 rating, and only 28 a 4.5 with the other 82 coming in at 4.

Good explanation. I like the point you made earlier in the thread about course selection. Whenever I travel I also like to pick the highest rated courses to play and this could easily skew the bell curve since you are playing the best every region has to offer.
 
I think that these statistics are useless when looking for the best reviewer. Just because someone goes against the grain or toes the line of conformity on ratings the content of the review Doesn't make them a good or bad reviewer. Content is key.

Statistics and numbers are not always the answer.
 
Good explanation. I like the point you made earlier in the thread about course selection. Whenever I travel I also like to pick the highest rated courses to play and this could easily skew the bell curve since you are playing the best every region has to offer.

Yup, again I'll use myself as an example because I haven't done the legwork Dave is working on; if you look at my distribution I have a ton of 3-4 rated courses, and that's definitely due to me seeking out the 3.5+ courses.
 
HA!!! @ Boyd Hill! :D

Of all my reviews.. :D
 
I can totally see how some reviewers end up with a disproportionate # of 4.5 and 5.0 reviews, if (like myself), you tend to only play the best courses when you travel to an area. I'm not about to use my vacation time to have a "typical discing experience." As long as I can keep hitting great courses, that's where you'll find me when I'm travelling. My average rating may therefore seem "unnaturally" high, and the and the number of 4.0+ courses reviewed also might seem high relative to the total # of courses reviewed. That's bound to skew things.


QFT:
I think that these statistics are useless when looking for the best reviewer. Just because someone goes against the grain or toes the line of conformity on ratings the content of the review Doesn't make them a good or bad reviewer. Content is key.

Statistics and numbers are not always the answer.
 
Last edited:
I have a ton of 3-4 rated courses, and that's definitely due to me seeking out the 3.5+ courses.

This brings up an interesting quandary. Is it more appropriate to use a general formula for rating a course (if x,y,z then 3.5) or determine ratings based only on the selection of courses you have played (best course you have played is a 5 worst is a 0 or .5 and everything else spread somewhere in between)?

I use the 2nd method. Of course if I only played 4+ rated courses on here then many of them would get picked apart and I would give them lower ratings and only the best of the best would get high ratings. But I feel this is the only honest way to rate a course. I can't compare courses I haven't played.

Using this method I have sometimes eventually gone back and lowered ratings for some courses as my library of courses grows/changes.

Also Dave, you are a big nerd. Keep up the interesting thread topics.
 
Last edited:
I don't care what your numbers say, my reviews are the best! Mine! mine! MINE! Each one is a Homerian saga with Tolkien-esque attention to detail!* All other reviewers tremble before my mighty prose!

*throws furniture around the room*



*except for the many that I just phoned it in.
 
This data might actually be more instructive. What this shows is a helpful way to ratings rounded to the nearest half disc and shows:

---------------------------TR's rating Below DGCR's <---------------------------------> TR's rating Above DGCR's

Code:
[B]Top 20 TR	[COLOR="blue"]Agree[/COLOR]	-1	-0.75	-0.5	[COLOR="Blue"]-0.25	0	0.25[/COLOR]	0.5	0.75	1	1.25	1.5	1.75[/B]
AdamH.,.,.	90%	0%	0%	0%	15%	35%	40%	10%	0%	0%	0%	0%	
mashnut,,,	90%	1%	1%	4%	25%	42%	23%	3%	2%				
JR Stengel	90%	0%	0%	5%	30%	30%	30%	5%	0%	0%	0%	0%	
tallpaul..	84%			5%	26%	37%	21%	11%					
srm_520,,,	82%				7%	39%	36%	11%	4%	4%			
GoodDrive,	79%		1%	4%	14%	32%	33%	13%	4%				
Denny Rit.	78%			11%	26%	26%	26%		11%				
harr0140..	77%	0%	0%	0%	10%	33%	34%	19%	2%	2%	0%	0%	
bjreagh,,,	76%	0%	5%	10%	23%	30%	23%	3%	8%	0%	0%	0%	
#19325,.,.	75%				12%	29%	34%	15%	5%	5%			
Jukeshoe.	72%			3%	9%	39%	24%	18%	6%				
DSCJNKY.,.	71%				26%	26%	19%	19%	11%				
sillybizz.	68%			5%	18%	18%	32%	9%	14%	5%			
prerube,,	68%			5%	18%	18%	32%	9%	14%	5%			
ERicJ,.,.,	68%			3%	9%	24%	35%	18%	12%				
The Valkry	65%		2%		20%	25%	20%	31%	2%				
Innovadude	64%			2%	20%	22%	22%	27%	7%				
swatso,,,,	63%	0%	0%	26%	29%	20%	14%	11%	0%	0%	0%	0%	
mndiscg,,,	63%					14%	49%	23%	11%	3%			
optidiscic	62%		3%	8%	31%	18%	13%	21%	5%	3%			
Donovan,,,	35%			2%	2%	9%	24%	27%	24%	7%		2%	2%

"Agree" means that the TR's rating is either matching or as close as it can be to matching without actually matching......with 0.25 breaks, about half of the -0.25 and +0.25 will have equal number of discs and the other half will be only 1/2 disc off of the DGCR average.
 
Very cool, that's much more informative IMO, is that all reviews by each reviewer this time?
 
Last edited:
Gotcha, I still think that only looking at the 4+ reviews (while making the task manageable) misses a lot of the point on whether reviewers over- or under-rate courses. I find myself thinking about inflated ratings a lot more with courses rated 2-3.5 than I do about the 4+ rated ones (excepting some with <5 reviews).

You make a valid point and I wrestle with that too after I rate and see where my rating ends up stacking up with the DGCR average.

But when you are talking about the 2.5 to 3.5 rated courses you are in the fat part of the bell curve and there are so many courses there that nobody pays attention or gets bent out of shape about ratings. On the bottom end most people do not discuss them here much.

This analysis deals with the courses that most people care most about.
 
Very cool, that's much more informative IMO, is that all reviews by each reviewer this time?

Woops - did not clarify/define that in my post. No, this is still the same set of data: only courses each TR rated at 4.0 or more.....compared against the DGCR average rating (on an individual basis).
 
I don't care what your numbers say, my rounds of golf are the best! Mine! mine! MINE! Each one is like a reality-show saga with Freudian-esque attention to mothers!* All other golfers tremble before (and most often after) my terrible throws!

*throws putters around the room*


*except for the many that I was just drunk.

How's it going Dave? We should get a round in sometime soon.
 
It says that this percentage of reviews are within 1/2 disc of the DGCR average:

Code:
[B]Top 20 TR	w/in 0.5 discs of DGCR Average[/B]
AdamH.,.,.	95%
JR Stengel	95%
mashnut,,,	94%
tallpaul..	92%
srm_520,,,	88%
GoodDrive,	88%
harr0140..	87%
Denny Rit.	84%
bjreagh,,,	83%
#19325,.,.	83%
Jukeshoe.	83%
swatso,,,,	82%
The Valkry	81%
DSCJNKY.,.	81%
Innovadude	79%
ERicJ,.,.,	79%
optidiscic	77%
sillybizz.	75%
prerube,,	75%
mndiscg,,,	75%
Donovan,,,	50%

My conclusions are, either:
1) TR's do not stray far from the published average
2) Non-TR's take their cues on rating from TR's (*unlikely)
or
3) TR's ratings are no more meaningful than the DGCR average

*If most courses had lots of TR reviews, this could be a valid conclusion. But, I would TR's are in the solid minority of reviewers for most courses (especially early on while a course gets its average built).
 
How's it going Dave? We should get a round in sometime soon.

How dare you change my post to something more accurate?

Uh, pretty good man. Still playing teh disk golf despite my pedestrian skill set. ;)
 
Top