• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Trusted Reviewer - Groupthink

Which TR do you trust the most? (more than one choice allowed)


  • Total voters
    66
TLDR: I think there are some major variables that your table of numbers misses. That said, it's definitely interesting stuff and a different look at the data than anyone's done before, thanks!

Same. I'm not sure I feel like I can empirically ascertain some "trustiness" from this data per se. I'm neither for or against agreeing with the average rating, I don't think you can definitively say that one is good/bad. I do appreciate reviewers that are more open to using the whole scale overall though.
 
Same. I'm not sure I feel like I can empirically ascertain some "trustiness" from this data per se. I'm neither for or against agreeing with the average rating, I don't think you can definitively say that one is good/bad. I do appreciate reviewers that are more open to using the whole scale overall though.

I agree, if I see that someone never gives out 0s or never gives out 5s their ratings mean less to me.
 
It's a little difficult to actually make conclusions about the causality of a reviewer agreeing with the consensus or not. ...

It also can make a difference what kind of courses you tend to play. ...

TLDR: I think there are some major variables that your table of numbers misses.

That said, it's definitely interesting stuff and a different look at the data than anyone's done before, thanks!

^2nd these emotions

Smyith said:
But I digress and let's focus on the intent of the thread.

Agreed. :)
 
I'm not sure I can look at a set of data like yours and determine "reviewer reliability." For me, it's not about whether I think someone has "stuck their neck out" or rates higher or lower than everyone else. For me the valuable knowledge is contained in the narrative and not the numbers. I don't care what rating people assign each course so much as what they have to say.

A reliable reviewer passes on information. Generally speaking, it's easy to tell if someone is full of **** or not. I might not agree with everyone's rating or review, but if they pass on knowledge of any sort in their narrative, then it is of use to me and therefore a worthy review. If I think someone's full of **** I regard their reviews as unreliable.

A reliable reviewer's statistics should not be compared with an unreliable general public anyway, right? :popcorn:

You sum up several thoughts repeated throughout this thread that I think miss the point a little bit (even though they are perfectly good/valid points).

This is completely about Trusted Reviewers ratings. Not their reviews. Maybe the question that is being addressed by the stats is "Are TR's ratings to be trusted any more than the DGCR average ratings?"

Like you say, a big part of the review process is about passing on information.....and the rating is the most succinct summary of information this site allows you to pass on.

Here is the composite of these top 20 most published TRs' ratings of the best courses....a profile of the TR's contribution compared to the DGCR average:

Code:
[B]Within	0.25-	0.50-	0.75-	Over	Over	4.0+
0.24	0.49	0.74	0.99	0.99	Rate	Courses[/B]
51%	30%	14%	4%		0.15	44

What conclusions can we draw from this?
 
TLDR: I think there are some major variables that your table of numbers misses. That said, it's definitely interesting stuff and a different look at the data than anyone's done before, thanks!

Could you expand on this? I am curious on your thoughts.

2nd.

Other stats that might be useful that I can think of:

Average Rating - ideally should be between 2.5 (that is the middle of the ratings scale). But, there are problems with that. For instance, if travelers only hit highly rated course, their ratings will be skewed high. Or if they only choose to review courses with low review counts, most of those will be lower rated courses.

Player rating - perspectives are different from different skill levels. If people are truly rating for the entire community or for the intended player of a given course, elementary school course would be all rated much higher

Years played -

Age -

BMI -
 
A couple of the things you list there are the things I mentioned above, I'd also consider how widely traveled the reviewer is (are they comparing courses in their own state/region or across the country?). I think years played is another big one, it's been my experience that players who were playing in the 70s have a very different perspective on course design than players who started 10, 5 or 2 years ago (with admittedly many exceptions).
 
i picked R just because of column 7, most under the average. i like reviews to be critical. that's why denny ritner is one of my favorite reviewers.
 
it just occurred to me that the "Over Rate" stat is incomplete/inaccurate/meaningless. What I did was looked at 4.0 rated courses for each top 20 TR. So, if they rated any course below 4.0 that had DGCR average of >4.0 that was not included.

So, maybe they have slammed a ton of 4.0+ rated courses....but that does not show up here. I doubt it is the case base on another analysis I am working on right now and will post. But, take that number with a big ole grain of salt.
 
I vote for K. :| :| :| :| :|
 
it just occurred to me that the "Over Rate" stat is incomplete/inaccurate/meaningless. What I did was looked at 4.0 rated courses for each top 20 TR. So, if they rated any course below 4.0 that had DGCR average of >4.0 that was not included.

So, maybe they have slammed a ton of 4.0+ rated courses....but that does not show up here. I doubt it is the case base on another analysis I am working on right now and will post. But, take that number with a big ole grain of salt.

Yup. Still, interesting thread.
 
it just occurred to me that the "Over Rate" stat is incomplete/inaccurate/meaningless. What I did was looked at 4.0 rated courses for each top 20 TR. So, if they rated any course below 4.0 that had DGCR average of >4.0 that was not included.

So, maybe they have slammed a ton of 4.0+ rated courses....but that does not show up here. I doubt it is the case base on another analysis I am working on right now and will post. But, take that number with a big ole grain of salt.

You might be surprised, I know I've given at least a handful of 3-3.5 ratings to 4 rated courses and I bet that's true of several of the folks on that list.
 
i like reviews to be critical. that's why denny ritner is one of my favorite reviewers.

I would have to say he is my personal "Most Trusted Favoritest". I will include him in my next Analysis with his name (and everyone else's).....Since the clever ones like Jukeshoe are figuring out who is who on the list. :thmbup:
 
Personally, Mashnut is my most trusted reviewer. He and I seem to always jive pretty closely in our reviews. Since he and I have played plenty of rounds together throughout the midwest, we've had many occasions to note the similarity in the substance of our reviews. The styles definitely differ, though. :D
 
I just went back to your OP again and I'm not sure where any of the over/under rating data comes from. Did you just look at the 4+ courses (I'm assuming you didn't compare a few thousand individual ratings against course averages). If so, it's even less useful than I thought. I would be really interested in that stat if it took into account all ratings by each reviewer.
 
I'm firmly of the belief that there's no magical portal that DGCR reviewers pass through once they have a medal by their username that makes them better than the rest. The groupthink problem here runs top to bottom. Everyone here is to a degree guilty of it, even me. Its the result of a flaw in our psychology and one of the very things we come to this website for in a way contributes to it.

We want to play epic courses. Here is a place where we learn about such courses. We hear how great they are from others who preceded us. We plan a trip to these places and get excited about going there. We invest time and money to make it happen. When we get there, we're so jacked up about all of our plans coming to fruition, that we perhaps turn our critical senses off and overlook flaws. You know, the same critical senses that were working just fine when we stopped to play that nameless 3.25 rated Par 54 on the way to stretch our legs.

Then we go give the destination course another rubber stamp 5. Was it really that good? Oh yeah, of course it was. Several other people who went before you said it was didn't they?

This is why I like the rebel reviewer who goes to these places and is willing a dock a disc for things we would probably all dock for if we weren't at a course that didn't have so much preconceived hype and constant homepage attention on here. Once he's willing to go against convention and make that red mark, it makes it a little easier for others to come in and make more.
 
Trusted reviewers are like movie reviewers. You find one whose reviews jibe with yours and trust them the most. How they review relative to other people matters little.

I have always maintained this.

When introducing people to this sight, I tell them to read reviews of courses they are familiar with, and find a few reviewers who seem to share their point of view and DG sensibilites. Then read their reviews of courses they haven't played to get some idea of the course -if that's what they're looking for.
 
I'm firmly of the belief that there's no magical portal that DGCR reviewers pass through once they have a medal by their username that makes them better than the rest. The groupthink problem here runs top to bottom. Everyone here is to a degree guilty of it, even me. Its the result of a flaw in our psychology and one of the very things we come to this website for in a way contributes to it.

We want to play epic courses. Here is a place where we learn about such courses. We hear how great they are from others who preceded us. We plan a trip to these places and get excited about going there. We invest time and money to make it happen. When we get there, we're so jacked up about all of our plans coming to fruition, that we perhaps turn our critical senses off and overlook flaws. You know, the same critical senses that were working just fine when we stopped to play that nameless 3.25 rated Par 54 on the way to stretch our legs.

Then we go give the destination course another rubber stamp 5. Was it really that good? Oh yeah, of course it was. Several other people who went before you said it was didn't they?

This is why I like the rebel reviewer who goes to these places and is willing a dock a disc for things we would probably all dock for if we weren't at a course that didn't have so much preconceived hype and constant homepage attention on here. Once he's willing to go against convention and make that red mark, it makes it a little easier for others to come in and make more.

Are we really all that nutsack-less? Really? God, I know I'm not and I hope the rest of the TRs aren't. I call 'em like I see 'em and I don't give a damn for ratings or top ten or anything else. I just want to play quality golf, write a funny review, and maybe help someone out. If my review jives with what 99 other people say, great. If not, well, screw 'em! My opinion is my opinion. If people are hung up about getting a thumb down because their opinions are different than others, then man we're in trouble as a society.
 
I voted R.

I generally think most reviews are too generous with their ratings so being under average is a plus (although 35 4+ ratings makes me wonder...). Would have been interesting to see the reviewers' median review rating.
 
I trust mashnut

Biggest reason being is that I have played with him, we have similar ratings/skills (although hes got more distance than I).

We do have differences in opinions, but before I play a course that I haven't played, I read his thoughts. I'm sure some of the other trusted reviewers are good at what they do, but I just like to stick with peoples opinions about a course who I can relate to......nothing personal to anyone else.

mashnut has good reviews and rocthecourse usually has good course condition updates for around here.
 
atl scott - The # of 4+ ratings is irrelevant without an overall number of reviews. An example: one reviewer has 118 ratings 4 and above which sounds like a lot, but that's only 17% of that reviewer's total reviews which doesn't sound like a lot.
 

Latest posts

Top