• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Trusted Reviewer - Groupthink

Which TR do you trust the most? (more than one choice allowed)


  • Total voters
    66

Dave242

* Ace Member *
Gold level trusted reviewer
Joined
Aug 6, 2007
Messages
4,525
Based on the stats, which reviewer below do you trust the most? Why?

Code:
[b]Trusted	Within	0.25 to	0.50 to	0.75 to	Over	Over	4.0+
Rvwer	0.24	0.49	0.74	0.99	0.99	Rate	Courses[/b]
A	49%	37%	7%	7%	0%	0.22	41
B	65%	35%	0%	0%	0%	0.11	20
C	60%	23%	10%	8%	0%	-0.02	40
D	22%	31%	20%	20%	7%	0.50	45
E	56%	30%	11%	4%	0%	0.15	27
F	53%	29%	15%	3%	0%	0.23	34
G	59%	28%	11%	1%	0%	0.10	79
H	56%	31%	10%	3%	0%	0.19	100
I	39%	41%	17%	2%	0%	0.18	41
J	50%	40%	10%	0%	0%	0.00	20
K	61%	24%	15%	0%	0%	0.14	33
L	67%	28%	4%	1%	0%	-0.01	118
M	43%	31%	23%	3%	0%	0.34	35
N	49%	26%	18%	8%	0%	0.09	39
O	43%	33%	21%	0%	2%	0.28	42
P	36%	32%	23%	9%	0%	0.19	22
Q	57%	29%	11%	4%	0%	0.20	28
R	51%	29%	20%	0%	0%	-0.09	35
S	63%	21%	16%	0%	0%	0.03	19
T	51%	31%	15%	3%	0%	0.16	59

This is a list of top Trusted Reviewers and an analysis of their course ratings of 4.0 and greater. It compares them against what the course is rated by all reviewers for each course.

Since reviewers can only rate in 0.5 disc increments, the analysis in 0.25 increments since that is what on average will move the rating up or down one half disc.

The count of each TR's reviews that have a deviation off of the DGCR average is listed as a percent of their total reviews (columns 2-6). It is an absolute value – meaning it include both over the DGCR average and under by the amount for that column.

Over Rate (column 7) is the average of how their ratings fare against the DGCR average. It is an indication of if they chronically rate above the DGCR average or below.

Finally, "4.0+ Courses" (column 8), is simply the count of courses that each reviewer has rated 4.0 or greater.
 
So this is what you've been doing all day :cool:

But really, i like this thread. I'll get back toyou, i need to look at the numbers a little closer.
 
In regards to column 8 would a column that displays thier total reviews be of any consequence?
 
So this is what you've been doing all day :cool:

But really, i like this thread. I'll get back toyou, i need to look at the numbers a little closer.

^ this... proper perusal and pondering takes time.
:popcorn:
 
In regards to column 8 would a column that displays thier total reviews be of any consequence?

I am trying to keep this anonymous so responses are unbiased. You can figure out who each is if you try hard, but giving total # of reviews would give them away too easily.

These are the top 20 TR's - they each have reviewed over 125 courses and all have played over 150. IMO, that is enough for that not to change any of their Trustworthiness.

PS....it does make your head spin (at least it made mine spin) trying to figure out what you value just from these numbers. That's why multiple selections are allowed.
 
Last edited:
I trust mashnut

Biggest reason being is that I have played with him, we have similar ratings/skills (although hes got more distance than I).

We do have differences in opinions, but before I play a course that I haven't played, I read his thoughts. I'm sure some of the other trusted reviewers are good at what they do, but I just like to stick with peoples opinions about a course who I can relate to......nothing personal to anyone else.
 
Trusted reviewers are like movie reviewers. You find one whose reviews jibe with yours and trust them the most. How they review relative to other people matters little.
 
Can you break this down for an idiot like me? What do the numbers for Reviewer A mean for example?
 
I picked L, for two major reasons. Reason #1: I believe reviews *should* fully utilize the 0-5 scale.. i.e. I disagree with the reviewer logic that says no course can ever rate a 5.. this is related to reason #2: I believe on average 'good' but not 'great' courses are over-rated, simply due to lack of exposure. So I like the fact that reviewer L on average actually under-rates courses compared to the overall DGCR population. i.e. they still rate courses highly, but they also rate courses more critically than DGCR does (on average).
 
Based on the stats, I picked E & T.

By process of elimination I eliminated
1) those who had a very high percentage of their ratings match the DGCR average. It is inconceivable to me that a person's preference in courses would almost always match the average. If they are too scared to stick their neck out and rate what they really believe, that is untrustworthy.
2) those who seemed to chronically overrate courses. It is hard to know the motivation for chronically doing this, but it seems like it is either sucking up to the locals or having a non-critical approach. Neither motivations build trust for me.

Dang it, I wish I could change my vote as I would also add votes for N & R based on this (and looking a little closer).
 
I voted L because my own ratings are extremely close to the overall DGCR rating. Except a handful of niners, I suppose.
 
Can you break this down for an idiot like me? What do the numbers for Reviewer A mean for example?

Not for an idiot like you (you're not one from all the witty posts I have read over the years), but I'm glad to do it anyhow.

Reviewer A
49% of his ratings are Within 0.24 pts of the DGCR average. That means his rating is the same. On average it does not change the rating more than 1/2 disc

37% of his ratings are 1/2 disc off of DGCRs average. That is as close to agreeing with the average as you can be without actually agreeing

0nly 7% are in disagreement with the average by 1 disc and another 7% by more than 1 disc.

This person given an option always rounds their rating up (0.22 rating inflation). Basically never sticks his neck out with his opinion of the course to to disagree with a course being overrated by other DGCR users compared to his assessment.
 
My voting strategy.
1) Have to have played 40+ 4.0 courses. Generally I don't trust someone if they have played a good sample pool of quality courses. People with less overall courses generally tend to rate higher(imo).
Leaves: A, C, D, G, H, I, L, O, & T
2) Someone who has more the 50% agreeing with everyone else is too scared to vote truthfully for fear of thumbs down.
Leaves: A, D, I, & O
3) Over rate below 0.25. Kind of the same reason as #2, makes me feel like they are trying to appease the locals.
Leaves: A & I

In reality I base who I trust on who has similar tastes as I do. Being a TR means absolutely nothing to me when looking at a course. Usually I see a trusted reviewer as someone who isn't honest. Disc Golfers are notorious "ratings whores", so it wouldn't surprise me if members will compromise themselves just to get that TR status.
 
@ Dave242: Very interesting post. :thmbup: :cool:

I'm not sure I can look at a set of data like yours and determine "reviewer reliability." For me, it's not about whether I think someone has "stuck their neck out" or rates higher or lower than everyone else. For me the valuable knowledge is contained in the narrative and not the numbers. I don't care what rating people assign each course so much as what they have to say.

A reliable reviewer passes on information. Generally speaking, it's easy to tell if someone is full of **** or not. I might not agree with everyone's rating or review, but if they pass on knowledge of any sort in their narrative, then it is of use to me and therefore a worthy review. If I think someone's full of **** I regard their reviews as unreliable.

A reliable reviewer's statistics should not be compared with an unreliable general public anyway, right? :popcorn:




Smyith said:
Disc Golfers are notorious "ratings whores", so it wouldn't surprise me if members will compromise themselves just to get that TR status.

LOL! Seriously?! I'd hope people write reviews to help other people, not for some pixels on some website. :rolleyes: :p
 
This person given an option always rounds their rating up (0.22 rating inflation). Basically never sticks his neck out with his opinion of the course to to disagree with a course being overrated by other DGCR users compared to his assessment.

Usually I see a trusted reviewer as someone who isn't honest. Disc Golfers are notorious "ratings whores", so it wouldn't surprise me if members will compromise themselves just to get that TR status.

I don't think anyone who reviews a lot of courses cares about this at all. People who only review a handful of courses are probably more prone to over-rating than those who take the time to review a lot. Maybe some exceptions, but I do think they are vastly the minority.

Basically, I think that people who take the time to not only travel and play a ton of different courses, not to mention take the time to actually write a cogent review, like disc golf more than the average user. Or at least, appreciates the courses more then the average user. Hence the trend of being higher than the average.

In other words, people who rate lower will be less inclined to play new courses, since they have a more pessimistic (relatively, of course) view of courses in general. And when they do, they take a disc off the course for being too crowded. :|
 
I will get back to you on this . . . I like the thought process . . . but then again why are we discussing reviewers instead of courses. We should have a Review process for the reviewers. I guess that is somewhat the the thumbs up and down are for . . . but sometimes those are not accurate representations of the reviewer.

I can see it now . . . Disc Golf Course Trusted Reviewer Review . . . is www.dgctrr.com taken?
 
I would have to see the reviews. Rating is not the same as reviewing. But I vote H, They have done 100 and have a good spread.
 
It's a little difficult to actually make conclusions about the causality of a reviewer agreeing with the consensus or not. I personally often come up with a rating as I'm leaving the course and put that in my notes. On many courses where there are already a bunch of reviews, it's rare that my personal rating is that different from the average. On courses that only have a few reviews, it has been much more frequent that my rating differs from the average. I'm never worried about my thumbs or about how locals like my rating, I'm perfectly happy to differ from the rating if my opinion really is that different, it just doesn't happen nearly as often with more popular courses.

It also can make a difference what kind of courses you tend to play. I do a lot of road trips, picking out all the courses rated 4+ wherever I go, and it's frankly pretty rare that those highly rated courses fall below 3.5 on my personal scale. When I stay in an area for a while and get to all the less frequented courses and especially 9 hole courses I find a lot more where the rating doesn't give as good an idea of how good the course is compared to my assessment.

TLDR: I think there are some major variables that your table of numbers misses. That said, it's definitely interesting stuff and a different look at the data than anyone's done before, thanks!
 
why are we discussing reviewers instead of courses.
Because it's winter and we are all heavily suffering from lack of flying discs. That and some of us aren't lucky enough to travel for a month.....jealous
We should have a Review process for the reviewers.
I like this. Have a special TR Badge after you go through the review.

LOL! Seriously?! I'd hope people write reviews to help other people, not for some pixels on some website. :rolleyes: :p
I'm just saying it wouldn't surprise me at all. You have to admit tho we are status whores (ratings). As mush as you want to say no way...the real possibility is there.
But I digress and let's focus on the intent of the thread.
 
TLDR: I think there are some major variables that your table of numbers misses.

Could you expand on this? I am curious on your thoughts.
 
Top