• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Trusted Reviewer - Groupthink

Which TR do you trust the most? (more than one choice allowed)


  • Total voters
    66
This data might actually be more instructive. What this shows is a helpful way to ratings rounded to the nearest half disc and shows:

---------------------------TR's rating Below DGCR's <---------------------------------> TR's rating Above DGCR's

Code:
[B]Top 20 TR	[COLOR="blue"]Agree[/COLOR]	-1	-0.75	-0.5	[COLOR="Blue"]-0.25	0	0.25[/COLOR]	0.5	0.75	1	1.25	1.5	1.75[/B]
AdamH.,.,.	90%	0%	0%	0%	15%	35%	40%	10%	0%	0%	0%	0%	
mashnut,,,	90%	1%	1%	4%	25%	42%	23%	3%	2%				
JR Stengel	90%	0%	0%	5%	30%	30%	30%	5%	0%	0%	0%	0%	
tallpaul..	84%			5%	26%	37%	21%	11%					
srm_520,,,	82%				7%	39%	36%	11%	4%	4%			
GoodDrive,	79%		1%	4%	14%	32%	33%	13%	4%				
Denny Rit.	78%			11%	26%	26%	26%		11%				
harr0140..	77%	0%	0%	0%	10%	33%	34%	19%	2%	2%	0%	0%	
bjreagh,,,	76%	0%	5%	10%	23%	30%	23%	3%	8%	0%	0%	0%	
#19325,.,.	75%				12%	29%	34%	15%	5%	5%			
Jukeshoe.	72%			3%	9%	39%	24%	18%	6%				
DSCJNKY.,.	71%				26%	26%	19%	19%	11%				
sillybizz.	68%			5%	18%	18%	32%	9%	14%	5%			
prerube,,	68%			5%	18%	18%	32%	9%	14%	5%			
ERicJ,.,.,	68%			3%	9%	24%	35%	18%	12%				
The Valkry	65%		2%		20%	25%	20%	31%	2%				
Innovadude	64%			2%	20%	22%	22%	27%	7%				
swatso,,,,	63%	0%	0%	26%	29%	20%	14%	11%	0%	0%	0%	0%	
mndiscg,,,	63%					14%	49%	23%	11%	3%			
optidiscic	62%		3%	8%	31%	18%	13%	21%	5%	3%			
Donovan,,,	35%			2%	2%	9%	24%	27%	24%	7%		2%	2%

"Agree" means that the TR's rating is either matching or as close as it can be to matching without actually matching......with 0.25 breaks, about half of the -0.25 and +0.25 will have equal number of discs and the other half will be only 1/2 disc off of the DGCR average.

Good chart and good thread. Some possibilities: 1) TR's tend to rate towards majority to not offend or 2) future reviewers see what the TR's said and go with that 3) Reviewers become TRs based on votes, so people [may] vote alot for a rating they agree with. For example, with 90%, Mashnut could be a follower, a trend-setter, or one who just happens to rate similar to the majority?

(BTW- Mashnut is a great reviewer and one of my favorites, and I believe is a TR mainly because of content and quantity, rather than ratings.)
 
I remember talking to Justin Bunnell about hole #18 at Jefferson Barracks in the C placement. It's a big 'ol 350'+ turnover around a big 'ol clump of shule. ALL of the Open guys have a 350' turnover drive; for them the hole isn't much at all. It was almost an automatic 2. Once you go down to advanced, you would see a ton of guys miss the line and get knocked down in the shule. You would see a bunch of guys bail on the birdie line and take the safe hyzer route for par. Once you moved down to INT, almost nobody birdied that hole. For 95% of disc golfers, that is a very good golf hole. For the top 5%, it was a gimmie birdie. Justin laughed at the notion that it was a tough finishing hole. To him it was an easy birdie hole at the end.

Exactly. If we tailored courses for the top 5% I don't think anyone would start playing disc golf. It's a catch 22.
 
Just messing with you my Brotha

I will speak on this point I think you brought up earlier. I do think a lot of reviewers are afraid to rankle some people and get potential :thmbdown:s. I mean, you get free minis! Why would you risk that?! ;)

I've noticed that as I get more and more courses under my belt, I've become increasingly frank with new ratings/reviews. But what I think would help curb this group think mentality of people afraid of getting :thmbdown:s and dropping TR levels is some sort of tenure to where once you've reached Gold or maybe Silver you can't really be demoted. Or something along those lines, just spitballing.
 
I will speak on this point I think you brought up earlier. I do think a lot of reviewers are afraid to rankle some people and get potential :thmbdown:s. I mean, you get free minis! Why would you risk that?! ;)

I've noticed that as I get more and more courses under my belt, I've become increasingly frank with new ratings/reviews. But what I think would help curb this group think mentality of people afraid of getting :thmbdown:s and dropping TR levels is some sort of tenure to where once you've reached Gold or maybe Silver you can't really be demoted. Or something along those lines, just spitballing.

You don't have to send the minis back even if you do get demoted. :p I've actually never seen a gold or silver reviewer who got the status then lost it again except the couple times Tim changed the qualifications.
 
You don't have to send the minis back even if you do get demoted. :p I've actually never seen a gold or silver reviewer who got the status then lost it again except the couple times Tim changed the qualifications.

I'm joking about the minis but trust me, people care about those little pixels beside their name. ;)
 
Has it been brought up yet that this is a very small sampling of all TRs? I don't see how you can make group assumptions about all TR's based on just this small portion of them.
 
The term trusted reviewer implies that the majority of members will agree/trust what the trusted reviewers write.
Is it groupthink yes
I think that's the point of the term trusted by most members implies agreeing with most members.

That being said I don't Agree with some TRs and what some consider to be a good course
there is no formula. There are many types of good courses I simply know it when I play it.
 
As for the clown who hints that the higher rated more experienced player will be a better reviewer.
Give me a break dude.
U ever hear pros bitch after getting their ass handed to them on a tough course.
If pros and top rated guys were rating we d be forced to believe that 18 bomb hyzers is a 5 rated gem....pukes in toilet and hands in dgcr member rights
 
Hey crashzero......... See that? I am a friggin modern day prophet! :clap:

Just messing with you my Brotha

To me, the skill level of the reviewer should be disclosed. It matters because you need to know where their game is to understand where they are coming from. I am not saying change the rating system to give more weight to higher rated players, their viewpoint is just as important, but it means different things to different levels of players. For instance, a sidearm 300' flick player may love a course that others don't care for.
 
As for the clown who hints that the higher rated more experienced player will be a better reviewer.
Give me a break dude.
U ever hear pros bitch after getting their ass handed to them on a tough course.
If pros and top rated guys were rating we d be forced to believe that 18 bomb hyzers is a 5 rated gem....pukes in toilet and hands in dgcr member rights

I guess I'm that clown. I would like to see the wooded courses you designed to see if they are fair.
 
Exactly. If we tailored courses for the top 5% I don't think anyone would start playing disc golf. It's a catch 22.

Do you really think people give up that easy and are so weak minded? I don't really agree with this, although I think we do need some shorter beginner courses, you gotta give newer players a little more credit. If every new golfer is this fickle, easily frustrated type of person, then they'll never last anyways and I say why waste time catering to them.
 
I was looking at the data again that Dave242 has compiled and started thinking that I look like an agreeable chump. Yes, there are many courses whose rating I agree with and feel are appropriate. But there are a few courses that stand out in my mind as being completely overrated and recall giving them a rating 1.5 or so lower than their average. So I started to wonder why isn't that data showing up.

I found out that since my reviews these courses have received many more ratings which have lowered the average since my review.

So I theorize TRs (for the most part) are a more true indication of a course rating. After enough ratings have been given over time the course rating starts to average closer to the TRs ratings if there is much of a difference. I bet some of the diamond and gold TRs have plenty more examples of this from when they reviewed courses years ago and can see the course average is now closer to their rating. Here is a little data of mine to support this theory.

Course A
Before I reviewed it in 2009 it was rated 4.09 on 9 reviews
I rated it a 2.5
Currently it is 3.41 on 48 reviews
Current TR rating is 3.25 on 10 reviews

Course B
Before I reviewed it in 2011 it was rated 4.00 on 11 reviews
I rated it a 2.5
Currently it is 3.42 on 25 reviews
Current TR rating is 3.17 on 3 reviews
 
To me, the skill level of the reviewer should be disclosed. It matters because you need to know where their game is to understand where they are coming from. I am not saying change the rating system to give more weight to higher rated players, their viewpoint is just as important, but it means different things to different levels of players. For instance, a sidearm 300' flick player may love a course that others don't care for.

Again, you are on my wavelength totally. What I learned here is that lots of reviewers do not rate from their own personal perspective and their own personal preference points....they rate (the number) from what they think the masses will think. I think the data I went to great pains to gather bear that out.

What lots of Reviewers write (the words) is intended to be helpful to all. I have no beef with that. Lots of people with lots of skill levels can benefit knowing that a given course holds water in puddles/mud for a week after a rain, or has no bathrooms, or has parking issues, or gets really overcrowded on weekends, or is easy to get lost on (so bring a map is advised), or has major mosquito problems, or is very thorny, or offers little shade and has no drinking fountains near hole 9, or is very hilly exhausting but has no benches, etc.

But when it comes to design, these same reviewers try hard to appeal to the masses too. This simply does not work. Like you pointed out with your Justin Bunnell post, a hole that can be perfect for a mid-level player is boring for a higher rated player AND for a lower level player. Also, many reviewers simply do not understand design in the framework of testing player skills.....they write about things like flow/fairway routing, erosion control, safety. Not that those are bad in and of themselves, but those things belong on a designer group forum, not in a users/players forum.

So, for a player to rate a course based on how well he/she can compete against the course (the single most important aspect in the sport of disc golf), he/she needs to let readers know how they rate the course in this area. And, you have to rate this area for a specific skill level (and disclose that).

Well.....that was a whole bunch of rambling to let you know that you will not get the very logical thing you ask for from "Trusted Reviewers" ratings since they write (and rate) for the masses (the lump sum average.....the fat part of the bell curve).

And.....all that said, I still maintain that DGCR ratings do extremely well at finding courses for me that I really enjoy. I have found that anything rated >3.5 will only very rarely disappoint and that <2.5 are not worth me going out of my way and spending precious time on if I have other options.

So for say Flip City, I did not enjoy it to a 5.0 experience that DGCR rates it (it was an A grade 4.0 experience), or Idlewild which I enjoyed at a B+ great experience....these courses are still wonderful courses but their scoring experience for me is the primary thing that knocked them down. Conversely, Yadkin County Park is only rated 3.57, but it has all the elements (especially scoring euphoria) for me that makes it an incredibly addictive experience......if I lived nearby I would go there over and over and over in hopes of conquering it. The better a course is at giving me that reaction, the better the course.....so I gave it an A+ (5.0).
 
Course A
Before I reviewed it in 2009 it was rated 4.09 on 9 reviews
I rated it a 2.5
Currently it is 3.41 on 48 reviews
Current TR rating is 3.25 on 10 reviews

Course B
Before I reviewed it in 2011 it was rated 4.00 on 11 reviews
I rated it a 2.5
Currently it is 3.42 on 25 reviews
Current TR rating is 3.17 on 3 reviews

According to the data, this is the exception rather than the rule (when taking all the numbers aggregated).

There is no doubt that TRs do this and do this well (in a few out-lier cases). And the data only speaks to the highest rated (most attention paid to) courses.....and as I've stated several times in the thread does not give a perfect picture and falls short of capturing cases like this where TR's rightly knock down 4.0+ rated courses to a more accurate number.

The best way to convince me/us that the data I provided is wrong is to provide the data for your entire set of courses (probably a couple hours of work).
 
Last edited:
I'll again point out you're taking a small percentage of TRs and making assumptions about the group as a whole based on a small sample.
 
According to the data, this is the exception rather than the rule (when taking all the numbers aggregated).

There is no doubt that TRs do this and do this well (in a few out-lier cases). And the data only speaks to the highest rated (most attention paid to) courses.....and as I've stated several times in the thread does not give a perfect picture and falls short of capturing cases like this where TR's rightly knock down 4.0+ rated courses to a more accurate number.

The best way to convince me/us that the data I provided is wrong is to provide the data for your entire set of courses (probably a couple hours of work).

I started working on this for you AdamH, but gave up since to your credit you have reviewed a lot of courses that have not been reviewed much. This is good, but it does not yield meaningful results - look at it this way: if HBB rates a mediocre course a 5.0, and you come along and give it a 3.0 then your difference would be 1.0 (course would be rated 4.0 with your 2 reviews). This says more about HBB than it says about you.

What I did do for you is what I did for the TRs who have lots of IL & WI courses played. Here are 2 courses DGCR rated 4.0+ that you rated <4.0:

Code:
[B]AdamH	diff	DGCR	Course[/B]
3.5	-0.72	4.22	Redstone Arsenal
3.5	-0.61	4.11	Freeman Lake Park

So now your data for rating top courses looks like this (first row is how it was, 2nd row includes above):

Code:
[B]-0.75	-0.5	[COLOR="Blue"]-0.25	0	0.25[/COLOR]	0.5	0.75[/B]
		15%	35%	40%	10%	
9%		14%	32%	36%	9%
 
Again, you are on my wavelength totally. What I learned here is that lots of reviewers do not rate from their own personal perspective and their own personal preference points....they rate (the number) from what they think the masses will think. I think the data I went to great pains to gather bear that out.

What lots of Reviewers write (the words) is intended to be helpful to all. I have no beef with that. Lots of people with lots of skill levels can benefit knowing that a given course holds water in puddles/mud for a week after a rain, or has no bathrooms, or has parking issues, or gets really overcrowded on weekends, or is easy to get lost on (so bring a map is advised), or has major mosquito problems, or is very thorny, or offers little shade and has no drinking fountains near hole 9, or is very hilly exhausting but has no benches, etc.

But when it comes to design, these same reviewers try hard to appeal to the masses too. This simply does not work. Like you pointed out with your Justin Bunnell post, a hole that can be perfect for a mid-level player is boring for a higher rated player AND for a lower level player. Also, many reviewers simply do not understand design in the framework of testing player skills.....they write about things like flow/fairway routing, erosion control, safety. Not that those are bad in and of themselves, but those things belong on a designer group forum, not in a users/players forum.

Well put.......

Although I believe an uproar would ensue if we start disclosing everybody's skill level who reviews.

I will say that some lesser skilled players do understand how to fairly challenge the higher skill levels and how this pertains to the design of a course. But IMO a lot on this site do not, and that is a problem.
 
Last edited:
how many TRs did you collect data on and how many TRs are there total?

the other problem i have with all of this is you're looking at courses that TRs rated 4+. it's a lot easier for anyone to see a course is really great than it is for courses that aren't so great. more people will agree with TRs on those higher rated courses because everyone is in agreeance they're good.

the real data you should be looking at are courses TRs didn't rate as high and my guess is you'll find data that shows Non TRs rate those courses higher than TRs at a greater rate than the data you're looking at.
 
Last edited:

Latest posts

Top