• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Trusted Reviewer Stats

This is just my personal opinion, but some reviews I read, lead me to believe some reviewers are "padding" reviews to gain votes, in an effort to achieve gold or diamond reviewer status.

For the most part, "trusted reviewer status," doesn't mean anything to me.
 
I'll make some votes for some of these new guys . . . I don't monitor the forums much anymore and I do not write any more reviews because I just don't have the time to dedicate to them anymore as I play more than I can possibly write. I just got back from 101 course roadtrip that was 32 days. It was enough work trying to prepare each night for the next day that I wouldn't have had any time to write accurate and helpful reviews. The courses start to blend together after the first few days! Thanks to all who continue to make DGCR awesome! Good luck to those trying for Diamond . . . may you be welcomed by women, booze, and riches!

Wow! Awesome trip. I take a notebook with me on trips and try to jot down notes on the way to the next course while somebody else drives, or at least that night I try to find time to write down a few things. But I know what you mean, between eating, driving, planning, sleeping, etc. it is sometimes hard to find time to keep track when playing so many new courses at once.
 
This is just my personal opinion, but some reviews I read, lead me to believe some reviewers are "padding" reviews to gain votes, in an effort to achieve gold or diamond reviewer status.

For the most part, "trusted reviewer status," doesn't mean anything to me.

I think that I know what you are referring to about padding. I've seen some TR reviews where the reviewers seem to be bored writing them. (Or maybe I misinterpreted what you were meaning.)

I personally think, though, that these kinds of reviews are generally pretty indicative of the course quality. Boring, bland courses may not justify elaborate reviews. So I still consider them useful a vast majority of the time.

For me, I always sort by TRs first and go to other reviews if there are no TR reviews or no recent TR reviews on courses that have been updated.
 
Wow! Awesome trip. I take a notebook with me on trips and try to jot down notes on the way to the next course while somebody else drives, or at least that night I try to find time to write down a few things. But I know what you mean, between eating, driving, planning, sleeping, etc. it is sometimes hard to find time to keep track when playing so many new courses at once.

Yup, this. I make my wife take my notes a lot of the time while I drive to the next course (and that gives me the opportunity to put her thoughts into my reviews as well). When I get home I'm more transferring my notes from when the course was fresh in my mind rather than writing my review from memory.
 
Awesome! :thmbup:

BogeyNoMore had his 200 uniques and now has a purty little DIAMOND icon!

Way to Go! :clap:
 
Congrats Bogey! Well deserved for a good guy and a good reviewer. I'm ridiculously busy so it may be a bit before I get you added to the diamond stat sheet, but I promise I'll get to it. :)
 
Thanks guys! Knew I was getting close, but no clue I'd amassed 200 unique voters and was simply waiting for my 1000th helpful vote... figured I was gonna be Cubic Zirconia until my next road trip.

To everyone who thumbed down one of my reviews: I thank you.
For all I know, if one of you hadn't, I might still be waiting for that 200th unique voter.

Planning to visit the Buffalo/Rochester area Memorial Day weekend. Maybe I can meet Tim for a round and a Diamond TR Mini.
 
Last edited:
Congrats, BNM! Long time coming, fo' sho'!
 
This is just my personal opinion, but some reviews I read, lead me to believe some reviewers are "padding" reviews to gain votes, in an effort to achieve gold or diamond reviewer status.

For the most part, "trusted reviewer status," doesn't mean anything to me.

"Padding"?

I'm curious as to whether you're referring to disc-ratings or written content with this comment.

"TRS" shouldn't mean a whole lot to anyone. The beauty of DGCR is that everyone's opinion counts equally. :)
 
For the most part, "trusted reviewer status," doesn't mean anything to me.
"TRS" shouldn't mean a whole lot to anyone. The beauty of DGCR is that everyone's opinion counts equally. :)
TR status isn't the be all and end all, and Juke alludes to the fact that DGCR is basically democratic in nature, but here's an example where it's very useful to me.

Private course, rating = 3.89, 9 reviews. Of those 9 reviewers:
5 only have 1 review
2 have written 2 reviews
1 has written 3 reviews
1 is a Gold TR, with 223 reviews

We've all seen instances where a few locals create a profile and write reviews to pump up a local course's rating, especially if they personally have blood, sweat, tears (or money) invested in it. If you're visiting that area with limited time for DG and have to decide which course(s) to play/skip, assuming your don't know squat about any of the reviewers, do you really take what each of them says with equal weight?
 
Last edited:
"Padding"?

I'm curious as to whether you're referring to disc-ratings or written content with this comment.

"TRS" shouldn't mean a whole lot to anyone. The beauty of DGCR is that everyone's opinion counts equally. :)

I'm referring to the written reviews. I get the impression some people write a review with the intent of gaining votes, as opposed to writing a review, and letting the chips fall where they may.

I realize the silver, gold, and diamond status, are incentives to encourage reviews. I just think for some people, reaching a certain level of reviewer becomes more important than the review itself.
 
TR status isn't the be all and end all, and Juke alludes to the fact that DGCR is basically democratic in nature, but here's an example where it's very useful to me.

Private course, rating = 3.89, 9 reviews. Of those 9 reviewers:
5 only have 1 review
2 have written 2 reviews
1 has written 3 reviews
1 is a Gold TR, with 223 reviews

We've all seen instances where a few locals create a profile and write reviews to pump up a local course's rating, especially if they personally have blood, sweat, tears (or money) invested in it. If you're visiting that area with limited time for DG and have to decide which course(s) to play/skip, assuming your don't know squat about any of the reviewers, do you really take what each of them says with equal weight?

If visiting an area with multiple courses and limited DG time, course choice is determined by reading some reviews, course designer, tee type. pictures, and course type i.e. hilly, wooded, flat, open, park or country setting. etc.

This may not be for everyone, but it works for me.
 
TR status isn't the be all and end all, and Juke alludes to the fact that DGCR is basically democratic in nature, but here's an example where it's very useful to me.

Private course, rating = 3.89, 9 reviews. Of those 9 reviewers:
5 only have 1 review
2 have written 2 reviews
1 has written 3 reviews
1 is a Gold TR, with 223 reviews

We've all seen instances where a few locals create a profile and write reviews to pump up a local course's rating, especially if they personally have blood, sweat, tears (or money) invested in it. If you're visiting that area with limited time for DG and have to decide which course(s) to play/skip, assuming your don't know squat about any of the reviewers, do you really take what each of them says with equal weight?

BNM hits it on the head here for me. I tend to sort by TRs also, mostly just for the face that most TRs have a decent chunk of courses played - they've seen a wider range of courses and usually have a good gauge of where a course falls on the scale. It's not that more reviews make you a better reviewer, but just that you have a good base scale. Obviously there will be some reviewers that fall through the cracks who have played plenty of courses but just don't review a lot, but those are fewer and farther between.
 
I realize the silver, gold, and diamond status, are incentives to encourage reviews. I just think for some people, reaching a certain level of reviewer becomes more important than the review itself.

Sure, it's nice when people write reviews "from the heart" as it were, but like anything else, people do what they do for their own reasons. As long as it encourages members to write plenty of reviews others find helpful, who cares what their motivation is? Why people write helpful reviews is completely secondary to the fact that they actually write helpful reviews. Whatever it takes to generate truly helpful reviews, the site and it's users benefit.

IMHO, the achievement seeking reviewer is infinitely better than the those who cobble together 18 words that basically say nothing just so the can hang a # on a course without providing any substance to support their rating.
 
This is just my personal opinion, but some reviews I read, lead me to believe some reviewers are "padding" reviews to gain votes, in an effort to achieve gold or diamond reviewer status.

For the most part, "trusted reviewer status," doesn't mean anything to me.

really doubt this is true in many cases. if someone just wanted to get thumbs, they could just review popular courses highly and put in a lot less work.

BNM hits it on the head here for me. I tend to sort by TRs also, mostly just for the face that most TRs have a decent chunk of courses played - they've seen a wider range of courses and usually have a good gauge of where a course falls on the scale. It's not that more reviews make you a better reviewer, but just that you have a good base scale. Obviously there will be some reviewers that fall through the cracks who have played plenty of courses but just don't review a lot, but those are fewer and farther between.

well it sorta does. (not that your other points aren't true. but doing anything more tends to make people better at it.
 
Quite right. I guess I was just saying that that portion is secondary to me. But of course they go hand in hand.

But to Shipley's point, I think I get a little where he's coming from. Obviously all reviews aren't going to be straight from the heart, but sometimes extra words and bullet points may just not be required if the course really isn't all that spectacular.

For instance, in my reviews I usually try to list some specific holes that I like for one reason or another, and then a couple that maybe felt lack-luster. I like reading reviews before I go to a course, and having an idea of when I might be coming up on some of the cool holes. But, maybe on some courses that are pretty bland, me including the best holes that are actually not very good, but just better than the rest is just not needed info? Shipley, is that the kind of fluff or padding you're talking about?
 
I've read (almost) exemplary reviews, followed by a rating of 1.5 to 2.5 discs. IMO, this disparity indicates some reviews are written for votes, rather than content. I'd rather read a succinct review, then to read a review with the content being crafted to gain votes. This makes the review content ancillary.

FWIW, reviews won't always match the rating, and that's to be expected. However, there shouldn't be a large discrepancy when an individual has written an eloquent, thorough review that praises the course for it's asthetics, good risk/reward factors, good fun factor, informative tee signs, then rates the course two discs.
 
Somebody's been hard at work... looks like they're shooting for the "Lowest % Helpful" award.
 
I know who you're talking about BirdieYesForever.

That front page domination, makes me want to write 50 reviews tonight :D
 
I'm pretty happy with my little green ribbon, and just wish that people in the south read reviews as much as the typical Midwest course review gets read.

the average course in the south just doesn't have as many views as a similar one in other parts of the country.
 

Latest posts

Top