• Discover new ways to elevate your game with the updated DGCourseReview app!
    It's entirely free and enhanced with features shaped by user feedback to ensure your best experience on the course. (App Store or Google Play)

Top Player Ratings Over Time

It appears that you don't understand how ratings work.

I can only try to understand the ratings themselves. And they clearly show this...

The higher rated avg player in a division will result in higher rated rounds. Which speaks for itself in regards to the bubbling up of ratings over the years.
 
I can only try to understand the ratings themselves. And they clearly show this...

The higher rated avg player in a division will result in higher rated rounds. Which speaks for itself in regards to the bubbling up of ratings over the years.

So higher rated players get higher rated rounds.

[SARCASM]Ratings inflation is proven![/SARCASM]

Unless it's because of extra-terrestrials.
 
So higher rated players get higher rated rounds.

[SARCASM]Ratings inflation is proven![/SARCASM]

Unless it's because of extra-terrestrials.

You mean

ancient-aliens-guy.jpg
 
So higher rated players get higher rated rounds.

[SARCASM]Ratings inflation is proven![/SARCASM]

Unless it's because of extra-terrestrials.

Aliens? nah... but with a rating algorithm that knows no limits, how can there not be ratings inflation over the years? I'm guessing the original thought process was that MPO would rarely be rated among themselves. It would be interesting to go back to Climo's stats and see how many of his tournaments (when playing in MPO) were rated separately from other divisions. I looked at a few in 2003 and the same score/round/rating was the same for all divisions.
 
Aliens? nah... but with a rating algorithm that knows no limits, how can there not be ratings inflation over the years? I'm guessing the original thought process was that MPO would rarely be rated among themselves. It would be interesting to go back to Climo's stats and see how many of his tournaments (when playing in MPO) were rated separately from other divisions. I looked at a few in 2003 and the same score/round/rating was the same for all divisions.

It seems quite plausible that ratings would drift in relation to underlying skill (if there actually is such a thing in the face of changing disc technologies, player pool, training methods, and course design).

I just like to shoot down false evidence. Only by seeing real drift can we stop it or correct for any drift that has happened so far.
 
Aliens? nah... but with a rating algorithm that knows no limits, how can there not be ratings inflation over the years? I'm guessing the original thought process was that MPO would rarely be rated among themselves. It would be interesting to go back to Climo's stats and see how many of his tournaments (when playing in MPO) were rated separately from other divisions. I looked at a few in 2003 and the same score/round/rating was the same for all divisions.

Agreed. I suspect the ratings have been inflated by about 10 points in the last 20 or so years. Which would put Climo in the low 1050s for his hey day which seems about right. When you look at the players from the decade of the 2000s and add about 10 or so points to their ratings things start to add up more accurately in comparison to today.

I think you're spot on in that the recent separating of the top MPO field from the ams and FPO is partially/mostly responsible for this.
 
I'm guessing the original thought process was that MPO would rarely be rated among themselves.

I think this is ultimately the smoking gun of what's going on. Without the lower rated fields to keep ratings from drifting higher and higher each year, we've slowly seen the ratings inflate by about 10 points from 20 years ago, with most of that inflation going on over the last 7 or so years when the PRO/AM fields were more often separated, the FPO was given their own layout, and the top tier MPO players (>1030) started playing the exclusive events.

Back in the 2000s more often than not you'd have a Val Jenkins and/or Juliana Korver randomly shoot a hot round that's cashable in MPO, or some local 960 rated pro who played the layout a million times shoot lights out at the event and really throw a monkey wrench into the round ratings and drag them down.

I know the PDGA at this point are invested in the ratings system and are probably hesitant to start monkeying around with it. Sunk costs are real yo. Just spitballing ideas here, but you've got my spidey sense tingling.

I'm actually a fan of the ratings system BTW, and I think it accurately reflects the player abilities of today. It's pretty cool and makes our unique sport even more unique. I think the one flaw as you pointed out is using it to compare players over time. Ultimately every system will have weak points and isn't perfect.
 
Last edited:
The higher rated avg player in a division will result in higher rated rounds. Which speaks for itself in regards to the bubbling up of ratings over the years.

Go test it. There's lots of tournament results you can look at.
Average the player ratings of the players who competed in a round. Average their round ratings. Remove and DNFs and non-propagators. Compare the two numbers and show that the MPO field is drifting upward.
 
I think this is ultimately the smoking gun of what's going on. Without the lower rated fields to keep ratings from drifting higher and higher each year, we've slowly seen the ratings inflate by about 10 points from 20 years ago, with most of that inflation going on over the last 7 or so years when the PRO/AM fields were more often separated, the FPO was given their own layout, and the top tier MPO players (>1030) started playing the exclusive events.

I'm not saying ratings don't drift, but I would like to see some sort of analysis that shows the magnitude of possible drift over the years. I think to really have a strong argument you would need to show for a fixed course over many years the same score results in a trend of increasing ratings. It is hard to find a fixed course because trees grow & fall, and at the touring pro level the courses are constantly being tweaked. Similarly weather conditions might be radically different in some years further obfuscating any trends.

So I want to describe other mechanics that could also contribute to a rise in the rating of the top N pros.

The total number of disc golf players competing in the PDGA system has seen a huge rise in the past decade. Check out the demographics from PDGA's website: PDGA Demographics

From:
  • 2002 - 7638
  • 2003 - 8304
  • 2007 - 11943
  • 2011 - 16609
  • 2015 - 30454
  • 2019 - 53366
That is a huge increase in the the size of the field so it seems reasonable that the top N pros have higher ratings now.

In this awesome discussion of the PDGA rating system (Nick & Matt show with Chuck Kennedy) around the 25:30 mark Chuck describes the origin of the ratings. 1000 Rated SSA was set from the average of the 100 Best Rounds at the 1998 Cinncinati Pro Worlds. These Course ratings were used to generate player ratings. Everything propagates from this event. If a similar event were held with the players and discs from today, would we expect the 100 best rounds to average better or worse than the top 100 rounds from 1998?

Later in the show, around 1:07:25, they discuss ratings creep and Chuck shares his thoughts. Chuck attributes the majority of the rise in ratings to the fact that modern courses are being made harder through a variety of methods like increased distance and OB zones.

The PDGA rating is a measure of the players past performances on the courses they have played. There is an upper limit to your rating and that limit is the difficulty of the courses you have played. You can always find ways to make a course harder for the general population of disc golfers. But if a player's total throws don't change as the course is made harder he is showing that his rated rounds (at that course) were limited by the ability of the course to properly rate/evaluate his skill.
 
The top 50 or so DGPT MPO field is going to bubble up to within 20 points of each other in a few years or so. If they were to raise the minimum rating to get in, it would happen even faster. Like Pablo says... ratings don't matter.
 
There's also a guy Andrew Simmons, PDGA #166681, who is 1044-rated based on a single round in a flex start. Wtf?

Savvy players understanding the ratings & sponsorship game early on are bypassing rated rounds until they get good enough.

At least half the field shot above their rating, and then Luke Humphries shot a 798 rated round?! curiously, he is 1 of 4 scores that don't show hole by hole scores.

Looks like he just made sure his already bad round was low enough to not affect his rating. But by doing so it also made ratings skew higher for the field.
 
He did a video where he was totally dicking around trying to win an ace pot. He mentioned it was a sanctioned round. I wonder if that was the round.

edit: This might be that round since Emerson was in the vid, but it looks like he did the same thing...

https://www.pdga.com/tour/event/49336#FPO
 
Last edited:
At least half the field shot above their rating, and then Luke Humphries shot a 798 rated round?! curiously, he is 1 of 4 scores that don't show hole by hole scores.

Don't know him, never met him, but...

Something about the way he comes across coverage of Skins matches and other clips just rubs my fur the wrong way. :\ No illl will toward him at all, but I can't imagine ever pulling for the guy.
 
Looks like he just made sure his already bad round was low enough to not affect his rating. But by doing so it also made ratings skew higher for the field.
Players shooting that far below their rating get removed from the ratings pool to calculate round ratings.
 
I wouldn't say I root for him either, moreso that he is a notable name and it's interesting to see a 1000+ rated touring pro dead last at a small C tier full of sub 1000 rated players.
 

Latest posts

Top