Final word - 14U rankings
The final word, or maybe not, is how the rating system was developed.
Last fall, my grandson's select 8th grade team, D1 United, played in and won a freshman league. I started taking game statistics and developing a ratings power index like Sagarin or RPI. I used D1 United as 100 and gave their opponents ratings based on the point spread. A team that D1 beat by 20 received an 80.0 rating, a team that won by 5 received a 105.0 rating.
By late fall, D1 United had joined up with Larry Hughes Basketball Academy and was their top team. I added some KC teams after the Boys & Girls Club of KC tournament in November, and then found some Kansas and Omaha tournaments that KC teams played in. That included some Iowa and Oklahoma teams which then led the adding Minnesota, Wisconsin, Texas and Arkansas teams.
One thing led to another and I was picking up NYBL and IndiHoops tournaments with teams from all over the country. These ratings were all anchored to LHBA in St. Louis, so teams like Vegas Elite, New World, etc. were pumped up to 125+ ratings, or a 25 point advantage over LHBA, still anchored at 100.
One could argue, "hey, we beat New World by 2 at XYZ tournament", but as I added more tournaments, I had up to 50 comparative scores for some top teams and the rating was based on their season results, not one game. So your team might have upset New World once, but on average, your team was posting lower quality wins against other teams, including opponents of New World that were blasting those same teams.
I know this isn't perfect, but it beats some subjective polling on some websites, based on a couple of tournament wins. I am sure that I have included B game results in some teams and mis-applied some scoring to the wrong team, based on poor identification of the team in the tournament. There are a whole bunch of tournaments that NEVER posted results. Shame on them for signing up to a website and not using it to inform fans.
The best tournaments had the best refs, the best facilities and posted scores within 30 minutes of game end. Tournament directors should aspire to give the kids the best experience.
So how does the algorithm work?
Well, I used an Excel spreadsheet with comparative formulae for each game score. For instance, at Indi Worlds 2 in San Diego, Seattle Rotary had some tight games with Vegas Elite so their RPI for that tournament would look like this:
Seattle Rotary = ($D$211-3+$D$213+10+$D$211-1+$D$213+11+$D$211-12)/5 = 103.7
where location $D$211 is Vegas Elite and $D$213 is New York Dragons.
So this formula indicates that Seattle Rotary lost to Vegas Elite by three, one and twelve and beat New York Dragons by 10 and 11 points. Since I used the comparative formula for Rotary, I cannot use the same formula for Vegas Elite or New York Dragons or Excel will give me a "circular" formula so I would add the point spread to their ratings.
Vegas Elite = (130.7+3 + 130.7+1 + 130.7 +12) = 136.0 rating
That's an over-simplification because I can use a formula for other teams Vegas Elite beat so that all teams have some comparative formulae. That way, if one team drops in ratings, it can drag their previous opponents down or if they improve significantly, it gives their opponents a little boost. You can see how one bad tournament can drag your rating down even after you have beat New World (btw, I don't know if anyone has beaten New World, LOL)>
Hope that helps. I have enjoyed it. After August, I will pull the plug on 14U and focus on my grandson's high school stats and local rankings for his sixth grade brother.
It's been fun.
Last fall, my grandson's select 8th grade team, D1 United, played in and won a freshman league. I started taking game statistics and developing a ratings power index like Sagarin or RPI. I used D1 United as 100 and gave their opponents ratings based on the point spread. A team that D1 beat by 20 received an 80.0 rating, a team that won by 5 received a 105.0 rating.
By late fall, D1 United had joined up with Larry Hughes Basketball Academy and was their top team. I added some KC teams after the Boys & Girls Club of KC tournament in November, and then found some Kansas and Omaha tournaments that KC teams played in. That included some Iowa and Oklahoma teams which then led the adding Minnesota, Wisconsin, Texas and Arkansas teams.
One thing led to another and I was picking up NYBL and IndiHoops tournaments with teams from all over the country. These ratings were all anchored to LHBA in St. Louis, so teams like Vegas Elite, New World, etc. were pumped up to 125+ ratings, or a 25 point advantage over LHBA, still anchored at 100.
One could argue, "hey, we beat New World by 2 at XYZ tournament", but as I added more tournaments, I had up to 50 comparative scores for some top teams and the rating was based on their season results, not one game. So your team might have upset New World once, but on average, your team was posting lower quality wins against other teams, including opponents of New World that were blasting those same teams.
I know this isn't perfect, but it beats some subjective polling on some websites, based on a couple of tournament wins. I am sure that I have included B game results in some teams and mis-applied some scoring to the wrong team, based on poor identification of the team in the tournament. There are a whole bunch of tournaments that NEVER posted results. Shame on them for signing up to a website and not using it to inform fans.
The best tournaments had the best refs, the best facilities and posted scores within 30 minutes of game end. Tournament directors should aspire to give the kids the best experience.
So how does the algorithm work?
Well, I used an Excel spreadsheet with comparative formulae for each game score. For instance, at Indi Worlds 2 in San Diego, Seattle Rotary had some tight games with Vegas Elite so their RPI for that tournament would look like this:
Seattle Rotary = ($D$211-3+$D$213+10+$D$211-1+$D$213+11+$D$211-12)/5 = 103.7
where location $D$211 is Vegas Elite and $D$213 is New York Dragons.
So this formula indicates that Seattle Rotary lost to Vegas Elite by three, one and twelve and beat New York Dragons by 10 and 11 points. Since I used the comparative formula for Rotary, I cannot use the same formula for Vegas Elite or New York Dragons or Excel will give me a "circular" formula so I would add the point spread to their ratings.
Vegas Elite = (130.7+3 + 130.7+1 + 130.7 +12) = 136.0 rating
That's an over-simplification because I can use a formula for other teams Vegas Elite beat so that all teams have some comparative formulae. That way, if one team drops in ratings, it can drag their previous opponents down or if they improve significantly, it gives their opponents a little boost. You can see how one bad tournament can drag your rating down even after you have beat New World (btw, I don't know if anyone has beaten New World, LOL)>
Hope that helps. I have enjoyed it. After August, I will pull the plug on 14U and focus on my grandson's high school stats and local rankings for his sixth grade brother.
It's been fun.
Comments
Post a Comment