What can THOME tell us about 2018?

Just recently, I added a rear-facing element to the THOME Projections website: 2018 "projections."

Back in 2018, leading up to the season, I did a version of these projections that I kept to myself. They weren't quite as robust as the version of the algorithm I have since developed. Basically, they didn't have the fifth step listed on the FAQs page. After determining an Expected Win Percentage, I simply used that as each team's winning percentage for the purposes of the projected standings.

So there was no Monte Carlo simulation of the season, and that meant two things: No normal distribution of outcomes from where I derive the Vegas Number Confidences; and the number of wins throughout the league did not come close to matching the number of losses. I think there were something like 2,490 wins and 2,370 losses. There should be 2,430 of each. The current, as of this writing, 2019 Projected Standings show 2,436 wins and 2,424 losses, which is much closer to a possible reality.

But more importantly for picking which season-long wagers to place, I did not have confidences for each Over/Under number. I simply had to go with the teams that were further away in their win totals than what Vegas had. I picked eight teams this way, went 5-3 over the season, for a return on investment of about 21%. Not bad at all for a six-month investment.

But what if the THOME system that exists today existed 12 months ago? What would that have looked like? I refactored the algorithm over the last couple weeks to allow for this type of simulation. Let's start by looking at the Projected Standings:

A few things jump out at me here. First, the number of teams where THOME's projected record was within five wins is 11, including nailing the St. Louis Cardinals on the nose. The number of teams where THOME was off by 10 or more is 14, including a delta of more than 20 for Boston (they were very good) and Baltimore (they were very bad).

So almost the same number of very close hits as very wild misses. Not bad. Now looking at the Vegas Number Confidences, and what would have happened if you'd placed these bets:

Overall, THOME recommended the correct play for 20 out of 30 teams. I think it bears repeating that in a full two third of the teams in Major League Baseball, THOME knew which way to play. If you had played all 30 recommendations, assuming an average odds of -110 and an even risk across all wagers, you would have seen a return on investment of 27%.

If I could say that you could hand me any amount of money, and in six months I'll hand it back to you with a 27% bonus on top, you'd be a fool not to do it.

Now, that having been said, most people aren't going to play all 30 of these. And admittedly, the teams where THOME felt most confident did not play out as well as the entire group. If you played just eight teams with a > 90% confidence, with the same assumptions as before, your ROI drops to 19.3%. Which is still making you very good money at the end of the season.

Also, take a close look at the hits and misses in that group. The misses were small: Cleveland and Arizona by four wins, and Miami by just two, for an average of 3.33. The hits were by generally bigger margins: SF by eight, Toronto by eight, Yankees by five, Angels by four, and Mets by four, for an average of 5.8.

If the Marlins had won two more games, 5-3 becomes 6-2 and a 19.3% ROI becomes 43.2%. That's a huge swing on just two games in a 162 game season.

At any rate, I think these "historical" pages will stand as a bit of proof that the system works. And hopefully you'll join along with me for the 2019 season. Let's get that bread.