Data Platform Testing – what we learned from our test partners

Posted On May 01, 2024

image

As we pivoted towards the Player Development sector, we decided to conduct extensive testing of our data platform with this sector.

James Bach once said that “pretty good testing is easy to do. Excellent testing is quite hard to do!”

From 2018 to 2020, our team of analysts collected data on 400+ matches across 3 different leagues. Identifying new metrics, collecting the data and assessing its value & context.

When we decided to pivot towards the Player Development sector, we made a conscious decision to go back to the beginning & conduct extensive testing of our data platform with this sector too.

Since the beginning of this year, we have tested with clubs in the USA, Canada, Ireland and the UK – taking in clubs from the League of Ireland, Girls’ Academy, MLS Next, ECNL and other leagues. Across U13-U19 age levels, we have:

  • Tagged 97 games in their entirety – from indoor 20 min-a-side games to 120-min U19 Cup games
  • Collected just under 100k individual player data points
  • Close to 340 hour analysis’ hours spent collecting & reviewing the data
  • 18 different teams tagged across 8 test clubs
  • Over 1 hour of individual player video clips downloaded.

When we switched from a Talent ID (senior level) to a Player Development focus (U13-U19), we were happy with our current data infrastructure. There was a nagging question that we wanted an answer to:

Does our perf. data content relate to what clients want from a Player Development standpoint?

What did we learn?

After our very first test partner interaction, we realized that we’re close but not close enough. We learned:

  1. The young soccer player is pretty comfortable in possession across all 10 outfield positions – we needed to expand our possession data further than simply pass direction successes & failures.
  2. Certain players/positions are expected to be involved in more build-ups to efforts on goal than others. Can this data be collected & reported? After sizable testing, PlayerStat Data’s answer is YES!
  3. The ability to filter 900+ video clips per game by player, time and action has proven to be a very important suggestion by our test partners. By adding tools allowing users to drill down their search through our video platform is a massively positive outcome from this testing process.

Finally and perhaps most importantly:

Our test platform was initially targeted at the Academy/Technical Director/CEO figure at the club/academy. By allowing test partners to include team-specific coaches in the process (i.e. giving team Head Coaches their own login access), this opened up a communication channel within the testers’ clubs – adding further value to their usage of the performance data through the test process.

We realized open communication & enhanced network sharing will enhance the power of our data platform for the user and in turn, greatly magnify our clients’ ability to power their own Player Development pathways over each season.

Summary

Testing never stops.

Testing is an excellent way to engage with clients, work with them to design our data offerings to their specifications.

Testing has the ability to enhance our own research efforts; finding out in granular detail what the market pains are & what they want as a solution.

It helps us when we listen to those with the experience & expertise in the Player Development field, especially in the US & Canadian regions.

It helps our clients when we build a data platform that fits around their needs and solves their problems, not what we think will work for them.

It helps us both when the club/academy client enhance their own operations at the end of each season – building stronger for the future and powering their own Player Development pathways over time.

Leave a Reply

Your email address will not be published. Required fields are marked *