Appropriate way of Averaging an infinite rate or ratio

Something that I haven't been able to get my head around is a case where an average measure is needed, but one of the rate or ratio inputs could be divided by 0. Allow me to explain:

Suppose we have a pass-to-fail ratio defined in a floating point datatype container: pass / failures.

If we have 0 passes, the average that is computed would be summed with 0. But if we have greater 1 pass and 0 failures, our floating point average will either be summed with positive infinity, NaN, or throw an exception.

A real world example of this possibility is the average kill-death ratio counter found in most video games where a player could go 14-0 for example.


Pass-to-fail ratio:

This dataset infers that there are a total of 7 tests.

  • 3/5
  • 5/3
  • 0/7
  • 7/0

AVG: Infinity

Source code runnable for example here

How would we resolve this situation to have a meaningful measure of average?