Ad
  • Default User Avatar

    Calculating the exact sequence values (e.g. using BigDecimal in Java) and finally converting them to double fails with some of the random test cases.

    Example case:

    • signature = [4.0, 0.0, 8.0, 16.0, 9.0, 10.0, 16.0, 3.0, 6.0, 2.0, 19.0, 14.0, 5.0, 4.0, 4.0, 2.0, 7.0, 2.0, 7.0]
    • n = 69

    The exact sequence contains 18773643266614620, which can (just) be represented exactly as a double. The test checks for 1.8773643266614612E16, which is 8 less than the exact value.

    Maybe random tests should be limited to a maximum value below 2^52.