Ad
  • Default User Avatar

    well actually the prompt said crystal clear that were ordered, and even said in which direction and with the extremes included: " Given a list of unique numbers sorted in ascending order, return a new list so that the values increment by 1 for each index from the min to the max value, both included"

  • Default User Avatar

    This comment is hidden because it contains spoiler information about the solution

  • Custom User Avatar

    Task say that "sorted in ascending order!"

  • Custom User Avatar
  • Custom User Avatar
  • Default User Avatar

    Given a list of unique numbers sorted in ascending order, - Unless the description was different at the time of your comment, they did specify it was ascending/unique. Solution works and I did the same. Yes it doesn't work for disordered arrays, but that wasn't the task to complete.

  • Custom User Avatar

    While it works for this kata, it only does so due to limited test cases. The prompt gives no indication that the array can be assumed in order and even asks for an ordered array from "minimum value up to the maximum value". Just something to keep in mind before assuming this solution is the best one.

  • Default User Avatar

    i think that doesn't work in a disordered array

  • Custom User Avatar
  • Default User Avatar

    How does this work??
    It just returns and empty array if the input is a nested array for example

  • Custom User Avatar

    In this case however of comparing to another linear form, 2n is slower than n. In a real world example a performance minded programmer would never choose to code something that takes 10 seconds when they could do it in 5.

    As said by FArekkusu, that's were you're mistaken: you could compare O(2n) to O(n) only if all operations were of the same kind (if that could be possible...), which will never be the case because, unless you code it on purpose, going from one approach to the other you'll (almost) always end up using different tools (dict/set, for example). And then the specific performances of each of these tools will directly impact the actual performances, and possibly invert what your BigO is telling you. And even if there is no "inversion" of the performances, you'll never end up with a factor 2 going from O(n) to O(2n).

    especially on a problem that says to optimize for speed.

    the actual meaning here is to avoid a O(n²) algo. ;)

    cheers

  • Default User Avatar

    I didn't claim mine was faster, however to make my point I changed my code and now mine is faster. The point I was making was the extra call arr.count is unnecessary and adds time. You're right about complexity not being equivalent to processing time. This is due to the way internal mechanisms in this problem perform their task. The set() call takes 80% of the time, vs arr.count() which takes only 20% of the run time. If these both used vanilla loops they'd be closer but the way they were implemented in python causes one to be much different. Regardless, if we can solve the problem using only one costly iterative function instead of two, this will be a win, especially on a problem that says to optimize for speed.

  • Custom User Avatar

    It looks like you don't really understand what you're talking about. Time complexity is not an execution time metric.

    In a real world example a performance minded programmer would never choose to code something that takes 10 seconds when they could do it in 5.

    Your solution is actually 3.5-4 times slower than this one using Python 3, and 15-25 times slower using Python 2.

  • Default User Avatar

    Asymptotic complexity is only important when comparing to other canonical forms. In this case however of comparing to another linear form, 2n is slower than n. In a real world example a performance minded programmer would never choose to code something that takes 10 seconds when they could do it in 5.

  • Custom User Avatar

    Meaning you don't really understand what you're talking about as long as it comes to complexity, my dear... ;)

    To find the asymptotic complexity, you can go through coefficients, but since it is an asymptotic complexity you want, you have to simplify and in the end: O(2n)==O(n) So the asymptotic behvior of both approaches is the same, actually.

    Now if you're talking about pure performances, I bet the present code will always be faster than the other one, because of the hashing overhead while there are only ever 2 tranversals here (note: yeah, there is hashing here too, but the fact is: sets are way faster than dicts).

    ccl: Do never take asymptotic behavior for an actual measurement of the perfs of the code "on the field". It only gives you ideas about the behavior of the performances when the "n" growths, that's all.

    EDIT: and look at the fork, if you really wan't something fast. ;)

  • Loading more items...