Ad
  • Custom User Avatar

    @Jack-Hogerhuis: because it just creates an array while it is not necessary. The Reverse() method works on the string as well, no need to make it (copy it) to a char array first.

  • Custom User Avatar

    This comment is hidden because it contains spoiler information about the solution

  • Custom User Avatar

    it really should have a failing test on memory allocation with some IEnumberable the generates billions of values.

  • Custom User Avatar

    regular C++ enjoyer

  • Custom User Avatar

    What I meant with my comment is just pure information: using O(N * logN) sorting is slower and that's it. Maybe I shouldn't have written "horrible".
    In most business applications where a code is not in hot path (read: inside of a tight loop), performance difference like this does not matter, and I never wrote it does.
    So my important point is that my comment never said a sorting solution is bad, I only sad it has worse performance and for many cases it is fine, but I wanted for people to be aware of this.

    To give another perspective for this: in katas on this site, business context are usually not mentioned, so we wouldn't know what constraints should we program to, so we wouldn't know if a solution would be running on a performance-critical system or just a hello world application, so both concise lazy solutions and high-performance solutions and ideas should be supported for the sake of learning.

    People basically downvoted my informative comment which is not lying and giving further information to the learner...

    By the way, thanks for quasi-proving my point.

  • Custom User Avatar

    Reasons you may have been downvoted include:

    1. This is a 7 kyu kata marked Fundamentals and Arrays; it is not marked for Optimization.

    2. "Avoid premature optimization"; this code is very easy to understand and therefore should be very easy to maintain.

    3. A good sorting algorithm will be O(n ln(n)) versus the O(n) of going through the list once. Yes, sorting is expected to take longer but is it significant?

    I wrote a program that
    (a) generates N random numbers,
    (b) solves by a minimum selection method, and
    (c) solves by sorting the array.

    For N = 2,000,000 sorting the array was taking about 1/4 second (versus perhaps 30 ms for the other method). Unless I'm looking to run this a lot of times in an outer loop, that time isn't worth fussing over. AND, if I am running it many times I would first ask whether this is important for the overall program. If those 2,000,000 values are coming from a slow hard drive the program will appear slow with either algorithm.

    If you're interested, at N = 20,000,000 the sorting took about 2 seconds (versus 80 ms). Here I would look at whether to optimize. Further, it took me a while to find my typo in the faster algorithm, which supports the maintainability aspect mentioned earlier.

  • Custom User Avatar

    Writing a reply since people greatly downvoted my comment.
    For those downvoting, I have a small note: compare the runtime performance of a modified minimum selection algorithm and the best sorting algorithm for this scenario...

  • Custom User Avatar
  • Custom User Avatar

    This comment is hidden because it contains spoiler information about the solution

  • Custom User Avatar

    The second one was needed, as an error will be thrown without it

  • Custom User Avatar

    @jamescurran:
    I rephrased my comment to be simpler and more precise.

  • Custom User Avatar

    This comment is hidden because it contains spoiler information about the solution

  • Custom User Avatar

    duplicate issue (see @donaldsebleung's issue below)

  • Custom User Avatar

    You get used to them when you start using them.

  • Custom User Avatar

    This comment is hidden because it contains spoiler information about the solution

  • Loading more items...