It is indeed a use of recursion, though I, for one, don't think it's good to call it "Best Practises". To me this code is harder to understand that its iterative equivalent.
Interesting idea with setting the first value of minimum_element to INT_MAX, though I dislike declaring multiple variables in one "statement"(is that how it's called when you do int a, b;?).
What I don't get is why do you not declare int i = 0 inside for loop, and instead hold the int i for the whole function;
Indeed it looks like something that's "clever"(though at that level there isn't much space for using cleverness, in particular in C). I for example, dislike using while loops when a for loop works(along with the better readability to me), and how it mutates the value.
Note: I'd be wary of using ASCII codes. I basically never see any reason to write into a char through anything other than char ch = 'a';. It makes it more difficult to read when using numeric code, vs a "normal" way of writing the character itself into the char.
100% agree i never understood using while loops as if they're for loops
That's irrelevant. Feet divided by feet per second = seconds.
I'm used to the C89 version, where the loop variables had to be declared at the beginning of the function.
British imperial system is really bad :\
It is indeed a use of recursion, though I, for one, don't think it's good to call it "Best Practises". To me this code is harder to understand that its iterative equivalent.
Interesting idea with setting the first value of
minimum_element
toINT_MAX
, though I dislike declaring multiple variables in one "statement"(is that how it's called when you doint a, b;
?).What I don't get is why do you not declare
int i = 0
inside for loop, and instead hold theint i
for the whole function;Doesn't get more pythonic than that.
Indeed it looks like something that's "clever"(though at that level there isn't much space for using cleverness, in particular in C). I for example, dislike using
while
loops when afor
loop works(along with the better readability to me), and how it mutates the value.Note: I'd be wary of using ASCII codes. I basically never see any reason to write into a
char
through anything other thanchar ch = 'a';
. It makes it more difficult to read when using numeric code, vs a "normal" way of writing the character itself into thechar
.