It's very pedantic, but he does have a point. Similar to how you could view memory usage as O(1) regardless of the algorithm used, just because a computer doesn't have infinite memory, so it's always got an upper bound on that.
Only that's not helpful at all when comparing algorithms, so we disregard that quirk and assume we're working with infinite memory.
That's not how O notation works. I gave the other comment a longer answer
But you just completely ignored everything I said in that comment.
Mathematically, that is precisely how O notation works, only (as I've mentioned) we don't use it like that to get meaningful results. Plus, when looking at time, we can actually use O notation like normal, since computers can indeed calculate something for infinity.
Still, you're wrong saying that isn't how it works in general, which is really easy to see if you look at the actual definition of O(g(n)).
Oh, and your computer crashing is a thing that could happen, sure, but that actually isn't taken into account for runtime analysis, because it only happens with a certain chance. If it would happen after precisely three days every time, then you'd be correct and all algorithms would indeed have an upper bound for time too. However it doesn't, so we can't define that upper bound as there will always be calculations breaking it.
It's very pedantic, but he does have a point. Similar to how you could view memory usage as O(1) regardless of the algorithm used, just because a computer doesn't have infinite memory, so it's always got an upper bound on that.
Only that's not helpful at all when comparing algorithms, so we disregard that quirk and assume we're working with infinite memory.
That's not how O notation works. I gave the other comment a longer answer
But you just completely ignored everything I said in that comment.
Mathematically, that is precisely how O notation works, only (as I've mentioned) we don't use it like that to get meaningful results. Plus, when looking at time, we can actually use O notation like normal, since computers can indeed calculate something for infinity.
Still, you're wrong saying that isn't how it works in general, which is really easy to see if you look at the actual definition of O(g(n)).
Oh, and your computer crashing is a thing that could happen, sure, but that actually isn't taken into account for runtime analysis, because it only happens with a certain chance. If it would happen after precisely three days every time, then you'd be correct and all algorithms would indeed have an upper bound for time too. However it doesn't, so we can't define that upper bound as there will always be calculations breaking it.