Patricia Shanahan <(E-Mail Removed)> writes:

> Lasse Reichstein Nielsen wrote:

>> It's O(n*m*k)
....

> I don't understand why you multiply together m and k.
Neither do I, now that you mention it. I should have said O(n*m+n*k),

but I was too busy multiplying

> The cost of examining each element is O(n*k), and the cost of removing

> the elements that need removing is O(n*m*j), where j is the cost of

> shifting one element down the array. We can treat at least one of k and

> j as being one, because constant factors don't affect complexity, so

> that can reduce to O(n*k+n*m)=O(n*(k+m)). I was treating them both as

> being one unit, so I reduced it to O(n*(m+1)).
Indeed.

>> Worst case is O(n^2), so it's fair to use that as an upper bound on

>> the complexity of the algorithm.

> However, in many real applications of this sort of operation the number

> of removed elements is very small, so I think it is important to

> remember that the complexity depends on the removal rate.
Nah, being practical shouldn't get in the way of a good theory

But ofcourse, you are right. Knowing the actual problem being solved

is more important than general theory. Like Bubble sort being the best

sorting algorithm ... for almost sorted lists.

/L

--

Lasse Reichstein Nielsen -

(E-Mail Removed)
DHTML Death Colors: <URL:http://www.infimum.dk/HTML/rasterTriangleDOM.html>

'Faith without judgement merely degrades the spirit divine.'