Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > C++ > Efficient data structures

Reply
Thread Tools

Efficient data structures

 
 
tolgaceylanus@yahoo.com
Guest
Posts: n/a
 
      08-15-2006

Also, I meant "in sorting" not "in searching". My posting is not very
related
with the original question. I guess in this case hashes O(1) can beat
O(log N)
complexity of the sets for the 'lookup' operation.

(assuming sets are typically implemented with red-black trees.)

(Shouldn't have posted at all.. :-}} )

Tolga Ceylan

Howard wrote:
>
> Linear (O(N)) is worse that logarithmic (O(log N)) on average. Perhaps
> you're thinking of O(N log N) when you say "logarithmic"? Or perhaps you
> mean constant time (O(1)) when you say "linear"?
>
> -Howard


 
Reply With Quote
 
 
 
 
Mark P
Guest
Posts: n/a
 
      08-15-2006
Christian Christmann wrote:
> Hi,
>
> I'm looking for a data structure where I can store
> arbitrary elements and than efficiently check if an
> element (of same type as the stored elements) is contained
> in the list. The latter operation is performed pretty
> frequently, so the data structure I require must handle
> it with low komplexity.
>
> My idea was to use a STL set and then do something like
>
> myset.find( element2Search ) != myset.end()
>
> For sorted associative containers the "find" function
> has a logarithmic complexity. Are there any approaches
> with linear complexity?
>


Why would you want linear complexity when log is clearly faster? Or did
you mean constant time complexity? The canonical data structure for
this sort of problem is a hash table. Unfortunately this is not (yet)
part of standard C++ but it is a commonly provided extension.
 
Reply With Quote
 
 
 
 
Marcus Kwok
Guest
Posts: n/a
 
      08-15-2006
http://www.velocityreviews.com/forums/(E-Mail Removed) wrote:
> Howard wrote:
>> Linear (O(N)) is worse that logarithmic (O(log N)) on average. Perhaps
>> you're thinking of O(N log N) when you say "logarithmic"? Or perhaps you
>> mean constant time (O(1)) when you say "linear"?

>
> Oppps... "logarithmic" is the wrong word here. It should have been
> N log2 N


Just a small thing: O(N log N) == O(N log2 N). This is because Big-O
does not care about constant factors.

N log2 N == N (log N / log 2) == (1/log 2) * N log N

1/log 2 is a constant so it can be dropped.

--
Marcus Kwok
Replace 'invalid' with 'net' to reply
 
Reply With Quote
 
Jerry Coffin
Guest
Posts: n/a
 
      08-16-2006
In article <(E-Mail Removed) om>,
(E-Mail Removed) says...

[ ... ]

> Ok, I should have said: "You can't beat logarithmic complexity with the
> current standard library." (Hashes are part of TR1, however.) In any
> case, while hashes can certainly be helpful, their worst-case guarantee
> O(n) is obviously worse than logarithmic performance.


While the hash tables included in TR1 may have O(N) complexity in the
worst case, a hash table can be designed to provide logarithmic worst-
case complexity.

[ ... ]

> You can beat logarithmic complexity in average complexity but, as far
> as I know, not in worst-case complexity or with standard library
> functions.


That sounds reasonable to me.

--
Later,
Jerry.

The universe is a figment of its own imagination.
 
Reply With Quote
 
red floyd
Guest
Posts: n/a
 
      08-16-2006
Marcus Kwok wrote:
>
> Just a small thing: O(N log N) == O(N log2 N). This is because Big-O
> does not care about constant factors.
>
> N log2 N == N (log N / log 2) == (1/log 2) * N log N
>
> 1/log 2 is a constant so it can be dropped.


I suspect that O(N log2 N) actually means O(N log^2 N)
 
Reply With Quote
 
mlimber
Guest
Posts: n/a
 
      08-16-2006
Jerry Coffin wrote:
> While the hash tables included in TR1 may have O(N) complexity in the
> worst case, a hash table can be designed to provide logarithmic worst-
> case complexity.


Right you are, though of course it comes as another trade-off.

Cheers! --M

 
Reply With Quote
 
Marcus Kwok
Guest
Posts: n/a
 
      08-16-2006
red floyd <(E-Mail Removed)> wrote:
> Marcus Kwok wrote:
>> Just a small thing: O(N log N) == O(N log2 N). This is because Big-O
>> does not care about constant factors.
>>
>> N log2 N == N (log N / log 2) == (1/log 2) * N log N
>>
>> 1/log 2 is a constant so it can be dropped.

>
> I suspect that O(N log2 N) actually means O(N log^2 N)


You could be right, I was just using the common convention that people
use to indicate the base. E.g., log10 is base-10 logarithm, so log2
would be base-2 logarithm.

--
Marcus Kwok
Replace 'invalid' with 'net' to reply
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
structures, structures and more structures (questions about nestedstructures) Alfonso Morra C Programming 11 09-24-2005 07:42 PM
Most Efficient Way of Exporting CSV data from page Peter ASP .Net 1 11-09-2004 10:41 PM
Type Casting IPv4 and IPv6 structures to Generic Structures tweak C Programming 14 06-11-2004 02:43 PM
efficient data structure and pointer question ... John Galt Java 4 02-13-2004 03:10 PM
Efficient format for huge amount of data Gabriel Genellina Java 21 01-23-2004 10:56 PM



Advertisments