(E-Mail Removed) wrote in

news:(E-Mail Removed) oups.com:

> On 6 Feb., 14:31, Patricia Shanahan <(E-Mail Removed)> wrote:

>> (E-Mail Removed) wrote:

>> > Hi,

>>

>> > I'm having a dataset which I use to multiply with another

>> > dataset. The number of multiplications is >5000 but constant.

>> > The time for computing varies (~0.1-0.2 s) for different

>> > datasets, although they are of the same size. What is the reason

>> > for this variation? Is it because of the zeros that are in the

>> > dataset such that multiplication with zero is faster than any

>> > other multiplication. So the more zeros the faster? Or is it

>> > maybe a memory problem?

>>

>> There are all sorts of effects that could give a 0.1 second

>> variation in time unless you have things really well locked down.

>>

>> Do repeated runs with the same dataset take the same amount of

>> time? In particular, try alternating runs with a "fast" and a

>> "slow" data set.

>>

>> Patricia

>

> Well the problem is that I have only the results of the different

> datasets but I'm not able to test these anymore.

> BTW the total amount of processing time is approx 3.5 sec.
If you're trying to significantly speed up that 3.5 seconds, then I'd

suggest that attempting to optimize an operation that will, at best,

result in a 0.2 second improvement is a less-than-optimal use of your

time -- there are undoubtedly other optimizations that could result in

greater speed improvement.

Cheers

GRB