Velocity Reviews

Velocity Reviews (http://www.velocityreviews.com/forums/index.php)
-   C++ (http://www.velocityreviews.com/forums/f39-c.html)
-   -   Arguing efficiency. Arg! (http://www.velocityreviews.com/forums/t958113-arguing-efficiency-arg.html)

Christopher 02-26-2013 09:43 PM

Arguing efficiency. Arg!
 
One of the tenured fellows I work with made a file with typedefs for time to long long types, defined his own NULL as max value of long long, and wrote a plethora of functions to manually perform conversion to and from XML string, to database strings, perform UTC and local conversions etc.

I am arguing for the use of boost::optional<ptime>

When the time is invalid, it is readily apparent that it is invalid! Arithmetic is correct and we don't have to worry about a false NULL value or exceeding max value, or flipping over minimum value. Initialization is more readable and maintainable and a host of other arguments.

The counter argument is speed.

Well, He is correct in that boost::optional<ptime> is slower. In the same way, he is correct that using stringstreams for conversions to integral types from strings and from string to integral types is slower than using functions from the C runtime.

I already knew this before I wrote a performance test.

My problem with the way my performance test is being interpreted is that they will say, "Look! it takes 5 times as long over a million iterations!" Well, if one method takes 1 millisecond and another takes 2 milliseconds, isn't the difference showing up going to be one million * 1 millisecond?

How do I argue that the difference in performance is negligible. Is it negligible? Or maybe I am just being hard headed in my desire to get away from C-style source code?

What say you?







Ian Collins 02-26-2013 09:53 PM

Re: Arguing efficiency. Arg!
 
Christopher wrote:

Please wrap your lines!

> One of the tenured fellows I work with made a file with typedefs for
> time to long long types, defined his own NULL as max value of long
> long, and wrote a plethora of functions to manually perform
> conversion to and from XML string, to database strings, perform UTC
> and local conversions etc.


Sound like an academic...

> I am arguing for the use of boost::optional<ptime>
>
> When the time is invalid, it is readily apparent that it is invalid!
> Arithmetic is correct and we don't have to worry about a false NULL
> value or exceeding max value, or flipping over minimum value.
> Initialization is more readable and maintainable and a host of other
> arguments.
>
> The counter argument is speed.


It often is..

> Well, He is correct in that boost::optional<ptime> is slower. In the
> same way, he is correct that using stringstreams for conversions to
> integral types from strings and from string to integral types is
> slower than using functions from the C runtime.
>
> I already knew this before I wrote a performance test.


So does it matter in the context of the application? Does shaving a few
cycles from a string conversion matter when the string is read and
written from a database or serialised as XML?

> My problem with the way my performance test is being interpreted is
> that they will say, "Look! it takes 5 times as long over a million
> iterations!" Well, if one method takes 1 millisecond and another
> takes 2 milliseconds, isn't the difference showing up going to be one
> million * 1 millisecond?


Ask them at add a million database or XML writes to the test!

> How do I argue that the difference in performance is negligible. Is
> it negligible? Or maybe I am just being hard headed in my desire to
> get away from C-style source code?


What constitutes negligible depends on the context. If the application
really is doing millions of conversion in a time critical loop, then
optimising those operations is worth the effort. If it isn't, the
effort is probably a waste of time and money not to mentions a possible
source of bugs and maintenance arse aches.

--
Ian Collins

Öö Tiib 02-26-2013 09:58 PM

Re: Arguing efficiency. Arg!
 
On Tuesday, 26 February 2013 23:43:12 UTC+2, Christopher wrote:
> One of the tenured fellows I work with made a file with typedefs for time
> to long long types, defined his own NULL as max value of long long, and
> wrote a plethora of functions to manually perform conversion to and from
> XML string, to database strings, perform UTC and local conversions etc.


Time (Gregorian Calendar and POSIX time) is a thing that every software developer should implement at least once in his life.
It is inevitable that he fails most miserably on first attempt.

> I am arguing for the use of boost::optional<ptime>


Nooo. Take Boost.Posix Time and Boost.Gregorian if you have boost!!!

> What say you?


Prove that his crap is not working unlike Boost. It takes few tests.
Trust me, it does not work.

Christopher 02-26-2013 09:59 PM

Re: Arguing efficiency. Arg!
 
I'd say string to integral conversions, and back, probably do occur around 50,000
times from the time data comes into the system and the time it goes out.

Time to string and string to time occur much less. Somewhere between 10 and 100
times.

Data is theoretically supposed to make its way through in less than a second.

Sorry about line formatting. It's been awhile since I was on a newsgroup.
I remember we had this problem before. I'll have to try and find a suitable
news client for work.




Ian Collins 02-26-2013 10:08 PM

Re: Arguing efficiency. Arg!
 
Christopher wrote:
> I'd say string to integral conversions, and back, probably do occur around 50,000
> times from the time data comes into the system and the time it goes out.
>
> Time to string and string to time occur much less. Somewhere between 10 and 100
> times.
>
> Data is theoretically supposed to make its way through in less than a second.


So does it? That's the test that matters.

There's nothing wrong with using the C standard library string
conversion functions if you know in advance which types you are using
and their sizes. iostreams win when you don't know the types or sizes,
in templates for example.

> Sorry about line formatting. It's been awhile since I was on a newsgroup.
> I remember we had this problem before. I'll have to try and find a suitable
> news client for work.


Google do their uppermost to make correct Usenet posting impossible!

--
Ian Collins

Christopher 03-01-2013 01:50 PM

Re: Arguing efficiency. Arg!
 
On Wednesday, February 27, 2013 3:33:11 AM UTC-6, Andy Champ wrote:
> On 26/02/2013 21:59, Christopher wrote:
>
> > I'd say string to integral conversions, and back, probably do occur around 50,000

>
> > times from the time data comes into the system and the time it goes out..

>
> >

>
> > Time to string and string to time occur much less. Somewhere between 10and 100

>
> > times.

>
> >

>
> > Data is theoretically supposed to make its way through in less than a second.

>
>
>
> When you profile the application does it really spend a significant
>
> amount of time converting to and from strings? If not, look where it
>
> really spends the time. If so I'd start by asking why you have to
>
> convert in the first place.
>
>
>
> You _have profiled it, haven't you?
>
>
>
> Andy



Absolutely
Performance is a huge problem, however, so is stability. I will always takestability over performance. Running fast it worthless when it doesn't run.

There are much larger problems, but those huge problems require huge overhauls. For instance, quite a bit of data is represented as text rather than more suitable primitive types and classes, which in profiling, took 130X or worse to process. You know how these things go with supervisors. "It takes time?! Leave it alone!"

Arguments like streams vs c-style is something managers want to address, because it is a going forward decision rather than a let's go back and changethe code decision.


All times are GMT. The time now is 04:44 AM.

Powered by vBulletin®. Copyright ©2000 - 2014, vBulletin Solutions, Inc.
SEO by vBSEO ©2010, Crawlability, Inc.