Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > C Programming > #define with semicolon

Reply
Thread Tools

#define with semicolon

 
 
Keith Thompson
Guest
Posts: n/a
 
      07-13-2011
Joe Pfeiffer <(E-Mail Removed)> writes:
> cc <(E-Mail Removed)> writes:
>> On Jul 13, 3:38*pm, Keith Thompson <(E-Mail Removed)> wrote:
>>> cc <(E-Mail Removed)> writes:
>>> > Is it acceptable practice to have a #define with a semicolon in it,
>>> > such as:
>>>
>>> > #define SMALL 1;
>>>
>>> > I didn't think it was, but a very good friend of mine claims it's
>>> > perfectly acceptable if you want to prevent the #define from being
>>> > used in an expression like if(SMALL).
>>>
>>> Why would you want to prevent it from being used in an expression?
>>> I think "1;" is a poor example of what your friend is talking about.
>>> I'd be interested in seeing a better example.

>>
>> That was his example. That was also his explanation of why he did it
>> (so the compiler would complain if he used it as an expression).
>>
>> Another example was from the linux kernel.
>>
>> /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
>> #define LDO_MAX_VOLT 3300;

>
> I was curious enough I went and looked that one up -- it's the only
> #define in the file that ends with a semicolon (even LDO_MIN_VOLT
> doesn't), and a recursive grep fails to turn the symbol up anywhere else
> in the kernel. I'm guessing the reason for this one was an
> overly-clever way of keeping anybody from using it (for anything!) in
> what seems to be a fairly new driver.


I'm guessing that it's just a mistake that nobody has fixed yet.

Adding the semicolon won't keep it from being used. In many cases, it
won't change anything:

voltage = LDO_MAX_VOLT;

and in others it can silently change the meaning of the code:

voltage = LDO_MAX_VOLT + 1;

--
Keith Thompson (The_Other_Keith) http://www.velocityreviews.com/forums/(E-Mail Removed) <http://www.ghoti.net/~kst>
Nokia
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
 
Reply With Quote
 
 
 
 
Barry Schwarz
Guest
Posts: n/a
 
      07-14-2011
On Wed, 13 Jul 2011 11:19:53 -0700 (PDT), cc <(E-Mail Removed)>
wrote:

>Is it acceptable practice to have a #define with a semicolon in it,
>such as:
>
>#define SMALL 1;
>
>I didn't think it was, but a very good friend of mine claims it's
>perfectly acceptable if you want to prevent the #define from being
>used in an expression like if(SMALL).


Acceptable is in the eye of the beholder. If you are at work, it is
whatever standards your company adopts. If you are at home, it is
whatever your preference is.

The only thing perfect about is that it is perfectly legal syntax.

I don't find it acceptable at all personally but there is no reason
why you or anyone else reading this should care what I think.

--
Remove del for email
 
Reply With Quote
 
 
 
 
Ben Bacarisse
Guest
Posts: n/a
 
      07-14-2011
cc <(E-Mail Removed)> writes:
<snip>
> Another example was from the linux kernel.
>
> /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
> #define LDO_MAX_VOLT 3300;


It's quote clear from the context that it's a typo. Fortunately the
macro is not used anywhere!

<snip>
--
Ben.
 
Reply With Quote
 
Hallvard B Furuseth
Guest
Posts: n/a
 
      07-14-2011
Keith Thompson writes:
>> Right. So you see no logical reason to ever use something like #define
>> SMALL 1;? I don't either, but I was just making sure there wasn't
>> something I missed.

>
> I won't say there's *never* a reason to do something like that.
> There are cases where macros will expand to something other than
> an expression or a statement. It usually means you're messing with
> the language syntax, which is dangerous but *sometimes* useful.
>
> Many years ago, I wrote something like:
>
> #define EVER ;;
> ...
> for (EVER) {
> ...
> }
>
> but I got better.


More generally, you can end up with such strange-looking macros if you
use the preprocessor to extend the language. E.g. macros for a poor
man's exception facility, typically with setjmp at the core. These
might also use ugliness like

#define FOO_BEGIN(x) { <something>
#define FOO_END(x) <something else> }

You should then try to keep the ugliness inside macro definitions,
so code using the macros will not look too bad. That can lead to
strange definitions like '#define SMALL 1;'.

You might even deliberatey design the facility so the 'x' macro
parameter above takes an argument of format '<number> <semicolon>',
if user code is always supposed to pass a macro like SMALL, never a
number. It could still be naughty and pass FOO_BEGIN(1 directly
instead of FOO_BEGIN(SMALL), but that will at least look strange.
Maybe that's what the OP's friend is talking about. However it makes
no sense to speak of that in isolation, without reference to the macro
set which uses SMALL.

--
Hallvard
 
Reply With Quote
 
Phil Carmody
Guest
Posts: n/a
 
      07-14-2011
Joe Pfeiffer <(E-Mail Removed)> writes:
> cc <(E-Mail Removed)> writes:
>
> > On Jul 13, 3:38*pm, Keith Thompson <(E-Mail Removed)> wrote:
> >> cc <(E-Mail Removed)> writes:
> >> > Is it acceptable practice to have a #define with a semicolon in it,
> >> > such as:
> >>
> >> > #define SMALL 1;
> >>
> >> > I didn't think it was, but a very good friend of mine claims it's
> >> > perfectly acceptable if you want to prevent the #define from being
> >> > used in an expression like if(SMALL).
> >>
> >> Why would you want to prevent it from being used in an expression?
> >> I think "1;" is a poor example of what your friend is talking about.
> >> I'd be interested in seeing a better example.

> >
> > That was his example. That was also his explanation of why he did it
> > (so the compiler would complain if he used it as an expression).
> >
> > Another example was from the linux kernel.
> >
> > /usr/src/linux-3.0.0-rc7-mainline/include/linux/mfd/tps65910.h:
> > #define LDO_MAX_VOLT 3300;

>
> I was curious enough I went and looked that one up -- it's the only
> #define in the file that ends with a semicolon (even LDO_MIN_VOLT
> doesn't), and a recursive grep fails to turn the symbol up anywhere else
> in the kernel. I'm guessing the reason for this one was an
> overly-clever way of keeping anybody from using it (for anything!) in
> what seems to be a fairly new driver.


There's worse.

$ git grep define\ DELAY_1

Ug.

Phil
--
"At least you know where you are with Microsoft."
"True. I just wish I'd brought a paddle." -- Matthew Vernon
 
Reply With Quote
 
Walter Banks
Guest
Posts: n/a
 
      07-14-2011


cc wrote:

> Is it acceptable practice to have a #define with a semicolon in it,
> such as:
>
> #define SMALL 1;
>
> I didn't think it was, but a very good friend of mine claims it's
> perfectly acceptable if you want to prevent the #define from being
> used in an expression like if(SMALL).


I wish I had a nickel for every customer call I have taken over the last
30 years where a semicolon at the end of a #define changed an
expression. The worst ones are when an expression is split in two and
creates two valid statements. No compiler errors or warnings just
application anguish.

Regards,

--
Walter Banks
Byte Craft Limited
http://www.bytecraft.com





 
Reply With Quote
 
Gene
Guest
Posts: n/a
 
      07-14-2011
On Jul 13, 4:30*pm, Keith Thompson <(E-Mail Removed)> wrote:
> I won't say there's *never* a reason to do something like that.
> There are cases where macros will expand to something other than
> an expression or a statement. *It usually means you're messing with
> the language syntax, which is dangerous but *sometimes* useful.
>
> Many years ago, I wrote something like:
>
> * * #define EVER ;;
> * * ...
> * * for (EVER) {
> * * * * ...
> * * }
>
> but I got better.
>
> --
> Keith Thompson (The_Other_Keith) (E-Mail Removed) *<http://www.ghoti.net/~kst>


A recovering macroholic?
 
Reply With Quote
 
cc
Guest
Posts: n/a
 
      07-15-2011
On Jul 13, 2:19*pm, cc <(E-Mail Removed)> wrote:
> Is it acceptable practice to have a #define with a semicolon in it,
> such as:
>
> #define SMALL 1;
>
> I didn't think it was, but a very good friend of mine claims it's
> perfectly acceptable if you want to prevent the #define from being
> used in an expression like if(SMALL).


It seems as though some people have taken issue with my
characterization of the situation. First off, it wasn't a very good
friend, but actually someone I don't even know. It was a Usenet thing.
That was supposed to be a joke for others reading, but one person was
very upset and called me a liar. So no, it wasn't "a very good
friend." Also, they seem to have an issue with the way I presented the
situation. Here is the full post:

"'The semi-colon will be expanded as part of the macro, causing the
printf to fail to compile.'

Correct - but in reality what I actually do is exploit that to make it
intentionally fail!

e.g. I could easily write


if(SMALL) { do something } else { do something else }


That is bad programming - for the most part, I know I would never
write if(SMALL) ... because if I set SMALL to 2,3,4, then everything
is OK when configuring the software, but if accidentally set SMALL to
0 the execution of the if() statement will change and that would have
been an unintentional side effect.

If I accidentally wrote the code with if(SMALL) it will not fail
especially hard to spot the mistake if it is buried in a complex
formula. And there is no warning of impending doom.

So by putting semicolon in #define SMALL 1; I've made sure on
compiling it it is guaranteed to fail when used out of context."

So that's the whole quote (of which I see no difference in what I said
before), so if you feel differently about it being poor coding
practice I would like to hear why again. Also I'm sorry I jokingly
called someone I don't know, my very good friend. Thanks.

 
Reply With Quote
 
Anders Wegge Keller
Guest
Posts: n/a
 
      07-15-2011
cc <(E-Mail Removed)> writes:

if (some_condition)
some_var = SMALL + some_other_var++;


I wish you a merry time debugging code like this.

--
/Wegge

Leder efter redundant peering af dk.*,linux.debian.*
 
Reply With Quote
 
Roberto Waltman
Guest
Posts: n/a
 
      07-15-2011
Keith Thompson wrote:
>Many years ago, I wrote something like:
>
> #define EVER ;;
> ...
> for (EVER) {
> ...
> }
>
>but I got better.


I am tempted to do that often, because with some compilers this,

while (1) { ... }

generates a warning about the "expression being constant", while your
example is accepted silently.

--
Roberto Waltman

[ Please reply to the group.
Return address is invalid ]
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Cookies: semicolon vs. semicolon-space Peter Michaux Javascript 3 12-11-2006 04:43 PM
Problem returning identity from SQL Server when string contains semicolon Dan ASP .Net 3 11-16-2005 07:31 PM
Printing "hello , wolrd" with out using semicolon Prashanth Badabagni C Programming 42 05-12-2004 03:27 AM
"Hello World" without semicolon with a difference ankursinha C Programming 33 02-25-2004 09:27 AM
orgin of semicolon usage Russ C Programming 9 10-26-2003 02:44 AM



Advertisments