Velocity Reviews

Velocity Reviews (http://www.velocityreviews.com/forums/index.php)
-   C Programming (http://www.velocityreviews.com/forums/f42-c-programming.html)
-   -   Re: short int always 16 bits or not? (http://www.velocityreviews.com/forums/t959949-re-short-int-always-16-bits-or-not.html)

Malcolm McLean 04-20-2013 08:49 AM

Re: short int always 16 bits or not?
 
On Saturday, April 20, 2013 1:14:26 AM UTC+1, Shriramana Sharma wrote:
> Hello. I am reading the C99 standard as available from: http://www.open-std.org/jtc1/sc22/WG...docs/n1256.pdf
>
> I note that it specifies (on p 34) macros defining the minimum and maximum
> values of a short int corresponding to a size of 16 bits. However it doesn't
> explicitly say that short int-s should be of 16 bits size. So can I trust
> short int-s to be 16 bits size or not?
>
> Also, doesn't prescribing #define-s for integer type min/max values conflict
> with the general (?) understanding that the size of these types are
> implementation defined? I mean, is the general understanding wrong? (For
> instance see: http://en.wikipedia.org/wiki/Short_i...e_b_grp_notesc)
>
>
>
> Finally, why would anyone want char to be other than 8 bits? *Is* char on any > platform *not* 8 bits?
>


Denis Ritchie, who designed C, made a mistake by making "char" (a variable that
holds a character in a human-readable language) and "byte" (the smallest
addressible unit of memory) the same same thing.
256 characters aren't enough for some purposes. And whilst most computers use
8 bit bytes internally, this isn't universal, particularly on big machines.

So C has a bit of a problem. The solution, which sort of works, is to allow char
to be more than 8 bits, on some platforms to solve the byte issue, and to
introduce wchar_t to solve the bigt alphabet issue.

As for redefining every basic type, this is often done by people with a limited
understanding of software engineering, who think that they are making the
program more robust by allowing the possibility of redefining the type. In
practice, it's most unlikely that this won't break things, and the introduction
of new types causes more problems than it solves, certainly it makes it hard to
integrate code from two programs.

--
Malcolm's website
http://www.malcolmmclean.site11.com/www

James Kuyper 04-20-2013 10:59 AM

Re: short int always 16 bits or not?
 
On 04/20/2013 04:49 AM, Malcolm McLean wrote:
> On Saturday, April 20, 2013 1:14:26 AM UTC+1, Shriramana Sharma wrote:

....
>> Finally, why would anyone want char to be other than 8 bits? *Is* char on any > platform *not* 8 bits?
>>

>
> Denis Ritchie, who designed C, made a mistake by making "char" (a variable that
> holds a character in a human-readable language) and "byte" (the smallest
> addressible unit of memory) the same same thing.


Not quite. 'char' is a data type, while 'byte' is a unit for measuring
the amount of memory required to store an object. As a data type, 'char'
has an integer conversion rank, and if signed, it might have either 1's
complement, 2's complement, or sign-magnitude representation. As a unit
for measuring storage, a byte has none of those things. He decided to
make sizeof(char) == 1 byte.

C would arguably have been better if designed from the start with
something similar to the current wchar_t and size-named types,
preferably with different names, rather than with char, short, int, and
long. I'd recommend thinking along those lines when designing a new
language. However, it would break too much legacy code to ever move C in
that direction.
--
James Kuyper

Eric Sosman 04-20-2013 12:58 PM

Re: short int always 16 bits or not?
 
On 4/20/2013 6:59 AM, James Kuyper wrote:
> On 04/20/2013 04:49 AM, Malcolm McLean wrote:
>> On Saturday, April 20, 2013 1:14:26 AM UTC+1, Shriramana Sharma wrote:

> ...
>>> Finally, why would anyone want char to be other than 8 bits? *Is* char on any > platform *not* 8 bits?
>>>

>>
>> Denis Ritchie, who designed C, made a mistake by making "char" (a variable that
>> holds a character in a human-readable language) and "byte" (the smallest
>> addressible unit of memory) the same same thing.

>
> Not quite. 'char' is a data type, while 'byte' is a unit for measuring
> the amount of memory required to store an object. As a data type, 'char'
> has an integer conversion rank, and if signed, it might have either 1's
> complement, 2's complement, or sign-magnitude representation. As a unit
> for measuring storage, a byte has none of those things. He decided to
> make sizeof(char) == 1 byte.
>
> C would arguably have been better if designed from the start with
> something similar to the current wchar_t and size-named types,
> preferably with different names, rather than with char, short, int, and
> long. I'd recommend thinking along those lines when designing a new
> language. However, it would break too much legacy code to ever move C in
> that direction.


Also, keep in mind the amount of memory on the machines where
early C and Unix were born. Quoth one DMR:

"During [B's] development, [Thompson] continually struggled
against memory limitations: each language addition inflated
the compiler so it could barely fit, but each rewrite taking
advantage of the feature reduced its size."

In that sort of environment, one hasn't the luxury of adding every
desirable feature.

--
Eric Sosman
esosman@comcast-dot-net.invalid


All times are GMT. The time now is 03:00 AM.

Powered by vBulletin®. Copyright ©2000 - 2014, vBulletin Solutions, Inc.
SEO by vBSEO ©2010, Crawlability, Inc.