Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > C++ > Re: C++ way to convert ASCII digits to Integer?

Reply
Thread Tools

Re: C++ way to convert ASCII digits to Integer?

 
 
andreas.koestler@googlemail.com
Guest
Posts: n/a
 
      05-27-2009
On May 27, 9:18*am, "Peter Olcott" <(E-Mail Removed)> wrote:
> I remember that there is a clean C++ way to do this, but, I
> forgot what it was.


I don't know what you mean by 'clean C++ way' but one way to do it is:
int ascii_digit_to_int ( const char asciidigit ) {
if ( asciidigit < '0' ||
asciidigit > '9' ) {
throw NotADigitException();
}
return (int) asciidigit - 48; // 48 => '0'
}
Or you can use atoi or similar.
Or you use the std::stringstream:

std::stringstream sstr("3");
int value;
sstr >> value;

Hope that helps
Andreas
 
Reply With Quote
 
 
 
 
Bart van Ingen Schenau
Guest
Posts: n/a
 
      05-27-2009
http://www.velocityreviews.com/forums/(E-Mail Removed) wrote:

> On May 27, 9:18 am, "Peter Olcott" <(E-Mail Removed)> wrote:
>> I remember that there is a clean C++ way to do this, but, I
>> forgot what it was.

>
> I don't know what you mean by 'clean C++ way' but one way to do it is:
> int ascii_digit_to_int ( const char asciidigit ) {
> if ( asciidigit < '0' ||
> asciidigit > '9' ) {
> throw NotADigitException();
> }
> return (int) asciidigit - 48; // 48 => '0'


To make the function usable also on non-ASCII implementations, and to
remove the need for a comment, you should write that last line as:
return asciidigit - '0';

> }
> Or you can use atoi or similar.


Better use strtol rather than atoi. It provides much better behaviour in
error situations.

> Or you use the std::stringstream:
>
> std::stringstream sstr("3");
> int value;
> sstr >> value;


Or you use boost::lexical_cast<> (which uses stringstreams internally,
but with proper error handling).

>
> Hope that helps
> Andreas


Bart v Ingen Schenau
--
a.c.l.l.c-c++ FAQ: http://www.comeaucomputing.com/learn/faq
c.l.c FAQ: http://c-faq.com/
c.l.c++ FAQ: http://www.parashift.com/c++-faq-lite/

 
Reply With Quote
 
 
 
 
andreas.koestler@googlemail.com
Guest
Posts: n/a
 
      05-27-2009
> Not if the machine doesn't use ASCII; only a function like yours above
> is fully portable. But perhaps the original poster wanted a function
> that would convert from the host's native textual representation to an
> integer, in which case the above function would not be a good idea, and
> atoi or stringstream would.

Blargg, please explain...

 
Reply With Quote
 
Pascal J. Bourguignon
Guest
Posts: n/a
 
      05-27-2009
"(E-Mail Removed)" <(E-Mail Removed)> writes:

> From: blargg <(E-Mail Removed)>
> andreas.koestler wrote:
> > On May 27, 9:18*am, "Peter Olcott" <(E-Mail Removed)> wrote:
> > > I remember that there is a clean C++ way to do this [convert
> > > ASCII digits to Integer], but, I forgot what it was.

> >
> > I don't know what you mean by 'clean C++ way' but one way to do it is:
> >
> > int ascii_digit_to_int ( const char asciidigit ) {
> > if ( asciidigit < '0' ||
> > asciidigit > '9' ) {
> > throw NotADigitException();
> > }
> > return (int) asciidigit - 48; // 48 => '0'
> > }
> >
> > Or you can use atoi or similar.
> > Or you use the std::stringstream:
>> Not if the machine doesn't use ASCII; only a function like yours above
>> is fully portable. But perhaps the original poster wanted a function
>> that would convert from the host's native textual representation to an
>> integer, in which case the above function would not be a good idea, and
>> atoi or stringstream would.

> Blargg, please explain...


Actually Blargg's code is wrong. On a machine using EBCDIC '0' = 248, not 48.

#include <iso646.h>

struct ASCII {
enum ASCII {
NUL = 0, SOH, STX, ETX, EOT, ENQ, ACK, BELL, BACKSPACE, TAB,
NEWLINE, VT, PAGE, RETURN, SO, SI, DLE, DC1, DC2, DC3, DC4, NAK,
SYN, ETB, CAN, EM, SUB, ESCAPE, FS, GS, RS, US, SPACE,
EXCLAMATION_MARK, QUOTATION_MARK, NUMBER_SIGN, DOLLAR_SIGN,
PERCENT_SIGN, AMPERSAND, APOSTROPHE, LEFT_PARENTHESIS,
RIGHT_PARENTHESIS, ASTERISK, PLUS_SIGN, COMMA, HYPHEN_MINUS,
FULL_STOP, SOLIDUS, DIGIT_ZERO, DIGIT_ONE, DIGIT_TWO,
DIGIT_THREE, DIGIT_FOUR, DIGIT_FIVE, DIGIT_SIX, DIGIT_SEVEN,
DIGIT_EIGHT, DIGIT_NINE, COLON, SEMICOLON, LESS_THAN_SIGN,
EQUALS_SIGN, GREATER_THAN_SIGN, QUESTION_MARK, COMMERCIAL_AT,
LATIN_CAPITAL_LETTER_A, LATIN_CAPITAL_LETTER_B,
LATIN_CAPITAL_LETTER_C, LATIN_CAPITAL_LETTER_D,
LATIN_CAPITAL_LETTER_E, LATIN_CAPITAL_LETTER_F,
LATIN_CAPITAL_LETTER_G, LATIN_CAPITAL_LETTER_H,
LATIN_CAPITAL_LETTER_I, LATIN_CAPITAL_LETTER_J,
LATIN_CAPITAL_LETTER_K, LATIN_CAPITAL_LETTER_L,
LATIN_CAPITAL_LETTER_M, LATIN_CAPITAL_LETTER_N,
LATIN_CAPITAL_LETTER_O, LATIN_CAPITAL_LETTER_P,
LATIN_CAPITAL_LETTER_Q, LATIN_CAPITAL_LETTER_R,
LATIN_CAPITAL_LETTER_S, LATIN_CAPITAL_LETTER_T,
LATIN_CAPITAL_LETTER_U, LATIN_CAPITAL_LETTER_V,
LATIN_CAPITAL_LETTER_W, LATIN_CAPITAL_LETTER_X,
LATIN_CAPITAL_LETTER_Y, LATIN_CAPITAL_LETTER_Z,
LEFT_SQUARE_BRACKET, REVERSE_SOLIDUS, RIGHT_SQUARE_BRACKET,
CIRCUMFLEX_ACCENT, LOW_LINE, GRAVE_ACCENT, LATIN_SMALL_LETTER_A,
LATIN_SMALL_LETTER_B, LATIN_SMALL_LETTER_C, LATIN_SMALL_LETTER_D,
LATIN_SMALL_LETTER_E, LATIN_SMALL_LETTER_F, LATIN_SMALL_LETTER_G,
LATIN_SMALL_LETTER_H, LATIN_SMALL_LETTER_I, LATIN_SMALL_LETTER_J,
LATIN_SMALL_LETTER_K, LATIN_SMALL_LETTER_L, LATIN_SMALL_LETTER_M,
LATIN_SMALL_LETTER_N, LATIN_SMALL_LETTER_O, LATIN_SMALL_LETTER_P,
LATIN_SMALL_LETTER_Q, LATIN_SMALL_LETTER_R, LATIN_SMALL_LETTER_S,
LATIN_SMALL_LETTER_T, LATIN_SMALL_LETTER_U, LATIN_SMALL_LETTER_V,
LATIN_SMALL_LETTER_W, LATIN_SMALL_LETTER_X, LATIN_SMALL_LETTER_Y,
LATIN_SMALL_LETTER_Z, LEFT_CURLY_BRACKET, VERTICAL_LINE,
RIGHT_CURLY_BRACKET, TILDE, RUBOUT
}}

int ascii_digit_to_int ( const char asciidigit ) {
if((asciidigit<ASCII.DIGIT_ZERO)or(ASCII.DIGIT_NIN E<asciidigit)){
throw NotADigitException();
}else{
return((int)asciidigit - ASCII.DIGIT_ZERO);
}
}

--
__Pascal Bourguignon__
 
Reply With Quote
 
James Kanze
Guest
Posts: n/a
 
      05-27-2009
On May 27, 3:30 am, "(E-Mail Removed)"
<(E-Mail Removed)> wrote:
> On May 27, 9:18 am, "Peter Olcott" <(E-Mail Removed)> wrote:


> > I remember that there is a clean C++ way to do this, but, I
> > forgot what it was.


> I don't know what you mean by 'clean C++ way' but one way to
> do it is: int ascii_digit_to_int ( const char asciidigit ) {
> if ( asciidigit < '0' ||
> asciidigit > '9' ) {
> throw NotADigitException();
> }
> return (int) asciidigit - 48; // 48 => '0'


That's wrong. There's no guarantee that '0' is 48. I've worked
on machines where it is 240. (Of course, the term asciidigit is
very misleading on such machines, because you're really dealing
with an ebcdicdigit.)

You are guaranteed that the decimal digits are consecutive, so
digit - '0' works. Of course, as soon as you do that, someone
will ask for support for hexadecimal. The simplest solution is
just to create a table, correctly initialize it, and then:

return table[ digit ] < 0
? throw NotADigitException()
: table[ digit ] ;
> }


> Or you can use atoi or similar.
> Or you use the std::stringstream:


> std::stringstream sstr("3");
> int value;
> sstr >> value;


That's the normal way of converting a stream of digits into a
number in internal format.

--
James Kanze (GABI Software) email:(E-Mail Removed)
Conseils en informatique orientée objet/
Beratung in objektorientierter Datenverarbeitung
9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
 
Reply With Quote
 
Michael Doubez
Guest
Posts: n/a
 
      05-27-2009
On 27 mai, 12:04, James Kanze <(E-Mail Removed)> wrote:
> On May 27, 3:30 am, "(E-Mail Removed)"
>
> <(E-Mail Removed)> wrote:
> > On May 27, 9:18 am, "Peter Olcott" <(E-Mail Removed)> wrote:
> > > I remember that there is a clean C++ way to do this, but, I
> > > forgot what it was.

[snip]
> > Or you can use atoi or similar.


atoi cannot report errors and should be avoided if the format has not
been previously validated.

> > Or you use the std::stringstream:
> > std::stringstream sstr("3");
> > int value;
> > sstr >> value;

>
> That's the normal way of converting a stream of digits into a
> number in internal format.


An equivalent indigest way to do it is to use directly the num_get()
member of the facet:

using namespace std;
locale loc;
ios::iostate state;
int value;
istream is;
string number ("3");

use_facet<num_get<char,string::iterator> >(loc).get (
number.begin(),number.end(),
is,state,
value
);

You can modify the formatting of the number by setting fmtflags in
'is'.

You can wrap it in something like the numeric_cast<>() of Boost.

--
Michael
 
Reply With Quote
 
Default User
Guest
Posts: n/a
 
      05-27-2009
blargg wrote:

> blargg wrote:
> > andreas.koestler wrote:
> > > On May 27, 9:18 am, "Peter Olcott" <(E-Mail Removed)> wrote:
> > > > I remember that there is a clean C++ way to do this [convert
> > > > ASCII digits to Integer], but, I forgot what it was.
> > >
> > > I don't know what you mean by 'clean C++ way' but one way to do
> > > it is:
> > >
> > > int ascii_digit_to_int ( const char asciidigit ) {
> > > if ( asciidigit < '0' ||
> > > asciidigit > '9' ) {
> > > throw NotADigitException();
> > > }
> > > return (int) asciidigit - 48; // 48 => '0'
> > > }

> [...]
> > Not if the machine doesn't use ASCII; only a function like yours
> > above is fully portable.

>
> Whoops, that's wrong too, as the above function uses '0' and '9',
> which won't be ASCII on a non-ASCII machine. So the above should
> really use 48 and 57 in place of those character constants, to live
> up to its name. Otherwise, on a machine using ASCII, it'll work, but
> on another, it'll be broken and neither convert from ASCII nor the
> machine's native character set!


The requirements for numerals in the character set specify that the
values be consecutive and in increasing value.

So digit - '0' will always give you the numeric value of the numeral
the character represents, regardless of whether it is ASCII or not. The
same is not true for digit - 48.

The original problem specified conversion from ASCII, but that's not
likely what the OP really wanted. If so, then a preliminary step to
convert to ASCII could be performed, but that's probably not what was
really desired.



Brian



 
Reply With Quote
 
Default User
Guest
Posts: n/a
 
      05-28-2009
Pete Becker wrote:

> Default User wrote:
> >
> > The original problem specified conversion from ASCII, but that's not
> > likely what the OP really wanted.

>
> If you write code that you think your boss wants instead of the code
> your boss said he wants you won't last long in your job. If you think
> the specification is wrong, ask the person who is responsible for it.


That really depends on circumstances. For the most part in my career as
a software engineer, supervisors have not created specifications to
that level.




Brian
 
Reply With Quote
 
Default User
Guest
Posts: n/a
 
      05-28-2009
Pete Becker wrote:

> Default User wrote:
> > Pete Becker wrote:


> > That really depends on circumstances. For the most part in my
> > career as a software engineer, supervisors have not created
> > specifications to that level.

>
> If you ignore a specification because you think it's wrong you're
> simply wrong. If you're winging it without specifications you have a
> completely different set of issues.



I'm not following. My superviors have not typically set requirements to
that level. That is to say, the broad strokes of task are set, and the
engineers will define and implement the lower-level requirements.




Brian
 
Reply With Quote
 
James Kanze
Guest
Posts: n/a
 
      05-29-2009
On May 28, 1:59 pm, Pete Becker <(E-Mail Removed)> wrote:
> Default User wrote:


> > The original problem specified conversion from ASCII, but
> > that's not likely what the OP really wanted.


> If you write code that you think your boss wants instead of
> the code your boss said he wants you won't last long in your
> job.


In most places I've worked, writing what the boss wants, rather
than what he says, is good for your career. Of course...

> If you think the specification is wrong, ask the person
> who is responsible for it.


If you think that what he is actually asking for is not what he
wants, you are better off asking, just to be sure.

In the end, it depends on the boss.

And in this case, Default User is probably right: far too many
newbies use "ascii" to mean text. (Given that ASCII is for all
intents and purposes dead, it's highly unlikely that they really
want ASCII.)

--
James Kanze (GABI Software) email:(E-Mail Removed)
Conseils en informatique orientée objet/
Beratung in objektorientierter Datenverarbeitung
9 place Sémard, 78210 St.-Cyr-l'École, France, +33 (0)1 30 23 00 34
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
convert single digit to two digits ofuuzo1@yahoo.no XML 2 04-12-2008 01:02 PM
problem to convert integer to ascii chars for LCD in vhdl Kenneth VHDL 2 08-20-2007 10:29 PM
Regex with ASCII and non-ASCII chars TOXiC Python 5 01-31-2007 04:48 PM
[FR/EN] how to convert the characters ASCII(0-255) to ASCII(0-127) Alextophi Perl Misc 8 12-30-2005 10:43 AM
routine/module to translate microsoft extended ascii to plain ascii James O'Brien Perl Misc 3 03-05-2004 04:33 PM



Advertisments