Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Java > How to check variables for uniqueness ?

Reply
Thread Tools

How to check variables for uniqueness ?

 
 
Chris Uppal
Guest
Posts: n/a
 
      02-19-2007
Mike Schilling wrote:

> I believe that the existing mess could have been avoided with precisely
> one change to the existing definition of Java: rather than define a char
> as a 16-bit integral type, define it as a non-integral type of an
> unspecified size. (Exactly like boolean.)


Of even as an unsigned integral type with a size guaranteed to be <= 31 bits.

There would have to have been small changes to the JVM spec, and to the
serialisation spec too (and to JNI -- though that didn't come out until later).
But nothing of staggering difficulty. Even the JVM /implementation/ would be
largely unchanged since chars are represented in 32-bit slots on the stack
anyway...

.... sigh ...

-- chris


 
Reply With Quote
 
 
 
 
Mike Schilling
Guest
Posts: n/a
 
      02-19-2007
Chris Uppal wrote:
> Mike Schilling wrote:
>
>> I believe that the existing mess could have been avoided with
>> precisely one change to the existing definition of Java: rather
>> than define a char as a 16-bit integral type, define it as a
>> non-integral type of an unspecified size. (Exactly like boolean.)

>
> Of even as an unsigned integral type with a size guaranteed to be <=
> 31 bits.


I considered that, but didn't see a definite need for even that loose
guarantee. (Why 31 and not 32, by the way? What makes the guarantee
useful, I think, is that chars can be losslessly converted to ints.)


 
Reply With Quote
 
 
 
 
Mike Schilling
Guest
Posts: n/a
 
      02-19-2007
Mike Schilling wrote:
> Chris Uppal wrote:
>> Mike Schilling wrote:
>>
>>> I believe that the existing mess could have been avoided with
>>> precisely one change to the existing definition of Java: rather
>>> than define a char as a 16-bit integral type, define it as a
>>> non-integral type of an unspecified size. (Exactly like boolean.)

>>
>> Of even as an unsigned integral type with a size guaranteed to be <=
>> 31 bits.

>
> I considered that, but didn't see a definite need for even that loose
> guarantee. (Why 31 and not 32, by the way? What makes the guarantee
> useful, I think, is that chars can be losslessly converted to ints.)


Oh, "unsigned". Never mind.


 
Reply With Quote
 
=?ISO-8859-1?Q?Arne_Vajh=F8j?=
Guest
Posts: n/a
 
      02-20-2007
Chris Uppal wrote:
> Mike Schilling wrote:
>> I believe that the existing mess could have been avoided with precisely
>> one change to the existing definition of Java: rather than define a char
>> as a 16-bit integral type, define it as a non-integral type of an
>> unspecified size. (Exactly like boolean.)

>
> Of even as an unsigned integral type with a size guaranteed to be <= 31 bits.
>
> There would have to have been small changes to the JVM spec, and to the
> serialisation spec too (and to JNI -- though that didn't come out until later).
> But nothing of staggering difficulty. Even the JVM /implementation/ would be
> largely unchanged since chars are represented in 32-bit slots on the stack
> anyway...


For proper interoperability it has to be specified what it is.

Arne
 
Reply With Quote
 
Mike Schilling
Guest
Posts: n/a
 
      02-20-2007
Arne Vajh°j wrote:
> Chris Uppal wrote:
>> Mike Schilling wrote:
>>> I believe that the existing mess could have been avoided with
>>> precisely one change to the existing definition of Java: rather
>>> than define a char as a 16-bit integral type, define it as a
>>> non-integral type of an unspecified size. (Exactly like boolean.)

>>
>> Of even as an unsigned integral type with a size guaranteed to be <=
>> 31 bits. There would have to have been small changes to the JVM spec, and
>> to
>> the serialisation spec too (and to JNI -- though that didn't come
>> out until later). But nothing of staggering difficulty. Even the
>> JVM /implementation/ would be largely unchanged since chars are
>> represented in 32-bit slots on the stack anyway...

>
> For proper interoperability it has to be specified what it is.


Or, to be precise, how it's represented externally. This could be, for
instance, as the character's UTF-8 representation.


 
Reply With Quote
 
Chris Uppal
Guest
Posts: n/a
 
      02-20-2007
Arne Vajh°j wrote:

[me:]
> > There would have to have been small changes to the JVM spec, and to the
> > serialisation spec too (and to JNI -- though that didn't come out until
> > later). But nothing of staggering difficulty. Even the JVM
> > /implementation/ would be largely unchanged since chars are represented
> > in 32-bit slots on the stack anyway...

>
> For proper interoperability it has to be specified what it is.


We were considering what changes would have been necessary back at the
beginning of Java's history for this UTF-16 mess to have been avoided, or
avoidable. I agree that we are in fact stuck with what we've got.

Way back then there was no interoperability, since there was nothing to
interoperate /with/

-- chris


 
Reply With Quote
 
=?ISO-8859-1?Q?Arne_Vajh=F8j?=
Guest
Posts: n/a
 
      02-21-2007
Chris Uppal wrote:
> Arne Vajh°j wrote:
> [me:]
>>> There would have to have been small changes to the JVM spec, and to the
>>> serialisation spec too (and to JNI -- though that didn't come out until
>>> later). But nothing of staggering difficulty. Even the JVM
>>> /implementation/ would be largely unchanged since chars are represented
>>> in 32-bit slots on the stack anyway...

>> For proper interoperability it has to be specified what it is.

>
> We were considering what changes would have been necessary back at the
> beginning of Java's history for this UTF-16 mess to have been avoided, or
> avoidable. I agree that we are in fact stuck with what we've got.
>
> Way back then there was no interoperability, since there was nothing to
> interoperate /with/


It is not backwards compatibility I am talking about.

I am talking about interoperability between JVM's from
different vendors.

If you have a SUN and IBM Java exchanging binary data, then
it is very beneficial that the number of bits in a char is
well defined - not at least X bits as we all know it from C/C++.

Arne
 
Reply With Quote
 
Chris Uppal
Guest
Posts: n/a
 
      02-21-2007
Arne Vajh°j wrote:

[me:]
> > We were considering what changes would have been necessary back at the
> > beginning of Java's history for this UTF-16 mess to have been avoided,
> > or avoidable. I agree that we are in fact stuck with what we've got.
> >
> > Way back then there was no interoperability, since there was nothing to
> > interoperate /with/

>
> It is not backwards compatibility I am talking about.
>
> I am talking about interoperability between JVM's from
> different vendors.
>
> If you have a SUN and IBM Java exchanging binary data, then
> it is very beneficial that the number of bits in a char is
> well defined - not at least X bits as we all know it from C/C++.


Ah, I had misunderstood you. Sorry.

But I don't think it would have been possible to allow for binary compatibility
(in that sense) /and/ had the Java spec worded in such a way that it didn't
make it impossible to fix up future problems.

At least, not without buildng extra (explicit) flexibility into each binary
spec.

-- chris


 
Reply With Quote
 
Mike Schilling
Guest
Posts: n/a
 
      02-21-2007

"Chris Uppal" <(E-Mail Removed)-THIS.org> wrote in message
news:45dc303b$0$765$(E-Mail Removed)...
> Arne Vajh°j wrote:
>
> [me:]
>> > We were considering what changes would have been necessary back at the
>> > beginning of Java's history for this UTF-16 mess to have been avoided,
>> > or avoidable. I agree that we are in fact stuck with what we've got.
>> >
>> > Way back then there was no interoperability, since there was nothing to
>> > interoperate /with/

>>
>> It is not backwards compatibility I am talking about.
>>
>> I am talking about interoperability between JVM's from
>> different vendors.
>>
>> If you have a SUN and IBM Java exchanging binary data, then
>> it is very beneficial that the number of bits in a char is
>> well defined - not at least X bits as we all know it from C/C++.

>
> Ah, I had misunderstood you. Sorry.
>
> But I don't think it would have been possible to allow for binary
> compatibility
> (in that sense) /and/ had the Java spec worded in such a way that it
> didn't
> make it impossible to fix up future problems.
>
> At least, not without buildng extra (explicit) flexibility into each
> binary
> spec.


That is, Java wouldn't define a binary format for chars; chars would be
exchanged (as Strings are) via some encoding of chars into bytes.


 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Re: How include a large array? Edward A. Falk C Programming 1 04-04-2013 08:07 PM
Subclassing Hash to enforce value uniqueness ala key uniqueness. Adam Gardner Ruby 5 11-19-2008 07:36 AM
Best Way to Check Uniqueness in Array Kasp Perl Misc 5 11-13-2003 04:06 PM
uniqueness across different child elements Don Bate XML 0 07-22-2003 10:42 PM
XML Schema keys, uniqueness based on ancestor's attribute Ognen Ivanovski XML 0 07-15-2003 02:36 PM



Advertisments