Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Python > Does Python really follow its philosophy of "Readability counts"?

Reply
Thread Tools

Does Python really follow its philosophy of "Readability counts"?

 
 
Paul Rubin
Guest
Posts: n/a
 
      01-14-2009
Carl Banks <(E-Mail Removed)> writes:
> Guess what systems I worked on that didn't even use scoping? I wrote
> code for the GP7000 (equipped on some Airbus 380s) and the F-136
> (which will be equipped on some F-35 fighters) engine controllers.
> Neither one used any data hiding. The language was C (not C++), but
> it was generated from schematic diagrams.


Generated from a schematic by a program you mean? In that case, the C
was used sort of like assembly code emitted by a compiler. Not really
the same situation.

> Would you like to adopt GE's practice of schematic-generated C with no
> namespaces or data hiding? No? Then don't be telling me I have to
> embrace Boeing's.


All you're telling us is that GE makes foolish choices.
 
Reply With Quote
 
 
 
 
Paul Rubin
Guest
Posts: n/a
 
      01-14-2009
Bruno Desthuilliers <(E-Mail Removed)> writes:
> > I haven't anywhere in this thread as far as I know suggested
> > eliminating dynamism from Python,

>
> Nope, but your suggestion would have the same practical result as far
> as I'm concerned.


Sorry, I don't comprehend that. The rest of your post makes no sense
as a consequence.

Python already had such a change when it deprecated and later got rid
of string exceptions. It's still Python.
 
Reply With Quote
 
 
 
 
Russ P.
Guest
Posts: n/a
 
      01-14-2009
On Jan 14, 1:54*am, Carl Banks <(E-Mail Removed)> wrote:

> I thought you were done wasting time with this nonsense.


So did I.

> > An engine *simulation* is one
> > thing; the actual engine control code is another.

>
> Guess what systems I worked on that didn't even use scoping? *I wrote
> code for the GP7000 (equipped on some Airbus 380s) and the F-136
> (which will be equipped on some F-35 fighters) engine controllers.
> Neither one used any data hiding. *The language was C (not C++), but
> it was generated from schematic diagrams.
>
> Would you like to adopt GE's practice of schematic-generated C with no
> namespaces or data hiding? *No? *Then don't be telling me I have to
> embrace Boeing's.


Well, that's interesting. But you say the code was "generated from
schematic diagrams." Does that mean it was automatically generated by
machine? If so, then the concerns about encapsulation may no longer
apply. In that case, the schematics were the implementation
"language," and the code that was generated was essentially a higher
level version of assembly or machine code (because humans don't work
with it directly).

I know some researchers in software engineering who believe that the
ultimate solution to software reliability is automatic code
generation. The don't really care much which language is used, because
it would only be an intermediate form that humans don't interact with
directly. In that scenario, humans would essentially use a "higher
level" language such as UML or some such thing.

I personally have a hard time seeing how that could work, but that may
just be due to be my own lack of understanding or vision.

 
Reply With Quote
 
Paul Rubin
Guest
Posts: n/a
 
      01-14-2009
"Russ P." <(E-Mail Removed)> writes:
> I know some researchers in software engineering who believe that the
> ultimate solution to software reliability is automatic code
> generation. The don't really care much which language is used, because
> it would only be an intermediate form that humans don't interact with
> directly. In that scenario, humans would essentially use a "higher
> level" language such as UML or some such thing.
>
> I personally have a hard time seeing how that could work, but that may
> just be due to be my own lack of understanding or vision.


The usual idea is that you would write a specificiation, and a
constructive mathematical proof that a certain value meets that
specification. The compiler then verifies the proof and turns it into
code. Coq (http://coq.inria.fr) is an example of a language that
works like that. There is a family of jokes that go:

Q. How many $LANGUAGE programmers does it take to change a lightbulb?
A. [funny response that illustrates some point about $LANGUAGE].

The instantiation for Coq goes:

Q. How many Coq programmers does it take to change a lightbulb?
A. Are you kidding? It took two postdocs six months just to prove
that the bulb and socket are threaded in the same direction.

Despite this, a compiler for a fairly substantial C subset has been
written mostly in Coq (http://compcert.inria.fr/doc/index.html). But,
this stuff is far far away from Python.

I have a situation which I face almost every day, where I have some
gigabytes of data that I want to slice and dice somehow and get some
numbers out of. I spend 15 minutes writing a one-off Python program
and then several hours waiting for it to run. If I used C instead,
I'd spend several hours writing the one-off program and then 15
minutes waiting for it to run, which is not exactly better. (Or, I
could spend several hours writing a parallel version of the Python
program and running it on multiple machines, also not an improvement).
Often, the Python program crashes halfway through, even though I
tested it on a few megabytes of data before starting the full
multi-gigabyte run, because it hit some unexpected condition in the
data that could have been prevented with more compile time checking
that made sure the structures understood by the one-off script matched
the ones in the program that generated the input data.

I would be ecstatic with a version of Python where I might have to
spend 20 minutes instead of 15 minutes writing the program, but then
it runs in half an hour instead of several hours and doesn't crash. I
think the Python community should be aiming towards this.
 
Reply With Quote
 
Bruno Desthuilliers
Guest
Posts: n/a
 
      01-14-2009
Paul Rubin a écrit :
> Bruno Desthuilliers <(E-Mail Removed)> writes:
>>> I haven't anywhere in this thread as far as I know suggested
>>> eliminating dynamism from Python,

>> Nope, but your suggestion would have the same practical result as far
>> as I'm concerned.

>
> Sorry, I don't comprehend that.


IIRC, your suggestion was that one should have to explicitely allow
"dynamic binding" (ie: outside the initializer) of new attributes, and
that the default vould be to disallow them. That's at least what I
understood from :

"""
There are cases where this is useful but they're not terribly common.
I think it would be an improvement if creating new object attributes
was by default not allowed outside the __init__ method. In the cases
where you really do want to create new attributes elsewhere, you'd
have to explicitly enable this at instance creation time, for example
by inheriting from a special superclass:

class Foo (DynamicAttributes, object): pass
"""

(snip)

> Python already had such a change when it deprecated and later got rid
> of string exceptions.


I really don't get how this would be comparable with the above
suggestion. I can well understand your concerns wrt/ Python's
performances (even if I don't agree on your proposed solutions), but
this one "argument" really looks like a straw man.


 
Reply With Quote
 
Bruno Desthuilliers
Guest
Posts: n/a
 
      01-14-2009
Paul Rubin a écrit :
> Bruno Desthuilliers <(E-Mail Removed)> writes:
>> Given that the convention for "protected" attributes in Python is to
>> prefix them with an underscore, I fail to see how one could
>> "accidentally" mess with implementation details. Typing a leading
>> underscore is rarely a typo.

>
> We are talking about the accidental creation of new attributes in
> places outside the initializer.


Nope. This was about encapsulation and data-hiding, cf:

http://groups.google.com/group/comp....a09b54d386eb6c

and the convention James refered to was obviously the naming convention.


 
Reply With Quote
 
Steve Holden
Guest
Posts: n/a
 
      01-14-2009
Paul Rubin wrote:
> "Russ P." <(E-Mail Removed)> writes:
>> I know some researchers in software engineering who believe that the
>> ultimate solution to software reliability is automatic code
>> generation. The don't really care much which language is used, because
>> it would only be an intermediate form that humans don't interact with
>> directly. In that scenario, humans would essentially use a "higher
>> level" language such as UML or some such thing.
>>
>> I personally have a hard time seeing how that could work, but that may
>> just be due to be my own lack of understanding or vision.

>
> The usual idea is that you would write a specificiation, and a
> constructive mathematical proof that a certain value meets that
> specification. The compiler then verifies the proof and turns it into
> code. Coq (http://coq.inria.fr) is an example of a language that
> works like that. There is a family of jokes that go:
>
> Q. How many $LANGUAGE programmers does it take to change a lightbulb?
> A. [funny response that illustrates some point about $LANGUAGE].
>
> The instantiation for Coq goes:
>
> Q. How many Coq programmers does it take to change a lightbulb?
> A. Are you kidding? It took two postdocs six months just to prove
> that the bulb and socket are threaded in the same direction.
>
> Despite this, a compiler for a fairly substantial C subset has been
> written mostly in Coq (http://compcert.inria.fr/doc/index.html). But,
> this stuff is far far away from Python.
>
> I have a situation which I face almost every day, where I have some
> gigabytes of data that I want to slice and dice somehow and get some
> numbers out of. I spend 15 minutes writing a one-off Python program
> and then several hours waiting for it to run. If I used C instead,
> I'd spend several hours writing the one-off program and then 15
> minutes waiting for it to run, which is not exactly better. (Or, I
> could spend several hours writing a parallel version of the Python
> program and running it on multiple machines, also not an improvement).
> Often, the Python program crashes halfway through, even though I
> tested it on a few megabytes of data before starting the full
> multi-gigabyte run, because it hit some unexpected condition in the
> data that could have been prevented with more compile time checking
> that made sure the structures understood by the one-off script matched
> the ones in the program that generated the input data.
>
> I would be ecstatic with a version of Python where I might have to
> spend 20 minutes instead of 15 minutes writing the program, but then
> it runs in half an hour instead of several hours and doesn't crash. I
> think the Python community should be aiming towards this.


RPython might help, but of course it wouldn't allow you the full language.

regards
Steve
--
Steve Holden +1 571 484 6266 +1 800 494 3119
Holden Web LLC http://www.holdenweb.com/

 
Reply With Quote
 
Brian Allen Vanderburg II
Guest
Posts: n/a
 
      01-14-2009
http://www.velocityreviews.com/forums/(E-Mail Removed) wrote:
> Here is a piece of C code this same guy showed me saying Pythonic
> indention would make this hard to read -- Well lets see then!
>
> I swear, before god, this is the exact code he showed me. If you don't
> believe me i will post a link to the thread.
>
> // Warning ugly C code ahead!
> if( is_opt_data() < sizeof( long double ) ) { // test for insufficient
> data
> return TRUE; // indicate buffer empty
> } // end test for insufficient data
> if( is_circ() ) { // test for circular buffer
> if( i < o ) { // test for data area divided
> if( ( l - o ) > sizeof( long double ) ) { // test for data
> contiguous
> *t = ( ( long double * ) f )[ o ]; // return data
> o += sizeof( long double ); // adjust out
> if( o >= l ) { // test for out wrap around
> o = 0; // wrap out around limit
> } // end test for out wrap around
> } else { // data not contiguous in buffer
> return load( ( char * ) t, sizeof( long double ) ); // return
> data
> } // end test for data contiguous
> } else { // data are not divided
> *t = ( ( float * ) f )[ o ]; // return data
> o += sizeof( long double ); // adjust out
> if( o >= l ) { // test for out reached limit
> o = 0; // wrap out around
> } // end test for out reached limit
> } // end test for data area divided
> } else { // block buffer
> *t = ( ( long double * ) f )[ o ]; // return data
> o += sizeof( long double ); // adjust data pointer
> } // end test for circular buffer
>
>

I do a bit of C and C++ programming and even I think that is ugly and
unreadable. First of all there are 'way' to many comments. Why does he
comment every single line. Second of all I've always found that
brace/indent style to lead toward harder-to-read code IMHO. I think the
Allman style is the most readable followed by perhaps Whitesmiths style.

Brian Vanderburg II
 
Reply With Quote
 
Steven D'Aprano
Guest
Posts: n/a
 
      01-14-2009
On Wed, 14 Jan 2009 20:33:01 +0100, Bruno Desthuilliers wrote:

> Paul Rubin a écrit :
>> Bruno Desthuilliers <(E-Mail Removed)>
>> writes:
>>>> I haven't anywhere in this thread as far as I know suggested
>>>> eliminating dynamism from Python,
>>> Nope, but your suggestion would have the same practical result as far
>>> as I'm concerned.

>>
>> Sorry, I don't comprehend that.

>
> IIRC, your suggestion was that one should have to explicitely allow
> "dynamic binding" (ie: outside the initializer) of new attributes, and
> that the default vould be to disallow them.




Lots of heat and noise in this discussion, but I wonder, just how often
do Python programmers use this dynamism *in practice*? I hardly ever do.
I like that it is there, I like that Python is so easy to use without the
overhead of Java, but I rarely need *all* the dynamism available.

[sweeping generalization] Most objects people use are built-ins, and you
can't add attributes to them. I don't think I've ever done subclasses a
built-in just to get dynamic attributes:

class DynamicInt(int):
pass

x = DynamicInt(2)
x.attribute = "something"


As far as non built-in classes go:

>>> from decimal import Decimal
>>> d = Decimal('0.5')
>>> d.something = "something"

Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'Decimal' object has no attribute 'something'

If I recall correctly, the first implementation of Decimal was written in
Python. Did anyone object to Decimal being re-written in C because they
missed the ability to add arbitrary attributes to Decimal instances?

And if they did, I'm pretty sure the answer given would have been: deal
with it. Subclass Decimal, or use delegation. Don't penalise 99% of uses
of Decimal for that 1% of uses where you need dynamism.

I think that's a wise decision.



--
Steven
 
Reply With Quote
 
Steven D'Aprano
Guest
Posts: n/a
 
      01-14-2009
On Wed, 14 Jan 2009 08:45:46 -0800, Paul Rubin wrote:

>> Btw, for performance, there is __slots__,

>
> That is a good point, we somehow lost sight of that in this thread.
>
>> with the side-effect that it forbids attribute creation 'on the fly'.

>
> I have had the impression that this is a somewhat accidental side effect
> and shouldn't be relied on.


Not accidental, but people complain if you use slots for the purpose of
prohibiting attribute creation. They say "That's not what __slots__ was
designed for!". That's okay though, computers were designed for breaking
Germany ciphers and calculating the trajectory of cannon-shells, but
they're not the only things we use computers for these days.



--
Steven
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Its a bird, its a plane, its.. um, an Attribute based System? thunk Ruby 14 04-03-2010 10:08 AM
Its a bird, its a plane, its.. um, an Attribute based System? thunk Ruby 0 04-01-2010 10:25 PM
Its a bird, its a plane, no ummm, its a Ruide thunk Ruby 1 03-30-2010 11:10 AM
Does the Python community really follow the philospy of "CommunityMatters?" r Python 16 02-02-2009 08:21 PM
Its really NOT the camera, its the PICTURE Larry Digital Photography 7 06-02-2004 06:07 AM



Advertisments