Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Ruby > the perens in lisp dilects is there for a reson... macros.

Reply
Thread Tools

the perens in lisp dilects is there for a reson... macros.

 
 
Francis Cianfrocca
Guest
Posts: n/a
 
      08-09-2006
M. Edward (Ed) Borasky wrote:
>>

> First of all, while a Turing machine is a great "experimental animal",
> our "practical computing machines" are Von Neumann machines rather than
> Turing machines. And Von Neumann machines derive from a model of a human
> working with a mechanical desk calculator. In fact, the people -- rooms
> full of people -- who operated them were called "computers". Properly
> speaking, a Von Neumann machine is an "electronic" computer or
> "automatic" computer -- a machine doing what people used to do.
>
> Second, I don't think the actual unification of Church and Turing models
> occurred *long* before digital computers existed. The basic Turing and
> Godel and Church papers were written in the early 1930s, and by the late
> 1930s there were working relay-based digital computers. Vacuum tube
> machines started showing up (in public, anyhow) in the early 1950s and
> in "war rooms" shortly after the end of WW II.



Ed, I'll certainly defer to the memory of a guy who was there . I
wasn't, but here's my recollection of the history: You're absolutely
right about the big stuff being done by Turing and Church in the early
30s (and don't forget Godel's papers published in the middle of the
decade, although he was addressing different questions). Turing came to
Princeton on sabbatical for the 1937-38 academic year, and there he and
Church worked together, and they successfully unified Church's lambda
calculus with Turing's cellular-automaton-based universal machine that
winter.

By 1948, von Neumann had suggested what became known as the "Harvard
architecture" (stored-program digital computer) which is the model for
all of the "computers" we recognize as such today. The ten-year interval
between 1938 and 1948 is what I had in mind when I said "long before."

You're also right about the early impetus for mechanized calculation. It
was ballistic equations for artillery, and also the hydrodynamics of
chemical explosives for plutonium-based atomic bombs. The various
machines that were invented to solve these problems basically had
programs hard-wired into them, with solder. von Neumann's suggestion was
to store what later became called "software" directly into the "memory
cells" of the machines, and in so doing he invented the model that made
computers valuable for general use.

But note how completely reminiscent both models (the "ENIAC" style and
the "Harvard" style) are of classic Turing machines! Computer memory is
just a tessellation structure, and a computer program (even a
hard-soldered one) is a linear sequence of values corresponding
precisely to Turing's "tape."

The fact is that a Turing machine is described with components that not
only have direct physical analogs (his 29-state tape can easily be
modeled with five of what we now call "bits", and you will please pardon
the pun on "analogs"), but these physical analogs (electronic circuits)
turned out to be both easy to make and blazingly fast, once the
transistor came into wide use in the mid-Fifties. None of this is true
of a logical computing machine based on lambda-calculus. And that,
logically enough, is the reason why for the very longest time, languages
like Lisp suffered from horrible performance and incredible memory
requirements. And these problems have only recently been solved, but
only by extreme cleverness on the part of compiler writers.

Whereas almost 50 years ago, John Backus and his crew already had
Fortran compilers that could execute numerical methods at a pretty high
fraction of the available machine speed. (Interestingly, according to
Backus' notes, the first Fortran compiler took 18 man-years to build,
and *most* of that time went into hand-writing the front-end, a job that
become automated and near-trivial by the mid-Seventies.)

When I was a kid, I spent all my allowances buying TTL gates and
hand-wiring digital logic. Then when I first got my hands on a computer,
the process of learning machine language and then assembler was
near-instantaneous, because I knew exactly what those instructions were
doing physically in the hardware. Memory-addressing and pointers made
complete sense from the first moment. I was shocked when I first
encountered C, years later, that it was possible to program computers at
such a high level! My point is that putting values into memory cells,
moving them around, and adding and subtracting them is not only the
easiest known way to build a useful (Turing-complete) computer, but also
may be the easiest way for most people to model computations. As an
economic proposition, then, regardless of all this theory and history,
I'm no longer interested in FP for mainstream use. Ruby, on the other
hand, is a step in the right direction.

Aside: I used to think the fad that swept our university CS departments
10 years or so ago of teaching only Scheme was extraordinarily wasteful
and damaging. They thought that teaching people how to understand
computation using the "more-pure" functional style was preferable to
just teaching people how to program. Well, compared to what they
replaced it with (algorithm cookbooks in Java), it may not have been so
bad.

But now that I've declared Lisp to be a fascinating part of computing's
past (<dons asbestos underwear />), the question is: what lies ahead? It
would be charitable to compare our current computer technology, and the
applications we put it to, to the model T and the candlestick telephone.
The more interesting challenge is to go beyond the solved problem of
Turing-complete computations, and figure out how to linguistically and
metaphorically express the large-scale information-driven processes that
humans work with every day. Someone upthread started hinting at that.
There are already some beautiful but flawed examples of this new
thinking (Erlang for one, the original conception of Smalltalk for
another), but it's till very very early.

--
Posted via http://www.ruby-forum.com/.

 
Reply With Quote
 
 
 
 
Simen Edvardsen
Guest
Posts: n/a
 
      08-09-2006
On 8/9/06, Francis Cianfrocca <(E-Mail Removed)> wrote:
> Whereas almost 50 years ago, John Backus and his crew already had
> Fortran compilers that could execute numerical methods at a pretty high
> fraction of the available machine speed. (Interestingly, according to
> Backus' notes, the first Fortran compiler took 18 man-years to build,
> and *most* of that time went into hand-writing the front-end, a job that
> become automated and near-trivial by the mid-Seventies.)
>
> When I was a kid, I spent all my allowances buying TTL gates and
> hand-wiring digital logic. Then when I first got my hands on a computer,
> the process of learning machine language and then assembler was
> near-instantaneous, because I knew exactly what those instructions were
> doing physically in the hardware. Memory-addressing and pointers made
> complete sense from the first moment. I was shocked when I first
> encountered C, years later, that it was possible to program computers at
> such a high level! My point is that putting values into memory cells,
> moving them around, and adding and subtracting them is not only the
> easiest known way to build a useful (Turing-complete) computer, but also
> may be the easiest way for most people to model computations. As an
> economic proposition, then, regardless of all this theory and history,
> I'm no longer interested in FP for mainstream use. Ruby, on the other
> hand, is a step in the right direction.
>


If the "natural" way of modeling computation is that of moving around
bits in memory and doing trivial operations on them, we wouldn't be
building all these kinds of fancy abstractions, now would we? Also,
Ruby is full of higher order functions, so saying FP is not for
mainstream use but Ruby is a step in the right direction seems a bit
od.

> Aside: I used to think the fad that swept our university CS departments
> 10 years or so ago of teaching only Scheme was extraordinarily wasteful
> and damaging. They thought that teaching people how to understand
> computation using the "more-pure" functional style was preferable to
> just teaching people how to program. Well, compared to what they
> replaced it with (algorithm cookbooks in Java), it may not have been so
> bad.
>


Only teaching one language is probably not the way to go, but I see
Scheme as an excellent beginner language. It contains a simple core
and yet a bunch of powerful abstractions that are useful to learn. It
doesn't lock someone into a single mindset. Java, on the other hand,
is probably making it more difficult for people to grasp the essence
of programming, by making them think that anything that doesn't have a
hundred classes is a failure.

> But now that I've declared Lisp to be a fascinating part of computing's
> past (<dons asbestos underwear />), the question is: what lies ahead? It
> would be charitable to compare our current computer technology, and the
> applications we put it to, to the model T and the candlestick telephone.
> The more interesting challenge is to go beyond the solved problem of
> Turing-complete computations, and figure out how to linguistically and
> metaphorically express the large-scale information-driven processes that
> humans work with every day. Someone upthread started hinting at that.
> There are already some beautiful but flawed examples of this new
> thinking (Erlang for one, the original conception of Smalltalk for
> another), but it's till very very early.
>
> --
> Posted via http://www.ruby-forum.com/.
>
>


There's always room for improvement. Should people not have invented
the clock because someone had solved the problem with an hourglass or
whatever?

--
- Simen

 
Reply With Quote
 
 
 
 
Francis Cianfrocca
Guest
Posts: n/a
 
      08-09-2006
On 8/9/06, Simen Edvardsen <(E-Mail Removed)> wrote:
>>>If the "natural" way of modeling computation is that of moving around

bits in memory and doing trivial operations on them, we wouldn't be
building all these kinds of fancy abstractions, now would we? Also,
Ruby is full of higher order functions, so saying FP is not for
mainstream use but Ruby is a step in the right direction seems a bit
od<<<

Give that a bit more thought. Most of the "fancy abstractions" are
attempts to map aspects of particular problem domains into language
constructs. And what are the typical things we do with our fancy
abstractions? They are mostly about *transport* and communications. We
create purchase orders and store them in databases, then we email them
to vendors. We make accounting calculations and then generate reports.
We send email and instant messages to each other. These operations are
undertaken primarily for the side effects they generate (or
alternatively, the persistent changes they make to the computational
environment). Constructing mathematical fields to solve these
problems rather than scripting them imperatively is arguably a less
natural approach. True functional languages do not have side effects
at all, or they are considered hybrid if they do (SML for example).

Now you and many others may argue with much justification that true FP
is just as valid for constructing nontrivial programs and in many ways
better. Fair enough. But in the real world, I've become convinced that
more programmers will do "better" work (where "better" is defined in
economic terms as high quality at low cost) with imperative languages.
And the reasons why are certainly of interest, but only of theoretical
interest.

Ruby's "higher order functions": aren't you confusing the true
functional style with the simple ability of Ruby (and many other
imperative languages) to treat functions and closures as objects?

 
Reply With Quote
 
Simen Edvardsen
Guest
Posts: n/a
 
      08-09-2006
On 8/9/06, Francis Cianfrocca <(E-Mail Removed)> wrote:
> On 8/9/06, Simen Edvardsen <(E-Mail Removed)> wrote:
> >>>If the "natural" way of modeling computation is that of moving around

> bits in memory and doing trivial operations on them, we wouldn't be
> building all these kinds of fancy abstractions, now would we? Also,
> Ruby is full of higher order functions, so saying FP is not for
> mainstream use but Ruby is a step in the right direction seems a bit
> od<<<
>
> Give that a bit more thought. Most of the "fancy abstractions" are
> attempts to map aspects of particular problem domains into language
> constructs. And what are the typical things we do with our fancy
> abstractions? They are mostly about *transport* and communications. We
> create purchase orders and store them in databases, then we email them
> to vendors. We make accounting calculations and then generate reports.
> We send email and instant messages to each other. These operations are
> undertaken primarily for the side effects they generate (or
> alternatively, the persistent changes they make to the computational
> environment). Constructing mathematical fields to solve these
> problems rather than scripting them imperatively is arguably a less
> natural approach. True functional languages do not have side effects
> at all, or they are considered hybrid if they do (SML for example).
>


Understanding why something works and how it works and how we might
make it work more easily seems worth making a science about. For the
record, I find adding complexity to satisfy mathematics at best an
academic exercise.

> Now you and many others may argue with much justification that true FP
> is just as valid for constructing nontrivial programs and in many ways
> better. Fair enough. But in the real world, I've become convinced that
> more programmers will do "better" work (where "better" is defined in
> economic terms as high quality at low cost) with imperative languages.
> And the reasons why are certainly of interest, but only of theoretical
> interest.
>


Pure FP seems to complicate making complex systems. I've been trying
to learn Haskell, but lifting values in and out of layers of monads to
achieve simple state seems needlessly complicated. I believe Clean has
an alternative solution to state in a pure functional language -- I
don't know if it's any better. Anyway, my point is that non-pure FP
languages produce shorter and more readable -- more "pure" and
"elegant" if you like -- solutions to many problems. I'm no hardcore
FP advocate. There is no one solution that fits all problems.

> Ruby's "higher order functions": aren't you confusing the true
> functional style with the simple ability of Ruby (and many other
> imperative languages) to treat functions and closures as objects?
>
>


True, they're not really mathematical functions, but how is map, each,
inject etc. not higher-order functions? Ruby favors functions (well,
methods) that take blocks (aka closures). The definition of higher
order function from Wikipedia:

In mathematics and computer science, higher-order functions or
functionals are functions which do at least one of the following:

* take one or more functions as an input
* output a function

According to this definition, Ruby's standard library is filled with
higher order functions, as are the libraries of Ocaml, Scheme etc.

--
- Simen

 
Reply With Quote
 
Francis Cianfrocca
Guest
Posts: n/a
 
      08-09-2006
Simen Edvardsen wrote:
>
> In mathematics and computer science, higher-order functions or
> functionals are functions which do at least one of the following:
>
> * take one or more functions as an input
> * output a function
>
> According to this definition, Ruby's standard library is filled with
> higher order functions, as are the libraries of Ocaml, Scheme etc.



This comment is extremely interesting. It is actually *not* the case
that Ruby's "functions" are functions in the mathematical sense. They
are actually objects that point to blocks of executable code or to
closures. They have the ability to execute instructions that modify the
contents of memory or produce other side effects so as to make visible
the results of operations that simulate mathematical functions.

But true functional languages directly expose composable mathematical
functions (mappings from a possibly infinite domains to ranges). Pure
functional languages don't expose any side effects at all. They are
extremely effective at modeling computations (I once wrote a complete,
correct qsort in 7 lines of ML) but not very good at executing
communications or data transport tasks, which are all about side
effects.

I've often been mystified when people talk about Ruby's metaprogramming,
closures, lambdas, etc. as constituting a "functional" programming
style. Your comment is striking because it made me realize that perhaps
many people who learn languages like Scheme are actually trying to apply
the imperative, Turing-style computational model to languages that don't
support them. And that reinforces all the more my intuition that pure FP
is not well-suited for mainstream professional programming.

Years ago, I spent a lot of time trying to teach professional
programmers to grasp the functional style. And most people couldn't even
let go of the concept of variables as little boxes holding readable and
writable values. That's when I gave up on FP.

--
Posted via http://www.ruby-forum.com/.

 
Reply With Quote
 
ara.t.howard@noaa.gov
Guest
Posts: n/a
 
      08-09-2006
On Wed, 9 Aug 2006, Francis Cianfrocca wrote:

> Simen Edvardsen wrote:
> >
>> In mathematics and computer science, higher-order functions or
>> functionals are functions which do at least one of the following:
>>
>> * take one or more functions as an input
>> * output a function
>>
>> According to this definition, Ruby's standard library is filled with
>> higher order functions, as are the libraries of Ocaml, Scheme etc.

>
>
> This comment is extremely interesting. It is actually *not* the case
> that Ruby's "functions" are functions in the mathematical sense. They
> are actually objects that point to blocks of executable code or to
> closures. They have the ability to execute instructions that modify the
> contents of memory or produce other side effects so as to make visible
> the results of operations that simulate mathematical functions.
>
> But true functional languages directly expose composable mathematical
> functions (mappings from a possibly infinite domains to ranges). Pure
> functional languages don't expose any side effects at all. They are
> extremely effective at modeling computations (I once wrote a complete,
> correct qsort in 7 lines of ML) but not very good at executing
> communications or data transport tasks, which are all about side
> effects.
>
> I've often been mystified when people talk about Ruby's metaprogramming,
> closures, lambdas, etc. as constituting a "functional" programming
> style. Your comment is striking because it made me realize that perhaps
> many people who learn languages like Scheme are actually trying to apply
> the imperative, Turing-style computational model to languages that don't
> support them. And that reinforces all the more my intuition that pure FP
> is not well-suited for mainstream professional programming.
>
> Years ago, I spent a lot of time trying to teach professional
> programmers to grasp the functional style. And most people couldn't even
> let go of the concept of variables as little boxes holding readable and
> writable values. That's when I gave up on FP.


interesting. but then you have ocaml, which is a bastard of oop/fp and
impertive features - what's wrong there?

cheers.

-a
--
to foster inner awareness, introspection, and reasoning is more efficient than
meditation and prayer.
- h.h. the 14th dali lama

 
Reply With Quote
 
Daniel Martin
Guest
Posts: n/a
 
      08-09-2006
Francis Cianfrocca <(E-Mail Removed)> writes:

> And my (unproved) intuition is that
> mathematically restricted DSLs (whether they are Lisp programs, Ruby
> programs, graphical widgets, physical constructions, or any other such
> equivalent thing) may be so much easier to tool that they can give a
> vast productivity improvement.


I believe that there are certainly programming constructs that can be
easier to reason about than the standard type of programming, but I'm
not sure how ease-of-reasoning relates to Turing completeness. For
instance, Haskell is certainly Turing-complete as a language, yet
chunks of Haskell code seem like they should be very easy to reason
about. On the other hand, to take a pathological example, Befunge-93
is not Turing-complete, but I doubt it's any easier to reason about
than, say, Brain****, which is Turing complete.

I have a theory that a big advantage to tool makers is the ability to
perform some sort of detailed type inference, where the types that can
be inferred are useful in the problem domain. (e.g. it does no good
to be able to infer a type of "int", when what you really need to know
is which of twenty different types all typedef'ed to int this is
supposed to be) So far as I can tell, the ease of useful type
inference is at least a different question from whether the language
being considered is Turing complete, and may in fact be completely
orthogonal to the question of Turing completeness.

 
Reply With Quote
 
Simen Edvardsen
Guest
Posts: n/a
 
      08-09-2006
On 8/9/06, Francis Cianfrocca <(E-Mail Removed)> wrote:
> Simen Edvardsen wrote:
> >
> > In mathematics and computer science, higher-order functions or
> > functionals are functions which do at least one of the following:
> >
> > * take one or more functions as an input
> > * output a function
> >
> > According to this definition, Ruby's standard library is filled with
> > higher order functions, as are the libraries of Ocaml, Scheme etc.

>
>
> This comment is extremely interesting. It is actually *not* the case
> that Ruby's "functions" are functions in the mathematical sense. They
> are actually objects that point to blocks of executable code or to
> closures. They have the ability to execute instructions that modify the
> contents of memory or produce other side effects so as to make visible
> the results of operations that simulate mathematical functions.
>


Yes, in the post you quoted I said that Ruby functions are not true
mathematical functions. My point still holds: functions, or methods or
whatever you'd like to call them, that operate on closures (regardless
of whether the closures are themselves purely functional) is a very
functional idea. Whether you call it "Higher order method" or
something else, it's still a functional idea.

> But true functional languages directly expose composable mathematical
> functions (mappings from a possibly infinite domains to ranges). Pure
> functional languages don't expose any side effects at all. They are
> extremely effective at modeling computations (I once wrote a complete,
> correct qsort in 7 lines of ML) but not very good at executing
> communications or data transport tasks, which are all about side
> effects.
>


What about Erlang? It's not purely functional, but it certainly favors
a functional style and is used extensively at tasks that seem exactly
like the ones you describe.

> I've often been mystified when people talk about Ruby's metaprogramming,
> closures, lambdas, etc. as constituting a "functional" programming
> style. Your comment is striking because it made me realize that perhaps
> many people who learn languages like Scheme are actually trying to apply
> the imperative, Turing-style computational model to languages that don't
> support them. And that reinforces all the more my intuition that pure FP
> is not well-suited for mainstream professional programming.
>


Perhaps not pure FP. I can agree with you on that. But a functional
style and ideas like closures, higher-order functions, pattern
matching etc. are all good tools which the mainstream could benefit
greatly from.

> Years ago, I spent a lot of time trying to teach professional
> programmers to grasp the functional style. And most people couldn't even
> let go of the concept of variables as little boxes holding readable and
> writable values. That's when I gave up on FP.
>
> --
> Posted via http://www.ruby-forum.com/.
>
>


But as long as *you* understand FP, there's no reason to give up on
it. Surely there are other intelligent programmers out there who
understand it too. I'm not saying we should abandon imperative
constructs altogether, as you seem to imply.

--
- Simen

 
Reply With Quote
 
Chad Perrin
Guest
Posts: n/a
 
      08-09-2006
On Wed, Aug 09, 2006 at 10:08:51PM +0900, Francis Cianfrocca wrote:
>
> Ruby's "higher order functions": aren't you confusing the true
> functional style with the simple ability of Ruby (and many other
> imperative languages) to treat functions and closures as objects?


Minor tangential quibble:
A closure, in essence, *is an object* by definition (definition of
"object", that is). Perhaps Jonathan Rees' menu of OOP features would
help make my point: http://mumble.net/~jar/articles/oo.html

I think what you mean is something more like "treat closures as the type
of object specified by a language's non-closure-centered object model",
which is not quite the same thing.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]
Brian K. Reid: "In computer science, we stand on each other's feet."

 
Reply With Quote
 
Chad Perrin
Guest
Posts: n/a
 
      08-09-2006
On Wed, Aug 09, 2006 at 11:57:03PM +0900, Francis Cianfrocca wrote:
>
> I've often been mystified when people talk about Ruby's metaprogramming,
> closures, lambdas, etc. as constituting a "functional" programming
> style. Your comment is striking because it made me realize that perhaps
> many people who learn languages like Scheme are actually trying to apply
> the imperative, Turing-style computational model to languages that don't
> support them. And that reinforces all the more my intuition that pure FP
> is not well-suited for mainstream professional programming.


1. A functional "style" is not the same as a functional "language".
The adoption of certain coding techniques that arose in functional
languages for use in a language that is more object oriented and
imperative in syntax in no way changes the fact that those techniques
come from a programming style that arose due in large part to a
functional syntax.

2. I agree that *pure* functional programming is not well-suited to
mainstream professional computing, but then, neither is *pure*
imperative programming. That doesn't change the fact that a whole lot
of functional programming syntactic characteristics can be crammed into
a language before passing the point of diminishing returns. In fact, I
think that something as simple as teaching students to use prefix
arithmetic notation in grade school would turn the tables such that
functional programming syntax would probably be by far the "most
natural" to nascent programmers by quite a wide margin.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]
Ben Franklin: "As we enjoy great Advantages from the Inventions of
others we should be glad of an Opportunity to serve others by any
Invention of ours, and this we should do freely and generously."

 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Nice historical Musical - VERY RELAXING - about LISP history -fundamental ideas of LISP nanothermite911fbibustards C++ 0 06-16-2010 09:47 PM
pat-match.lisp or extend-match.lisp in Python? ekzept Python 0 08-10-2007 06:08 PM
the perens in lisp dilects is there for a reson... macros. atbusbook@aol.com Python 1 08-05-2006 04:30 PM
the perens in lisp dilects is there for a reson... macros. atbusbook@aol.com Perl Misc 0 08-05-2006 06:42 AM
Bruce Perens: The Emerging Economic Paradigm of Open Source Have A Nice Cup of Tea NZ Computing 0 04-12-2006 07:06 AM



Advertisments