Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Python > Re: Number of languages known [was Re: Python is readable] - somewhatOT

Reply
Thread Tools

Re: Number of languages known [was Re: Python is readable] - somewhatOT

 
 
Neil Cerutti
Guest
Posts: n/a
 
      04-03-2012
On 2012-04-03, Dave Angel <(E-Mail Removed)> wrote:
> And I worked on a system where the microcode was in ROM, and
> there was a "patch board" consisting of lots of diodes and some
> EPROMs. The diodes were soldered into place to specfy the
> instruction(s) to be patched, and the actual patches were in
> the EPROMs, which were reusable. The diodes were the only
> thing fast enough to "patch" the ROM, by responding more
> quickly than the ROM. This was back when issuing a new ROM was
> a very expensive proposition; there were masking charges, so
> you couldn't reasonably do low quantities.


I worked on a system where the main interface to the system was
poking and peeking numbers at memory addresses.

--
Neil Cerutti
 
Reply With Quote
 
 
 
 
rusi
Guest
Posts: n/a
 
      04-03-2012
All this futuristic grandiloquence:

On Apr 3, 10:17*pm, Nathan Rice <(E-Mail Removed)>
wrote:
> The crux of my view is that programming languages exist in part
> because computers in general are not smart enough to converse with
> humans on their own level, so we have to talk to them like autistic 5
> year-olds. *That was fine when we didn't have any other options, but
> all the pieces exist now to let computers talk to us very close to our
> own level, and represent information at the same way we do. *Projects
> like IBM's Watson, Siri, Wolfram Alpha and Cyc demonstrate pretty
> clearly to me that we are capable of taking the next step, and the
> resurgence of the technology sector along with the shortage of
> qualified developers indicates to me that we need to move now.


needs to be juxtaposed with this antiquated view

> I would argue that the computer is the tool, not the language.



.... a view that could not be held by an educated person after the
1960s -- ie when it became amply clear to all that the essential and
hard issues in CS are about software and not hardware
 
Reply With Quote
 
 
 
 
Nathan Rice
Guest
Posts: n/a
 
      04-03-2012
>> > A carpenter uses his tools -- screwdriver, saw, planer --to do
>> > carpentry
>> > A programmer uses his tools to to programming -- one of which is
>> > called 'programming language'

>>
>> > Doing programming without programming languages is like using toenails
>> > to tighten screws

>>
>> I would argue that the computer is the tool, not the language.

>
> "Computer science is as much about computers as astronomy is about
> telescopes" -- E W Dijkstra
>
> Here are some other attempted corrections of the misnomer "computer
> science":
> http://en.wikipedia.org/wiki/Compute...e_of_the_field


I view "computer science" as applied mathematics, when it deserves
that moniker. When it doesn't, it is merely engineering.

Ironically, telescopes are a tool that astronomers use to view the stars.


On Tue, Apr 3, 2012 at 1:25 PM, rusi <(E-Mail Removed)> wrote:
> All this futuristic grandiloquence:
>
> On Apr 3, 10:17*pm, Nathan Rice <(E-Mail Removed)>
> wrote:
>> The crux of my view is that programming languages exist in part
>> because computers in general are not smart enough to converse with
>> humans on their own level, so we have to talk to them like autistic 5
>> year-olds. *That was fine when we didn't have any other options, but
>> all the pieces exist now to let computers talk to us very close to our
>> own level, and represent information at the same way we do. *Projects
>> like IBM's Watson, Siri, Wolfram Alpha and Cyc demonstrate pretty
>> clearly to me that we are capable of taking the next step, and the
>> resurgence of the technology sector along with the shortage of
>> qualified developers indicates to me that we need to move now.

>
> needs to be juxtaposed with this antiquated view
>
>> I would argue that the computer is the tool, not the language.

>
>
> ... a view that could not be held by an educated person after the
> 1960s -- ie when it became amply clear to all that the essential and
> hard issues in CS are about software and not hardware


I'll go ahead and forgive the club handed fallacies, so we can have a
nice discussion of your primary point. What a civil troll I am

Lets start with some analogies. In cooking, chefs use recipes to
produce a meal; the recipe is not a tool. In architecture, a builder
uses a blueprint to produce a building; the blueprint is not a tool.
In manufacturing, expensive machines use plans to produce physical
goods; the plans are not the tool.

You could say the compiler is a tool, or a development environment is
a tool. The programming language is a mechanism for communication.
 
Reply With Quote
 
Terry Reedy
Guest
Posts: n/a
 
      04-03-2012
On 4/3/2012 8:39 AM, Nathan Rice wrote:

> Ultimately, the answers to your questions exist in the world for you
> to see. How does a surgeon describe a surgical procedure? How does a
> chef describe a recipe? How does a carpenter describe the process of
> building cabinets? Aside from specific words, they all use natural
> language, and it works just fine.


Not really. Surgeon's learn by *watching* a surgeon who knows the
operation and next (hopefully) doing a particular surgery under
supervision of such a surgeon, who watches and talks, and may even grab
the instruments and re-show. They then really learn by doing the
procedure on multiple people. They often kill a few on the way to mastery.

I first learned basic carpentry and other skills by watching my father.
I don't remember that he ever said anything about how to hold the tools.

I similarly learned basic cooking by watching my mom. My knowledge of
how to crack open an egg properly and separate the yolk from the rest is
a wordless memory movie.

--
Terry Jan Reedy

 
Reply With Quote
 
Phil Runciman
Guest
Posts: n/a
 
      04-03-2012

> -----Original Message-----
> From: Mark Lawrence [(E-Mail Removed)]
> Sent: Wednesday, 4 April 2012 3:16 a.m.
> To: http://www.velocityreviews.com/forums/(E-Mail Removed)
> Subject: Re: Number of languages known [was Re: Python is readable] -
> somewhat OT
>
> On 03/04/2012 15:56, Chris Angelico wrote:
> > On Wed, Apr 4, 2012 at 12:46 AM, Grant

> Edwards<(E-Mail Removed)> wrote:
> >> Anybody remember DEC's VAX/VMS "patch" utility? Apparently, DEC
> >> thought it was a practical way to fix things. It had a built-in
> >> assembler and let you "insert" new code into a function by
> >> auto-allocating a location for the new code an hooking it into the
> >> indicated spot with jump instructions.
> >>
> >> The mind wobbled.

> >
> > Not specifically, but I _have_ heard of various systems whose source
> > code and binary were multiple years divergent. It's actually not a
> > difficult trap to fall into, especially once you start patching
> > running systems. I've had quite a few computers that have been unable
> > to reboot without assistance, because they go for months or years
> > without ever having to go through that initial program load. (I've

> had
> > _programs_ that were unable to load, for the same reason.) But
> > auto-allocating a new spot for your expanded function? That's just...
> > awesome. My mind is, indeed, wobbling.
> >
> > ChrisA

>
> Around 1990 I worked on Telematics kit. The patches on all their
> software were implemented via assembler once the original binary had
> been loaded into memory. They even came up with a system that let you
> select which patches you wanted and which you didn't, as e.g. some
> patches were customer specific.
>
> --
> Cheers.
>
> Mark Lawrence.
>


In the 70's I worked with Honeywell 16 Series computers controlling a variety of systems. The patches were loaded as a starting address followed by machine code, using a piece of software for this purpose. This all sounds rather similar to Mark's situation. The reason however is less obvious. On theH16 series we did not have a multi-access O/S and the process of assembling and linking a large system involved many steps. Often the modifications required were trivial. It was generally easier to reload a memory dump from off paper tape and then apply the patches.


Phil Runciman
 
Reply With Quote
 
Phil Runciman
Guest
Posts: n/a
 
      04-03-2012

> On Tue, Apr 3, 2012 at 4:20 PM, Terry Reedy <(E-Mail Removed)> wrote:


> > On 4/3/2012 8:39 AM, Nathan Rice wrote:
> >
> > > Ultimately, the answers to your questions exist in the world for you
> > > to see. *How does a surgeon describe a surgical procedure? *How does
> > > a chef describe a recipe? *How does a carpenter describe the process
> > > of building cabinets? *Aside from specific words, they all use
> > > natural language, and it works just fine.

> >
> >
> > Not really. Surgeon's learn by *watching* a surgeon who knows the operation
> > and next (hopefully) doing a particular surgery under supervision of such a
> > surgeon, who watches and talks, and may even grab the instruments and
> > re-show. They then really learn by doing the procedure on multiple
> > people. They often kill a few on the way to mastery.

>
>
> Well, there is declarative knowledge and procedural knowledge. In all
> these cases, only the procedural knowledge is absolutely necessary,
> but the declarative knowledge is usually a prerequisite to learning
> the procedure in any sort of reasonable manner.


There is also tacit knowledge. Such knowledge is a precursor to declarativeknowledge and therefore procedural knowledge. "Tacit knowledge is not easily shared. It involves learning and skill, but not in a way that can be written down. Tacit knowledge consists often of habits and culture that we do not recognize in ourselves." Wikipedia.

The process of eliciting tacit knowledge may be time consuming and require patience and skill. The following book covers aspects of this: Nonaka, Ikujiro; Takeuchi, Hirotaka (1995), The knowledge creating company: how Japanese companies create the dynamics of innovation.

Phil Runciman
 
Reply With Quote
 
Mark Lawrence
Guest
Posts: n/a
 
      04-03-2012
On 03/04/2012 19:42, Nathan Rice wrote:

> I view "computer science" as applied mathematics, when it deserves
> that moniker. When it doesn't, it is merely engineering.
>


Is it still April first in your time zone?

--
Cheers.

Mark Lawrence.

 
Reply With Quote
 
Steven D'Aprano
Guest
Posts: n/a
 
      04-04-2012
On Tue, 03 Apr 2012 13:17:18 -0400, Nathan Rice wrote:

> I have never met a programmer that was not completely into computers.
> That leaves a lot unspecified though.


You haven't looked hard enough. There are *thousands* of VB, Java, etc.
code monkeys who got into programming for the money only and who have
zero inclination to expand their skills or knowledge beyond that
necessary to keep their job.

Go to programming blogs, and you will find many examples of some
allegedly professional programmer selecting an arbitrary blog post to ask
"Pls sombody write me this code", where "this code" is either an utterly
trivial question or a six month project.


> As part of my troll-outreach effort, I will indulge here. I was
> specifically thinking about some earlier claims that programming
> languages as they currently exist are somehow inherently superior to a
> formalized natural language in expressive power.


I would argue that they are, but only for the very limited purpose for
which they are written. With the possible exception of Inform 7, most
programming languages are useless at describing (say) human interactions.

Human languages are optimised for many things, but careful, step-by-step
algorithms are not one of them. This is why mathematicians use a
specialist language for their problem domain, as do programmers. Human
language is awfully imprecise and often ambiguous, it encourages implicit
reasoning, and requires a lot of domain knowledge:

Joe snatched the hammer from Fred. "Hey," he said, "what are
you doing? Don't you know that he'll hit the roof if he catches
you with that?"


> I think part of this comes from the misconception that terse is better


+1


> The crux of my view is that programming languages exist in part because
> computers in general are not smart enough to converse with humans on
> their own level, so we have to talk to them like autistic 5 year-olds.
> That was fine when we didn't have any other options, but all the pieces
> exist now to let computers talk to us very close to our own level, and
> represent information at the same way we do.


I think you're dreaming. We (that is to say, human beings in general, not
you and I specifically) cannot even talk to each other accurately,
precisely and unambiguously all the time. Natural language simply isn't
designed for that -- hence we have specialist languages like legal
jargon, mathematics, and programming languages, for specialist purposes.



--
Steven
 
Reply With Quote
 
rusi
Guest
Posts: n/a
 
      04-04-2012
On Apr 3, 11:42*pm, Nathan Rice <(E-Mail Removed)>
wrote:
> Lets start with some analogies. *In cooking, chefs use recipes to
> produce a meal; the recipe is not a tool. *In architecture, a builder
> uses a blueprint to produce a building; the blueprint is not a tool.
> In manufacturing, expensive machines use plans to produce physical
> goods; the plans are not the tool.
>
> You could say the compiler is a tool, or a development environment is
> a tool. *The programming language is a mechanism for communication.


Long personal note ahead.
tl;dr version: Computers are such a large shift for human civilization
that generally we dont get what that shift is about or towards.
------
Longer version
My mother often tells me (with some awe): You are so clever! You know
how to use computers! (!?!?)

I try to tell her that a computer is not a machine like a car is (she
is better with things like cars than most of her generation). Its
physical analogy to a typewriter is surprisingly accurate. In fact
its more like a pen than other machines and its civilizational
significance is larger than Gutenbergs press and is on par with the
'invention' (or should I say discovery?) of language as a fundamental
fact of what it means to be human.

[At this point or thereabouts my communication attempt breaks down
because I am trying to tell her of the huge significance of
programming...]

A pen can be used to write love-letter or a death-sentence, a text-
book of anatomy or a symphony.
An yet it would be a bizarre superman who could do all these.
Likewise (I vainly try to communicate with my mother!) that I cant
design machines (with autocad) or paint (with photoshop) or ...
probably 99% of the things that people use computers for.
And so saying that I 'know computers' is on par with saying that
because I know (how to use a pen to) fill up income tax forms, I
should also know how to (use a pen to) write Shakespearean sonnets.

There is a sense in which a pen is a 'universal device.' To some
extent the layman can get this.
There is a larger sense in which the computer is a universal device
(aka universal turing machine).
In my experience, not just 'my mother's' but even PhDs in computer
science dont get what this signifies.

This sense can (somewhat?) be appreciated if we see that the pen is
entirely a declarative tool
The computer is declarative+imperative.
The person who writes the love-letter needs the postman to deliver it.
The judge may write the death-sentence. A hangman is needed to execute
it.
When it comes to computers, the same device can write the love-letter/
death-sentence as the one which mails/controls the electric chair.

Let me end with a quote from Dijkstra: http://www.smaldone.com.ar/documento...36_pretty.html

In the long run I expect computing science to transcend its parent
disciplines, mathematics and logic, by effectively realizing a
significant part of Leibniz's Dream of providing symbolic calculation
as an alternative to human reasoning. (Please note the difference
between "mimicking" and "providing an alternative to": alternatives
are allowed to be better.)

Needless to say, this vision of what computing science is about is not
universally applauded. On the contrary, it has met widespread --and
sometimes even violent-- opposition from all sorts of directions. I
mention as examples

(0) the mathematical guild, which would rather continue to believe
that the Dream of Leibniz is an unrealistic illusion

(1) the business community, which, having been sold to the idea that
computers would make life easier, is mentally unprepared to accept
that they only solve the easier problems at the price of creating much
harder one

(2) the subculture of the compulsive programmer, whose ethics
prescribe that one silly idea and a month of frantic coding should
suffice to make him a life-long millionaire

(3) computer engineering, which would rather continue to act as if it
is all only a matter of higher bit rates and more flops per second

(4) the military, who are now totally absorbed in the business of
using computers to mutate billion-dollar budgets into the illusion of
automatic safety

(5) all soft sciences for which computing now acts as some sort of
interdisciplinary haven

(6) the educational business that feels that, if it has to teach
formal mathematics to CS students, it may as well close its schools.
 
Reply With Quote
 
Steven D'Aprano
Guest
Posts: n/a
 
      04-04-2012
On Tue, 03 Apr 2012 08:39:14 -0400, Nathan Rice wrote:

> Much like
> with the terminal to GUI transition, you will have people attacking
> declarative natural language programming as a stupid practice for noobs,
> and the end of computing (even though it will allow people with much
> less experience to be more productive than them).


I cry every time I consider GUI programming these days.

In the late 1980s and early 1990s, Apple released a product, Hypercard,
that was a combination GUI framework and natural-ish language programming
language. It was an astonishing hit with non-programmers, as it allowed
people to easily move up from "point and click" programming to "real"
programming as their skills improved.

Alas, it has been abandoned by Apple, and while a few of its intellectual
successors still exit, it very niche.

I *really* miss Hypercard. Not so much for the natural language syntax,
as for the astonishingly simple and obvious GUI framework.

To get a flavour of the syntax, see OpenXION:

http://www.openxion.org

and for a hint of the framework, see Pythoncard:

http://pythoncard.sourceforge.net


> Ultimately, the answers to your questions exist in the world for you to
> see. How does a surgeon describe a surgical procedure? How does a chef
> describe a recipe? How does a carpenter describe the process of
> building cabinets? Aside from specific words, they all use natural
> language, and it works just fine.


No they don't. In general they don't use written language at all, but
when they are forced to, they use a combination of drawings or
illustrations plus a subset of natural language plus specialist jargon.

Programming languages include both specialist grammar and specialist
semantics. That makes it a cant or an argot.



--
Steven
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Re: Number of languages known [was Re: Python is readable] - somewhatOT Chris Angelico Python 4 04-03-2012 07:12 AM
Number of languages known [was Re: Python is readable] - somewhat OT Chris Angelico Python 15 03-24-2012 04:44 AM
Calling a function when the number of parameters isn't known till runtime John Friedland C Programming 11 07-14-2006 03:56 AM
Calling a function when the number of parameters isn't known till runtime John Friedland C Programming 18 07-12-2006 12:28 AM
Finding the array index number for known content. PhilC Python 4 10-30-2004 04:11 AM



Advertisments