Recently I was comparing an old Windows DOS command for deleting all the files in a directory with a scripted equivalent - I noticed the "modernised" version required typing 50 times more keystrokes to achieve the same outcome.
Are these additional keystrokes enhancing productivity? Are they serving a purpose that has been quantified, for example reducing coding error rates?
The issue as I see it is that a computer language written primarily to accommodate Von Neumann architecture - rather than the way we think - forces us to solve problems by juggling three problem domains in our heads (a) the original probem (b) the problem restructured to fit Von Neumann architecture (c) the mapping rules needed to translate back and forth between (a) and (b).
As a rule of thumb the more efficient a computer language notation - in the sense that it enables you to work directly with the problem at hand - the lower the coding overhead. Lower coding overhead makes problem solving more tractable and thereby reduces coding and room for error. It should definitely not increase workload!
Which computer language in your opinion makes for the most efficient problem resolution platform - in that it enables you to think directly in terms of the original problem without having to do cross-domain problem juggling?
For interest I did a byte count of 37 different solutions to Conway's game of life and came up with the following stats:
J : 80,
APL : 145,
Mathematica : 182,
Ursala : 374,
JAMES II : 394,
SETL : 559,
ZPL : 652,
PicoLisp : 906,
F# : 1029,
Vedit macro language : 1239,
AutoHotkey : 1344,
E : 1365,
Perl 6 : 1372,
TI-89 BASIC : 1422,
Perl : 1475,
PureBasic : 1526,
Ocaml : 1538,
Ruby : 1567,
Forth : 1607,
Python : 1638,
Haskell : 1771,
Clojure : 1837,
Tcl : 1888,
R : 2031,
Common Lisp : 2185,
OZ : 2320,
Scheme : 2414,
Fortran : 2485,
C : 2717,
ADA : 2734,
D : 3040,
C# : 3409,
6502 Assembly : 3496,
Delphi : 3742
ALGOL 68 : 3830,
VB.NET : 4607,
Java : 5138,
Scala : 5427
(See e.g. http://rosettacode.org/wiki/Conway's_Game_of_Life)
Comments?
Please be specific about the merits of the notational approach the language you critique takes and do so from a reasonably high level - preferably with direct project experience.
They wanted to develop notation that would represent not only formulas in mathematics, but also deductions and proofs in mathematics. Boole had shown around 1850 that one could represent basic propositional logic in mathematical terms.
Mathematical notation is widely used in mathematics, science, and engineering for representing complex concepts and properties in a concise, unambiguous and accurate way. is the quantitative representation in mathematical notation of the mass–energy equivalence.
Frangois Viète (Latin: Vieta), a great French mathematician, is credited with the invention of this system, and is therefore known as the "father of modern algebraic notation" [3, p. 268].
Greek mathematician Diophantus was one of the pioneers of syncopated algebra. In this stage of notation, some shorthand was used along with prose. Indian mathematicians developed a syncopated algebraic notation independently of Diophantus. Around 1500 BC, symbolic algebra began to develop.
You used Conway's game of Life as an example, and no language can solve that more elegantly or efficiently than APL. The reason is full array/matrix manipulation in very powerful single or multiple character operators.
See: Whatever Happened to APL? and my story about my combinatorics assignment that compares APL with PL/I.
If you're talking about "efficient" in terms of keystrokes to solve a problem, APL will be tough to beat.
Your byte count of 145 for APL solving Conway's game is wrong. That is a very inefficient solution you were looking at.
This is one solution:
(source: catpad.net)
That's 68 bytes and beats the J solution. I think there are other APL solutions that are even better.
Also see this video about it.
Those seizing on "keystrokes as a measure of efficiency - considered harmful" are missing the point indicated by the title of this discussion.
A well-designed, notationally-dense language like APL or J gives us high-level computational concepts embedded in a simple, consistent framework which allows us to think more easily about complex problems. The small number of keystrokes is a side-effect of this.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With