Posted 2011-04-27 16:59:00 GMT
There was a time at Bell Labs when the so-called Area 10 from where so
much of UN*X sprang evangelised the
the dual view of programs as
both tools and text (Derman: My Life as a Quant). Tools like the
yacc parser generator and lex were invented there. Nowadays the
fashion has swung to handwriting
those things which were once generated. Programming systems have
become more expressive, and integrated with debugger and editor
environments so the benefits of a specialised language are lower and
the costs in terms of loss of convenience higher.
The increased expressiveness of modern mainstream programming languages over the FORTRAN of yore paradoxically has caused an explosion in boilerplate as people painstakingly reimplement operator precedence in their parsers by hand.
Full meta-programming is a huge boon to the development of application specific software, which is in the end, any large project. There is an unfortunate hysteresis in that using an application specific language brings vast costs: it is expensive to start using a new language (which as a project becomes sufficiently large is inevitable). Then it is expensive to maintain the separate compilation stage or auxiliary systems, and then expensive to redesign or replace the language once its deficiencies become unbearable (which as the project ages is also inevitable).
Given — for example — that the increased expressiveness of C++0x allows much to be done without macros that used to need macros, what advantage does first-class meta-programming have over template-based or type-system structured meta-programming? Well, clearly, a huge reduction in complexity: what is the point of wedging a Turing complete template language to be executed at compile time over a Turing complete base language, when one language would suffice perfectly well?
It is now quite possible to flexibly define and register functions for an RPC layer purely in C++; it is even possible to do it relatively neatly handling type-conversions without overhead at compile time. It was not before; should all the systems that have been built to do this be abandoned?
One problem still unsolvable in C++ or weaker systems like Scala, etc. is that one cannot define readily define types that should be represented in memory in a special way (for example, position independently or with persistence to a backing store). For example, in tpd2 there is a set of objects (including comments on this blog) that are automatically persisted to disk. These are defined with a custom "defrecord" instead of a standard "defstruct" and that is all; there is no need for runtime reflection to determine the static type of the object.
Fully flexible first-class meta-programming has the advantage there is a single language to learn, not a language and then language within it for specifying dependent types. It is most important as projects grow. Metaprogramming is all about scaling; scaling a codebase is hard enough without being shocked by new techniques. Meta-programming should be a normal tool not an extra-ordinary one.
Post a comment
"Meta-programming should be a normal tool not an extra-ordinary one."
Short examples of meta-programming being used as a normal tool would be very helpful.
Posted 2011-04-27 21:03:18 GMT by Anonymous from 18.104.22.168
I've tried to give examples from tpd2 in this (very briefly) and in previous articles. I guess I should write up a few examples of macrolet. I'm really struggling to see how to explain all this and any more ideas would be great!
Posted 2011-04-28 14:50:23 GMT by John Fremlin
In my opinion Peter Seibel's book "Practical Common Lisp" gives many good examples of down-to-earth meta-programming.
http://www.gigamonkeys.com/book/ (e.g. chapter 9).
Posted 2011-05-03 21:38:13 GMT by Anon