Programs typically read input data, operate on that data and give some output data. Metaprograms read another program, manipulate that program and return a modified program. Sometimes a metaprogram can change its own behaviour by updating itself. The act of writing metaprograms is called metaprogramming. The language used to write metaprograms is called metalanguage.

In general, where code itself needs to change with the data, metaprogramming can be an effective approach. Metaprograms treat code as data, and data as code.

While metaprogramming has been around for decades, it only from the mid-1990s that it received increasing attention. More and more languages are supporting metaprogramming. Principles of metaprogramming have led to higher levels of abstractions such as meta-modeling, metadesign, metaengineering and Model-Driven Engineering (MDE).


  • Could you explain metaprogramming with an example?

    Assume you're asked to write a program to print numbers 1 to 1000 but without using any looping constructs in the code. The obvious way to do this is to explicitly print each number. Writing such a code becomes tedious for the programmer. For example, in a shell script, you would have to type echo 1, echo 2, and so on till echo 1000. However, it's possible to write another shell script with a for loop that outputs the desired program containing these 1000 echo commands. In other words, we've written a program to produce another program.

    In the above example, a shell script produces another shell script. This need not be the case. The metaprogram could be in a different language than the target language. For example, a Ruby script could be used to output C source code.

  • What are some use cases of metaprogramming?

    Metaprogramming has many applications: compiler generation, application generation, code analysis, generic component design, program transformations, software maintenance/evolution/configuration, anticipatory optimization, design patterns, partial evaluation, and more.

    Compilers, transpilers, assemblers and interpreters are examples of metaprograms. They take programs in one form and transform them into machine code, bytecode or even source code in another language. Lex and Yacc are metaprograms. Programs that read database metadata and output EJB or XML code are further examples. Dynamic or interpreted languages usually have an eval() function that allows execution of code supplied as strings. This is also metaprogramming.

    ActiveRecord is a database ORM framework used in Ruby on Rails. When calling a method that's not defined, due to predefined conventions, the method gets created on the fly. This implies that a developer writes less manual code.

    Metaprogramming has been applied to numerical algorithms such as for solving Ordinary Differential Equations (ODEs).

    Another example is the Hone modular graphing library written in Julia. Data points, axes and legends are composed as strings and executed to create the plot.

  • What's the formal definition of metaprogramming?

    One definition sees metaprogramming as introducing a higher level of abstraction: "the technique of specifying generic software source templates from which classes of software components, or parts thereof, can be automatically instantiated to produce new software components."

    We could define metalanguage thus: "any language or symbolic system used to discuss, describe, or analyze another language or symbolic system is a metalanguage." This leads to the definition of metaprogramming: "creating application programs by writing programs that produce programs."

    Another definition that emphasizes program generation: "metaprograms manipulate object-programs" where metaprograms "may construct object-programs, combine object-program fragments into larger object-programs, observe the structure and other properties of object-programs." A simpler definition is, "Metaprogramming is writing programs that themselves write code."

  • What are the benefits of metaprogramming?

    Metaprogramming has many benefits:

    • Extensible: Since code is treated as data, it's easy to extend programs. Code can be added by simply adding metadata.
    • Performance: Instead of having variables in memory and passing them around, everything is packed into a concatenated string and executed. Metaprogramming enables abstractions without runtime penalty. It can be useful in pushing computations from runtime to compile time. Metaprograms can help tune programs to specific architectures. Specialized efficient programs are better than generic inefficient ones.
    • Less Code: In the long term, metaprogramming automates code writing and reduces manual effort. The use of macros (such as in C or C++) saves development time and promotes reusable code. Recurring code patterns can be abstracted and reused even when functions, generics or classes are unable to this.
    • Correctness: Compiling code from a high-level language to machine code or bytecode is not something we wish to do manually. If done manually, it's easy to introduce errors. Metaprograms help in such tedious but necessary tasks.
    • Reasoning: Metaprograms such as flow analyzers and type checkers help us discover properties about programs, validate behaviour or improve performance.
  • How do we classify the various types of metaprogramming?

    In Homogeneous MP, the object language and metalanguage are the same. Examples include Racket, Template Haskell, MetaOcaml, and Converge. In Heterogeneous MP, these are different. For example, a Java compiler is written in C. In Generative MP, the metaprogram generates an output program. In Intensional MP, the metaprogram analyzes the input program, by way of reflection. Combining the above two classifications, we note an important class called Homogeneous Generative Meta-Programming (HGMP).

    Another classification is based on when metaprograms execute: Compile-Time MP (CTMP) or Run-Time MP (RTMP). CTMP examples include the Lisp family, Template Haskell, Converge and C++. RTMP examples include the MetaML family, JavaScript and printf-based MP. Converge and Scala support both. There are also implicit and explicit flavours of compile-time evaluation.

    CTMP and RTMP could be called static and runtime respectively. With RTMP, the metaprogram produces new code, which is immediately executed. If the new code is also a metaprogram, we called this multi-stage programming.

    Metaprogramming requires us to annotate static code fragments (the metaprogram) from dynamic ones generated by the metaprogram. This gives rise to two flavours, manually annotated MP versus automatically annotated MP.

  • In metaprogramming, how are programs represented as data?
    High-level characterisation of various HGMP languages. Source: Berger et al. 2017, fig. 1.
    High-level characterisation of various HGMP languages. Source: Berger et al. 2017, fig. 1.

    Common ways of representing programs include:

    • Strings: Simplest and terse. Typically runtime evaluation but some languages offer compile-time support.
    • Abstract Syntax Trees (ASTs): Code fragments represented as a tree. For example, \(2+3\) can be written as \(ast_{add}(ast_{int}(2), ast_{int}(3))\).
    • Up MetaLevels (UpMLs): Represent AST-like structures as quoted chunks of normal program syntax. \(2+3\) can be written as \(\uparrow\{2+3\}\).
    • Down MetaLevels (DownMLs): Used to express holes in UpMLs. Evaluated first to yield an AST. If function \(f\) returns the AST for \(2+3\), then \(\uparrow\{\downarrow\{f()\} *4\}\) is a valid expression.

    Adopting a suitable representation depends on syntactic overhead, expressivity, support for the target language and support for generating 'valid' programs. Hygiene is also important, which is about managing free and bound variables. Typically, we want the terseness of strings (ASTs are verbose) and the syntactic correctness of ASTs. UpMLs (aka quasi-quotes, backquotes) and DownMLs (aka splices, inserts) offer the best of both worlds.

    Lilis and Savidis (2020) give a more complete discussion of metaprogramming models include macro systems, reflection systems, Meta Object Protocols (MOPs), Aspect-Oriented Programming (AOP), generative programming and multi-stage programming.

  • Which languages offer support for metaprogramming?
    Classification of metaprogramming languages and systems. Source: Lilis and Savidis 2020, table 8.
    Classification of metaprogramming languages and systems. Source: Lilis and Savidis 2020, table 8.

    Many languages support metaprogramming including Python, Ruby, JavaScript, Java, Go, Clojure, and Julia. Groovy, Java, Racket, Common Lisp and Scheme are examples that support both CTMP and RTMP. CPP, M4, Racket, Reflective Java and AspectC++ are examples that support Preprocessing-Time MP (PPTMP). Interested readers can refer to A Survey of Metaprogramming Languages by Lilis and Savidis (2020).

    Many languages such as Scala, Rust or JavaScript support metaprogramming by design. Others such as C++ evolved to support metaprogramming.

    Interpreted languages usually have the eval() function. Examples include Lisp, Perl, Ruby, Python, PHP and JavaScript. In such languages, we could generate the code within the program and pass this code into eval() for execution. For example, we could do this in Ruby: x = 3; s = 'x + 1'; puts eval(s).

    UpMLs are supported in Racket, MetaOCaml and Converge. Popular languages including Python, Ruby, Groovy and Perl support MOP based on metaclasses.

  • What are some challenges or shortcomings of metaprogramming?

    Metaprogramming, both in theory and practice, has many open problems. C++ template metaprogramming is not pretty. Scala had to rewrite it's implementation for metaprogramming. MetaOcaml has typing problems. There's no consistent terminology. Tooling is not mature. For example, it's hard to debug metaprograms.

    Executing strings as code can lead to insecure code, particularly when the strings come from untrusted sources. However, languages may provide some support to make it safer. For instance, Python's decorators are safer than strings for metaprogramming.

    Writing code as strings reduces readability and is likely to introduce bugs. Strings might use lots of regular expressions. When strings can't be parsed due to some programming error, we might see unexpected output.

    Metaprogramming is hard and harder for beginners. Someone who doesn't know the language constructs (such as macros in Lisp) will find it very hard to understand metaprograms written by others.



Willard Van Orman Quine invents quasi-quote and splicing. These concepts would later prove to be useful for metaprogramming. In fact, the term quine is coined decades later to refer to a computer program that takes no input and outputs its own source code. Quines can be considered as special metaprograms.


In a short MIT memo, Timothy P. Hart introduces macros into Lisp. In the 1970s, Lisp becomes possibly the first language to support HGMP.

A non-exhaustive timeline of metaprogramming languages and systems. Source: Lilis and Savidis 2020, fig. 1.
A non-exhaustive timeline of metaprogramming languages and systems. Source: Lilis and Savidis 2020, fig. 1.

Two paradigms of metaprogramming emerge: Aspect-Oriented Programming (AOP) and Multi-Stage Programming. It's also during the years 1995-2000 that many metalanguages and metaprogramming systems emerge. Mainstream languages such as Java and C++ provide better support for metaprogramming. The arrival of MetaML proves that HGMP is possible beyond syntactically simple languages like Lisp.

Sample Code

  • # Source:
    # Accessed 2021-09-09
    # metaprogram
    echo '#!/bin/sh' > program
    for i in $(seq 992)
        echo "echo $i" >> program
    chmod +x program


  1. Apache Groovy. 2021. "Runtime and compile-time metaprogramming." Accessed 2021-09-09.
  2. Berger, Martin. 2016. "Foundations of meta-programming." Slides, August 9. Accessed 2021-09-09.
  3. Berger, Martin, Laurence Tratt, and Christian Urban. 2017. "Modelling homogeneous generative meta-programming." In: 31st European Conference on Object-Oriented Programming (ECOOP 2017), Peter Müller (ed), Leibniz International Proceedings in Informatics, Article no. 93, pp. 93:1–93:23. Accessed 2021-09-09.
  4. Bicking, Ian. 2004. "The Challenge Of Metaprogramming." Updated 2005-01-09. Accessed 2021-09-09.
  5. Boudreau, Emmett. 2020. "The Pitfalls Of Meta-Programming." Towards Data Science, on Medium, August 31. Accessed 2021-09-09.
  6. C2 Wiki. 2012. "Meta Programming." February 7. Accessed 2021-09-09.
  7. Damaševičius, Robertas, and Vytautas Štuikys. 2008. "Taxonomy of The Fundamental Concepts of Meta-Programming." Information Technology and Control, vol. 37, no. 2, June 13. Accessed 2021-09-09.
  8. Israely, Shlomi. 2019. "Meta-Programming — basic concepts and use cases." On Medium, June 22. Accessed 2021-09-09.
  9. Lanning, Erik. 2016. "Metaprogramming pt.1." Blog, Erik's CS Musings, April 6. Accessed 2021-09-09.
  10. Lilis, Yannis, and Anthony Savidis. 2020. "A Survey of Metaprogramming Languages." ACM Computing Surveys, vol. 52, no. 6, pp. 1–39, January. Accessed 2021-09-09.
  11. Mulansky, Mario, and Karsten Ahnert. 2011. "Metaprogramming Applied to Numerical Problems." AIP Conference Proceedings 1389, pp. 1582-1585. doi: 10.1063/1.3637933. Accessed 2021-09-09.
  12. Ortiz, Ariel. 2007. "An Introduction to Metaprogramming." Linux Journal, June 1. Accessed 2021-09-09.
  13. Sheard, Tim. 2000. "Accomplishments and Research Challenges in Meta-Programming." In 2nd Int. Workshop on Semantics, Applications, and Implementation of Program Generation, LNCS 2196, Springer-Verlag, pp. 2-44. Accessed 2021-09-09.
  14. Steele, Guy L. and Richard P. Gabriel. 1993. "The Evolution of Lisp." ACM SIGPLAN Notices, vol. 28, no. 3, pp. 231–270. doi: 10.1145/155360.155373. Accessed 2021-09-09.
  15. Toal, Ray. 2021. "Metaprogramming." Notes, CS Dept, Loyola Marymount University. Accessed 2021-09-09.
  16. Wikipedia. 2021. "Metaprogramming." Wikipedia, July 6. Accessed 2021-09-09.

Further Reading

  1. Lilis, Yannis, and Anthony Savidis. 2020. "A Survey of Metaprogramming Languages." ACM Computing Surveys, vol. 52, no. 6, pp. 1–39, January. Accessed 2021-09-09.
  2. Sheard, Tim. 2000. "Accomplishments and Research Challenges in Meta-Programming." In 2nd Int. Workshop on Semantics, Applications, and Implementation of Program Generation, LNCS 2196, Springer-Verlag, pp. 2-44. Accessed 2021-09-09.
  3. VanderPlas, Jake. 2012. "A Primer on Python Metaclasses." Blog, Pythonic Perambulations, December 1. Accessed 2021-09-09.
  4. Todorovic, Nikola. 2015. "Ruby Metaprogramming Is Even Cooler Than It Sounds." Toptal, October 29. Accessed 2021-09-09.
  5. Krauss, Aaron. 2016. "Programming Concepts: Type Introspection and Reflection." Blog, February 12. Accessed 2021-09-09.
  6. van Binsbergen, L. Thomas. 2018. "The Fundamental Constructs of Homogeneous Generative Meta-Programming: or Funcons for HGMP." Slides, University of London, January 17. Accessed 2021-09-09.

Article Stats

Author-wise Stats for Article Edits

No. of Edits
No. of Chats

Cite As

Devopedia. 2021. "Metaprogramming." Version 3, September 13. Accessed 2024-06-25.
Contributed by
1 author

Last updated on
2021-09-13 10:26:39
  • Quine
  • Program Introspection
  • Reflective Programming
  • Aspect-Oriented Programming
  • Template Metaprogramming
  • Homogeneous Generative Meta-Programming