Anton Antonov — 45 minutes
In this presentation we discuss different ways of using Large Language Models (LLMs) in Raku.
We consider using LLMs via:
- Literate programming
- LLM functions
- LLM chat objects
- LLM prompts
- Chatbooks (based on Jupyter)
The presentation has multiple demos and examples of LLM utilization that include:
- Data retrieval, reshaping, and visualization
- Computation workflows in Physics and Chemistry
- Test generation and narration
- Iterative grammar development
- Number guessing games staging (man vs machine and machine vs machine)
- "In place" document generation
- Code writing assistance
- Comparison with Python- and Mathematica LLM implementations
- *Others*