So Raku has a module for “few shot” LLM training for DSLs … would be cool to see an example DSL interpreter in Selkie (eg a window for DSL code and a window for output…
- https://raku.land/zef:antononcube/LLM::Resources : Uses agentic LLM-graphs with asynchronous execution
- https://raku.land/zef:antononcube/ML::FindTextualAnswer : Finds answers to questions over provided texts (e.g. natural language code generation commands)
- https://raku.land/zef:antononcube/ML::NLPTemplateEngine : Fills-in predefined code templates based on natural language code descriptions/commands
- https://raku.land/zef:antononcube/DSL::Examples : Example translations of natural language commands to executable code
- https://raku.land/zef:apogee/LLM::Character implements CCv3 which is a standard for managing characters (system prompts) and lorebooks (injected snippets)
- https://raku.land/zef:apogee/LLM::Chat handles context shifting for long contexts, sampler settings, templating for text completion & inferencing with or without streaming using supply/tap
- https://raku.land/zef:apogee/LLM::Data::Inference adds retries, JSON parsing & multi-model route handling to LLM::Chat
- https://raku.land/zef:apogee/LLM::Data::Pipeline allows you to declaratively build multi-step pipelines (simple agentic LLM use)
- https://raku.land/zef:apogee/HuggingFace::API is a partial wrapper around HF API for grabbing tokenizers.json & tokenizer_config.json
- https://raku.land/zef:apogee/Template::Jinja2 is a near-complete impl of Jinja2 for parsing LLM text completion templates (can be used for anything you'd use Jinja2 for)
- https://raku.land/zef:apogee/Tokenizers is a thin wrapper around HF tokenizers, for token counting mostly
Wouldn't be too difficult:
- FileBrowser widget to get the serialised bitmap.
- MultiLineInput for the query text input.
- ListView to show tags and fields.
- RichText for the AST pretty printer.