Place these in ~/lisp-models/ and point your Lisp code there. ;; Step 1 – install SBCL and Quicklisp ;; Step 2 – in REPL (ql:quickload :cl-gpt2) ;; Step 3 – load model (downloads weights automatically) (defparameter ai (cl-gpt2:load-model :gpt2-medium))
(ql:quickload :cl-markov) (e.g., Shakespeare):
(ql:quickload :cl-llama) (cl-llama:load-model "path/to/llama-2-7b.Q4_K_M.gguf") (cl-llama:generate "Once upon a time") If a Lisp library expects local weights: ai generator lisp download
(defparameter *gpt2* (cl-gpt2:load-model :gpt2-small)) (cl-gpt2:generate *gpt2* "The meaning of life is" :length 50) Model weights stored in ~/.cache/cl-gpt2/ . Option B: cl-transformer – General Transformer Library Supports BERT, GPT, etc. Requires manual weight download.
;; Step 4 – generate text (cl-gpt2:generate ai "The future of artificial intelligence" :max-tokens 100 :temperature 0.8) Place these in ~/lisp-models/ and point your Lisp code there
(ql:quickload :cl-gpt2) It automatically downloads a (124M parameter GPT-2) from Hugging Face (~500 MB) on first use.
wget https://www.gutenberg.org/files/100/100-0.txt -O shakespeare.txt Requires manual weight download
For a modern LLM generator in Lisp, use (easy) or cl-llama + llama.cpp (more powerful). Avoid implementing transformers from scratch unless educational.