LLMs as Files
Recently I had a need to interact with a LLM from a throwaway shell script. Initially I went the easy route of shelling out to claude -p to solve my problem. Problem solved, script done, and I moved on to my next task.
Something in my brain, nestled next to the part that loves the ideas behind the Plan 9 operating system, kept spinning on the idea. Having successfully nerd-sniped myself I did what any self-respecting programmer would do. I built a FUSE filesystem that makes interacting with large language models as easy as interacting with files.
> llm-mount --model qwen3p6-plus $HOME/qwen
Mounted: 01KQTCKX976TDQNVFB7Q55JNT2
Path: /home/kevsmith/qwen
Provider: fireworks
Model: qwen3p6-plus
> llm-mount -l
NAME PATH PROVIDER MODEL CREATED
01KQTCKX976TDQNVFB7Q55JNT2 /home/kevsmith/qwen fireworks qwen3p6-plus 2026-05-04T16:02:09-04:00
> echo "What's the boiling point of tungsten?" > $HOME/qwen/input
> tail -f $HOME/qwen/output
The boiling point of tungsten is approximately **5,555 °C (10,031 °F or 5,828 K)**.
This is the highest boiling point of any metal and one of the highest of all elements. Keep in mind that slight variations in
reported values can occur because measuring temperatures this extreme with high precision is experimentally challenging.
> llm-umount $HOME/qwen
Unmounted: /home/kevsmith/qwen
A minimal LLM chat client fits in ~40 lines of shell script.
Code available here under the Apache 2.0 license.
Comments