Just reading there is an Anthropic xAI deal. But you can also
do neuro symbolic compute at home, without the need for a big
data center. This might be interesting for Prologers seeking
data privacy and hosting independency, or developing hybrid
solutions, exploring new frontiers of reasoning. Given the danger
that I compare apples with oranges, but there is a formidable
race going on. It is interesting too see how the AI laptop landscape
is forming. Just toying around with a deep seek v4 derivate in
LM Studio, a model that came out 9 days ago:
Etc.. etc.. it shows more text, all generated on a laptop that was
even only $1000 since end of year 2025, there were some discounts.
The laptop has the Windows Copilot+ specs. The secrete sauce?
Some general matrix multiplications (GEMM) tucked in your iGPU.
Some clever guy has just published a tin opener, for LLMs.
They are indeed graph models, only baked into artificial
neural networks. Namely the knowledge part of an LLM. People
familiar with chat80 might recognize this type of knowledge:
larql> DESCRIBE "France";
France
Edges (L14-27):
capital → Paris 1436.9 L27 (probe)
language → French 35.2 L24 (probe)
continent → Europe 14.4 L25 (probe)
borders → Spain 13.3 L18 (probe)
The tin opener is called LARQL (with Lazarus Query Language), it
decompiles transformer models into a queryable format called a vindex.
The tin opener also allows to modify models.
Due to space constraints, the knowledge might not contain
provenance edges. Depending on the model it can be
a ground truth with high confidence.
See also:
LLMs Are Databases - So Query Them
Chris Hay - 13.04.2026
https://www.youtube.com/watch?v=8Ppw8254nLI
LARQL - The model IS the database
chrishayuk - GitHub
https://github.com/chrishayuk/larql