Sourcify
Effortlessly find and manage open-source dependencies for your projects.
A programming language for LLMs that enables robust and modular prompting with types, constraints, and an optimizing runtime.

LMQL (Language Model Query Language) is a programming language designed for interacting with Large Language Models (LLMs). It allows developers to define structured prompts with constraints, types, and modular components. LMQL uses an optimizing runtime that translates high-level constraints into (sub)token masks, which are enforced during text generation, leading to efficient and controllable LLM output. Nested queries facilitate modularity and reuse of prompt components. LMQL supports multiple backends like llama.cpp, OpenAI, and π€ Transformers, ensuring portability. It enables complex logic within prompts through Python control flow and string interpolation. LMQL helps in building chatbots, data processing pipelines, and autonomous agents with reproducible LLM outputs.
LMQL (Language Model Query Language) is a programming language designed for interacting with Large Language Models (LLMs).
Explore all tools that specialize in prompt engineering. This domain focus ensures LMQL delivers optimized results for this specific requirement.
Enables modularized local instructions and re-use of prompt components within LMQL queries.
Allows specifying logical constraints on the output, which are enforced during text generation.
Supports multiple LLM backends, including llama.cpp, OpenAI, and π€ Transformers.
Enables dynamic prompt construction using Python control flow and branching behavior.
Guarantees output format by using typed variables within LMQL queries.
Install LMQL locally or use the web-based Playground IDE.
Write a simple LMQL query with prompt statements and constraints.
Specify the decoding algorithm to use for text generation (e.g., sample, argmax).
Use control-flow and branching behavior to create dynamic prompts.
Define constraints using 'where' clauses to control output.
Employ typed variables for guaranteed output format.
Integrate LMQL queries into Python functions.
All Set
Ready to go
Verified feedback from other users.
"LMQL is praised for its ability to create structured and reliable LLM interactions but needs more documentation."
Post questions, share tips, and help other users.
Effortlessly find and manage open-source dependencies for your projects.

End-to-end typesafe APIs made easy.

Page speed monitoring with Lighthouse, focusing on user experience metrics and data visualization.

Topcoder is a pioneer in crowdsourcing, connecting businesses with a global talent network to solve technical challenges.

Explore millions of Discord Bots and Discord Apps.

Build internal tools 10x faster with an open-source low-code platform.

Open-source RAG evaluation tool for assessing accuracy, context quality, and latency of RAG systems.

AI-powered synthetic data generation for software and AI development, ensuring compliance and accelerating engineering velocity.