Firefox AI Runtime
Live views of the Remote Settings collections powering the Firefox AI runtime and Smart Window
Which ML model is deployed for each Firefox feature — task type, model ID, quantization, and revision.
ml-inference-options
The WebAssembly runtime files (ONNX and wllama) hosted in Remote Settings for on-device inference.
ml-onnx-runtime
URL prefixes that are allowed or denied when Firefox loads models from external hosts.
ml-model-allow-deny-list
Language-specific n-gram block lists used for content filtering during ML inference.
ml-inference-words-block-list
System prompts and configuration for each Firefox Smart Window feature and AI model.
ai-window-prompts