infermux Tutorials
Step-by-step guides for routing inference requests with infermux. Set up providers, configure failover, and optimize costs.
Route your first inference request
10 minInstall infermux, write a minimal config with one provider, start the server, and send your first request through the router.
Set up automatic failover between OpenAI and Anthropic
20 minConfigure OpenAI as primary and Anthropic as fallback. Tune circuit breaker thresholds, test failure scenarios, and verify automatic recovery.
Optimize inference costs with routing strategies
25 minRoute requests to the cheapest available provider, set per-caller spend caps, and read cost reports from the management API.