Reliability

What is Error Rate?

Percentage of requests that result in errors.

Definition

Error Rate measures the proportion of user requests or operations that fail, encompassing both server errors (5xx) and client errors (4xx) in web applications. It is the most direct measure of reliability from the user's perspective. High error rates erode trust, increase support costs, and directly impact conversion and retention. Error rate should be tracked per service, per endpoint, and per user segment to quickly identify and isolate issues.

Formula

(Error responses ÷ Total requests) × 100

How to measure

Calculate (5xx responses + significant 4xx responses) ÷ total requests. Exclude expected 4xx responses (e.g., 404 for missing pages that are handled gracefully). Track per service or endpoint. Set real-time alerts for error rate spikes. Correlate with deployment events to catch regressions.

Industry benchmarks

Production targets: below 0.1% for critical paths, below 1% for non-critical endpoints. During incidents, error rates above 5% typically warrant immediate response. Payment/checkout flows should maintain error rates below 0.05%.

Used in feature types

Performance Reliability

Related metrics

Need Error Rate in your metrics plan?

Use Metrik Compass to generate a complete metrics plan with baselines, targets, and guardrails — including Error Rate when it fits your feature type.