To understand this warning, one must first understand the concept of a coefficient matrix . In linear regression—the workhorse of everything from econometrics to engineering—we solve a system of equations to find the relationship between variables. The algorithm relies on inverting a matrix of correlations or covariances. A “coefficient” in this context refers to the elements of that matrix, not the final regression outputs. When the software reports that the ratio of the largest coefficient to the smallest coefficient in that matrix exceeds one hundred million (1.0e8), it is diagnosing a condition known in numerical linear algebra as ill-conditioning .
What causes this catastrophic imbalance? The most common culprit is —specifically, extreme multicollinearity where two or more predictor variables are almost perfectly correlated. For example, consider a regression analyzing house prices that includes both “price in US Dollars” and “price in Japanese Yen” at the same time. The coefficient for Dollars might be 1, while the coefficient for Yen would be approximately 0.01. The ratio between them is only 100. But if you include “age of house in years” and “age of house in seconds,” the latter coefficient becomes astronomically tiny (1 year ≈ 31.5 million seconds). The ratio between the coefficient for seconds (tiny) and the coefficient for a normalized variable (e.g., number of bathrooms, around 1) will easily exceed 1.0e8. Other causes include scaling errors (mixing millimeters and kilometers) or redundant dummy variables (the classic “dummy variable trap” where you include one category for every possible outcome plus an intercept). coefficient ratio exceeds 1.0e8 - check results
The warning’s final, chilling instruction—“check results”—is the most important part. What does a “bad” result look like? Ironically, it looks perfectly normal. The software will still produce numbers: standard errors, p-values, and R-squared values. But these numbers are numerical lies. Standard errors may be wildly inflated or implausibly small. Coefficients may have the wrong sign (positive instead of negative). P-values that appear “significant” are essentially random noise filtered through a broken lens. A classic symptom is that dropping a single observation or rounding a variable slightly changes the coefficients by orders of magnitude. The model becomes non-reproducible. To understand this warning, one must first understand