Master LLMs with our FREE course in collaboration with Activeloop & Intel Disruptor Initiative. Join now!

Publication

Outlier Detection (Part 1): Univariate
Latest   Machine Learning

Outlier Detection (Part 1): Univariate

Last Updated on July 24, 2023 by Editorial Team

Author(s): Mishtert T

Originally published on Towards AI.

Analyze even better — For Better Informed Decision

Robust Statistics: Example in R

Making a Point

A popular tool for outlier detection is

  1. Calculate z-score for each observation.
  2. Flag observation as an outlier if z-score has an absolute value greater than 3
z-score computation

For Example: Let’s look at a log-transformed dataset named log_inc of the monthly income of 10 people. If you look closely, the last observation is clearly an outlier.

7.876638 7.681560 7.628518 … 7.764296 9.912943

Computing the z-score for each observation:


Mean <- mean(log_inc)
Sd <- sd(log_inc)
z_score <- abs((log_inc — Mean)/Sd)

Checking if z-scores are larger than 3 in absolute value:

abs(z_score) > 3
[1] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE

No outliers identified.

Using the same dataset we’re going to show that the last variable is detected as an outlier using Robust Statistics

Robust statistics

Classical statistical methods rely on (normality) assumptions, but even a single outlier can influence conclusions significantly and may lead to misleading results.

Robust statistics produce also reliable results when data contains outliers and yield automatic outlier detection tools.

“It is perfect to use both classical and robust methods routinely, and only worry when they differ enough to matter… But when they differ, you should think hard.” — J.W. Tukey (1979)

Estimators of Location of Xn:

Sample Mean

Sample Mean

Median

Median

Estimators of Scale:

Sample Standard Deviation
Median Absolute Deviation
Interquartile range (normalized) where Q1 and Q3 are the first and third quartile of the data

Robust z-scores for outlier detection

We plug in the robust estimators to compute robust z-scores:

Robust Estimators to compute Robust z-score

Demonstration

Let’s use the same dataset that we saw as an example in the beginning.

data <- c(7.876638, 7.681560, 7.628518,7.234543,7.465769,
7.135876, 7.895643, 7.793432, 7.764296, 9.912943)
Mean <- mean(data); Sd <- sd(data); Med <- median(data); Mad <- mad(data)

Z-Score Output

> abs(z_score) > 3
[1] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE

Robust Z-Score Output

> abs(rob_z_score) > 3
[1] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE
> which(abs(rob_z_score) > 3)
[1] 10

Boxplot

One of the popular tools used to identify outliers is Tukey’s boxplot. Observations are flagged as outliers if they are outside the boxplot fence.

Tukey’s boxplot

Let’s use an example data 0.02% of “(LOS)Length of Stay in Hospital” data and plot boxplot to understand.

library(ggplot2)ggplot(data.frame(los), aes(x = "", y = los)) + geom_boxplot(outlier.colour = "red", outlier.shape = 16, outlier.size = 3,fill = "lightblue", width = 0.5) + xlab("") + ylab("Length Of Stay (LOS)") + theme(text = element_text(size = 25))
Boxplot with Outliers

Adjusted boxplot (Hubert and Vandervieren, 2008)

  1. At asymmetric distributions, boxplot may flag many regular points as outliers.
  2. The skewness-adjusted boxplot corrects for this by using a robust measure of skewness in determining the fence.
library(robustbase)adjbox_stats <- adjboxStats(los)$statsggplot(data.frame(los), aes(x = "", y = los)) + stat_boxplot(geom = "errorbar", width = 0.2, coef = 1.5*exp(3*mc(los))) + geom_boxplot(ymin = adjbox_stats[1],ymax = adjbox_stats[5],middle = adjbox_stats[3],upper = adjbox_stats[4],lower = adjbox_stats[2],outlier.shape = NA,fill = "lightblue",width = 0.5) +
geom_point(data=subset(data.frame(los),los < adjbox_stats[1] U+007C los > adjbox_stats[5]),col = "red", size = 3, shape = 16) +xlab("") + ylab("Length Of Stay (LOS)") +theme(text = element_text(size = 25))
Comparison of the fence line in a skewness-adjusted box plot

Summary:

Usage of Robust Statistics enhances our detection capability of outliers than just using classical methods.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓