FINDING · DETECTION

Pretraining on 30 GB of unlabeled mixed traffic via masked language modeling (ISCX-VPN2016 NonVPN, CICIDS2017, WIDE backbone), then fine-tuning, enables TrafficMoE to classify VPN application traffic at 88.72% F1 and VPN service traffic at 92.61% F1, exceeding all fully supervised and prior pretraining baselines without requiring labeled training data for those domains.

From 2026-he-trafficmoe-heterogeneity-aware-mixtureTrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification · §IV-A, §IV-B, Table III · 2026 · arXiv preprint

Implications

Tags

censors
generic
techniques
ml-classifiertraffic-shape
defenses
tunneling

Extracted by claude-sonnet-4-6 — review before relying.