Time-series anomaly detection plays a crucial role in industrial fault detection. Most existing studies follow either an unsupervised setting, which is prone to false alarms, or a supervised setting, which is time-consuming and labor-intensive. To address these limitations, we adopt an innovative weakly supervised paradigm for industrial fault detection, where segment-level labels are provided during training, while point-level predictions are made during inference. Within this paradigm, we propose an innovative C-ary tree-based multi-instance learning (MIL) framework. First, the entire time series is represented as a C-ary tree, where nodes representing subsequences of different lengths are treated as instances in the MIL framework. This design allows for the detection of both point and collective anomalies. Second, to detect out-of-distribution (OOD) anomalies that are not visible during training, we develop a vector quantization module to memorize regular historical patterns. OOD anomalies are then detected when they show significant discrepancies from all memorized patterns. Finally, we enhance the MIL framework with an attention-based pooling mechanism that allocates greater focus on anomalous instances, further improving detection performance. To validate the effectiveness of our method, we conduct experiments on four real-world industrial time-series datasets. The results show that our method outperforms existing approaches by at least 6.01% in AUROC under weak supervision.