LanXiaoPang613 commited on
Commit
a54c84a
·
unverified ·
1 Parent(s): e9f5fd4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +69 -2
README.md CHANGED
@@ -1,2 +1,69 @@
1
- # C2MT
2
- This is the official code for C2MT: Cross to Merge Training with Class Prototypes: Robust to Deep Learning under Noisy Labels. It will be available soon.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Cross-to-merge training with class balance strategy for learning with noisy labels
2
+
3
+ <h5 align="center">
4
+
5
+ *Qian Zhang, Yi Zhu, Ming Yang, Ge Jin, YingWen Zhu, Qiu Chen*
6
+
7
+ [![Expert Systems with Applications]](https://doi.org/10.1016/j.eswa.2024.123846)
8
+ [![License: MIT]](https://github.com/LanXiaoPang613/C2MT/blob/main/LICENSE)
9
+
10
+ </h5>
11
+
12
+ The PyTorch implementation code of the paper, [Cross-to-merge training with class balance strategy for learning with noisy labels](https://doi.org/10.1016/j.eswa.2024.123846).
13
+
14
+ **Abstract**
15
+ The collection of large-scale datasets inevitably introduces noisy labels, leading to a substantial degradation in the performance of deep neural networks (DNNs). Although sample selection is a mainstream method in the field of learning with noisy labels, which aims to mitigate the impact of noisy labels during model training, the testing performance of these methods exhibits significant fluctuations across different noise rates and types. In this paper, we propose Cross-to-Merge Training (**C2MT** ), a novel framework that is insensitive to the prior information in sample selection progress, enhancing model robustness. In practical implementation, using cross-divided training data, two different networks are cross-trained with the co-teaching strategy for several local rounds, subsequently merged into a unified model by performing federated averages on the parameters of two models periodically. Additionally, we introduce a new class balance strategy, named Median Balance Strategy (MBS), during the cross-dividing process, which evenly divides the training data into a labeled subset and an unlabeled subset based on the estimated loss distribution characteristics. Extensive experimental results on both synthetic and real-world datasets demonstrate the effectiveness of C2MT. The Code will be available at: https://github.com/LanXiaoPang613/C2MT
16
+
17
+ ![PLReMix Framework](./img/framework.png)
18
+
19
+ [//]: # (<img src="./img/framework.tig" alt="PLReMix Framework" style="margin-left: 10px; margin-right: 50px;"/>)
20
+
21
+ ## Installation
22
+
23
+ ```shell
24
+ # Please install PyTorch using the official installation instructions (https://pytorch.org/get-started/locally/).
25
+ pip install -r requirements.txt
26
+ ```
27
+
28
+ ## Training
29
+
30
+ To train on the CIFAR dataset(https://www.cs.toronto.edu/~kriz/cifar.html), run the following command:
31
+
32
+ ```shell
33
+ python train_cifar_c2mt.py --r 0.2 --lambda_u 0
34
+ python train.py --r 0.4 --noise_mode 'asym' --lambda_u 10 --data_path './data/cifar-10-batches-py' --dataset 'cifar10' --num_class 10
35
+ python train.py --r 0.5 --noise_mode 'sym' --lambda_u 25 --data_path './data/cifar-10-batches-py' --dataset 'cifar10' --num_class 10
36
+ ```
37
+
38
+ To train on the Animal-10N dataset, run the following command:
39
+
40
+ ```shell
41
+ python train.py --num_epochs 60 --lambda_u 0 --data_path './data/Animal-10N' --dataset 'animal10N' --num_class 10
42
+ ```
43
+
44
+ <details>
45
+ <summary>Animal-10N(https://dm.kaist.ac.kr/datasets/animal-10n/) dataset (You need to download the dataset from the corresponding website.)</summary>
46
+
47
+
48
+ ## Citation
49
+
50
+ If you have any questions, do not hesitate to contact [email protected]
51
+
52
+ Also, if you find our work useful please consider citing our work:
53
+
54
+ ```bibtex
55
+ Qian Zhang, Yi Zhu, Ming Yang, Ge Jin, YingWen Zhu, Qiu Chen,
56
+ Cross-to-merge training with class balance strategy for learning with noisy labels,
57
+ Expert Systems with Applications,
58
+ 2024,
59
+ 123846,
60
+ ISSN 0957-4174,
61
+ https://doi.org/10.1016/j.eswa.2024.123846.
62
+ ```
63
+
64
+ ## Acknowledgement
65
+
66
+ * [DivideMix](https://github.com/LiJunnan1992/DivideMix): The algorithm that our framework is based on.
67
+ * [MOIT](https://github.com/DiegoOrtego/LabelNoiseMOIT): Inspiration for the balancing strategy.
68
+ * [Federated-Learning](https://github.com/AshwinRJ/Federated-Learning-PyTorch): Inspiration for the cross-to-merge training strategy.
69
+ * [Co-teaching]((https://github.com/bhanML/Co-teaching)): Inspiration for the cross-to-merge training strategy.