arvindkaphley commited on
Commit
2b36622
·
verified ·
1 Parent(s): 1205c25

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -12
README.md CHANGED
@@ -44,22 +44,22 @@ Ruby Code Generator is a versatile tool crafted to streamline the interaction be
44
 
45
  **2. Data Preprocessing:**
46
 
47
- - Tokenize the code text using the appropriate tokenizer for the chosen model.
48
- - Apply necessary cleaning or normalization (e.g., removing comments, handling indentation).
49
- - Create input examples suitable for the model's architecture (e.g., with masked language modeling objectives).
50
 
51
  **3. Configure Training:**
52
 
53
- - Initialize a Trainer object (likely from a library like Transformers).
54
- - Set training arguments based on the provided args:
55
 
56
- - Learning rate, optimizer, scheduler
57
- - Gradient accumulation steps
58
- - Weight decay
59
- - Loss function (likely cross-entropy)
60
- - Evaluation metrics (e.g., accuracy, perplexity)
61
- - Device placement (GPU/TPU)
62
- - Number of processes for potential distributed training
63
 
64
  **4. Train the Model:**
65
 
 
44
 
45
  **2. Data Preprocessing:**
46
 
47
+ - Tokenize the code text using the appropriate tokenizer for the chosen model.
48
+ - Apply necessary cleaning or normalization (e.g., removing comments, handling indentation).
49
+ - Create input examples suitable for the model's architecture (e.g., with masked language modeling objectives).
50
 
51
  **3. Configure Training:**
52
 
53
+ - Initialize a Trainer object (likely from a library like Transformers).
54
+ - Set training arguments based on the provided args:
55
 
56
+ - Learning rate, optimizer, scheduler
57
+ - Gradient accumulation steps
58
+ - Weight decay
59
+ - Loss function (likely cross-entropy)
60
+ - Evaluation metrics (e.g., accuracy, perplexity)
61
+ - Device placement (GPU/TPU)
62
+ - Number of processes for potential distributed training
63
 
64
  **4. Train the Model:**
65