Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

streaming multipack for pretraining dataset #959

Merged
Prev Previous commit
Next Next commit
fix hardcoded data collator fix for multipack pretraining
  • Loading branch information
winglian committed Jan 5, 2024
commit 2a4924882c21c29f4d8fed72c6c05e3386f57760
7 changes: 5 additions & 2 deletions src/axolotl/core/trainer_builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -815,7 +815,7 @@ def build(self, total_num_steps):
train_dataset=self.train_dataset,
eval_dataset=self.eval_dataset,
args=training_args,
# data_collator=self.build_collator(**data_collator_kwargs),
data_collator=self.build_collator(training_args, **data_collator_kwargs),
bench_data_collator=transformers.DataCollatorForSeq2Seq(
self.tokenizer,
return_tensors="pt",
Expand All @@ -836,7 +836,10 @@ def build(self, total_num_steps):

return trainer

def build_collator(self, **kwargs):
def build_collator(self, training_args: AxolotlTrainingArguments, **kwargs):
if training_args.pretraining:
return None

if self.cfg.model_config_type == "mamba":
return MambaDataCollator(tokenizer=self.tokenizer)

Expand Down
Loading