You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The assertion error is shown below:
Downloading builder script: 3.69kB [00:00, 3.67MB/s]
2024-05-14:15:26:59,346 INFO [GPTQ.py:56] lm_eval is not installed, GPTQ may not be usable
Traceback (most recent call last):
File "/root/anaconda3/envs/lora/bin/tune", line 8, in
sys.exit(main())
^^^^^^
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/torchtune/_cli/tune.py", line 49, in main
parser.run(args)
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/torchtune/_cli/tune.py", line 43, in run
args.func(args)
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/torchtune/_cli/run.py", line 179, in _run_cmd
self._run_single_device(args)
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/torchtune/_cli/run.py", line 93, in _run_single_device
runpy.run_path(str(args.recipe), run_name="main")
File "", line 291, in run_path
File "", line 98, in _run_module_code
File "", line 88, in _run_code
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/recipes/eleuther_eval.py", line 26, in
import lm_eval
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/init.py", line 1, in
from .evaluator import evaluate, simple_evaluate
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/evaluator.py", line 10, in
import lm_eval.tasks
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/tasks/init.py", line 7, in
from lm_eval.api.task import TaskConfig, Task, ConfigurableTask
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/api/task.py", line 17, in
from lm_eval.api.metrics import (
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/api/metrics.py", line 18, in
@register_aggregation("mean")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/api/registry.py", line 141, in decorate
assert (
AssertionError: aggregation named 'mean' conflicts with existing registered aggregation!
does anyone meet this as well? what should I do to resolve it?
The text was updated successfully, but these errors were encountered:
When I try to evaluate the base model Meta-Llama3-8B-Instruct, it posted an assertion error.
The recipe yaml is shown below:
`
Config for EleutherEvalRecipe in eleuther_eval.py
To launch, run the following command from root torchtune directory:
tune run eleuther_eval --config eleuther_evaluation tasks=["truthfulqa_mc2","hellaswag"]
Model Arguments
model:
component: torchtune.models.llama3.llama3_8b
checkpointer:
component: torchtune.utils.FullModelHFCheckpointer
checkpoint_dir: /data/feipan3/Meta-Llama-3-8B-Instruct/original
checkpoint_files: [
consolidated.00.pth,
]
output_dir: /data/feipan3/torchtune_test/llm_eval_output
model_type: LLAMA3
Tokenizer
tokenizer:
component: torchtune.models.llama3.llama3_tokenizer
path: /data/feipan3/Meta-Llama-3-8B-Instruct/original/tokenizer.model
Environment
device: cuda
dtype: bf16
seed: 666
EleutherAI specific eval args
tasks: ["truthfulqa_mc2"]
limit: null
max_seq_length: 4096
Quantization specific args
quantizer: null
`
The assertion error is shown below:
Downloading builder script: 3.69kB [00:00, 3.67MB/s]
2024-05-14:15:26:59,346 INFO [GPTQ.py:56] lm_eval is not installed, GPTQ may not be usable
Traceback (most recent call last):
File "/root/anaconda3/envs/lora/bin/tune", line 8, in
sys.exit(main())
^^^^^^
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/torchtune/_cli/tune.py", line 49, in main
parser.run(args)
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/torchtune/_cli/tune.py", line 43, in run
args.func(args)
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/torchtune/_cli/run.py", line 179, in _run_cmd
self._run_single_device(args)
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/torchtune/_cli/run.py", line 93, in _run_single_device
runpy.run_path(str(args.recipe), run_name="main")
File "", line 291, in run_path
File "", line 98, in _run_module_code
File "", line 88, in _run_code
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/recipes/eleuther_eval.py", line 26, in
import lm_eval
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/init.py", line 1, in
from .evaluator import evaluate, simple_evaluate
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/evaluator.py", line 10, in
import lm_eval.tasks
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/tasks/init.py", line 7, in
from lm_eval.api.task import TaskConfig, Task, ConfigurableTask
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/api/task.py", line 17, in
from lm_eval.api.metrics import (
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/api/metrics.py", line 18, in
@register_aggregation("mean")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/anaconda3/envs/lora/lib/python3.11/site-packages/lm_eval/api/registry.py", line 141, in decorate
assert (
AssertionError: aggregation named 'mean' conflicts with existing registered aggregation!
does anyone meet this as well? what should I do to resolve it?
The text was updated successfully, but these errors were encountered: