You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Issue: The internal parameter for deterministic mode did not set in eval_model() func
According to the example code for eval_model(), temperature parameter set 0.
model_path="liuhaotian/llava-v1.5-7b"prompt="What are the things I should be cautious about when I visit here?"image_file="https://llava-vl.github.io/static/images/view.jpg"args=type('Args', (), {
"model_path": model_path,
"model_base": None,
"model_name": get_model_name_from_path(model_path),
"query": prompt,
"conv_mode": None,
"image_file": image_file,
"sep": ",",
"temperature": 0,
"top_p": None,
"num_beams": 1,
"max_new_tokens": 512
})()
eval_model(args)
I think if we set this param 0, we should explicitly set these additional params.
Hi,
Thank you for bringing up this issue. I encountered a similar problem even after explicitly setting torch.backends.cudnn.deterministic and related flags. I've noticed that the discrepancies occur specifically in the CLIP ViT encoders, where the vision embeddings produce different values across separate runs.
When comparing two identical inference processes, I observed that the image_forward_out varies despite using the same image input. This occurs in the following file:
This situation occurs starting from the second example until the last one.
I'm curious to know if you're still experiencing this issue and if you've found a solution. Any insights would be greatly appreciated. Thank you for your time!
Describe the issue
Issue: The internal parameter for deterministic mode did not set in eval_model() func
According to the example code for eval_model(),
temperature
parameter set0
.I think if we set this param 0, we should explicitly set these additional params.
Hence, we should add the following conditional execution in eval_model() func.
The text was updated successfully, but these errors were encountered: