Is /transformer supposed to be in F16 (instead of BF16 like Dev and Fast)?

#4
by dmhf - opened

image.png

HiDream.ai org

BF16

It's saved as F16 in Full....

as a workaround you can use pipe = pipe.to(torch.bfloat16) to avoid problems during inference

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment