-
Notifications
You must be signed in to change notification settings - Fork 643
Pull requests: lucidrains/DALLE-pytorch
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
add ability to specify full attention for a prefix length of the sequ…
#19
by lucidrains
was merged Jan 18, 2021
Loading…
omit the prefix sections of the sequence undergoing full attention fr…
#20
by lucidrains
was merged Jan 18, 2021
Loading…
just set ignore index to 0, which is padding by convention
#58
by lucidrains
was merged Feb 26, 2021
Loading…
make vq-gan from the taming transformers paper accessible for training
#75
by lucidrains
was merged Mar 10, 2021
Loading…
add ability to use a huggingface tokenizer by specifying path to BPE …
#198
by lucidrains
was merged Apr 15, 2021
Loading…
Previous Next
ProTip!
Follow long discussions with comments:>50.