-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Example code from mario_gpt should possibly be mario_gpt.lm #5
Comments
Hey! So I actually updated the code to make this importing possible, but I realized that the pypi package has not really been updated yet. I reverted this back in the README so it should be good :) |
Thanks for your assistance! I'm pretty close to getting things working but hit another error I've been unable to debug. generated_level = mario_lm.sample( This is using the default prompts = ["many pipes", "many enemies", "some blocks", "high elevation"] Any help would be greatly appreciated! |
Hey! So multiple prompts should now work. You should probably clone the repo or update from pypi. On another note, when using default prompts make sure you're using: prompts = ["many pipes, many enemies, some blocks, high elevation"] instead of prompts = ["many pipes", "many enemies", "some blocks", "high elevation"] If you split the categories the model could still work, but I haven't tested that out extensively. |
Also, make sure you update your |
Thanks for pointing this out though, lmk if there's any other errors! |
Hi,
First of all thanks for your novel implementation! This is very cool to see.
When running the minimal code snippet provided in the readme, I got the following error:
I resolved it by changing the line:
from mario_gpt import MarioLM
to
from mario_gpt.lm import MarioLM
Perhaps this is just a simple syntactical mistake? Or it could be user error on my part...
Thanks for your attention!
The text was updated successfully, but these errors were encountered: