-
Notifications
You must be signed in to change notification settings - Fork 949
Issues: Kaggle/docker-python
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Loaded runtime CuDNN library: 9.0.0 but source was compiled with: 9.3.0.
#1435
opened Nov 6, 2024 by
lukorito99
Efficiently Setting Up Dependencies for Karpathy's llm.c on Kaggle: Seeking Guidance
new-package
Requests for installing new packages
#1416
opened Aug 1, 2024 by
dewijones92
NotFoundError: Graph execution error: TPU
bug
bug & failures with existing packages
help wanted
#1370
opened Mar 10, 2024 by
innat
Can't install vllm, llama.cpp
bug
bug & failures with existing packages
help wanted
#1369
opened Mar 9, 2024 by
lullabies777
Keras Issue for TensorFlow Hub model
bug
bug & failures with existing packages
help wanted
#1367
opened Mar 3, 2024 by
zzj0402
CHAOS AT CURRENT CUDF WITH RAPIDS DRIVERS
bug
bug & failures with existing packages
help wanted
#1361
opened Feb 3, 2024 by
Hvnt3rK3ys
Interrupt code breaks sometimes
bug
bug & failures with existing packages
help wanted
#1338
opened Dec 8, 2023 by
chrstfer
Unable to fully install xFormers in auxilliary notebook
bug
bug & failures with existing packages
help wanted
#1335
opened Dec 1, 2023 by
probit2011
Check the version of python and cuda with each image of cpu and gpu.
enhancement
#1327
opened Nov 28, 2023 by
Keiku
TF TensorRT misconfigured
bug
bug & failures with existing packages
help wanted
#1323
opened Nov 18, 2023 by
maciejskorski
Jupyter notebook is not working
bug
bug & failures with existing packages
help wanted
#1313
opened Oct 22, 2023 by
atsuchiy
Notebook Execution Environment Information in Metadata
enhancement
#1311
opened Oct 20, 2023 by
reallyTG
ProTip!
Follow long discussions with comments:>50.