Skip to content

TensorFlow on Alpine #490

Answered by marcelklehr
EVOTk asked this question in Q&A
Nov 13, 2022 · 5 comments · 4 replies
Discussion options

You must be logged in to vote

So, long-term the ideal path forward is to create a separate container for recognize to run models in, with the external apps ecosystem this is coming closer and closer. Short-term you can switch to a different container, ideally debian-based, and tensorflow should run there. You can also switch to a GPU-enabled container if you want, see @bugsyb 's comments.

Replies: 5 comments 4 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
3 replies
@drlauridsen
Comment options

@marcelklehr
Comment options

@drlauridsen
Comment options

Comment options

You must be logged in to vote
1 reply
@pktiuk
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by marcelklehr
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
7 participants