Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
680 views
in Technique[技术] by (71.8m points)

tensorflow - Can a model trained on gpu used on cpu for inference and vice versa?

I was wondering if a model trained on the GPU could be use to run inference with the cpu ? (And vice versa) Thanks to you!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You can do it as long as your model doesn't have explicit device allocations. IE, if your model has blocks like with tf.device('gpu:0'), it'll complain when you run it on model without GPU.

In such cases you must make sure your imported model doesn't have explicit device assignments, for instance, but using clear_devices argument in import_meta_graph


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...