Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
591 views
in Technique[技术] by (71.8m points)

neural network - tensorflow store training data on GPU memory

I am pretty new to tensorflow. I used to use theano for deep learning development. I notice a difference between these two, that is where input data can be stored.

In Theano, it supports shared variable to store input data on GPU memory to reduce the data transfer between CPU and GPU.

In tensorflow, we need to feed data into placeholder, and the data can come from CPU memory or files.

My question is: is it possible to store input data on GPU memory for tensorflow? or does it already do it in some magic way?

Thanks.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

If your data fits on the GPU, you can load it into a constant on GPU from e.g. a numpy array:

with tf.device('/gpu:0'):
  tensorflow_dataset = tf.constant(numpy_dataset)

One way to extract minibatches would be to slice that array at each step instead of feeding it using tf.slice:

  batch = tf.slice(tensorflow_dataset, [index, 0], [batch_size, -1])

There are many possible variations around that theme, including using queues to prefetch the data to GPU dynamically.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

2.1m questions

2.1m answers

60 comments

57.0k users

...