I am trying to append .h5 files saved in a directory for my training set using Python, but I face memory error as the Data set is very large. The following code is what I used for appending:
import h5py
Data_dir= ".../training"
Data = []
for sample in Data_dir:
img_path = Data_dir + sample
file = h5py.File(img_path)
Image= file['Image'][()]
Data.append(Image)
I was wondering if there is any way that I can use the directory as training set directly without making a list of arrays from them?
I have found the method of " flow_from_directory " from Keras ( https://keras.io/preprocessing/image/ ), but as I know it does not work for .h5 data. My problem is how to flow the data from directory for .h5 files for training a neural network.
Thanks.