Need to load 3GB+ file in lab

I need to load histopathological biopsy dataset of breast cancer in my home directory , to use lab effectively in my research.But there is only 3 GB size allocated and after unzipping the main zipped dataset in local it becomes about 5 gb.Is there a possibility to work on this data set in cloudx lab?
Please response.

Hi,

You can put this data on HDFS for your analysis. How are you planning to use this data? Are you planning to use Hadoop or Spark or it will be with Python?

I ultimately need the data for applying transfer learning on this dataset and i just need it for 7 days. After the analysis is done and my models are generated and saved, I will remove everything from lab and back them up in my local.Can our lab access Kaggle directly then i need not store the data in cloudx but just store the models?Please guide me how to go about the process?I cannot visit my university lab due to the pandemic.

Can I use Kaggle for this, it provides 100 GB space to users and 30 hr/week TFU and GPU support.Is it okay to proceed with Kaggle?