How to access a file in my hdfs from spark shell?

Hi, I am new to cloudxlabs.
I am trying to read a file stored in my hdfs directory in cloudxlab from spark-shell.
Normally , in a cloudera CDH quickstart , I would give it as

val input = sc.textFile(“hdfs://localhost:9000/user/myusername/myfile.txt”);

Please let me know in cloudXlab, what is the path I should give to read the file.

Hi @Prasuna_A,

Below command should work on CloudxLab

val input = sc.textFile(“hdfs:///user/YOUR_CLOUDXLAB_USERNAME/myfile.txt”);

/user/YOUR_CLOUDXLAB_USERNAME is your home folder in HDFS. Replace YOUR_CLOUDXLAB_USERNAME with your CloudxLab username which you can find under “My Lab” section.

Also make sure myfile.txt exists in your home folder in HDFS.

Hope this helps.

Thanks

1 Like

Thanks a lot. This helped.