Hi,
I have couple of questions.
- Do we have Python 3.x installed in CloudxLab? I could see only Python 2.7.5
- Do you have any video for using pyspark as part of BigData Hadoop and Spark course? (Note: I subscripted to this course)
Hi,
I have couple of questions.
Hi @senthiljdpm,
Please find my answers below
Do we have Python 3.x installed in CloudxLab? I could see only Python 2.7.5
We do not have Python 3 on all web consoles. I think it is available in f.clouxlab.com. Please login to f.cloudxlab.com to access Python3
Do you have any video for using pyspark as part of BigData Hadoop and Spark course? (Note: I subscripted to this course)
This course covers Spark using Scala. But most of the APIs are same in Spark for Scala, Python, and Java.
Hope this helps.
Thanks Abhinav. Is it possible to install Apache NiFi as well using Virtual env ? If not, would you help me in installing it.
hi Abhinav,
I tried to install Python 3.x in venv as shown in the screenshot but couldnt get it through. Could you please tell me if this is due to access rights (sudoer)?
Hi @senthiljdpm,
I think Python3 is available on f.clouxlab.com. Please login to f.cloudxlab.com to access Python3
it works. Thanks Abhinav
Update to the original question
Now we have Python 3 installed on CloudxLab.
Please run below commands to access Python 3
export PATH=/usr/local/anaconda/bin:$PATH
python3
Hope this helps.
Thanks
Collecting pyspark
Collecting py4j==0.10.9 (from pyspark)
Using cached https://files.pythonhosted.org/packages/9e/b6/6a4fb90cd235dc8e265a6a2067f2a2c99f0d91787f06aca4bcf7c23f3f80/py4j-0.10.9-py2.py3-none-any.whl
Installing collected packages: py4j, pyspark
ERROR: Could not install packages due to an EnvironmentError: [Errno 13] Permission denied: ‘/usr/local/anaconda/lib/python3.6/site-packages/py4j’
Consider using the --user
option or check the permissions.
Note: you may need to restart the kernel to use updated packages.
Please Help
which package are you trying to install? You may need to enable virtual environment to install your own package. Please follow below blog