Hbase from Spark Scala in lab is possible?

Hi

Is there anyway I can connect Hbase using scala spark?

I tried to import below package in jupyter but getting error.

import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.client._
import org.apache.hadoop.hbase.{HBaseConfiguration, TableName}
import org.apache.hadoop.hbase.util.Bytes

Name: Compile Error
Message: :27: error: object hbase is not a member of package org.apache.hadoop
import org.apache.hadoop.hbase.client._
^
:28: error: object hbase is not a member of package org.apache.hadoop
import org.apache.hadoop.hbase.{HBaseConfiguration, TableName}
^
:29: error: object hbase is not a member of package org.apache.hadoop
import org.apache.hadoop.hbase.util.Bytes
^

StackTrace:

It should be possible. Looks like the notebook is not having the library.

Could you try from console by providing the library via command line argument --jars or --package?

I myself have to debug and try to understand it.

Hi

I am not aware of including these libraries in HBASE path using these command…
I tried to find in net but could not get a proper syntax.

Could you plz help.

Thanks
Arup Mukherjee

Hi Giri

Did you get any anything on Hbase inclusion from Spark/Scala?

Seems it is not that much complex and may be somebody in the lab already using.

Thanks
Arup

Hi

I found some solution to include Hbase jars in spark
First
Need to start HMaster using below command
start-hbase.sh

but in cloudexlab how can I execute this command?

Next
Need create an HBASE_PATH environmental variable to store the hbase paths
and then

need to start the spark shell by passing HBASE_PATH variable to include all the hbase jars.

Could you please let me know how to execute command start-hbase.sh in lab so I can try this process.

Thanks
Arup Mukherjee