HBASE giving NoDefError

Hi,
When I am trying to put a record in HBASE it gives me NoDefError. I looked around a bit and found that the hbase dependencies I am using in the program is not the same as in cluster. Can you please guide me as how to make it same.
build.sbt
name := “Kafka_WordCount_HBase”
version := “0.1”
scalaVersion := “2.11.12”
libraryDependencies += “org.scalactic” %% “scalactic” % “3.0.5”
libraryDependencies += “org.scalatest” %% “scalatest” % “3.0.5” % “test”
libraryDependencies ++= Seq(
“org.apache.spark” % “spark-core_2.11” % “2.3.0”,
“org.apache.spark” % “spark-sql_2.11” % “2.3.0”,
“org.apache.spark” % “spark-streaming-kafka-0-10_2.11” % “2.0.2”,
“org.apache.spark” % “spark-streaming_2.11” % “2.1.0”,
“org.apache.kafka” % “kafka_2.11” % “0.10.2.0”,
“org.apache.hbase” %% “hbase-client” % “1.1.2” ,
“org.apache.hbase” %% “hbase-common” % “1.1.2”,
“org.apache.hbase” %% “hbase-protocol” % “1.1.2”,
“org.apache.hbase” %% “hbase-hadoop2-compat” % “1.1.2”,
“org.apache.hbase” %% “hbase-server” % “1.1.2”,
“org.apache.hbase” %% “hbase-annotations” % “1.1.2”
)

Hi, you can try creating a new cluster and put the data as below. It should work.

  1. create ‘yourlogin_name’, ‘content’, ‘anchors’.
  2. put ‘yourlogin_name’, ‘anycontent’,‘anchors:which_anchors’, ‘content_of_anchors’.
  3. You can see the data in the following location:
    /apps/hbase/data/default/‘yourlogin_name’/
1 Like