SparkR or sparklyR cran package


#1

I need to connect to my Spark cluster from R, Please could you install spark , sparklyR package. If not how do I connect to spark cluster from the R jupyter interface
Femi


#2

Hi Femi,

We will have to evaluate it.


#3

Hi @femi.aiyeku,

This code will give you the snippet to run R on Spark.

Hope this helps.

Thanks


#4

Yes it does help
thank you


#5

@femi.aiyeku

Awesome! Thanks


#6

Hi Abhinav,
And lastly, is it possible to work with elasticsearch on Hadoop cluster (Cloudxlab)


#7

First, please could you update the Microsoft R Open to the current or 3.5 version. It is currently running on 3.4.3 which is affecting the ability to load library/packages on Jupyter notebook

I was trying to load an R library package on Jupyter notebook and I encountered this error message

Error: package or namespace load failed for ‘ggplot2’:
package ‘scales’ was installed by an R version with different internals; it needs to be reinstalled for use with this R version
Traceback:

  1. library(ggplot2, lib.loc = “/home//R/x86_64-pc-linux-gnu-library/3.4”)
  2. tryCatch({
    . attr(package, “LibPath”) <- which.lib.loc
    . ns <- loadNamespace(package, lib.loc)
    . env <- attachNamespace(ns, pos = pos, deps)
    . }, error = function(e) {
    . P <- if (!is.null(cc <- conditionCall(e)))
    . paste(" in", deparse(cc)[1L])
    . else “”
    . msg <- gettextf(“package or namespace load failed for %s%s:\n %s”,
    . sQuote(package), P, conditionMessage(e))
    . if (logical.return)
    . message(paste(“Error:”, msg), domain = NA)
    . else stop(msg, call. = FALSE, domain = NA)
    . })
  3. tryCatchList(expr, classes, parentenv, handlers)
  4. tryCatchOne(expr, names, parentenv, handlers[[1L]])
  5. value[3L]
  6. stop(msg, call. = FALSE, domain = NA)