R interface to Apache Spark, a fast and general engine for big data processing, see . This package supports connecting to local and remote Apache Spark clusters, provides a 'dplyr' compatible back-end, and provides an interface to Spark's built-in machine learning algorithms.

Documentation

Manual: sparklyr.pdf
Vignette: None available.

Maintainer: Javier Luraschi <javier at rstudio.com>

Author(s): Javier Luraschi*, Kevin Ushey*, JJ Allaire*, RStudio*, The Apache Software Foundation*

Install package and any missing dependencies by running this line in your R console:

install.packages("sparklyr")

Depends R (>= 3.1.2)
Imports assertthat, base64enc, broom, config(>=0.2), DBI(>=0.6-1), dplyr(>=0.7.2), dbplyr(>=1.1.0), digest, httr(>=1.2.1), jsonlite(>=1.4), lazyeval(>=0.2.0), methods, openssl(>=0.8), rappdirs, readr(>=1.1.0), rlang(>=0.1), rprojroot, rstudioapi, shiny(>=1.0.1), withr, xml2
Suggests ggplot2, janeaustenr, nycflights13, testthat, RCurl
Enhances
Linking to
Reverse
depends
Reverse
imports
rsparkling, spark.sas7bdat, sparkwarc
Reverse
suggests
replyr
Reverse
enhances
Reverse
linking to

Package sparklyr
Materials
URL http://spark.rstudio.com
Task Views
Version 0.6.3
Published 2017-09-19
License Apache License 2.0 | file LICENSE
BugReports https://github.com/rstudio/sparklyr/issues
SystemRequirements Spark: 1.6.x or 2.x
NeedsCompilation no
Citation
CRAN checks sparklyr check results
Package source sparklyr_0.6.3.tar.gz