DataSource API — Managing Datasets in External Data Sources
Reading Datasets
Spark SQL can read data from external storage systems like files, Hive tables and JDBC databases through DataFrameReader interface.
You use SparkSession to access DataFrameReader using read operation.
|
1 2 3 4 5 6 7 8 |
import org.apache.spark.sql.SparkSession val spark = SparkSession.builder.getOrCreate val reader = spark.read |
DataFrameReader is an interface to create DataFrames (aka Dataset[Row]) from files, Hive tables or tables using JDBC.
|
1 2 3 4 5 6 |
val people = reader.csv("people.csv") val cities = reader.format("json").load("cities.json") |
As of Spark 2.0, DataFrameReader can read text files using textFile methods that return Dataset[String] (not DataFrames).
|
1 2 3 4 5 |
spark.read.textFile("README.md") |
You can also define your own custom file formats.
|
1 2 3 4 5 |
val countries = reader.format("customFormat").load("countries.cf") |
There are two operation modes in Spark SQL, i.e. batch and streaming (part of Spark Structured Streaming).
You can access DataStreamReader for reading streaming datasets through SparkSession.readStream method.
|
1 2 3 4 5 6 |
import org.apache.spark.sql.streaming.DataStreamReader val stream: DataStreamReader = spark.readStream |
The available methods in DataStreamReader are similar to DataFrameReader.
Saving Datasets
Spark SQL can save data to external storage systems like files, Hive tables and JDBC databases through DataFrameWriter interface.
You use write method on a Dataset to access DataFrameWriter.
|
1 2 3 4 5 6 7 8 |
import org.apache.spark.sql.{DataFrameWriter, Dataset} val ints: Dataset[Int] = (0 to 5).toDS val writer: DataFrameWriter[Int] = ints.write |
DataFrameWriter is an interface to persist a Datasets to an external storage system in a batch fashion.
You can access DataStreamWriter for writing streaming datasets through Dataset.writeStream method.
|
1 2 3 4 5 6 7 8 |
val papers = spark.readStream.text("papers").as[String] import org.apache.spark.sql.streaming.DataStreamWriter val writer: DataStreamWriter[String] = papers.writeStream |
The available methods in DataStreamWriter are similar to DataFrameWriter.
spark技术分享