public class JDBCUtils
extends java.lang.Object
Modifier and Type | Class and Description |
---|---|
static interface |
JDBCUtils.ConnectionSupplier
Defines a template for creating a lambda function to supply a database connection
|
Constructor and Description |
---|
JDBCUtils() |
Modifier and Type | Method and Description |
---|---|
static SpatialJavaRDD<SparkRecordInfo> |
createSpatialRDDFromQuery(JavaSparkContext sc, JDBCUtils.ConnectionSupplier getConnection, java.lang.String sql, SparkRecordInfoProvider<java.sql.ResultSet> recordInfoProvider)
Creates a Spatial RDD given a SQL query to a database.
|
static SpatialJavaRDD<SparkRecordInfo> |
createSpatialRDDFromQuery(JavaSparkContext sc, JDBCUtils.ConnectionSupplier getConnection, java.lang.String sql, SparkRecordInfoProvider<java.sql.ResultSet> recordInfoProvider, long rowsPerPartition)
Creates a Spatial RDD given a SQL query to a database.
|
static SpatialJavaRDD<SparkRecordInfo> |
createSpatialRDDFromTable(JavaSparkContext sc, JDBCUtils.ConnectionSupplier getConnection, java.lang.String tableName, java.util.List<java.lang.String> fields, SparkRecordInfoProvider<java.sql.ResultSet> recordInfoProvider)
Creates a Spatial RDD given the name of a table and the fields to be selected.
|
static SpatialJavaRDD<SparkRecordInfo> |
createSpatialRDDFromTable(JavaSparkContext sc, JDBCUtils.ConnectionSupplier getConnection, java.lang.String tableName, java.util.List<java.lang.String> fields, SparkRecordInfoProvider<java.sql.ResultSet> recordInfoProvider, long rowsPerPartition)
Creates a Spatial RDD given the name of a table and the fields to be selected.
|
public static SpatialJavaRDD<SparkRecordInfo> createSpatialRDDFromQuery(JavaSparkContext sc, JDBCUtils.ConnectionSupplier getConnection, java.lang.String sql, SparkRecordInfoProvider<java.sql.ResultSet> recordInfoProvider) throws java.sql.SQLException
sc
- a spark contextgetConnection
- a lambda function that supplies the connection to the databasesql
- a SQL query which will return the content for the RDD. A spatial column is expected.recordInfoProvider
- an implementation of SparkRecordInfoProvider
to customize how the row's fields are mapped to a SparkRecordInfo. If null or empty, a default implementation is used.java.sql.SQLException
public static SpatialJavaRDD<SparkRecordInfo> createSpatialRDDFromQuery(JavaSparkContext sc, JDBCUtils.ConnectionSupplier getConnection, java.lang.String sql, SparkRecordInfoProvider<java.sql.ResultSet> recordInfoProvider, long rowsPerPartition) throws java.sql.SQLException
sc
- a spark contextgetConnection
- a lambda function that supplies the connection to the databasesql
- a SQL query which will return the content for the RDD. A spatial column is expected.recordInfoProvider
- an implementation of SparkRecordInfoProvider
to customize how the row's fields are mapped to a SparkRecordInfo. If null or empty, a default implementation is used.rowsPerPartition
- the number of desired rows from the database to be stored per spark partitionjava.sql.SQLException
public static SpatialJavaRDD<SparkRecordInfo> createSpatialRDDFromTable(JavaSparkContext sc, JDBCUtils.ConnectionSupplier getConnection, java.lang.String tableName, java.util.List<java.lang.String> fields, SparkRecordInfoProvider<java.sql.ResultSet> recordInfoProvider) throws java.sql.SQLException
sc
- a spark contextgetConnection
- a lambda function that supplies the connection to the databasetableName
- the name of the table to be queriedfields
- a list of the fields names to be queried. If null or empty, all the table fields are retrieved.recordInfoProvider
- an implementation of SparkRecordInfoProvider
to customize how the row's fields are mapped to a SparkRecordInfo. If null or empty, a default implementation is used.java.sql.SQLException
public static SpatialJavaRDD<SparkRecordInfo> createSpatialRDDFromTable(JavaSparkContext sc, JDBCUtils.ConnectionSupplier getConnection, java.lang.String tableName, java.util.List<java.lang.String> fields, SparkRecordInfoProvider<java.sql.ResultSet> recordInfoProvider, long rowsPerPartition) throws java.sql.SQLException
sc
- a spark contextgetConnection
- a lambda function that supplies the connection to the databasetableName
- the name of the table to be queriedfields
- a list of the fields names to be queried. If null or empty, all the table fields are retrieved.recordInfoProvider
- an implementation of SparkRecordInfoProvider
to customize how the row's fields are mapped to a SparkRecordInfo. If null or empty, a default implementation is used.rowsPerPartition
- the number of desired rows from the database to be stored per spark partitionjava.sql.SQLException
Copyright © 2017 Oracle and/or its affiliates. All Rights Reserved.