T - the generic type of the indexed spatial RDD
public abstract class DistributedSpatialIndex<T>
extends java.lang.Object
implements java.io.Serializable
| Modifier and Type | Field and Description |
|---|---|
protected static java.lang.String |
BOOTSTRAP_FILE |
protected java.lang.Class<T> |
type |
| Constructor and Description |
|---|
DistributedSpatialIndex(java.lang.Class<T> type)
Creates an empty instance which will accept spatial RDDs of the given generic type
|
| Modifier and Type | Method and Description |
|---|---|
abstract void |
cache()
Persist the index' internal RDD with the default storage level
|
static <T> DistributedSpatialIndex<T> |
createIndex(org.apache.spark.api.java.JavaSparkContext sc, SpatialJavaRDD<T> rdd, SpatialPartitioningConfiguration partConf)
Spatially index a spatial RDD using the specified
DistributedSpatialIndex implementing class |
abstract SpatialJavaRDD<T> |
filter(org.apache.spark.api.java.function.Function<T,java.lang.Boolean> f, SpatialOperationConfig spatialOpConf)
Returns a new spatial RDD containing only the elements that satisfy both, the filtering function and the spatial operation
|
abstract <R> org.apache.spark.api.java.JavaRDD<R> |
flatMap(org.apache.spark.api.java.function.FlatMapFunction<T,R> f, SpatialOperationConfig spatialOpConf)
Returns a new RDD by first spatially filtering the RDD elements using the spatial operation given by spatialOpConf, then a function is applied to all the remaining elements.
|
protected abstract void |
index(org.apache.spark.api.java.JavaSparkContext sc, SpatialJavaRDD<T> rdd, SpatialPartitioningConfiguration partConf)
Spatially indexes an RDD.
|
abstract <T2,R> org.apache.spark.api.java.JavaRDD<R> |
join(SpatialJavaRDD<T2> rdd2, org.apache.spark.api.java.function.Function2<T,T2,R> f, SpatialOperationConfig spatialOpConf)
Joins the records of the current index' data set with another spatial data set based on a spatial relationship between both data set's records
|
static <T> DistributedSpatialIndex<T> |
load(org.apache.spark.api.java.JavaSparkContext sc, java.lang.String pathStr)
Reads an existing persisted index from the given path
|
abstract void |
persist(org.apache.spark.storage.StorageLevel storageLevel)
Sets the index' internal RDD's storage level to persist its values across operations after the first time it is computed.
|
protected abstract void |
read(org.apache.spark.api.java.JavaSparkContext sc, java.lang.String pathStr)
Loads a persisted index from the given path.
|
void |
save(org.apache.spark.api.java.JavaSparkContext sc, java.lang.String pathStr)
Stores the current index as a file in the given path
|
abstract void |
unpersist()
Marks the index' internal RDD as non-persistent
|
protected abstract void |
write(org.apache.spark.api.java.JavaSparkContext sc, java.lang.String pathStr)
Writes the current index to the given path
|
protected java.lang.Class<T> type
protected static final java.lang.String BOOTSTRAP_FILE
public DistributedSpatialIndex(java.lang.Class<T> type)
type - the generic type of the spatial RDD to be indexed
protected abstract void index(org.apache.spark.api.java.JavaSparkContext sc,
SpatialJavaRDD<T> rdd,
SpatialPartitioningConfiguration partConf)
sc - an existing Spark contextrdd - a spatial RDDpartConf - the indexing and partitioning configuration
protected abstract void write(org.apache.spark.api.java.JavaSparkContext sc,
java.lang.String pathStr)
throws java.lang.Exception
sc - an existing Spark contextpathStr - a path on either, a local or distributed file systemjava.lang.Exception
protected abstract void read(org.apache.spark.api.java.JavaSparkContext sc,
java.lang.String pathStr)
throws java.lang.Exception
sc - an existing Spark contextpathStr - a path on either, a local or distributed file systemjava.lang.Exceptionpublic abstract SpatialJavaRDD<T> filter(org.apache.spark.api.java.function.Function<T,java.lang.Boolean> f, SpatialOperationConfig spatialOpConf) throws java.lang.Exception
f - a filtering functionspatialOpConf - the spatial operation configuration to spatially filter the RDDSpatialJavaRDDjava.lang.Exceptionpublic abstract <R> org.apache.spark.api.java.JavaRDD<R> flatMap(org.apache.spark.api.java.function.FlatMapFunction<T,R> f, SpatialOperationConfig spatialOpConf) throws java.lang.Exception
f - a function to apply to each elementspatialOpConf - the spatial operation configuration to spatially filter the RDDjava.lang.Exceptionpublic abstract <T2,R> org.apache.spark.api.java.JavaRDD<R> join(SpatialJavaRDD<T2> rdd2, org.apache.spark.api.java.function.Function2<T,T2,R> f, SpatialOperationConfig spatialOpConf) throws java.lang.Exception
f - a tuple processor function which receives a tuple of records from both data sets which match the spatial join conditionrdd2 - a spatial RDD to join withspatialOpConf - a spatial operation configuration defining the spatial relationship to join recordsjava.lang.Exceptionpublic abstract void persist(org.apache.spark.storage.StorageLevel storageLevel)
storageLevel -public abstract void unpersist()
public abstract void cache()
public void save(org.apache.spark.api.java.JavaSparkContext sc,
java.lang.String pathStr)
throws java.lang.Exception
sc - an existing Spark contextpathStr - a path on either, a local or distributed file systemjava.lang.Exceptionpublic static <T> DistributedSpatialIndex<T> load(org.apache.spark.api.java.JavaSparkContext sc, java.lang.String pathStr) throws java.lang.Exception
sc - an existing Spark contextpathStr - a path on either, a local or distributed file systemjava.lang.Exceptionpublic static <T> DistributedSpatialIndex<T> createIndex(org.apache.spark.api.java.JavaSparkContext sc, SpatialJavaRDD<T> rdd, SpatialPartitioningConfiguration partConf) throws java.lang.Exception
DistributedSpatialIndex implementing classsc - an existing Spark contextrdd - a spatial RDDpartConf - Indexing and partitioning configuration. It determines the concrete index implementation to be created.DistributedSpatialIndexjava.lang.ExceptionCopyright © 2016 Oracle and/or its affiliates. All Rights Reserved.