4.1.1 About OREmodels Functions

The OREmodels package contains functions with which you can build advanced analytical data models using ore.frame objects. The OREmodels functions are the following:

Table 4-1 Functions in the OREmodels Package

Function Description

ore.glm

Fits and uses a generalized linear model on data in an ore.frame.

ore.lm

Fits a linear regression model on data in an ore.frame.

ore.neural

Fits a neural network model on data in an ore.frame.

ore.randomForest Creates a random forest classification model in parallel on data in an ore.frame.

ore.stepwise

Fits a stepwise linear regression model on data in an ore.frame.

Note:

In R terminology, the phrase "fits a model" is often synonymous with "builds a model". In this document and in the online help for Oracle R Enterprise functions, the phrases are used interchangeably.

The ore.glm, ore.lm, and ore.stepwise functions have the following advantages:

  • The algorithms provide accurate solutions using out-of-core QR factorization. QR factorization decomposes a matrix into an orthogonal matrix and a triangular matrix.

    QR is an algorithm of choice for difficult rank-deficient models.

  • You can process data that does not fit into memory, that is, out-of-core data. QR factors a matrix into two matrices, one of which fits into memory while the other is stored on disk.

    The ore.glm, ore.lm and ore.stepwise functions can solve data sets with more than one billion rows.

  • The ore.stepwise function allows fast implementations of forward, backward, and stepwise model selection techniques.

The ore.neural function has the following advantages:

  • It is a highly scalable implementation of neural networks, able to build a model on even billion row data sets in a matter of minutes. The ore.neural function can be run in two modes: in-memory for small to medium data sets and distributed (out-of-core) for large inputs.

  • Users can specify the activation functions on neurons on a per-layer basis; ore.neural supports many different activation functions.

  • Users can specify a neural network topology consisting of any number of hidden layers, including none.