MySQL AI User Guide
This topic describes how to import an external ONNX model.
This topic has the following sections. Refer to the steps to import an ONNX model. There are also examples for your reference.
Review the following:
Review ONNX Model Metadata.
You have the following ways to import an external ONNX model.
Import model as a string: For smaller models, you can
copy the encoded string and paste it into a session
variable or temporary table column. You can then import
the table with the copied string. To do this, you run
the
ML_MODEL_IMPORT
routine and import the model as a pre-processed model
object.
Import model directly from a table: For larger models,
you can load the entire file into a table with the
appropriate parameters. You can then import the table
directly into your model catalog. If needed, you can
load the model in batches of smaller files. To do this,
you run the
ML_MODEL_IMPORT
routine and import the model as a table.
The table that you load the model into must have the following columns:
chunk_id
: The recommended
parameters are INT AUTO_INCREMENT PRIMARY
KEY
. There must be only one row in the
table with chunk_id
= 1.
model_object
: The recommended
parameters are LONGTEXT NOT NULL
.
model_metadata
: The recommended
parameters are JSON DEFAULT NULL
.
The workflow to import an ONNX model includes the following:
Convert the ONNX file to Base 64 encoding and carry out sting serialization. See Encoding ONNX File.
Depending on the size of the model, select if you want to import the model as a string in a pre-processed model object (smaller files) or as a table (larger files). Then, refer to the appropriate section to prepare the model file. See either Preparing to Import ONNX Model as a Pre-Processed Object or Preparing to Import ONNX Model as a Table.
Define the model metadata as needed depending on the type of machine learning task for the model. See Defining Model Metadata.
Import the model by using the
ML_MODEL_IMPORT
routine. See either
Importing ONNX Model as a Pre-processed Object
to import the model as a string or
Importing ONNX Model as a Table.
Before importing an ONNX model, you must convert the ONNX file to Base 64 encoding and carry out string serialization. Do this with the Python base64 module. Ensure you have the appropriate version of Python installed.
To encode the ONNX file:
Open a terminal window (command prompt on Windows).
Install the ONNX library.
pip install onnx
Launch Python and run the following code.
# python3 encode_onnx_base64.py import onnx import base64 with open("output_file_name
", "wb") as f: model = onnx.load("input_file_name
") f.write(base64.b64encode(model.SerializeToString()))
Replace input_file_name
with the full
file path to the ONNX file and
output_file_name
with the desired
file name for the encoded file. If needed, set a file
path for the output file.
The following example converts the
/Users/user1/iris.onnx
file and
creates the output file
iris_base64.onnx
.
# python3 encode_onnx_base64.py import onnx import base64 with open("iris_base64.onnx", "wb") as f: model = onnx.load("/Users/user1/iris.onnx") f.write(base64.b64encode(model.SerializeToString()))
After encoding the ONNX file, select the method to import the model and review the appropriate steps.
For smaller model files, you can import the ONNX model as a string into a pre-processed object.
To prepare to import the ONNX Model as a string:
Open the encoded file and copy the string.
Connect to the MySQL server.
Copy and paste the converted string for the file into a session variable. For example:
mysql> SET @onnx_string_model_object='ONNX_file_string
';
Alternatively, you can load the encoded file directly into a table column. Make sure you do the following:
Set the appropriate local-infile
setting for the client. The server setting
local_infile=ON
is enabled by
default. Verify with your admin before using these
settings. See Security Considerations for LOAD DATA LOCAL
to learn more.
Upload the file to the appropriate folder in the MySQL
server based on the
secure_file_priv
setting. To review this setting, connect to the MySQL
server and run the following command:
mysql> SHOW VARIABLES LIKE 'secure_file_priv';
To load the encoded file directly into a table column:
From a terminal window, upload the ONNX file to the folder of your username in the compute instance.
$> scp -v -i ssh-key.key /Users/user1/iris_base64.onnx user1@ComputeInstancePublicIP:/home/user1/
Replace the following:
ssh-key.key
: The full
file path to the SSH key file (.key) for the compute
instance.
/Users/user1/iris_base64.onnx
:
The full file path to the ONNX file on your device.
user1@ComputeInstancePublicIP
:
The appropriate username and public IP for the
compute instance.
/home/user1/
: The
appropriate file path to your username in the
compute instance.
Once the upload successfully completes, SSH into the compute instance.
$> ssh -i ssh-key.key user1@computeInstancePublicIP
Replace the following:
ssh-key.key
: The full
file path to the SSH key file (.key) for the compute
instance.
user1@ComputeInstancePublicIP
:
The appropriate username and public IP for the
compute instance.
Change the directory to the one for your username.
$> cd /home/user1
Replace user1
with your own username.
Create a copy of the ONNX file.
$> touch iris_base64.onnx
Replace iris_base64.onnx
with the
file name of the ONNX file.
Copy the ONNX file to the appropriate folder in the
MySQL server based on the
secure_file_priv
setting.
$> sudo cp iris_base64.onnx /var/lib/mysql-files
Replace the following:
iris_base64.onnx
: The
file name of the ONNX file.
/var/lib/mysql-files
: The
file path based on the
secure_file_priv
setting.
Update the owner and group of the file path previously specified that has the uploaded ONNX file.
$> sudo chown -R mysql:mysql /var/lib/mysql-files
Replace /var/lib/mysql-files
with the file path previously specified.
Connect to the MySQL server with the
local-infile
setting to
1
.
> mysql -u user1
-p --local-infile=1
Replace user1
with your MySQL
username.
Create and use the database to store the table. For example:
mysql>CREATE DATABASE onnx_model;
mysql>USE onnx_model;
Create a table with only one column to store the string.
The following example creates the
onnx_temp
table with the
onnx_string
column with the
LONGTEXT
data type.
mysql> CREATE TABLE onnx_temp (onnx_string LONGTEXT);
Use a LOAD DATA INFILE
statement to
load the pre-processed .onnx
file
into the temporary table.
The following example loads the
iris_base64.onnx
file with the string
into the onnx_string
column in the
onnx_temp
table.
mysql> LOAD DATA INFILE 'iris_base64.onnx'
INTO TABLE onnx_temp
CHARACTER SET binary
FIELDS TERMINATED BY '\t'
LINES TERMINATED BY '\r' (onnx_string);
Insert the loaded string into a session variable.
The following example loads the loaded string in the
onnx_string
column into the
@onnx_table_model_object
session
variable.
mysql> SELECT onnx_string FROM onnx_temp INTO @onnx_table_model_object;
After preparing the model, you can Defining Model Metadata.
For larger model files, you must import the model as a table. Make sure you do the following:
Set the appropriate local-infile
setting for the client. The server setting
local_infile=ON
is enabled by
default. Verify with your admin before using these
settings. See Security Considerations for LOAD DATA LOCAL
to learn more.
Upload the file to the appropriate folder in the MySQL
server based on the
secure_file_priv
setting. To review this setting, connect to the MySQL
server and run the following command:
mysql> SHOW VARIABLES LIKE 'secure_file_priv';
To import the model as a table:
From a terminal window, upload the ONNX file to the folder of your username in the compute instance.
$> scp -v -i ssh-key.key /Users/user1/iris_base64.onnx user1@ComputeInstancePublicIP:/home/user1/
Replace the following:
ssh-key.key
: The full
file path to the SSH key file (.key) for the compute
instance.
/Users/user1/iris_base64.onnx
:
The full file path to the ONNX file on your device.
user1@ComputeInstancePublicIP
:
The appropriate username and public IP for the
compute instance.
/home/user1/
: The
appropriate file path to your username in the
compute instance.
Once the upload successfully completes, SSH into the compute instance.
$> ssh -i ssh-key.key user1@computeInstancePublicIP
Replace the following:
ssh-key.key
: The full
file path to the SSH key file (.key) for the compute
instance.
user1@ComputeInstancePublicIP
:
The appropriate username and public IP for the
compute instance.
Change the directory to the one for your username.
$> cd /home/user1
Replace user1
with your own username.
Create a copy of the ONNX file.
$> touch iris_base64.onnx
Replace iris_base64.onnx
with the
file name of the ONNX file.
Copy the ONNX file to the appropriate folder in the
MySQL server based on the
secure_file_priv
setting.
$> sudo cp iris_base64.onnx /var/lib/mysql-files
Replace the following:
iris_base64.onnx
: The
file name of the ONNX file.
/var/lib/mysql-files
: The
file path based on the
secure_file_priv
setting.
Update the owner and group of the file path previously specified that has the uploaded ONNX file.
$> sudo chown -R mysql:mysql /var/lib/mysql-files
Replace /var/lib/mysql-files
with the file path previously specified.
Connect to the MySQL server with the
local-infile
setting to
1
.
> mysql -u user1
-p --local-infile=1
Replace user1
with your MySQL
username.
Create and use the database to store the table. For example:
mysql>CREATE DATABASE onnx_model;
mysql>USE onnx_model;
Create a table to store the model. The table must have
the three required columns to store the details for the
model (chunk_id
,
model_object
, and
model_metadata
). See
ML_MODEL_IMPORT Overview.
For example:
mysql> CREATE TABLE model_table (chunk_id INT AUTO_INCREMENT PRIMARY KEY, model_object LONGTEXT NOT NULL, model_metadata JSON DEFAULT NULL);
Use a LOAD DATA INFILE
statement to
load the model. If needed, load the model in batches of
files depending on the size of the model. See
LOAD DATA Statement to learn more. The following
example loads the model in three separate files into the
model_object
column in the
model_table
table previously created:
mysql>LOAD DATA INFILE '/onnx_examples/x00' INTO TABLE model_table CHARACTER SET binary FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\r' (model_object);
Query OK, 1 row affected (34.96 sec) Records: 1 Deleted: 0 Skipped: 0 Warnings: 0 mysql>LOAD DATA INFILE '/onnx_examples/x01' INTO TABLE model_table CHARACTER SET binary FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\r' (model_object);
Query OK, 1 row affected (32.74 sec) Records: 1 Deleted: 0 Skipped: 0 Warnings: 0 mysql>LOAD DATA INFILE '/onnx_examples/x02' INTO TABLE model_table CHARACTER SET binary FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\r' (model_object);
Query OK, 1 row affected (11.90 sec) Records: 1 Deleted: 0 Skipped: 0 Warnings: 0
After preparing the model, you can Defining Model Metadata.
After preparing the ONNX model (either as a string or table), define the metadata for the model as required. See Model Metadata and ONNX Model Metadata to learn more about requirements depending on the task type of the model.
To define the metadata for the ONNX model:
If including the column names for the model in the metadata, you have the option to set them into a JSON object as key-value pairs.
mysql> SET @variable
= JSON_OBJECT("key
","value
"[,"key
","value
"] ...);
For example:
mysql> SET @column_names = JSON_OBJECT("0","f1", "1","f2", "2","f3");
Set the metadata for the model as required into a JSON object as key-value pairs. To learn more about metadata requirements, see ONNX Model Metadata. You can also include additional information that allows you to properly configure input tables and columns for generating predictions.
mysql> SET @variable
= JSON_OBJECT("key
","value
"[,"key
","value
"] ...);
The following example shows how to define the metadata
if you import the model as a string (pre-processed
object). The predictions_name
and
prediction_probabilities_name
variables are provided because it is a classification
task. Including the column_names
allows you to refer to the metadata to ensure that input
tables for predictions have the same details. Otherwise
an error generates.
mysql> SET @model_metadata = JSON_OBJECT('task','classification','onnx_outputs_info',
JSON_OBJECT('predictions_name','label','prediction_probabilities_name', 'probabilities'),'target_column_name','target',
'train_table_name','mlcorpus.`classification_3_table`','column_names',@column_names,'notes','user notes for the model',
'training_score',0.734,'training_time',100.34,'n_rows',1000,'n_columns',3,'algorithm_name','xgboost');
The following example shows how to define the metadata
if you import the model from a table. The
predictions_name
and
prediction_probabilities_name
variables are provided because it is a classification
task. After defining the metadata, update the metadata
for the temporary table for the row that is
chunk_id=1
.
mysql>SET @model_metadata = JSON_OBJECT('task','classification','onnx_outputs_info', JSON_OBJECT('predictions_name','label','prediction_probabilities_name', 'probabilities'),'target_column_name','target');
mysql>UPDATE mlcorpus.model_table SET model_metadata=@model_metadata WHERE chunk_id=1;
Depending on how you prepared the model, follow the appropriate steps to import the model:
If you followed the steps to Preparing to Import ONNX Model as a Pre-Processed Object, review the following steps to import the model as a pre-processed object.
To import the model as a pre-processed object:
Optionally, define the model handle for the imported model instead of automatically generating one. See Work with Model Handles.
mysql> SET @variable
= 'model_handle
';
For example:
mysql> SET @model = 'onnx_model_string';
Run
ML_MODEL_IMPORT
to import the model.
mysql> CALL sys.ML_MODEL_IMPORT (model_object
, model_metadata
, model_handle
);
Since you are importing a pre-processed object, the
model_object
is defined by the string
you previously set in the in either the
@onnx_string_model_object
or
@onnx_table_model_object
session
variable. The model_metadata
is
defined by the metadata previously set in the
@model_metadata
session variable. The
model_handle
is defined by the
session variable created for the model handle.
See the following example:
mysql> CALL sys.ML_MODEL_IMPORT(@onnx_string_model_object, @model_metadata, @model);
Confirm the model successfully loaded by querying the
model_id
and
model_handle
from the model catalog.
Query the model by using the model handle previously
created. Replace user1
with your own
MySQL user name.
mysql> SELECT model_id, model_handle FROM ML_SCHEMA_user1.MODEL_CATALOG WHERE model_handle='onnx_model_string';
+----------+--------------------+
| model_id | model_handle |
+----------+--------------------+
| 1 | onnx_model_table |
+----------+--------------------+
1 row in set (0.0485 sec)
To load the model into MySQL AI so you can start using it
with MySQL AI routines, run
ML_MODEL_LOAD
.
mysql> CALL sys.ML_MODEL_LOAD(model_handle
, NULL);
For example:
mysql> CALL sys.ML_MODEL_LOAD(@model, NULL);
If you followed the steps to Preparing to Import ONNX Model as a Table, review the following steps to import the model as a table.
To import the model as a table:
Optionally, define the model handle for the imported model instead of automatically generating one. See Work with Model Handles.
mysql> SET @variable
= 'model_handle
';
For example:
mysql> SET @model = 'onnx_model_table';
Run
ML_MODEL_IMPORT
to import the model.
mysql> CALL sys.ML_MODEL_IMPORT (model_object
, model_metadata
, model_handle
);
Since you are importing a table, the
model_object
is set to
NULL
. The
model_metadata
is defined by the
schema name and table name storing the string for the
ONNX model. The metadata for the model is stored in the
table when following the steps to
Defining Model Metadata.
The model_handle
is defined by the
session variable created for the model handle.
See the following example:
mysql> CALL sys.ML_MODEL_IMPORT(NULL, JSON_OBJECT('schema', 'onnx_models', 'table', 'model_table'), @model);
Confirm the model successfully loaded by querying the
model_id
and
model_handle
from the model catalog.
Query the model by using the model handle previously
created. Replace user1
with your own
MySQL user name.
mysql> SELECT model_id, model_handle FROM ML_SCHEMA_user1.MODEL_CATALOG WHERE model_handle='onnx_model_table';
+----------+--------------------+
| model_id | model_handle |
+----------+--------------------+
| 2 | onnx_model_table |
+----------+--------------------+
1 row in set (0.0485 sec)
To load the model into MySQL AI so you can start using it
with MySQL AI routines, run
ML_MODEL_LOAD
.
mysql> CALL sys.ML_MODEL_LOAD(model_handle
, NULL);
For example:
mysql> CALL sys.ML_MODEL_LOAD(@model, NULL);
Review the following additional examples for importing ONNX models.
In the following example, a ONNX model for classification is imported. Then, the model is used to generate predictions, a score, and prediction explainers for a dataset in MySQL AI.
mysql>SET @model = 'sklearn_pipeline_classification_3_onnx';
Query OK, 0 rows affected (0.0003 sec) mysql>SET @model_metadata = JSON_OBJECT('task','classification', 'onnx_outputs_info', JSON_OBJECT('predictions_name','label','prediction_probabilities_name', 'probabilities'));
Query OK, 0 rows affected (0.0003 sec) mysql>CALL sys.ML_MODEL_IMPORT(@onnx_encode_sklearn_pipeline_classification_3, @model_metadata, @model);
Query OK, 0 rows affected (1.2438 sec) mysql>CALL sys.ML_MODEL_LOAD(@model, NULL);
Query OK, 0 rows affected (0.5372 sec) mysql>CALL sys.ML_PREDICT_TABLE('mlcorpus.classification_3_predict', @model, 'mlcorpus.predictions', NULL);
Query OK, 0 rows affected (0.8743 sec) mysql>SELECT * FROM mlcorpus.predictions;
+-------------------+----+----+-----+------------+----------------------------------------------------------------------------------------------------------+ | _4aad19ca6e_pk_id | f1 | f2 | f3 | Prediction | ml_results | +-------------------+----+----+-----+------------+----------------------------------------------------------------------------------------------------------+ | 1 | a | 20 | 1.2 | 0 | {"predictions": {"prediction": 0}, "probabilities": {"0": 0.5099999904632568, "1": 0.49000000953674316}} | | 2 | b | 21 | 3.6 | 1 | {"predictions": {"prediction": 1}, "probabilities": {"0": 0.3199999928474426, "1": 0.6800000071525574}} | | 3 | c | 19 | 7.8 | 1 | {"predictions": {"prediction": 1}, "probabilities": {"0": 0.3199999928474426, "1": 0.6800000071525574}} | | 4 | d | 18 | 9 | 0 | {"predictions": {"prediction": 0}, "probabilities": {"0": 0.5199999809265137, "1": 0.47999998927116394}} | | 5 | e | 17 | 3.6 | 1 | {"predictions": {"prediction": 1}, "probabilities": {"0": 0.3199999928474426, "1": 0.6800000071525574}} | +-------------------+----+----+-----+------------+----------------------------------------------------------------------------------------------------------+ 5 rows in set (0.0005 sec) mysql>CALL sys.ML_SCORE('mlcorpus.classification_3_table','target', @model, 'accuracy', @score, NULL);
Query OK, 0 rows affected (0.9573 sec) mysql>SELECT @score;
+--------+ | @score | +--------+ | 1 | +--------+ 1 row in set (0.0003 sec) mysql>CALL sys.ML_EXPLAIN('mlcorpus.classification_3_table', 'target', @model, JSON_OBJECT('model_explainer', 'shap', 'prediction_explainer', 'shap'));
Query OK, 0 rows affected (10.1771 sec) mysql>SELECT model_explanation FROM ML_SCHEMA_root.MODEL_CATALOG WHERE model_handle=@model;
+------------------------------------------------------+ | model_explanation | +------------------------------------------------------+ | {"shap": {"f1": 0.0928, "f2": 0.0007, "f3": 0.0039}} | +------------------------------------------------------+ 1 row in set (0.0005 sec) mysql>CALL sys.ML_EXPLAIN_TABLE('mlcorpus.classification_3_predict', @model, 'mlcorpus.explanations_shap', JSON_OBJECT('prediction_explainer', 'shap'));
Query OK, 0 rows affected (7.6577 sec) mysql>SELECT * FROM mlcorpus.explanations_shap;
+-------------------+----+----+-----+------------+----------------+-----------------+----------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | _4aad19ca6e_pk_id | f1 | f2 | f3 | Prediction | f1_attribution | f2_attribution | f3_attribution | ml_results | +-------------------+----+----+-----+------------+----------------+-----------------+----------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | 1 | a | 20 | 1.2 | 0 | 0.116909 | 0.000591494 | -0.00524929 | {"predictions": {"prediction": 0}, "attributions": {"f1_attribution": 0.11690924863020577, "f2_attribution": 0.0005914936463038284, "f3_attribution": -0.005249293645222988}} | | 2 | b | 21 | 3.6 | 1 | 0.0772133 | -0.00110559 | 0.00219658 | {"predictions": {"prediction": 1}, "attributions": {"f1_attribution": 0.07721325159072877, "f2_attribution": -0.0011055856943130368, "f3_attribution": 0.002196577191352772}} | | 3 | c | 19 | 7.8 | 1 | 0.0781372 | 0.0000000913938 | -0.00324671 | {"predictions": {"prediction": 1}, "attributions": {"f1_attribution": 0.07813718219598137, "f2_attribution": 9.139378859268632e-08, "f3_attribution": -0.0032467077175776238}} | | 4 | d | 18 | 9 | 0 | 0.115209 | -0.000592354 | 0.00639341 | {"predictions": {"prediction": 0}, "attributions": {"f1_attribution": 0.11520911753177646, "f2_attribution": -0.0005923539400101152, "f3_attribution": 0.006393408775329595}} | | 5 | e | 17 | 3.6 | 1 | 0.0767679 | 0.00110463 | 0.00219425 | {"predictions": {"prediction": 1}, "attributions": {"f1_attribution": 0.0767679293950399, "f2_attribution": 0.0011046340068181504, "f3_attribution": 0.002194248636563534}} | +-------------------+----+----+-----+------------+----------------+-----------------+----------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 5 rows in set (0.0005 sec)
In the following example, a ONNX model for regression is imported. Then, the model is used to generate predictions, a score, and prediction explainers for a dataset in MySQL AI.
mysql>SET @model = 'sklearn_pipeline_regression_2_onnx';
Query OK, 0 rows affected (0.0003 sec) mysql>SET @model_metadata = JSON_OBJECT('task','regression', 'onnx_outputs_info',JSON_OBJECT('predictions_name','variable'));
Query OK, 0 rows affected (0.0003 sec) mysql>CALL sys.ML_MODEL_IMPORT(@onnx_encode_sklearn_pipeline_regression_2, @model_metadata, @model);
Query OK, 0 rows affected (1.0652 sec) mysql>CALL sys.ML_MODEL_LOAD(@model, NULL);
Query OK, 0 rows affected (0.5141 sec) mysql>CALL sys.ML_PREDICT_TABLE('mlcorpus.regression_2_table', @model, 'mlcorpus.predictions', NULL);
Query OK, 0 rows affected (0.8902 sec) mysql>SELECT * FROM mlcorpus.predictions;
+-------------------+----+----+-----+--------+------------+-----------------------------------------------------+ | _4aad19ca6e_pk_id | f1 | f2 | f3 | target | Prediction | ml_results | +-------------------+----+----+-----+--------+------------+-----------------------------------------------------+ | 1 | a | 20 | 1.2 | 22.4 | 22.262 | {"predictions": {"prediction": 22.262039184570312}} | | 2 | b | 21 | 3.6 | 32.9 | 32.4861 | {"predictions": {"prediction": 32.486114501953125}} | | 3 | c | 19 | 7.8 | 56.8 | 56.2482 | {"predictions": {"prediction": 56.24815368652344}} | | 4 | d | 18 | 9 | 31.8 | 31.8 | {"predictions": {"prediction": 31.80000114440918}} | | 5 | e | 17 | 3.6 | 56.4 | 55.9861 | {"predictions": {"prediction": 55.986114501953125}} | +-------------------+----+----+-----+--------+------------+-----------------------------------------------------+ 5 rows in set (0.0005 sec) mysql>CALL sys.ML_SCORE('mlcorpus.regression_2_table','target', @model, 'r2', @score, NULL);
Query OK, 0 rows affected (0.8688 sec) mysql>SELECT @score;
+--------------------+ | @score | +--------------------+ | 0.9993192553520203 | +--------------------+ 1 row in set (0.0003 sec) mysql>CALL sys.ML_EXPLAIN('mlcorpus.regression_2_table', 'target', @model, JSON_OBJECT('model_explainer', 'partial_dependence', 'columns_to_explain', JSON_ARRAY('f1'), 'prediction_explainer', 'shap'));
Query OK, 0 rows affected (9.9860 sec) m mysql>CALL sys.ML_EXPLAIN_TABLE('mlcorpus.regression_2_predict', @model, 'mlcorpus.explanations', JSON_OBJECT('prediction_explainer', 'shap'));
Query OK, 0 rows affected (8.2625 sec) mysql>SELECT * FROM mlcorpus.explanations;
+-------------------+----+----+-----+------------+----------------+----------------+----------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | _4aad19ca6e_pk_id | f1 | f2 | f3 | Prediction | f1_attribution | f2_attribution | f3_attribution | ml_results | +-------------------+----+----+-----+------------+----------------+----------------+----------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ | 1 | a | 20 | 1.2 | 22.262 | -10.7595 | -4.25162 | -2.48331 | {"predictions": {"prediction": 22.262039184570312}, "attributions": {"f1_attribution": -10.759506797790523, "f2_attribution": -4.251623916625977, "f3_attribution": -2.483314704895024}} | | 2 | b | 21 | 3.6 | 32.4861 | 2.33657 | -8.50325 | -1.1037 | {"predictions": {"prediction": 32.486114501953125}, "attributions": {"f1_attribution": 2.336572837829592, "f2_attribution": -8.50324745178223, "f3_attribution": -1.1036954879760748}} | | 3 | c | 19 | 7.8 | 56.2482 | 14.8361 | 0 | 1.65554 | {"predictions": {"prediction": 56.24815368652344}, "attributions": {"f1_attribution": 14.83612575531006, "f2_attribution": 0.0, "f3_attribution": 1.6555433273315412}} | | 4 | d | 18 | 9 | 31.8 | -15.2433 | 4.25162 | 3.03516 | {"predictions": {"prediction": 31.80000114440918}, "attributions": {"f1_attribution": -15.243269538879392, "f2_attribution": 4.251623725891111, "f3_attribution": 3.0351623535156236}} | | 5 | e | 17 | 3.6 | 55.9861 | 8.83008 | 8.50325 | -1.1037 | {"predictions": {"prediction": 55.986114501953125}, "attributions": {"f1_attribution": 8.830077743530275, "f2_attribution": 8.50324764251709, "f3_attribution": -1.1036954879760756}} | +-------------------+----+----+-----+------------+----------------+----------------+----------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+ 5 rows in set (0.0006 sec)
An example with task set to NULL
.
mysql>SET @model = 'tensorflow_recsys_onnx';
mysql>CALL sys.ML_MODEL_IMPORT(@onnx_encode_tensorflow_recsys, NULL, @model);
Query OK, 0 rows affected (1.0037 sec) mysql>CALL sys.ML_MODEL_LOAD(@model, NULL);
Query OK, 0 rows affected (0.5116 sec) mysql>CALL sys.ML_PREDICT_TABLE('mlcorpus.recsys_predict', @model, 'mlcorpus.predictions', NULL);
Query OK, 0 rows affected (0.8271 sec) mysql>SELECT * FROM mlcorpus.predictions;
+-------------------+---------+-------------+--------------------------+-----------------------------------------------------------+ | _4aad19ca6e_pk_id | user_id | movie_title | Prediction | ml_results | +-------------------+---------+-------------+--------------------------+-----------------------------------------------------------+ | 1 | a | A | {"output_1": ["0.7558"]} | {"predictions": {"prediction": {"output_1": ["0.7558"]}}} | | 2 | b | B | {"output_1": ["1.0443"]} | {"predictions": {"prediction": {"output_1": ["1.0443"]}}} | | 3 | c | A | {"output_1": ["0.8483"]} | {"predictions": {"prediction": {"output_1": ["0.8483"]}}} | | 4 | d | B | {"output_1": ["1.2986"]} | {"predictions": {"prediction": {"output_1": ["1.2986"]}}} | | 5 | e | C | {"output_1": ["1.1568"]} | {"predictions": {"prediction": {"output_1": ["1.1568"]}}} | +-------------------+---------+-------------+--------------------------+-----------------------------------------------------------+ 5 rows in set (0.0005 sec)
Review how to Create a Machine Learning Model.
Review Machine Learning Use Cases to create machine learning models with sample datasets.