Using console to insert table data

Learn how to insert data into Oracle NoSQL Database Cloud Service tables from the NoSQL console.

The NoSQL console lets you insert new rows into the Oracle NoSQL Database Cloud Service tables in the following modes:
  1. Simple Input Mode: You can use this mode to provide the values for the new rows declaratively.
  2. Advanced JSON Input Mode: You can use this mode to provide the values for the new rows in JSON format.
  3. You can use the INSERT statement to add a row. Under NoSQL>>Tables, choose the table where you want to add a row. Under Explore Data, add the INSERT statement in the text box provided for the SQL statement and execute the command.
  4. You can also bulk upload the data from a local file into the table, via the browser.

Note: If the table is a Global Active table, then adding data in one region also adds the data to all the regional table replicas.

Inserting Data Into Tables: Simple Input Mode

Learn how to insert data into Oracle NoSQL Database Cloud Service tables by using the Simple Input insertion mode.

To insert data into tables:
  1. Access the NoSQL console from the Infrastructure Console. See Accessing the Service from the Infrastructure Console .
  2. The NoSQL console lists all the tables in the tenancy. To view table details, do either of the following:
    • Click the table name, or
    • Click the action menu corresponding to the table name and select View Details.
    The Table Details page opens up.
  3. Click Insert Row.
  4. In the Insert Row window, select Simple Input for Entry Mode.
  5. All the columns in the table are listed. Input the data for the columns of the table. For some column types, such as Binary, you upload the data.

    Note:

    Entering a value is mandatory for all non-nullable columns of the table.
  6. Click Insert Row.
    The record is inserted into the table.
To view help for the current page, click the help link at the top of the page.

Inserting Data Into Tables: Advanced JSON Input Mode

Learn how to insert data into Oracle NoSQL Database Cloud Service tables by using the Advanced JSON input mode.

To insert data into tables:
  1. Access the NoSQL console from the Infrastructure Console. See Accessing the Service from the Infrastructure Console .
  2. The NoSQL console lists all the tables in the tenancy. To view table details, do either of the following:
    • Click the table name, or
    • Click the action menu corresponding to the table name and select View Details.
    The Table Details page opens up.
  3. Click Insert Row.
  4. In the Insert Record window, select Advanced JSON Input for Entry Mode.
  5. Paste or upload the Record Definition in JSON format.
  6. Click Insert Row.
    The record is inserted into the table.
To view help for the current page, click the help link at the top of the page.

Using SQL command to insert data

The INSERT statement is used to construct a new row and add it to a specified table.

Optional column(s) may be specified after the table name. This list contains the column names for a subset of the table’s columns. The subset must include all the primary key columns. If no columns list is present, the default columns list is the one containing all the columns of the table, in the order, they are specified in the CREATE TABLE statement.

The columns in the columns list correspond one-to-one to the expressions (or DEFAULT keywords) listed after the VALUES clause (an error is raised if the number of expressions/DEFAULTs is not the same as the number of columns). These expressions/DEFAULTs compute the value for their associated column in the new row. An error is raised if an expression returns more than one item. If an expression returns no result, NULL is used as the result of that expression. If instead of an expression, the DEFAULT keyword appears in the VALUES list, the default value of the associated column is used as the value of that column in the new row. The default value is also used for any missing columns when the number of columns in the columns list is less than the total number of columns in the table.

Example 1: Inserting a row into BaggageInfo table providing all column values:
INSERT INTO BaggageInfo VALUES(
1762392196147,
"Birgit Naquin",
"M",
"165-742-5715",
"QD1L0T",
[ {
    "id" : "7903989918469",
    "tagNum" : "17657806240229",
    "routing" : "JFK/MAD",
    "lastActionCode" : "OFFLOAD",
    "lastActionDesc" : "OFFLOAD",
    "lastSeenStation" : "MAD",
    "flightLegs" : [ {
      "flightNo" : "BM495",
      "flightDate" : "2019-03-07T07:00:00Z",
      "fltRouteSrc" : "JFK",
      "fltRouteDest" : "MAD",
      "estimatedArrival" : "2019-03-07T14:00:00Z",
      "actions" : [ {
        "actionAt" : "MAD",
        "actionCode" : "Offload to Carousel at MAD",
        "actionTime" : "2019-03-07T13:54:00Z"
      }, {
        "actionAt" : "JFK",
        "actionCode" : "ONLOAD to MAD",
        "actionTime" : "2019-03-07T07:00:00Z"
      }, {
        "actionAt" : "JFK",
        "actionCode" : "BagTag Scan at JFK",
        "actionTime" : "2019-03-07T06:53:00Z"
      }, {
        "actionAt" : "JFK",
        "actionCode" : "Checkin at JFK",
        "actionTime" : "2019-03-07T05:03:00Z"
      } ]
    } ],
    "lastSeenTimeGmt" : "2019-03-07T13:51:00Z",
    "bagArrivalDate" : "2019-03-07T13:51:00Z"
  } ]
)

Example 2: Skipping some data while doing an INSERT statement by specifying the DEFAULT clause.

You can skip the data of some columns by specifying "DEFAULT".
INSERT INTO BaggageInfo VALUES(
1762397286805,
"Bonnie Williams",
DEFAULT,
DEFAULT,
"CZ1O5I",
[ {
    "id" : "79039899129693",
    "tagNum" : "17657806216554",
    "routing" : "SFO/ORD/FRA",
    "lastActionCode" : "OFFLOAD",
    "lastActionDesc" : "OFFLOAD",
    "lastSeenStation" : "FRA",
    "flightLegs" : [ {
      "flightNo" : "BM572",
      "flightDate" : "2019-03-02T05:00:00Z",
      "fltRouteSrc" : "SFO",
      "fltRouteDest" : "ORD",
      "estimatedArrival" : "2019-03-02T09:00:00Z",
      "actions" : [ {
        "actionAt" : "SFO",
        "actionCode" : "ONLOAD to ORD",
        "actionTime" : "2019-03-02T05:24:00Z"
      }, {
        "actionAt" : "SFO",
        "actionCode" : "BagTag Scan at SFO",
        "actionTime" : "2019-03-02T04:52:00Z"
      }, {
        "actionAt" : "SFO",
        "actionCode" : "Checkin at SFO",
        "actionTime" : "2019-03-02T03:28:00Z"
      } ]
    }, {
      "flightNo" : "BM582",
      "flightDate" : "2019-03-02T05:24:00Z",
      "fltRouteSrc" : "ORD",
      "fltRouteDest" : "FRA",
      "estimatedArrival" : "2019-03-02T13:24:00Z",
      "actions" : [ {
        "actionAt" : "FRA",
        "actionCode" : "Offload to Carousel at FRA",
        "actionTime" : "2019-03-02T13:20:00Z"
      }, {
        "actionAt" : "ORD",
        "actionCode" : "ONLOAD to FRA",
        "actionTime" : "2019-03-02T12:54:00Z"
      }, {
        "actionAt" : "ORD",
        "actionCode" : "OFFLOAD from ORD",
        "actionTime" : "2019-03-02T12:30:00Z"
      } ]
    } ],
    "lastSeenTimeGmt" : "2019-03-02T13:18:00Z",
    "bagArrivalDate" : "2019-03-02T13:18:00Z"
  } ]
)

Example 3: Specifying column names and skipping columns in the insert statement.

If you have data only for some columns of a table, you can specify the name of the columns in the INSERT clause and then specify the corresponding values in the "VALUES" clause.
INSERT INTO BaggageInfo(ticketNo, fullName,confNo,bagInfo) VALUES(
1762355349471,
"Bryant Weber",
"LI7N1W",
[ {
    "id" : "79039899149056",
    "tagNum" : "17657806234185",
    "routing" : "MEL/LAX/MIA",
    "lastActionCode" : "OFFLOAD",
    "lastActionDesc" : "OFFLOAD",
    "lastSeenStation" : "MIA",
    "flightLegs" : [ {
      "flightNo" : "BM114",
      "flightDate" : "2019-03-01T12:00:00Z",
      "fltRouteSrc" : "MEL",
      "fltRouteDest" : "LAX",
      "estimatedArrival" : "2019-03-02T02:00:00Z",
      "actions" : [ {
        "actionAt" : "MEL",
        "actionCode" : "ONLOAD to LAX",
        "actionTime" : "2019-03-01T12:20:00Z"
      }, {
        "actionAt" : "MEL",
        "actionCode" : "BagTag Scan at MEL",
        "actionTime" : "2019-03-01T11:52:00Z"
      }, {
        "actionAt" : "MEL",
        "actionCode" : "Checkin at MEL",
        "actionTime" : "2019-03-01T11:43:00Z"
      } ]
    }, {
      "flightNo" : "BM866",
      "flightDate" : "2019-03-01T12:20:00Z",
      "fltRouteSrc" : "LAX",
      "fltRouteDest" : "MIA",
      "estimatedArrival" : "2019-03-02T16:21:00Z",
      "actions" : [ {
        "actionAt" : "MIA",
        "actionCode" : "Offload to Carousel at MIA",
        "actionTime" : "2019-03-02T16:18:00Z"
      }, {
        "actionAt" : "LAX",
        "actionCode" : "ONLOAD to MIA",
        "actionTime" : "2019-03-02T16:12:00Z"
      }, {
        "actionAt" : "LAX",
        "actionCode" : "OFFLOAD from LAX",
        "actionTime" : "2019-03-02T16:02:00Z"
      } ]
    } ],
    "lastSeenTimeGmt" : "2019-03-02T16:09:00Z",
    "bagArrivalDate" : "2019-03-02T16:09:00Z"
  } ]
)
Example 4: Inserting a row into stream_acct table providing all column values:
INSERT INTO stream_acct VALUES(
1,
"AP",
"2023-10-18",
{
   "firstName": "Adam",
   "lastName": "Phillips",
   "country": "Germany",
   "contentStreamed": [{
      "showName": "At the Ranch",
      "showId": 26,
      "showtype": "tvseries",
      "genres": ["action", "crime", "spanish"],
      "numSeasons": 4,
      "seriesInfo": [{
         "seasonNum": 1,
         "numEpisodes": 2,
         "episodes": [{
            "episodeID": 20,
            "episodeName": "Season 1 episode 1",
            "lengthMin": 85,
            "minWatched": 85,
            "date": "2022-04-18"
         },
         {
            "episodeID": 30,
            "lengthMin": 60,
            "episodeName": "Season 1 episode 2",
            "minWatched": 60,
            "date": "2022 - 04 - 18 "
         }]
      },
      {
         "seasonNum": 2,
         "numEpisodes": 2,
         "episodes": [{
            "episodeID": 40,
            "episodeName": "Season 2 episode 1",
            "lengthMin": 50,
            "minWatched": 50,
            "date": "2022-04-25"
         },
         {
            "episodeID": 50,
            "episodeName": "Season 2 episode 2",
            "lengthMin": 45,
            "minWatched": 30,
            "date": "2022-04-27"
         }]
      },
      {
         "seasonNum": 3,
         "numEpisodes": 2,
         "episodes": [{
            "episodeID": 60,
            "episodeName": "Season 3 episode 1",
            "lengthMin": 50,
            "minWatched": 50,
            "date": "2022-04-25"
         },
         {
            "episodeID": 70,
            "episodeName": "Season 3 episode 2",
            "lengthMin": 45,
            "minWatched": 30,
            "date": "2022 - 04 - 27 "
         }]
      }]
   },
   {
      "showName": "Bienvenu",
      "showId": 15,
      "showtype": "tvseries",
      "genres": ["comedy", "french"],
      "numSeasons": 2,
      "seriesInfo": [{
         "seasonNum": 1,
         "numEpisodes": 2,
         "episodes": [{
            "episodeID": 20,
            "episodeName": "Bonjour",
            "lengthMin": 45,
            "minWatched": 45,
            "date": "2022-03-07"
         },
         {
            "episodeID": 30,
            "episodeName": "Merci",
            "lengthMin": 42,
            "minWatched": 42,
            "date": "2022-03-08"
         }]
      }]
   }]
});

Upload of table rows

The Upload Data button in the Table details page allows bulk uploading of data from a local file into the table, via the browser.

The Upload feature is intended for loading less than a few thousand rows. This feature is great for performing a proof of concept ( POC) or doing basic testing of the service. It is a convenient way to populate a small table. If you want to write tens of thousands of rows, then for performance reasons you would be better off using the Oracle NoSQL Database Migrator or writing your own program using one of the NoSQL SDKs. If, however, you want to quickly insert a few hundred or a few thousand rows, this upload method is an expeditious approach.

The file to be uploaded must contain a series of JSON objects. The objects can be expressed as comma-separated items of a single array or as a sequence of simple objects bounded by curly braces, with no syntactic delimiters between them. The contents of each object must be correctly formatted JSON and must conform to the schema of the table to which they will be uploaded.

Example: A table is created using the following DDL statement
CREATE TABLE Simple ( id integer, val string, PRIMARY KEY ( id ) )
The following example illustrates using the array format for the file content.

[
  {
    "id": 0,
    "val": "0"
  },
  {
    "id": 1,
    "val": "2"
  }, ...
]
The following example illustrates using simple objects for the file content.

{
  "id": 0,
  "val": "0"
}
{
  "id": 1,
  "val": "2"
}, ...
  • If a column value is not required by the table's schema, then the corresponding JSON property may be left out.
  • If a column value is GENERATED ALWAYS, then the corresponding JSON property must be left out.
  • If a JSON object contains properties with names that do not match any column names, those properties are ignored.

To use the upload feature, click the Upload Data button and select the file to be uploaded. The upload begins immediately, and progress will be shown on the page. Upon successful completion, the total number of rows inserted will be shown. You can interrupt the upload by clicking the Stop Uploading button. The number of rows that were successfully committed to the database will be shown.

If an error in the input file is detected, then uploading will stop and an error message with an approximate line number will be shown. Input errors might be caused by incorrect JSON syntax or schema nonconformance. Errors can also occur during requests for the service. Such errors also stop the uploading and display a message.

Consider a sample table with the following schema:
CREATE TABLE IF NOT EXISTS NDCStable (id INTEGER, name STRING, email STRING, age INTEGER, income DOUBLE, PRIMARY KEY(SHARD(id)))
The following table lists a few common errors in the input file and the corresponding error handling during data upload.

Table - Error Handling During Upload

Issues in the input file Error handling
A non-JSON text in one of the rows
{"id":1,"name":"Jane Smith","email":"iamjane@somemail.co.us","age":30,"income":30000.0}
missed a line
{"id":2,"name":"Adam Smith","email":"adam.smith@mymail.com","age":25,"income":25000.0}

The system ignores the text and continues the upload. A success message is displayed after all the rows are successfully loaded into the table.

In this example, the text "missed a line" is skipped and the two rows with id 1 and 2 are successfully loaded.

Duplicate primary key entries
{"id":2,"name":"Adam Smith","email":"adam.smith@mymail.com","age":25,"income":25000.0}
{"id":2,"name":"Jennifer Smith","email":"jenny1_smith@mymail.com","age":35,"income":35000.0}
{"id":5,"name":"Noelle Smith","email":"noel21@somemail.co.us","age":40,"income":40000.0}

The system skips the duplicated row entry and continues the upload. A success message is displayed after successfully loading all the rows.

In this example, the first row with id 2 is loaded into the table. The second row containing the duplicated id value 2 is skipped. All other rows are successfully uploaded.

Incorrect JSON format
{"id":7,"name":"Tom Smith","email":"reachtom12@mymail.com","age":65,"income":65000.0}
{"id":8,"name":"Jack Smith","email":"iamjacksmi@somemail.co.us","age":60,"income":30000.0
{"id":9,"fame":"Jane Smith""email":"iamjanesmi@somemail.co.us","age":56,"income":35000.0}

The system stops the uploading when it detects the erroneous row. An error message is displayed indicating unexpected end of file and its approximate line number.

In this example, there are two JSON format errors. The row with id 8 is missing a closing brace and id 9 is missing a comma between two JSON fields. Both these entries do not conform to the JSON Data Type format resulting in upload failure. Only the row with id 7 is successfully loaded into the table.

Note:

The uploading operation is stopped upon detecting the first instance of error or inconsistency in the input file.
Primary key missing in a row
{"id":2,"name":"Adam Smith","email":"adam.smith@mymail.com","age":25,"income":25000.0}
{"name":"Jennifer Smith","email":"jenny1_smith@mymail.com","age":35,"income":35000.0}
{"id":5,"name":"Noelle Smith","email":"noel21@somemail.co.us","age":40,"income":40000.0}

The system stops the uploading when it detects the erroneous row. An error message is displayed indicating that the primary key must have a value.

In this example, the second row does not include the primary key field resulting in an error. Only the row with id 2 is successfully loaded into the table.

Schema non-conformance - missing, incorrect, or additional fields
{"id":6,"name":"Yera Smith","email":"ys1982@mymail.com","age":55,"income":55000.0}
{"id":7,"fname":"Tom Smith","email":"reachtom12@mymail.com","age":65,"income":65000.0}
{"id":8,"email":"iamjacksmi@somemail.co.us","age":60,"income":30000.0}

If fields other than the primary key are missing in the input file, the data is successfully uploaded. A success message is displayed. The missing fields are populated with a null value.

In this example, the field name is spelled incorrectly in the row with id 7, and the field name is missing in the row with id 8. In both these cases, all the rows are successfully loaded to the table. The name fields are updated with a null value.

If the upload is stopped in the middle for any reason, you can do one of the following:
  • If there are no columns with generated key values (that is, if the keys are entirely dictated by the JSON file), then you can simply start over with the same file. The already-written rows will be written again.
  • If there are generated key values, then starting over would write new records instead of overwriting existing records. The easiest path would be to drop the table and create it again.
  • Alternatively, you could remove all records from the table by executing the statement DELETE FROM tablename in the Explore data form.

If the provisioned write limit is exceeded during the upload process, a transient message indicating so will be displayed, and the uploading will be slowed down to avoid exceeding the limit again.