6 Rolling Back OSO

This chapter provides information about rolling back Oracle Communications Operations Services Overlay (OSO) deployment to previous releases.

Note:

OSO does not support roll back from 24.3.x to previous releases.

6.1 Supported Rollback Paths

The following table lists the supported rollback paths for OSO:

Table 6-1 Supported Rollback Path

Source Release Target Release
24.3.1 24.2.0
24.3.1 24.1.x

6.2 Prerequisites

Following are the prerequisites to rollback OSO:
  • Ensure that the system has OSO 24.3.1 installed and the OSO is working properly.
  • Verify if all the pods and services are up and running.

6.3 Rollback Tasks

To roll back from OSO 24.3.1 to previous releases 23.4.x or 24.2.0:

  1. Check the revision you want to roll back your release to.
    $ helm -n <namespace> history <oso-release-name>

    For example:

    $ helm -n dbtier history oso

    Sample output

    
    REVISION        UPDATED                         STATUS          CHART                   APP VERSION             DESCRIPTION
    1               Mon Mar 25 17:46:51 2024        superseded      prometheus-15.16.1      24.3.1-62-g28668fe Install complete
    2               Mon Mar 25 17:48:01 2024        deployed        prometheus-15.16.1      24.3.1-62-g28668fe Upgrade complete
  2. Pick the revision number to which you wish to roll back your release into. In above example, it will be 1. Run the following command to roll back:
    $ helm  -n <namespace> rollback <oso-release-name> <oso revision number>

    For example:

    $ helm -n dbtier rollback oso 1

    Sample output

    Rollback was a success! Happy Helming!
  3. Verify if the rollback was successful to a previous version, perform the following command:
    $ helm -n <namespace> history <oso-release-name>

    For example:

    $ helm -n dbtier history oso

    Sample output

    
    REVISION        UPDATED                         STATUS          CHART                   APP VERSION             DESCRIPTION
    1               Mon Mar 25 17:46:51 2024        superseded      prometheus-15.16.1      24.3.1-62-g28668fe Install complete
    2               Mon Mar 25 17:48:01 2024        superseded      prometheus-15.16.1      24.3.1-62-g28668fe Upgrade complete
    3               Wed Mar 27 18:37:02 2024        deployed        prometheus-15.16.1      24.3.1-62-g28668fe Rollback to 1
  4. Post Helm rollback a new Prometheus pod is created, which is in pending state. To make it into running state, we have to detach the PVC from old pod and attach it to the new pod using below steps:
    $ kubectl get deployment -n <oso-namespace> 
    # Find Prometheus deployment name and replace it in below commands, Both the commands below needs to be run at the same time
    $ kubectl -n <oso-namespace> scale deploy <oso-deployment-name> --replicas=0
    $ kubectl -n <oso-namespace> scale deploy <oso-deployment-name> --replicas=1

    For example:

    $ kubectl get deployment -n OSO
    
     $ kubectl -n OSO scale deployment oso-prom-svr --replicas=0
     $ kubectl -n OSO scale deployment oso-prom-svr --replicas=1

    Sample Output:

    $ kubectl -n testoso get deployment 
    NAME           READY   UP-TO-DATE   AVAILABLE   AGE
    oso-prom-svr   1/1     1            1           23h 
    $ kubectl -n testoso scale deploy oso-prom-svr --replicas=0
    deployment.apps/oso-prom-svr scaled
    $ kubectl -n testoso scale deploy oso-prom-svr --replicas=1 
    deployment.apps/oso-prom-svr scaled