Note:
- This tutorial requires access to Oracle Cloud. To sign up for a free account, see Get started with Oracle Cloud Infrastructure Free Tier.
- It uses example values for Oracle Cloud Infrastructure credentials, tenancy, and compartments. When completing your lab, substitute these values with ones specific to your cloud environment.
Create OCI DevOps pipelines for Apache Airflow and deploy it using Helm
Introduction
This is part five of a six-part tutorial series that shows you how to deploy a temporary set of resources on an OKE cluster using Golang microservices representing the usage of OCI SDK, OCI-CLI, Resource Manager, OCI DevOps and Helm to deploy and destroy Apache Airflow.
Objective
In this tutorial you will learn how to create OCI DevOps pipelines for Apache Airflow and deploy it using Helm.
Prerequisites
- Completion of the previous tutorial in this learning path, Part 4/6 - Create OCI DevOps pipelines to build and deploy the Golang microservices
Task 1: Create the DevOps artifacts for Apache Airflow
Before creating the DevOps build pipeline, we need to create the artifacts that will connect with the build results (Helm package and container image).
-
Go to the OCI Registry you have created for this tutorial.
-
Go to your DevOps project page, click Artifacts and then click Add Artifact. Fill the information as below.
-
Context: This is a Kubernetes manifest to implement an ingress for the airflow.
-
Name: airflow-ingress
-
Type: Kubernetes manifest
-
Artifact source: Inline
-
Paste the content below in the value field.
apiVersion: networking.k8s.io/v1 kind: Ingress metadata: generation: 1 name: airflow-ingress namespace: airflow spec: ingressClassName: nginx rules: - host: ${APP_HOST} http: paths: - backend: service: name: airflow-web port: number: 8080 path: / pathType: ImplementationSpecific
-
-
Repeat Step 2 and fill the process the information as below.
-
Context: This is a Kubernetes manifest to implement an ingress for the airflow.
-
Name: airflow-namespace
-
Type: Kubernetes manifest
-
Artifact source: Inline
-
Paste the content below in the value field:
apiVersion: v1 kind: Namespace metadata: name: airflow
-
-
Repeat Step 2 and fill the process the information as below.
- Context: This is the Helm Chart registry location.
- Name: helm-airflow
- Type: Helm Chart
- Helm Chart URL: oci://gru.ocir.io/yournamespace/airflow-helm/airflow
- Version: 8.6.1
Note: Remember to replace the yournamespace from your OCI registry. This registry was created in earlier stages of this tutorial.
-
Repeat Step 2 and fill the process the information as below.
-
Context: This is the Helm values for the airflow deployment.
-
Name: values-airflow-helm
-
Type: General artifact
-
Artifact source: Inline
-
Paste the content below in the value field. This will force the deploy to the new node-pool.
nodeSelector: name: extra_workload
-
At this point, you should have the following artifacts.
Task 2: Create the DevOps build pipeline for Airflow
We will not have any code on the repository, just the build_spec.yaml
file. This is because we will use the official Helm chart for Apache Airflow.
-
Go to your OCI DevOps Project, click Code Repositories, and then create a repo named airflow-helm.
-
Click Clone and then take note of your repo
ssh url
. -
Go to your bastion host shell terminal, download the file build_spec.yaml and push it to your own code repository on OCI.
cd $HOME git clone <your ssh code repository> wget https://docs.us.oracle.com/en/learn/resource-manager-airflow-oke-part5/files/build_spec.yaml git add . git commit -m "new build spec added" git push
Task 3: Create build pipeline for airflow-helm
-
Go to your DevOps project, click Build Pipelines, and then click Create build pipeline.
-
Click Add New Stage and select type Managed Build.
-
Select the Primary Code repository, then select airflow-helm.
-
Gather vault secrets needed for this airflow piepline.
-
On your Oracle cloud console, click Identity & Security, click Vault, and select your vault.
-
Get the OCID for OCIRUser and OCIRToken secrets you created in the earlier steps of this tutorial.
-
-
Go to your DevOps Project, click Build Pipelines, select your build pipeline for airflow-helm, then click Parameters tab and add the following parameters:
Parameter Name Value VAULT_HELM_REPO_USER get the Vault secret OCID for your registry user OCIRUser VAULT_USER_AUTH_TOKEN get the Vault secret OCID for your registry token OCIRToken COMPARTMENT_ID get your compartment OCID HELM_REPO gru.ocir.io HELM_REPO_URL oci://gru.ocir.io/your_namespace/airflow-helm APP_HOST airflow.superocilab.com Note: Make sure you set your region correctly on the REPO URL. In this tutorial, it is “gru.ocir.io”.
-
Click Build pipeline tab, then click Start manual run to manually execute the build pipeline.
Task 4: Create deployment pipeline for airflow-helm
-
Go on your DevOps project, click Deployment Pipelines”, then create a new pipeline named airflow-helm-deploy.
-
Create a new stage to create new namespace in OKE, select Apply Manifest to your Kubernetes cluster.
-
Create a new stage to install airflow in OKE, select Apply Manifest to your Kubernetes cluster.
-
Create a new stage to create ingress for airflow in OKE, select Apply Manifest to your Kubernetes cluster. Since we set up an ingress controller to access our microservices from outside OKE, we also need to set up for the airflow.
-
Update your build pipeline for airflow and add a trigger to deployment. In order to start the deployment automatically after the build pipeline is completed, we need to add a trigger on the airflow-helm build pipeline.
Your build pipeline now should look like this.
Task 5: Configure go-microservice parameters
Now we need to configure the go-microservice to be able to trigger the recent created build pipeline for Airflow.
-
Go to your DevOps Project, click Build Pipelines and take note of the OCID for airflow-helm build pipeline.
-
On your Oracle cloud console, click Developer Services, under Resource Manager, click Stacks, select the CLI Extra Nodes stack and take note of the OCID.
-
Setup variables inside the go-microservice code.
Note: The go-microservices is a piece of code that will interact with the OCI resources. Since we just created the build pipeline for airflow, we need to make go-microservices aware of the build pipeline it needs to call. For that, we will do some changes on the configmap values that will be injected inside the container in OKE.
-
Get your go-microservice ssh url for your OCI code repository.
-
Open your bastion jump-box shell console and edit the
configmap/values.yaml
file from go-microservice code repository.cd $HOME rm -rf go-microservice git clone your_repo_ssh_url cd go-microservice vi chart-go-microservice/configmap/values.yaml
-
Add the following new variables at the end of the
configmap/values.yaml
file.ENV_RM_STACK_ID: "<paste your stack ocid>" ENV_DEVOPS_BUILD_ID: "<paste your build pipeline ocid"
-
Check the last 3 lines of the file and make sure you have the last two variables added.
tail -3 chart-go-microservice/configmap/values.yaml
-
Now we need to push the changes to the code repository:
git add . git commit -m "added new variables" git push
-
-
Go to your DevOps project, select Build pipelines and click go-microservice-pipeline.
-
Click Start manual run in order to re-execute the build pipeline for go-microservice and make sure the changes on the configmap take effect and are injected inside the running container.
-
Then check the deployment pipeline triggered by this build pipeline.
-
Your go-microservice now is ready to work!
Next Step
To proceed to the next tutorial in this learning path, click here.
Related Links
Acknowledgments
- Author - Joao Tarla (Oracle LAD A-Team Solution Engineer)
More Learning Resources
Explore other labs on docs.oracle.com/learn or access more free learning content on the Oracle Learning YouTube channel. Additionally, visit education.oracle.com/learning-explorer to become an Oracle Learning Explorer.
For product documentation, visit Oracle Help Center.
Create OCI DevOps pipelines for Apache Airflow and deploy it using Helm
F79811-01
April 2023
Copyright © 2023, Oracle and/or its affiliates.