This is the third article in the series on what makes FlexDeploy a perfect fit for container native technologies:
- FlexDeploy Loves Containers: Build and Deploy Microservices to Kubernetes clusters in the Cloud
- Enrich Oracle Container Native Application Development Platform with FlexDeploy Pipelines
- FlexDeploy loves Functions: Deploy Docker Containers with Fn Functions
- FlexDeploy Loves Functions: Build Docker Containers with Fn Functions
- FlexDeploy is a Container: Run FlexDeploy as a Docker Container
- FlexDeploy Loves Containers: Build FlexDeploy Plugins with Docker
Want to follow along in FlexDeploy? Check out our JumpStart Demo Lab configured with Oracle E-Business Suite. This is a pre-configured, friction-free environment in the cloud for you to try FlexDeploy with no set up, installation, or configuration.
In this post I am going to show how FlexDeploy can use Fn Project functions to make this world better. FlexDeploy is going to deploy Docker containers to a K8s cluster with an Fn function by the means of this simple workflow:
In one of the previous posts of this series I showed how we can implement the Deploy step with kubectl, which is a K8s command line tool. That solution would work fine if we have a preconfigured end point capable of executing the deployment step. The endpoint should have preinstalled kubectl and it should have been preconfigured with K8s cluster credentials (or K8s contexts) for all environments (Dev, Test and Prod) where this workflow can deploy to. The idea of this post is to create a Docker container serving as an end point like that. So, I am going to package kubectl and my K8s cluster credentials into a container and use it to deploy stuff to K8s clusters. Having done that, we don’t depend on the endpoint anymore, we just need Docker. But in order to feel like a truly free man I am going to create an Fn function on top of this “deployer” container, so I don’t depend on Docker either. The details on how to create the deployer container and a function on top of it are available here.
So, let’s configure FlexDeploy to work with the deployer Fn function. The Deploy step of our workflow is a Shell plugin operation and it has the following code snippet:
curl $FN_APP_URL/deploy -d "$K8S_CONTEXT $DOCKER_IMAGE:$FD_PROJECT_VERSION"
It invokes a function on an Fn application passing two parameters: K8s context name (where to deploy) and Docker image name including tag version (what to deploy). The Fn application can be running anywhere you want on prem or in the cloud. If you prefer, you can run it on top of K8s cluster in the cloud as it is described here.
The function call relies on the following properties:
- FN_APP_URL . This is an environment-instance workflow property referring to URL of Fn application. E.g. http://22.214.171.124:80/r/k8sdeployer
- K8S_CONTEXT. This is an environment-instance workflow property providing name of the K8s context. The K8s context defines the target K8s cluster and credentials for it. E.g. context-gcloud1560-dev.
- DOCKER_IMAGE. This is a project scoped workflow property containing name of the Docker image which is being deployed. The full name of the image includes tag version referring to the project version. E.g. eugeneflexagon/jsfrontend:1.0.23
Obviously, we don’t need any special end point for this workflow step. It can be easily executed on localhost.
In this simple example we demonstrated how FlexDeploy can leverage the power of serverless paradigm implemented by Fn Project. The deployment workflow is abstracted away from the end point infrastructure. The beauty of this solution is that the consumer of the function, the workflow step, just uses Rest API over http to get the application deployed and it does not care how and where this job will be done. But the workflow knows for sure that computing resources to execute the Deploy step will be utilized no longer than it is needed to get the job done.