As you learned in Chapter 2, you can build a Spring Cloud Function and deploy it to multiple environments. You can use various manual methods such as Azure-Function:Deploy, Gcloud CLI, AWS CLI, Kubectl, and Knative CLI. These manual approaches are not sustainable in an enterprise with many different teams, a lot of programmers, and a lot of code. It will be a management nightmare if every team member uses their own method to build and deploy code. Also, as you can see, this process is repeatable. Since it is a repeatable process, there is a chance to leverage automation.
This chapter explores ways to automate the deployment process. It leverages some popular approaches for automating your deploys. It explores GitHub Actions for deploying Lambda, Google Cloud Functions, and Azure Functions and you will integrate with ArgoCD to push to a Kubernetes/Knative environment. While you can use GitHub Actions alone for all environments, that would require custom scripting to push to Kubernetes. ArgoCD has built-in hooks to deploy to Kubernetes, which is the preferred way. More information on GitHub Actions can be found at https://github.com/features/actions, and information on ArgoCD can be found at https://argoproj.github.io/cd/.
Let’s dig a bit deeper into GitHub Actions and ArgoCD.
3.1 GitHub Actions
This is a CI/CD platform tightly integrated with GitHub; it allows you to create and trigger workflows from GitHub. So, if you are a fan of GitHub, you will really like this new feature. When you sign up for GitHub, GitHub Actions will automatically be integrated into your project, so you do not have to use a separate tool like Jenkins or Circle CI. Of course, this means that you are restricted to GitHub as your code repository. Creating a workflow is quite straightforward. You can create a workflow directly on the GitHub website by navigating to your project and clicking the New Workflow button in the Actions tab, as shown in the Figure 3-1.
Upon clicking New Workflow, as shown in Figure 3-2, you will be taken to the workflow “marketplace,” where you can choose from the suggested flows or set up a workflow yourself. Click the Set Up a Workflow Yourself link to start creating a custom workflow.
The Set Up a Workflow Yourself window, as shown in Figure 3-3, will take you to the page where you can write the script to create your workflow.
As you can see in Figure 3-3, the workflow points to a main.yaml file that is created in the ./github/workflows directory under your root project folder. You can also create the same file in your IDE and it will show up under the Actions tab once you commit the code. Listing 3-1 shows the sample code created for AWS Lambda.
Workflow for Payroll Function to be Deployed on AWS Lambda
Let’s dive deep into the components of the YAML file.
This sets up the GitHub Actions workflow and the triggers. Now, every time you commit or push code to this project repository in GitHub, this workflow will execute. Figure 3-5 shows a sample execution of this workflow.
3.2 ArgoCD
While GitHub Actions can push code to serverless environments such as Lambda, it lacks a good graphical representation of code deployed when it comes to the Kubernetes environment. Kubernetes is a orchestrator of containers and has a plethora services that manage deployments. ArgoCD was created for Kubernetes. ArgoCD is a declarative CD (Continuous Delivery) tool, which means application definitions, configurations, and environments can be version controlled. ArgoCD, similar to GitHub Actions, uses Git repositories as a single source of truth. This is also known as GitOps.
Declarative means configuration is guaranteed by a set of facts instead of by a set of instructions.
Declarative GitOps allows the programmer or the ones who created the application to control the configuration of the environment in which the environment will run. This means the programmer does not have to rely on different teams, such as infrastructure or DevOps teams, to manage the pieces of the application. The programmers are in control, and this is a good thing.
ArgoCD set up is mostly programmatic and relies on the underlying Kubernetes configmaps. This is, as you can see, is different from other tools like Jenkins.
The output of this command is the password for the “admin” user.
You can now log in using the web browser. Then navigate to the 20.119.112.240 URL to change the password.
Log in to ArgoCD with the following command:
$argocd login 20.119.112.240
You have successfully installed and connected to ArgoCD. Figure 3-11 shows a sample of the ArgoCD UI.
Now that you have learned about GitHub Actions and ArgoCD, you can move on to deploying your application and automating the CI/CD process.
3.3 Building a Simple Example with Spring Cloud Function
You will use the same example from Chapter 2, but instead of using the EmployeeConsumer interface, this example uses EmployeeSupplier. In order to do that, you need a prepopulated database. You’ll then query the database using a supplier function. You can find the code at https://github.com/banup-kubeforce/payroll-h2.
Here are the required changes.
Step 1: Create scripts to populate the H2 database when the function starts up. Create a schema that creates the employee table. Store the script in a Schema.sql file, as shown in Listing 3-2.
DROP TABLE IF EXISTS employee;
CREATE TABLE employee (
id INT AUTO_INCREMENT PRIMARY KEY,
name varchar(250),
employeeid varchar(250),
email varchar(250),
salary varchar(250)
);
Listing 3-2
Schema.sql
Populate the database with an INSERTstatement. Then create a file called Data.sql and store the INSERT statement in it, as shown in Listing 3-3.
INSERT INTO employee (name, employeeid, email, salary) values
Add these two files to the resources folder of the main project, as shown in Figure 3-12.
Modify the Application.properties as follows. Change spring.cloud.function.definition from employeeConsumer to employeeSupplier. This will route function calls to employeeSupplier. See Listing 3-4.
Also add spring.jpa.defer-datasource-initialization=true to ensure that the data gets populated on startup.
No other code changes are required. It is important to note that the changes you made only affect the configuration file.
3.4 Setting Up a CI/CD Pipeline to Deploy to a Target Platform
As discussed in the introduction of this chapter, you’ll use two tools for the CI/CD process. GitHub Actions can be used as a tool for both CI (Continuous Integration) and CD (Continuous Deployment), while ArgoCD is a CD tool.
ArgoCD was designed for Kubernetes, so you can leverage this tool exclusively for Knative/Kubernetes deployment. You’ll use GitHub Actions for serverless environments such as Lambda.
Figure 3-13 shows the flow when deploying to serverless environments like AWS Lambda, Google Cloud Functions, and Azure Functions.
The process steps are as follows:
1)
Create code and push/commit code to GitHub.
2)
GitHub Actions senses the event trigger of the commit and starts the build and deploy process to the serverless environments defined in the actions script.
Figure 3-13 shows the flow when deploying Spring Cloud Function to a Kubernetes environment with Knative configured.
The process steps are as follows:
1)
Create code and push/commit code to GitHub.
2)
GitHub Actions senses the event trigger of the commit and starts the build process and deploys the created container image into Docker Hub.
3)
ArgoCD polls for changes in GitHub and triggers a “sync.” It then retrieves the container image from Docker hub and deploys to Knative on Kubernetes. See Figure 3-14.
3.5 Deploying to the Target Platform
This section looks at the process of deploying the Spring Cloud Function to the target environments such as AWS Lambda, Google Cloud Functions, Azure Functions, and Knative on Kubernetes.
Here are the prerequisites for all the environments:
GitHub repository with code deployed to GitHub
Access and connection information to the environments
All Chapter 2 prerequisites for each of the environments. Refer to Chapter 2 for each environment
Use the successful deployments of Chapter 2 as a reference for each of the deploys
3.5.1 Deploying to AWS Lambda
Deploying to AWS Lambda requires using a SAM (Serverless Application Model) based GitHub Actions script. This section explains how to use SAM and GitHub Actions. There is no additional coding required.
Prerequisites:
AWS account
AWS Lambda Function subscription
S3 bucket to store the code build
AWS CLI (optional) to verify deployments through the CLI
Push the code to GitHub. You can bring down code from GitHub at https://github.com/banup-kubeforce/payroll-aws-h2. This code can be modified to your specs and deployed to the repository of your choice.
Step 2: Implement GitHub Actions with AWS SAM. AWS SAM (Serverless Application Model) is a framework for building serverless applications. More information can be found at https://aws.amazon.com/serverless/sam/.
AWS has a sample SAM-based actions script that is available in the GitHub marketplace that you can leverage. This script will execute the SAM commands.
The action code in Listing 3-5 can be created on the GitHub Actions dashboard or in your IDE
Figure 3-16 shows the place to store configuration secrets for GitHub Actions. It is under the Settings tab.
Step 3: Execute GitHub Actions. Once the workflow is configured, the actions can be triggered from the GitHub console or through a code commit, as defined by the following code in the sam-pipeline.yaml file.
on:
push:
branches: [ "master" ]
GitHub Actions execute the steps outlined in the YAML file and deploy the function to AWS Lambda. Figure 3-17 shows a successful run.
Step 4: Verify that the function is up and running in AWS Lambda.
Since the payroll-aws-h2 application exposes EmployeeSupplier, you will do a simple GET against the function to see if you get the result of data that has been inserted into the database on Spring Cloud Function startup.
In Figure 3-19, you can see that the test was successful and see a JSON result of what is in the database.
3.6 Deploying to GCP Cloud Functions
Deploying to GCP Cloud Functions using GitHub Actions is a bit intrusive, as you have to add a MANIFEST.MF file to the resources folder. See the code in GitHub.
Gcloud CLI is optional if you are just using the GitHub Actions dashboard
Step 1: Spring Cloud Function code in GitHub. Push your code to GitHub. If you have cloned the https://github.com/banup-kubeforce/payroll-gcp-h2.git then you have everything that you need to push the code to your repository.
Step 2: Set up Cloud Functions actions.
Set up GitHub actions to run the Cloud Functions command. You have two choices:
Workflow for Payroll Function to be Deployed on GCP Cloud Functions
Note that you will have to store your GCP_CREDENTIALS in the GitHub Secrets dashboard.
As in the previous example with AWS Lambda, note that the steps to check out, set up, and build Maven are the same. For the authentication and deployment, you use the Google Cloud CLI. The Set up Cloud SDK task will download and set up the Google CLI. You can use the same command line script that you used when you deployed from a laptop in Chapter 2.
Step 3: Commit and push code to trigger the GitHub Actions. This trigger is defined in the actions code. In this example, any push or commit to the “master” branch will trigger the GitHub Actions.
on:
push:
branches: [ "master" ]
This can be done on the GitHub Actions website or in the IDE. You can go to GitHub and commit a change by doing a simple modification at the “master” branch. This will start the GitHub Action flow.
Once the actions successfully complete the job, you can go to the Google Cloud Functions dashboard and test the function. Again, you execute a simple GET against the EmployeeSupplier function.
Step 4: Test the function.
Before you test the function, ensure that you pick the function to be invoked from an unauthenticated device such as your laptop. Once you’re done testing, remove the privilege to avoid unnecessary invocations.
You can go to the console of your function in the Google Cloud Functions dashboard and execute the test. You do not have to provide any input; simply click the Test the Function button to execute the test. You will see the output of EmployeeSupplier in the Output section, as shown in Figure 3-23.
3.7 Deploying to Azure Functions
Spring Cloud Function on Azure Functions require a bit of tweaking, as you learned in Chapter 2. This is because the configuration is not externalized, as with AWS Lambda or GCP Cloud Functions. This does not mean that you cannot deploy easily. You have to understand how Azure Function code interprets Spring Cloud Function code and execute. See Chapter 2 for discussions around this issue; make sure that you execute and test locally before pushing to the Azure cloud.
Step 4: Testing. You will use an external testing tool to see if the deployed functions work. The tool you use here is Postman.
You can simply use a GEToperation to test, as shown in Figure 3-29.
This completes the deployment of Spring Cloud Function on Azure Functions using GitHub Actions.
3.8 Deploying to Knative on Kubernetes
The CI/CD for deploying Spring Cloud Function on Knative are similar for every Kubernetes. The only change is the cluster name. This section uses ArgoCD (http://argoproj.github.io) for CD even though you can achieve the same result with GitHub Actions. I found GitHub Actions a bit code-intensive. I wanted to separate the CD process and have a good visual tool that shows the deployment. ArgoCD provides a good visual interface.
To have a common repository for all the cloud environments, you’ll use Docker hub in this example. Docker hub provides a good interface for managing images and it is popular with developers. If you use ECR, GCR, or ACR, you’ll experience vendor lock-in.
The prerequisites for deploying to any Kubernetes platform are the same:
Access to a Kubernetes Cluster with Knative configured
ArgoCD up and running
An app in ArgoCD that is configured to poll the GitHub project
Once you have the prerequisites set up, you can begin configuring an automated CI/CD pipeline. For this example implementation, you’ll use the code from GitHub at https://github.com/banup-kubeforce/payroll-h2.git.
Step 1: Spring Cloud Function code in GitHub. Push your code to GitHub. You can use the code for payroll-h2 in GitHub.
Step 2: Create a GitHub Action. Listing 3-8 shows the code for the action.
Workflow for Payroll Function Image to be Pushed to Docker Hub
This code creates a Docker image and pushes it to the Docker hub. You can store the username and password as secrets in the GitHub site (see Figure 3-31).
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
Step 3: Execute GitHub Actions to build and push the Docker image.
The execution of GitHub Actions can be triggered by a push/commit. The trigger is defined in the GitHub Actions YAML file:
on:
push:
branches:
- 'main'
Step 4: Configure ArgoCD.
Follow the steps outlined in the introduction of this chapter for ArgoCD. You need to connect to your cluster on your local machine before executing this command; see Listing 3-9.
ArgoCD Script to Create a Project and Point to the payroll-h2 Repo
This will create app payroll-h2 in ArgoCD, as shown in Figure 3-33.
Step 5: Sync the project in Argo CD.
Now that you have created the app and it is pointing to the GitHub repository, make sure you have a connection to the repo, as shown in Figure 3-34. I connected to the repo using HTTPS. This will allow the app to poll for changes and trigger the flow to push the Docker image to the specified Kubernetes environment.
You can also run a deployment manually by clicking SYNC, as shown in Figure 3-35.
Figure 3-36 shows a successful sync process in ArgoCD.
Step 6: Check if the function has been deployed. Navigate to the Kubernetes dashboard on the Azure Portal and verify that the service has been deployed. See Figure 3-37.
Step 7: Testing. The best way to get the URL to test is to connect to the cluster via the command line and get the URL, as explained in Chapter 2.
Run $kn service list to get the URL for testing, as shown in Figure 3-38.
This completes the successful deployment using GitHub Actions and ArgoCD.
3.9 Summary
In this chapter, you learned how to set up some CI/CD tools to create an automated deployment for your Spring Cloud Function.
You learned how to trigger the deployment of functions on Lambda, Google Cloud Functions, and Azure Functions.
You also learned that you can combine the build of Docker images stored in Docker hub and ArgoCD to deploy the image to any Kubernetes cluster that is running Knative.
If you want to achieve “write-once deploy-anywhere,” you have to look at using Kubernetes and Knative. Spring Cloud Function is really a portable function.