The following article provides instructions on how to bootstrap and deploy your DevOps extension project. This article showcases required CloudBlue Connect Command Line Interface (CLI) commands and demonstrates how to successfully configure your extension. Furthermore, the following outlines all tools and programs that are required to start working with your extension project.
In order to bootstrap a new extension project, make sure to install the following tools and programs first:
In addition, it is recommended to get familiar with GitHub actions and streamline your test operations with PyTest.
It is required to prepare your account and define a new extension object on Connect before starting to work with your extension. Specifically, it is required to define a new extension object and access required environment identifier via the DevOps module. Furthermore, use the following instructions to learn how to add your Connect account via the CLI tool.
To start working with the Connect CLI tool, make sure that the latest version of this tool is installed. Thereafter, add your account by generating an API token via the Integrations module on Connect. Select the custom token type and provide required module permissions for your token. Note that at least DevOps module permissions are required.
Once your API token is created, provide the token value to the CLI tool by using the following command:
ccli account add "<your copied token>"
In case more information for working with your account is required, refer to the CLI core functionalities.
Access the DevOps module to define your extension object on the CloudBlue Connect platform.
Click the Add Extension button to start creating a new extension object.
It is recommended to deploy your environment locally. Therefore, copy the Environment ID within Local Access widget. In case all of your extension will be deployed in the Cloud, such identifiers are not required.
For more detailed instructions on how to add and manage an extension, refer to the DevOps User Interface page.
Once your required account is activated and your extension object is created on the Connect platform, you can bootstrap your extension project. Use the following Connect CLI command to start your project configuration:
ccli project extension bootstrap
Therefore, the CLI tool will prompt to configure your extension project. Get familiar with the provided introduction and click Next to continue.
Follow the guidelines below to successfully complete your bootstrap operation:
Specify your general extension project details as follows:
Proceed with your extension project configuration as described below.
Provide your API key, environment identifier, and hostname configuration via the following steps:
These settings allow you to select your extension type, specify included application types and configure other features and specifications:
Once all aforementioned steps are completed, the CLI tool provides the summary of your extension project:
Press Create to finalize your extension project configuration. The CLI tool informs you about the successful extension project creation and provides your project location.
Therefore, you can prepare your code editor and start working with your extension. Note that it is highly recommended to deploy Docker to run your created project.
The following outlines the general project file structure and provides guidelines on how to work with your created extension project.
Your required local environments are specified within the .ENV files. The environment files incorporate the following structure:
# Required API Key is generated via the Integrations module of the Connect platform
export API_KEY="ApiKey XXXXXXX:xxxxxxxxxxxxxxxxxxxxxxxxxxx"
# Environment ID is displayed under Local Access within the Service Details screen on Connect
export ENVIRONMENT_ID=xxxxxxx-xxxx-xxxx-xxxx
# Connect API hostname can be accessed via the Integrations module of the Connect platform
export SERVER_ADDRESS=api.connect.cloudblue.com
Once all of your required environment files are presented, customize the docker compose file and specify your environments within this file as described below.
Access the docker-compose.yml file to define your docker containers. It is recommended to use individual containers for each of your deployed environments (i.e., a container for development environment, one for testing environment, and so on). The following showcases examples of defined docker containers:
version: '3'
services:
project_dev:
# Define your required environment
container_name: project_dev
image: cloudblueconnect/connect-extension-runner:01.1
# Pushes images to their respective registry/repository
command: cextrun -d # Use this command to select the debug mode
volumes:
# Declare your volume for persisting data generated by and used by your docker container
- .:/extension
env_file:
# Your specified environment; it is used only for the local mode
- project_dev.env
# Bash containers are especially helpful for local environments, since it
# allows using your commands and displays your command output
project_bash:
container_name: project_bash
image: cloudblueconnect/connect-extension-runner:01.1
command: /bin/bash
volumes:
- .:/extension
env_file:
- project_bash.env
# This container is instantiated for your test operations
# note that it is especially helpful in case you use GitHub actions
project_test:
container_name: project_test
image: cloudblueconnect/connect-extension-runner:01.1
command: /bin/bash -c "poetry install && pytest --verbose --cov=project --cov-report=html --cov-report=term-missing:skip-covered"
volumes:
- .:/extension
env_file:
- project_test.env
Note that the Poetry package manager is used to install required dependencies. Therefore, access the pyproject.toml file to familiarize yourself with provided Poetry dependencies. This file displays versions of each provided dependency, incorporated required notations, and more.
The pyproject file is especially helpful to fix any appeared issues with your project. Furthermore, note that you can use this file to declare your own dependencies if necessary.
Once your work with all of the aforementioned files is complete, you can launch your containers and your required environments as described below.
It is required to use Docker to build your extension project container and boot up required environments. Make sure to select your project via cd {your_project_path}
. Thereafter, execute the following command:
docker compose run project_bash
Note that project in the project_bash should be replaced with your specified container name. Connect your extension by using the following command:
cextrun
In case a container with bin/bash
command is deployed, you can also:
poetry install
command.pytest
command. cextrun -d
command.Therefore, your specified environment should be successfully deployed on the CloudBlue Connect platform. Make sure that the Connect interface displays the Connected status next to your selected environment. In case the Cloud mode is selected, provide a link to your Git repository with your extension project and ensure that the environment is switched to the Running state.
Once your environment is initialized successfully, the system starts the SSL connection. Note the Connect platform acts both as an endpoint and as a router for your custom events, product actions, configurations, etc. Any supported event or action within the initialized environment will be processed by your extension.
The following explains how to change the workflow and capabilities of your extension project. Note that you can also get your environment up and running for the following extension project customization. The Connect platform can apply your changes while your environments are in the Connected or Running states.
The extension.json file within your project folder includes specified capabilities of your created extension project. This file also incorporates additional properties that are described in the example below:
{
"name": "Extension Project", // Your specified project name
"description": "Mighty Mick's Extension Project.", // Your project description
"version": "1.0.0", // Change your extension version if necessary
"readme_url": "https://github.com/fstreeteaas/README.md", // Link to your readme file
"changelog_url": "https://github.com/fstreeteaas/changelog.md", // Link to your changelog file
"capabilities": { // This provides a list with your enabled capabilities
"asset_purchase_request_processing": [
"draft" // Request states that are supported by your project
"tiers_setup"
"pending"
"inquiring"
"approved"
"failed"
]
}
Customize your extension project by excluding request states that should not be supported and by including statuses that your project should support. For example, you can exclude the draft statuses for your purchase fulfillment requests in case your project features real-time validation of draft purchase requests.
The extension.py file is used to define the workflow of your extension project. The structure of this file and the workflow logic are demonstrated in the following example:
# Imports Extension as a Service classes that are required for following operations
from connect.eaas.extension import (
Extension,
ProcessingResponse,
ValidationResponse,
ProductActionResponse,
CustomEventResponse,
)
import random
import string
class MyProject(Extension):
# Extension is a general EaaS class that provides an instantiated
# Open API client, a logger, and configuration with your environment variables.
def approve_asset_request(self, request, template_id):
# This function is used to approve purchase request;
# note that only the request parameter is necessary to obtain a required
# request object
self.logger.info(f'request_id={request["id"]} - template_id={template_id}')
# This code instantiates the client,
# specifies the collection operation, provides request IDs,
# and defines the required action
self.client.requests[request['id']]('approve').post(
{
'template_id': template_id,
}
)
self.logger.warning(f"Approved request {request['id']}")
# Note that your project can have different logging levels.
def process_tier_config_change_request(self, request): # This function is used to process change tier requests
self.logger.info(
f"Received event for request {request['id']}, type {request['type']} "
f"in status {request['status']}"
)
if request['status'] == 'pending':
template_id = self.config['TR_APPROVE_TEMPL_ID']
# Define values for your environment variables (e.g.,TR_APPROVE_TEMPL_ID) via Service Details on Connect
self.approve_tier_request(request, template_id)
return ProcessingResponse.done()
Note that environment variables that are created for the Extension’s config parameter can include the Secured flag. In case the Secured checkbox is checked, the provided value will be encrypted. Therefore, this value can be used only by your extension project.
The system allows you to instantly implement your variables to your connected environment via the Apply Changes button.
Furthermore, note that Extension’s logger generates required output locally if the local environment is initialized. In case the Cloud environment is selected, the logger uses the logz.io service with a Kibana UI to generate logs, create alerts, etc. Click the Settings button under your Events via the Service Details screen to access different logging levels.
Before changing the logic of the provided functions within the the extension.py file, it is increasingly important to outline the key difference between two categories of request tasks that your extension project can work with. In addition, it is also especially important to understand how your extension works with product actions and custom events.
The first category is called background tasks, since such tasks can be processed in the background automatically. Such tasks are used to assign required statuses to certain objects. Note, however, that tasks and business objects on the Connect platform (e.g., requests) are different entities. Failing a background task does not mean that your request or any other object will be automatically assigned to the Failed state. Therefore, background tasks can be relaunched in case of an error. Each failed background task will be automatically restarted up to 10 times.
Access the extension.py file to customize the background task processing logic as described below:
def process_asset_resume_request(self, request): # This example function processes resume subscription requests
self.logger.info(
f"Received event for request {request['id']}, type {request['type']} "
f"in status {request['status']}"
)
if request['status'] == 'pending':
template_id = self.config['ASSET_REQUEST_APPROVE_TEMPLATE_ID']
self.approve_asset_request(request, template_id)
return ProcessingResponse.done()
# You can add logic if the task is failed. For example, use the following code to provide a message for failed tasks:
# return ProcessingResponse.fail(output= "Task is Failed!")
# You can skip a task by using this code: return ProcessingResponse.skip
# Note that you can also reschedule a task; there are two ways to perform this operation:
# 1) return ProcessingResponse.reschedule(countdown=30) — minimum value for countdown is 30 seconds
# 2) return ProcessingResponse.slow_process_reschedule(countdown=300) - minimum value for countdown is 300 seconds
The second category of tasks can be referred to as interactive tasks, because such tasks require your attention to process business objects on the CloudBlue Connect platform. Namely, such tasks represent dynamic (real-time) validation of your specified request types. The following example demonstrates a dynamic validation function:
def validate_asset_purchase_request(self, request): # Real-time validation of purchase requests
self.logger.info(f"Asset Validation with if {request['id']}")
return ValidationResponse.done(request) # Returns request objects once the task is complete
Note that there is no need to provide codes for validation task failures. Failed validation tasks will not mark provided parameters as invalid. Furthermore, if the validation process fails (e.g., in case of the 500 error), the system accepts your request either way.
Product actions, as the name implies, are used to interact with your product. Refer to the Connect Open API specification in case more information on product action is required. The example below showcases a product action function within the extension.py file:
def execute_product_action(self, request): # The function requires request objects; i.e., the pilot of JWT tokens
self.logger.info(f'Product action: {request}')
return ProductActionResponse.done(
http_status=302, # Response for product action can include an http status; e.g., to perform redirections
# Note, however, that this shouldn't be a permanent redirect; otherwise you will lose control of your product action
headers={'Location': 'https://google.com'}, # In addition, response can get headers
# Response can also include body and return html code if necessary
)
There is also no need to provide codes for product action failures, since failing product actions actually means to fail your business flow and not to fail your specified task.
Custom (external) events are similar to product actions. The core difference, however, is that custom events require you to specify your product, select required environment, and define your operation.
Note that custom events have special endpoint in the Connect API. Thus, for example, you can configure a webhook for your custom event by using this endpoint. The endpoint URL looks like this: https://{connect.domain}/public/v1/devops/hooks/{SRVC-XXXX-XXXX}/product/custom-event
. Connect domain represents the actual connect domain (e.g., api.connect.com) and SRVC represents your added service identifier (e.g., SRVC-1234-4321).
An external system should return an authentication header that includes a bearer token. Namely, use the following code to include the required header: headers['Authentication'] = 'Bearer {Your JSON Web Token}'
. Obtain the token by creating the following JWT claim:
{
“product_id”: <your_product_id>,
“environment”: (development|test|production),
“exp”: <expiration (see jwt.io)>
}
Sign the claim with your JWT secret that is available within your environment on the Connect platform. Access the service details screen to locate your JWT secret:
The following example showcases a custom event function within the extension.py file:
def process_product_custom_event(self, request):
self.logger.info(f'Custom event: {request}')
sample_return_body = {
"response": "OK"
}
# There are two types of data for custom event functions: 1) JSON body (i.e.,the POST method) and 2) formData object
# Required codes for both data types are presented below:
#1) request['body'] – contains the content of the POST method within the body
#2) request['form_data'] – includes required formData object
# Note that your function can include your specified external system header via this code: request['headers']
# You can also access selected method by using this code: request['method']
# Furthermore, access the query string via the following code: request['querystring']
return CustomEventResponse.done(body=sample_return_body)