Extension Project

Updated: December 28, 2022

    The following article provides instructions on how to bootstrap and deploy your DevOps extension project. This article showcases required CloudBlue Connect Command Line Interface (CLI) commands and demonstrates how to successfully configure your extension. Furthermore, the following outlines all tools and programs that are required to start working with your extension project.

    Core EaaS Solution

    Make sure to check out the Core EaaS repository that provides essential building blocks for developing your extensions. Its documentation also showcases how to work with different extension types and provides examples how to create various applications for Connect. Access this documentation by using the following link: connect-eaas-core.readthedocs.io


    In order to bootstrap a new extension project, make sure to install the following tools and programs first:

    • Docker
      Install Docker to run Python (recommended version 3.8 or later) and your extension projects.
    • Poetry
      Deploy the Poetry package manager and get familiar with its documentation.
    • Connect CLI
      Connect CLI is an essential tool that allows you to bootstrap, test, and validate your extension project.
    • Your IDE app
      Make sure to get your favorite Integrated Development Environment software application up and running for the following operations.

    In addition, it is recommended to get familiar with GitHub actions and streamline your test operations with PyTest.

    Getting Started

    It is required to prepare your account and define a new extension object on Connect before starting to work with your extension. Specifically, it is required to define a new extension object and access required environment identifier via the DevOps module. Furthermore, use the following instructions to learn how to add your Connect account via the CLI tool.

    Add Your Account

    To start working with the Connect CLI tool, make sure that the latest version of this tool is installed. Thereafter, add your account by generating an API token via the Integrations module on Connect. Select the custom token type and provide required module permissions for your token. Note that at least DevOps module permissions are required.

    Once your API token is created, provide the token value to the CLI tool by using the following command:

    ccli account add "<your copied token>"

    In case more information for working with your account is required, refer to the CLI core functionalities.

    Define Your Extension

    Access the DevOps module to define your extension object on the CloudBlue Connect platform.

    Click the Add Extension button to start creating a new extension object.

    It is recommended to deploy your environment locally. Therefore, copy the Environment ID within Local Access widget. In case all of your extension will be deployed in the Cloud, such identifiers are not required.

    For more detailed instructions on how to add and manage an extension, refer to the DevOps User Interface page.

    Bootstrap Your Extension

    Once your required account is activated and your extension object is created on the Connect platform, you can bootstrap your extension project. Use the following Connect CLI command to start your project configuration:

    ccli project extension bootstrap

    Therefore, the CLI tool will prompt to configure your extension project. Get familiar with the provided introduction and click Next to continue.

    Follow the guidelines below to successfully complete your bootstrap operation:

    Project Settings

    Specify your general extension project details as follows:

    • Project name: Enter your project name in this field.
    • Project root: Specify a folder name for the root folder of the Extension module (e.g., my_root_folder).
    • Project description: Provide a brief description for your project.
    • Project version: Specify your extension project version.
    • Project author: Enter an author name for your project.
    • Project license: Your extension should be compatible with one of the provided software licenses (Apache Software License 2.0, MIT, or BSD). Otherwise, the Connect platform will not work with your project.
    • Project package: Select a package name for your extension. It must be a valid python identifier (e.g., connect_ext).
    • Project asyncio: Enable or disable the asynchronous library for your project. Note that it is possible to use only either asynchronous library or synchronous library. Your project will not be deployed in case it uses both libraries.
    • Project CI: Enable or disable GitHub actions for your extension project.

    Proceed with your extension project configuration as described below.

    API Key, Environment, and Hostname Configuration

    Provide your API key, environment identifier, and hostname configuration via the following steps:

    • Config API key: If a local environment is required, enter your EaaS API Key that’s generated via the Integrations module on the Connect platform. In case the Cloud environment is selected, the system automatically generates and injects a token to your required environment; therefore, it is necessary to leave this field intact.
    • Config Environment ID: Fill out this field in case the local environment is required. Otherwise, leave this form intact. Access your Environment ID via the Local Access section of the Extension Details screen.
    • Config API hostname: Specify your API hostname in this field if a local environment is required. This hostname is specified within the General section of the Integrations module.

    Extension Settings

    These settings allow you to select your extension type, specify included application types and configure other features and specifications:

    • Extension Type: This step requires to select your extension type. Note that different Connect accounts feature different application types. Furthermore, note that some of following options might not be available for your extension type.
    • Extension Audience: Allows selecting Connect actors that will be able to use your extension. For example, in your extension works only for Vendor accounts, include Vendors and make sure to exclude Distributors and Resellers. This option is available only in case Multi-Account extension is selected.
    • Extension Features: Specify application types that should be supported by your extension. Note that the Web Applications are available only for Hub Integration and Multi-Account extensions.
    • Events Type: Allows specifying event types that are supported by your extension. This configuration is available only in case the Events Processing application type is selected. Learn more about the difference between various events by using the EaaS Core library documentation.
    • Web Application: This option is used to specify whether your extension will work with the Connect UI. The Connect team also provides UI SDK for streamlining your web interface development. For more information, refer to the UI SDK announcement.
    • Background Events: Specify background events that are supported by your extension. Such events available for Events Processing applications only.
    • Interactive Events: Select interactive events that are supported by your extension. These events are available only for Events Processing applications that are included in the Fulfillment Automation extensions.
    • Example Variables: Make sure to generate environment variable examples if your extension will work with such variables.


    Once all aforementioned steps are completed, the CLI tool provides the summary of your extension project:

    Press Create to finalize your extension project configuration. The CLI tool informs you about the successful extension project creation and provides your project location.

    Therefore, you can prepare your code editor and start working with your extension. Note that it is highly recommended to deploy Docker to run your created project.

    Project File Structure

    The following outlines the general project file structure and provides guidelines on how to work with your created extension project.

    Environment Files

    Your required local environments are specified within the .ENV files. The environment files incorporate the following structure:

    # Required API Key is generated via the Integrations module of the Connect platform 
    export API_KEY="ApiKey XXXXXXX:xxxxxxxxxxxxxxxxxxxxxxxxxxx"
    # Environment ID is displayed under Local Access within the Service Details screen on Connect
    export ENVIRONMENT_ID=xxxxxxx-xxxx-xxxx-xxxx
    # Connect API hostname can be accessed via the Integrations module of the Connect platform
    export SERVER_ADDRESS=api.connect.cloudblue.com

    Once all of your required environment files are presented, customize the docker compose file and specify your environments within this file as described below.

    Docker-Compose File

    Access the docker-compose.yml file to define your docker containers. It is recommended to use individual containers for each of your deployed environments (i.e., a container for development environment, one for testing environment, and so on). The following showcases examples of defined docker containers:

    version: '3'
      # Define your required environment 
        container_name: project_dev
        image: cloudblueconnect/connect-extension-runner:01.1
        # Pushes images to their respective registry/repository
        command: cextrun -d # Use this command to select the debug mode
          # Declare your volume for persisting data generated by and used by your docker container
          - .:/extension
          # Your specified environment; it is used only for the local mode
          - project_dev.env
      # Bash containers are especially helpful for local environments, since it 
      # allows using your commands and displays your command output
        container_name: project_bash
        image: cloudblueconnect/connect-extension-runner:01.1
        command: /bin/bash
          - .:/extension
          - project_bash.env
      # This container is instantiated for your test operations
      # note that it is especially helpful in case you use GitHub actions
        container_name: project_test
        image: cloudblueconnect/connect-extension-runner:01.1
        command: /bin/bash -c "poetry install && pytest --verbose --cov=project --cov-report=html --cov-report=term-missing:skip-covered"
          - .:/extension
          - project_test.env

    PyProject File

    Note that the Poetry package manager is used to install required dependencies. Therefore, access the pyproject.toml file to familiarize yourself with provided Poetry dependencies. This file displays versions of each provided dependency, incorporated required notations, and more.

    The pyproject file is especially helpful to fix any appeared issues with your project. Furthermore, note that you can use this file to declare your own dependencies if necessary.

    Once your work with all of the aforementioned files is complete, you can launch your containers and your required environments as described below.

    Boot Up Your Containers

    It is required to use Docker to build your extension project container and boot up required environments. Make sure to select your project via cd {your_project_path}. Thereafter, execute the following command:

    docker compose run project_bash

    Note that project in the project_bash should be replaced with your specified container name. Connect your extension by using the following command:


    In case a container with bin/bash command is deployed, you can also:

    • Install required dependencies with the poetry install command.
    • Start your testing operations by using the pytest command.
    • Initialize selected environment in the debug mode with the cextrun -d command.

    Therefore, your specified environment should be successfully deployed on the CloudBlue Connect platform. Make sure that the Connect interface displays the Connected status next to your selected environment. In case the Cloud mode is selected, provide a link to your Git repository with your extension project and ensure that the environment is switched to the Running state.

    Once your environment is initialized successfully, the system starts the SSL connection. Note the Connect platform acts both as an endpoint and as a router for your custom events, product actions, configurations, etc. Any supported event or action within the initialized environment will be processed by your extension.

    Project Customization

    The following explains how to change the workflow and capabilities of your extension project. Note that you can also get your environment up and running for the following extension project customization. The Connect platform can apply your changes while your environments are in the Connected or Running states.


    The extension.json file within your project folder includes specified capabilities of your created extension project. This file also incorporates additional properties that are described in the example below:

      "name": "Extension Project", // Your specified project name
      "description": "Mighty Mick's Extension Project.", // Your project description
      "version": "1.0.0", // Change your extension version if necessary
      "readme_url": "https://github.com/fstreeteaas/README.md", // Link to your readme file
      "changelog_url": "https://github.com/fstreeteaas/changelog.md", // Link to your changelog file
      "capabilities": { // This provides a list with your enabled capabilities
        "asset_purchase_request_processing": [
          "draft" // Request states that are supported by your project

    Customize your extension project by excluding request states that should not be supported and by including statuses that your project should support. For example, you can exclude the draft statuses for your purchase fulfillment requests in case your project features real-time validation of draft purchase requests.


    The extension.py file is used to define the workflow of your extension project. The structure of this file and the workflow logic are demonstrated in the following example:

    # Imports Extension as a Service classes that are required for following operations
    from connect.eaas.extension import (
    import random
    import string
    class MyProject(Extension):
        # Extension is a general EaaS class that provides an instantiated
        # Open API client, a logger, and configuration with your environment variables.
        def approve_asset_request(self, request, template_id):
            # This function is used to approve purchase request;
            # note that only the request parameter is necessary to obtain a required
            # request object
            self.logger.info(f'request_id={request["id"]} - template_id={template_id}')
            # This code instantiates the client,
            # specifies the collection operation, provides request IDs,
            # and defines the required action
                    'template_id': template_id,
            self.logger.warning(f"Approved request {request['id']}")
            # Note that your project can have different logging levels.
        def process_tier_config_change_request(self, request): # This function is used to process change tier requests
                f"Received event for request {request['id']}, type {request['type']} "
                f"in status {request['status']}"
            if request['status'] == 'pending':
                template_id = self.config['TR_APPROVE_TEMPL_ID']
                # Define values for your environment variables (e.g.,TR_APPROVE_TEMPL_ID) via Service Details on Connect
                self.approve_tier_request(request, template_id)
            return ProcessingResponse.done()

    Note that environment variables that are created for the Extension’s config parameter can include the Secured flag. In case the Secured checkbox is checked, the provided value will be encrypted. Therefore, this value can be used only by your extension project.

    The system allows you to instantly implement your variables to your connected environment via the Apply Changes button.

    Furthermore, note that Extension’s logger generates required output locally if the local environment is initialized. In case the Cloud environment is selected, the logger uses the logz.io service with a Kibana UI to generate logs, create alerts, etc. Click the Settings button under your Events via the Service Details screen to access different logging levels.

    Before changing the logic of the provided functions within the the extension.py file, it is increasingly important to outline the key difference between two categories of request tasks that your extension project can work with. In addition, it is also especially important to understand how your extension works with product actions and custom events.

    Background Tasks

    The first category is called background tasks, since such tasks can be processed in the background automatically. Such tasks are used to assign required statuses to certain objects. Note, however, that tasks and business objects on the Connect platform (e.g., requests) are different entities. Failing a background task does not mean that your request or any other object will be automatically assigned to the Failed state. Therefore, background tasks can be relaunched in case of an error. Each failed background task will be automatically restarted up to 10 times.

    Access the extension.py file to customize the background task processing logic as described below:

    def process_asset_resume_request(self, request): # This example function processes resume subscription requests
                f"Received event for request {request['id']}, type {request['type']} "
                f"in status {request['status']}"
            if request['status'] == 'pending':
                template_id = self.config['ASSET_REQUEST_APPROVE_TEMPLATE_ID']
                self.approve_asset_request(request, template_id)
            return ProcessingResponse.done()
    # You can add logic if the task is failed. For example, use the following code to provide a message for failed tasks: 
    # return ProcessingResponse.fail(output= "Task is Failed!")
    # You can skip a task by using this code: return ProcessingResponse.skip
    # Note that you can also reschedule a task; there are two ways to perform this operation:
    # 1) return ProcessingResponse.reschedule(countdown=30) — minimum value for countdown is 30 seconds
    # 2) return ProcessingResponse.slow_process_reschedule(countdown=300) - minimum value for countdown is 300 seconds

    Interactive Tasks

    The second category of tasks can be referred to as interactive tasks, because such tasks require your attention to process business objects on the CloudBlue Connect platform. Namely, such tasks represent dynamic (real-time) validation of your specified request types. The following example demonstrates a dynamic validation function:

    def validate_asset_purchase_request(self, request): # Real-time validation of purchase requests
        self.logger.info(f"Asset Validation with if {request['id']}")
        return ValidationResponse.done(request) # Returns request objects once the task is complete

    Note that there is no need to provide codes for validation task failures. Failed validation tasks will not mark provided parameters as invalid. Furthermore, if the validation process fails (e.g., in case of the 500 error), the system accepts your request either way.

    Product Actions

    Product actions, as the name implies, are used to interact with your product. Refer to the Connect Open API specification in case more information on product action is required. The example below showcases a product action function within the extension.py file:

    def execute_product_action(self, request): # The function requires request objects; i.e., the pilot of JWT tokens
       self.logger.info(f'Product action: {request}')
       return ProductActionResponse.done(
          http_status=302, # Response for product action can include an http status; e.g., to perform redirections
    # Note, however, that this shouldn't be a permanent redirect; otherwise you will lose control of your product action
          headers={'Location': 'https://google.com'}, # In addition, response can get headers
    # Response can also include body and return html code if necessary

    There is also no need to provide codes for product action failures, since failing product actions actually means to fail your business flow and not to fail your specified task.

    Custom Events

    Custom (external) events are similar to product actions. The core difference, however, is that custom events require you to specify your product, select required environment, and define your operation.

    Note that custom events have special endpoint in the Connect API. Thus, for example, you can configure a webhook for your custom event by using this endpoint. The endpoint URL looks like this: https://{connect.domain}/public/v1/devops/hooks/{SRVC-XXXX-XXXX}/product/custom-event. Connect domain represents the actual connect domain (e.g., api.connect.com) and SRVC represents your added service identifier (e.g., SRVC-1234-4321).

    An external system should return an authentication header that includes a bearer token. Namely, use the following code to include the required header: headers['Authentication'] = 'Bearer {Your JSON Web Token}'. Obtain the token by creating the following JWT claim:

         “product_id”: <your_product_id>,
         “environment”: (development|test|production),
         “exp”: <expiration (see jwt.io)>

    Sign the claim with your JWT secret that is available within your environment on the Connect platform. Access the service details screen to locate your JWT secret:

    The following example showcases a custom event function within the extension.py file:

    def process_product_custom_event(self, request):
       self.logger.info(f'Custom event: {request}')
       sample_return_body = {
          "response": "OK"
    # There are two types of data for custom event functions: 1) JSON body (i.e.,the POST method) and 2) formData object 
    # Required codes for both data types are presented below:
    #1) request['body'] – contains the content of the POST method within the body
    #2) request['form_data'] – includes required formData object
    # Note that your function can include your specified external system header via this code: request['headers']
    # You can also access selected method by using this code: request['method']
    # Furthermore, access the query string via the following code: request['querystring']
       return CustomEventResponse.done(body=sample_return_body)
    Is this page helpful?
    Translate with Google
    Copied to clipboard