Provisioning and deployment data pipeline

Problem description

Together with implementation of Nailgun Extensions [1] we want to remove all direct calls from Nailgun core to any kind of extension i.e.: to volume_manager [2] or any other extension using node_extension_call function [3].

But extensions must have the ability to change the deployment and provisioning data. It is required for example by new bareon-fuel-extension [4] which will be used to integrate Fuel with Bareon-API [5].

Proposed changes

Once the deployment or provisioning data serialization happens the data will be passed to all available extensions. Then every extension will be able to make some data manipulation.

The proposal is to create new Extension attribute which is called data_pipelines.

data_pipelines is a list of Pipeline classes. Every Pipeline class should implement at least one of the following methods:

  • process_deployment(deployment_data, **kwargs) - is executed once the serialization of deployment data occurs. It receives reference to a dict which can be changed.
  • process_provisioning(provisioning_data, **kwargs) - is executed once the serialization of provisioning data occurs. It receives reference to a dict which can be changed.

Both methods don’t return anything and both are executed after Nailgun data serialization. Then the data can be changed by User using Fuel CLI as it was possible so far.

Example implementation:

class ExamplePipeline(BasePipeline):

    def process_deployment(cls, deployment_data, **kwargs):
        deployment_data['new_field'] = external_source.get_new_data()

    def process_provisioning(cls, provisioning_data, **kwargs):
        provisioning_data['new_field'] = external_source.get_new_data()

class ExampleExtension(BaseExtension):
    data_pipelines = (ExamplePipeline,)

Web UI



Data model






RPC Protocol


Fuel Client




Fuel Library



Instead of introducing new Extension attribute with classes list:

  • we could just add these two methods to Extensions class:
    • but it will clash with Expert design [6] pattern what can lead to blurred responsibilities
  • we could implement Pipelines as mixins:
    • but it comes down to the same issue as in the previous example
    • we want to implement Pipeline classes execution custom ordering in the future

Upgrade impact


Security impact


Notifications impact


End user impact


Performance impact


Deployment impact


Developer impact

Developer is able to change the deployment/provisioning data directly from extensions.

Infrastructure impact


Documentation impact

Pipelines should be described in Extensions docs. Description should include:

  • Definition of pipeline
  • Minimal working pipeline (required methods etc.)



Primary assignee: Sylwester Brzeczkowski <>

Mandatory design review:

Work Items

  • Implement BasePipeline class and integrate it with existing BaseExtension class and add serialization event triggers to the places in Nailgun core where the event occurs.
  • Remove all direct calls to extensions from Nailgun core.


  • Nailgun extensions discovery must be done first [1]

Testing, QA


  • Install extension with pipeline which changes node volumes on provisioning serialization. Run provisioning and check if correct data was sent to Astute.
  • Install extension with pipeline which adds some new field in provisioning/deployment data. Download this data using Fuel CLI, remove that field, upload it back and run deployment. Check if the field was present in the message sent to Astute (shouldn’t be).

Acceptance criteria

  • It is possible to change/add new data to provisioning/deployment serialized data.
  • User can change deployment/provisioning data (as it was possible so far) and make the decision to use the changes introduced by pipelines or not.