Python Functions
SINCE VERSION 9.0
- 1 What are Python Functions?
- 2 Deploy a function
- 3 Passing arguments
- 4 Passing secrets and ENV variables
- 5 Returning values
- 6 Use a custom function name
- 7 Calling Commands and Pipelines from inside a script
- 8 List deployed functions
- 9 Hooks
- 9.1 on_deploy
- 9.2 on_undeploy
- 10 Install Packages
- 10.1 on_requirements
- 11 Best Practises
- 11.1 Development
- 11.2 Testing
- 11.3 Source Code Control
What are Python Functions?
In PIPEFORCE pipelines you can execute Python functions as part of a pipeline execution. This way you can use the full power of this popular scripting language inside your pipelines without the need of maintaining the Python interpreter setup, container builds and container deployments. After the function has been deployed, it can be executed from inside the pipeline by a single command call:
pipeline:
- function.run:
name: "myapp:myscript:myfunction"
args: {"firstName": "Sabrina", "lastName": "Smith"}
The Python functions will be executed by a backend service inside PIPEFORCE. This approach is also known as Function as a Service (FaaS) or Lambda: You just send a Python function to the service and receive the calculated result. You do not care about any interpreter, image deployment, scalability issues or any other task related to the execution side.
This approach opens a lot of new possibilities to pipelines and your applications, such as for example:
Create a set of libraries of functions for your custom needs and re-use them from anywhere inside your app pipelines.
Write advanced tests using a Python testing framework.
Do advanced data transformations and mappings with Python.
And many more...
Since all FaaS services by default are stateless inside PIPEFORCE, it is possible to scale the execution of the Python scripts easily automatically and nearly unlimited. Since it is possible to run multiple of such FaaS execution services. Only the resources available to your cluster set the limit.
Here are some documentation references to Python: