Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 5 Next »

By default PIPEFORCE comes with it’s own built-in AI backend hosted in Germany which complies with the GDPR rules and uses the default secret ai-default-secret.

PIPEFORCE can also combine multiple AI backends and models in order to use exactly the combination and security level you need to deliver your business solution.

Change the AI backend secret

If you do not specify otherwise, the default PIPEFORCE AI backend is used in any AI conversation.

But you can switch this default backend or use multiple AI backends in parallel. To do so, you have to add a secret for any AI backend you would like to additionally use.

This secret must be of type secret-text and must contain a JSON value which has a structure like this:

{
  "base_url": "string",
  "model": "string",
  "api_token": "string",
  "max_token": "integer",
  "custom_headers": { "name": "value" }
} 

Whereas:

  • base_url: The base url of the API (requried).

  • model: The AI model to be used (required).

  • api_token: The secure API token to be used.

  • max_token: The max prompt tokens allowed to be sent (defaults to 800 if missing)

  • custom_headers: Optional key-value pairs to be passed along as HTTP headers on any request. This is handy for example in case basic authentication or any other additional header setting is required.

Connect to OpenAI (ChatGPT)

In case you would like to integrate OpenAI as additional AI backend for example, you could use these settings in your secret:

{
  "base_url": "https://api.openai.com/v1",
  "model": "gpt-3.5-turbo",
  "api_token": "your_token",
  "max_token": 800
} 

Replace your_token by the API token you can generate on the OpenAI website.

Use the secret in your AI commands

In any of the AI commands, if no secret is specified, the default AI backend is used. You can refer to a different AI backend, by using its secret in the AI command.

Lets assume the OpenAI secret from above was added under a secret with name openai-secret, then to prompt this OpenAI backlend from inside your pipeline you can do something like this:

pipeline:
  - ai.prompt.send:
      secret: openai-secret
      prompt: "Tell me a joke"

This works the same way with any other AI command.

  • No labels