How to use our API

Architectural Overview

The AX NLG Cloud requires your data to interpret it according to a ruleset that you and your colleagues write. Based on that ruleset, content is generated and needs to be transferred back to your CMS/PIM/database.

3 steps to integrate content automation into your system:

  1. Upload the dataopen in new window you need for your content through our REST API.
  2. Generate contentopen in new window on the The AX NLG Cloud from data and your pre-configured instructions. Every data object will be converted to one piece of content.
  3. Receiving the generated contentopen in new window from the API or define a webhook to which we can send your text when it is ready.


You authenticate with the API (documented in the API Referenceopen in new window).


Generate an id_token and use it in all your processes until it almost expires. Then generate a new one. By generating a new id_token the old one automatically expires.

Uploading Documents

Upload your documents to the appropriate collection. For that you need the collection_id. If you don't have an account ask your colleagues to tell you which collection should receive the data. The upload process is briefly described in the Basic Chapter of the API Referenceopen in new window and documented in detail in the Description of documents in the API Referenceopen in new window.

Generating Texts

To generate text you need to select one of three possible options:

  1. You press the Generate button on the website.
  2. You activate automatic generation for the new data in your Collection
  3. You use the API to start a generation for a specific document or Collection. This is useful, for example, to refresh.your text(s) using the same data or to update them with modified data. This is documented in the API Referenceopen in new window.

The Preferred Method: Automatic Generation

Each collection can be configured to automatically trigger content generation in the "edit" view in the frontend:

  • The checkbox Autogenerate New Documents sets the collection to automatically trigger a generation for each newly imported dataset.
  • The checkbox Autogenerate Existing Documents After Changes sets the collection to automatically trigger a new generation for datasets where an update in the imported dataset is detected.

Ask your colleagues to set this up in the frontend, if not already done.

Receiving the Generated Text

The generated text goes back to you. This can be done in three ways:

  1. Setting up a webhookopen in new window
  2. Using the Direct APIopen in new window
  3. Checking the API response until the text is generated and then retrieving it (API Reference for requesting a single documentopen in new window)


We strongly recommend using a webhook unless you need real-time text production. In that case you should use our Direct API.

Using Webhooks for Receiving Content

You set up a webhook and store it in the collection's settings, so that finished text is sent there. You can find the documentation about this in the API Referenceopen in new window

What you Need

  1. Your side of the data integration to get the data in.
  2. A working ruleset.
  3. A collection in the cockpit.

Setting up the Webhooks on the AX Platform

  • Go to Data Sources and into the corresponding collection.
  • Click on edit.
  • Enter your URL in the Webhook field and save your changes.

Now you will receive any updated content on this URL. Manually generate a text to verify your setup. Then set the checkbox "Autogenerate new documents" in the collection's settings to enable automatic generation and save the changes.

Setup the Webhook on Your end

Only SSL Webhook targets are allowed. Our platform is hosted on the Amazon Cloud. Whitelisting of IP addresses is not feasible. Instead, the request is cryptographically signed with a shared key, we recommend evaluating that signature. More information about the setup can be found in the chapter Webhook for receiving text of the API Referenceopen in new window.


For a more in-depth introduction of the questions around API and the NLG Platform see the Webinar: "API for data and texts"open in new window