Using Postman Tests to Poll and Trigger Multiple API Requests

07 February 2024 Igor Rodionov
ADVICEment logo in Postman style
This article was originally published on the Postman blog - Using Postman Tests to Poll and Trigger Multiple API Requests.

In the dynamic realm of API development, there are instances where real-time data retrieval is crucial, necessitating the use of polling mechanisms. Despite the evolution of technology, there are scenarios where traditional request-response interactions fall short, and continuous monitoring of endpoints becomes imperative. This article delves into the intricacies of employing Postman tests as a powerful tool to efficiently poll and trigger multiple API requests. By exploring a unique and specialized case, we aim to demonstrate how Postman, a versatile API development and testing platform, can be harnessed to overcome the challenges posed by iterative data retrieval. Join us on a journey through the practical application of Postman in handling such intricate cases, unlocking the potential for automated API interactions.

Background

In our pursuit of efficient document generation, our development team encountered a unique challenge with the JSON-to-PDF API called DynamicDocs. This API, designed to dynamically generate PDF documents utilizing LaTeX templates and dynamic data supplied through a JSON payload, introduced a compelling need for a specialized approach—the implementation of polling.

In crafting this API, the team made a strategic decision: instead of a conventional synchronous response, the API request would return a link. This link, a gateway to the document's status, allowed users to determine whether the PDF creation process was successful, unsuccessful, or still in progress. By following this link, developers could actively monitor the document's lifecycle. Once the PDF was successfully generated, the status would be updated, and the final link to download the document would be provided. This distinctive scenario prompted us to explore how Postman tests could be harnessed to seamlessly navigate the intricacies of this asynchronous process, offering insights into an effective API polling strategy.

Requirements

To address the complexities inherent in the DynamicDocs API scenario, we sought to leverage Postman's capabilities as a comprehensive testing tool. The objective was to seamlessly navigate the asynchronous nature of document generation by setting up a series of orchestrated API calls.

  1. Initial API call:
    • Initiate the PDF creation process by sending the initial API request.
    • Extract the link provided in the response for subsequent status polling.
  2. PDF status requests and polling:
    • Implement a polling mechanism within Postman to actively check the status of the document using the extracted link.
    • Continue polling until the PDF creation process has stopped.
  3. PDF download request:
    • If the document creation is successful, trigger another API call to download the generated PDF.
    • Utilize Postman tests to verify the downloaded content, ensuring it is a valid PDF file.

By meticulously orchestrating these steps within Postman, we aimed to streamline the entire process, offering an insightful exploration into the platform's testing functionalities. This approach allowed us to effectively tackle the complexities of asynchronous document generation and verify the integrity of the generated PDFs with precision and reliability.

Solution

For this section, we are going to look at the publicly available DynamicDocs API - General JSON to PDF Template v1 collection via Postman and generate a contract PDF. Our other collections can be accessed via ADVICEment’s Postman profile.

Initial API call

In addressing the unique requirements of the DynamicDocs API, the first step involves triggering the document generation process. This is achieved through a targeted API request utilizing the endpoint:

https://api.advicement.io/v1/templates/pub-general-json-to-pdf-template-v1/compile

This request prompts DynamicDocs to dynamically create a PDF document based on LaTeX templates and dynamic data supplied through a JSON payload.

The following Postman test script is designed to verify the response of an initial API call made to the DynamicDocs API, which triggers the generation of a PDF document:

// example using response assertions pm.test("response should be okay to process", function () { pm.response.to.not.be.error; pm.response.to.have.jsonBody("documentStatusUrl"); pm.response.to.not.have.jsonBody("error"); }); // example using pm.response.to.be* pm.test("response must be valid and have a body", function () { // assert that the status code is 200 pm.response.to.be.ok; // assert that the response has a valid JSON body pm.response.to.be.withBody; // this assertion also checks if a body exists pm.response.to.be.json; }); // example using response assertions const documentStatusUrl = pm.response.json().documentStatusUrl; pm.environment.set("contract_url", documentStatusUrl); postman.setNextRequest("Contract v1 Progress JSON");

The script makes sure the API has provided a correct response; however, it also sets up an environment variable and prepares the next request. This is done by:

  1. `const documentStatusUrl = pm.response.json().documentStatusUrl;`: extracts the `documentStatusUrl` from the JSON response.
  2. `pm.environment.set("contract_url", documentStatusUrl);`: sets the extracted URL as an environment variable named `contract_url` for future use.
  3. `postman.setNextRequest("Contract v1 Progress JSON");`: sets the next request in the collection to be "Contract v1 Progress JSON."

Overall, this script effectively validates the success of the initial API request, checks for the presence of expected properties in the response, and prepares the environment for subsequent requests in the workflow.

PDF Status Requests and Polling

With the initial API request to trigger PDF generation completed, the subsequent challenge lies in efficiently polling the status of the document until the PDF creation process is completed. The API call is made to the endpoint specified by the dynamically obtained `{{contract_url}}`. Employing a thoughtfully constructed Postman test script, we implement a robust polling mechanism that dynamically adjusts to the asynchronous nature of document creation:

const maxNumberOfTries = 20; // your max number of tries const sleepBetweenTries = 1000; // your interval between attempts if (!pm.environment.get("tries")) { pm.environment.set("tries", 1); } const jsonData = pm.response.json(); if ((jsonData.statusCode == 102) && (pm.environment.get("tries") < maxNumberOfTries)) { const tries = parseInt(pm.environment.get("tries"), 10); pm.environment.set("tries", tries + 1); setTimeout(function() {}, sleepBetweenTries); postman.setNextRequest("Contract v1 Progress JSON"); } else { pm.environment.unset("tries"); pm.environment.unset("contract_url"); // your actual tests go here... pm.test('check response is json', () => { pm.response.to.have.status(200); pm.response.to.be.json; }); pm.test('check documentStatusUrl is valid and completed', () => { pm.expect(jsonData).to.have.property('statusCode'); pm.expect(jsonData.statusCode).to.eql(201); }); pm.test('check all properties are present', () => { pm.expect(jsonData).to.have.property('id'); pm.expect(jsonData).to.have.property('startedAt'); pm.expect(jsonData).to.have.property('statusCode'); pm.expect(jsonData).to.have.property('statusDescription'); pm.expect(jsonData).to.have.property('calculationLogUrl'); pm.expect(jsonData).to.have.property('latexLogUrl'); pm.expect(jsonData).to.have.property('documentUrl'); }); const documentStatusUrl = jsonData.documentUrl; pm.environment.set("contract_doc_url", documentStatusUrl); postman.setNextRequest("Contract v1 Document"); }

The script begins by defining parameters such as `maxNumberOfTries` and `sleepBetweenTries`, indicating the maximum number of polling attempts and the interval between each attempt, respectively. It utilizes environment variables to keep track of the number of tries made. The script then assesses the response JSON, specifically focusing on the `statusCode`. If the status indicates that the document creation process is still in progress (status code 102) and the maximum number of tries has not been reached, the script increments the try count, introduces a sleep interval, and schedules the same request to continue polling.

Once the document creation is complete or the maximum number of tries is reached, the script proceeds to perform essential tests on the response. These tests include verifying that the response is in JSON format, ensuring the statusCode indicates successful completion (status code 201), and checking for the presence of crucial properties in the response JSON.

Upon successful completion of polling, the script extracts the document URL from the response and sets it as an environment variable `contract_doc_url`. This URL is pivotal for the subsequent step of downloading the generated PDF document. The Postman test script concludes by setting the next request in the collection to "Contract v1 Document," facilitating the seamless transition to the final phase of the workflow.

PDF Download Request

With the document successfully generated and its URL stored in the `{{contract_doc_url}}` variable, the final phase of our workflow involves downloading the PDF. A dedicated API request is made to the specified document URL, orchestrating the seamless retrieval of the generated file. The following Postman test script has been crafted to ensure the integrity of the downloaded document:

pm.test('check for response to be 200', () => { pm.response.to.have.status(200); }); pm.test('check for Content-Type to be application/pdf', () => { pm.expect(pm.response.headers.get('Content-Type')).to.eql('application/pdf'); }); pm.test('check for PDF in the body response', () => { pm.expect(pm.response.text()).to.include('PDF'); }); pm.environment.unset("contract_doc_url");

The script begins by confirming that the response status is 200, validating the successful retrieval of the document. Subsequently, it checks for the expected Content-Type in the response headers, ensuring that it is specifically set to "application/pdf." This meticulous verification is crucial in confirming that the retrieved content is indeed a PDF document. Further validating the content, the script includes a test to confirm the presence of the string 'PDF' within the body response. This serves as an additional layer of assurance, substantiating that the downloaded document aligns with the anticipated PDF format.

This conclusive phase marks the successful execution of the end-to-end process, where Postman's testing capabilities prove instrumental in managing the complexities of asynchronous PDF generation and retrieval.

Final Remarks

In essence, this article has unravelled a comprehensive and effective solution for managing the end-to-end process of PDF generation, polling, and retrieval using Postman. Empowered by its versatile testing functionalities, developers can confidently navigate the complexities of asynchronous workflows, ensuring the reliability and precision of their API integrations. The described workflow serves as a testament to the adaptability and efficiency of Postman in addressing specialized use cases, providing developers with a powerful toolset to conquer the challenges of modern API development.