LogoLogo
API DocumentationIntegrationsPlexTrac.com
  • Product Documentation
    • Using This Site
    • Security Advisories
    • Deployment and Maintenance Policy
    • Supported Applications
  • PlexTrac Modules
    • Dashboard
    • Clients
      • Clients Components
      • Creating a Client
      • Managing Clients
      • Managing Client Users
      • Adding Assets to a Client
      • Managing Assets
    • Schedule
      • Schedule Components
      • Creating an Engagement
      • Requesting an Engagement
      • Managing Engagements
      • Engagement Status
    • Assessments
      • Assessment Components
      • Managing Questionnaires
      • Starting an Assessment
      • Taking an Assessment
      • Reviewing an Assessment
      • Submitting an Assessment
    • Reports
      • Report Components
      • Creating a Report
      • Adding from NarrativesDB
      • Editing a Report
      • Using Short Codes in Reports
      • Findings
        • Creating a Finding
        • Collaborative Editing
        • Importing Findings from a File
        • CSV Findings Templates
          • Using Report Findings CSV Template
        • Importing Findings via an Integration
        • Importing Findings from WriteupsDB
        • Finding Status
        • Creating Jira Tickets
        • CVSS Scoring
        • Affected Assets
      • Importing a Report
      • Exporting a Report
    • Priorities
      • Priorities Components
      • Creating a Priority
      • Linking Findings and Assets
      • Managing Priorities
      • Priorities Metrics
    • Content Library
      • Types of Repositories
      • NarrativesDB
        • NarrativesDB Home Page
        • Managing Repositories
        • Managing Users
        • Creating a Repository
        • Managing Sections
        • Creating a Section
      • WriteupsDB
        • WriteupsDB Home Page
        • Managing Repositories
        • Managing Users
        • Creating a Repository
        • Creating a Writeup
        • Copying a Writeup
        • Adding to a Report
        • Importing via CSV Template
      • RunbooksDB
        • RunbooksDB Home Page
        • Managing Repositories
        • Managing Users
        • Creating a Repository
        • Creating a Procedure
        • Creating a Technique
        • Creating a Tactic
        • Creating a Methodology
    • Analytics
      • Findings
      • Assets
      • Runbooks
      • Trends & SLAs
    • Runbooks
      • Managing Engagements
        • Starting an Engagement
        • Submitting an Engagement
      • Managing Test Plans
        • Creating a Test Plan
        • Exporting a Test Plan
  • Tenant Management
    • Account Management
      • Profile (Personal Settings)
        • Managing User Profile
        • Managing Password
        • Setting Up Two-Factor Authentication
      • Account Admin
        • Tenant Settings
          • Account Information
          • General Settings
          • Email Settings
          • Tags Settings
          • Service-Level Agreements (SLAs)
          • Short Codes
        • Customizations
          • Layouts
          • Templates
            • Report Templates
            • Export Templates
            • Style Guides
          • Theme
        • Automations
          • Risk Scoring
            • Creating Equations
            • Managing Priority Equations
          • Parser Actions
        • Integrations & Webhooks
          • Integrations (API)
            • Cobalt
            • Edgescan
            • HackerOne
            • Jira
            • ServiceNow
            • Tenable Vulnerability Management
            • Tenable Security Center
          • Webhooks
        • Security & User Management
          • Audit Log
          • Security
            • Authentication Methods
              • OAuth/OpenID Setup
                • Microsoft Entra ID
                • Google OAuth
                • Okta
                • OpenID Connect
              • SAML Setup
            • General Authentication Settings
            • Authorization
            • Role Based Access (RBAC)
              • Custom Roles
            • Classification Tiers
          • Users
            • Adding Users
            • Managing Users
        • Licensing
          • Licensing
          • Priorities
          • Plex AI
            • Using AI
        • White Labeling
      • Help Center
      • Logout
    • Integrations and File Imports
      • Acunetix
      • BlindSPOT
      • Burp Suite
      • Checkmarx
      • Core Impact
      • HCL AppScan
      • Invicti
      • Nessus
      • Nexpose
      • Nipper
      • Nmap (Assets)
      • Nmap Vulners NSE
      • Nodeware
      • NodeZero
      • OpenVAS
      • OWASP ZAP
      • Pentera
      • Qualys (VM Parser)
      • Qualys (Web App Scanner)
      • RapidFire
      • Scythe
      • Veracode
  • API Documentation
    • Overview
    • Concept Definitions
    • Getting Started
    • Retrieving Parameter IDs
    • Object Structures
      • Client Object
      • Report Object
      • Finding Object
      • Asset Object
      • Evidence Object
    • Use Cases
    • API Change Policy
      • API Change Log
    • Webhooks
      • Webhook Payload Structure
      • Verifying Sender Requests
Powered by GitBook

Resources

  • Privacy Policy
  • Terms of Use
  • Vulnerability Policy

Β© 2025 PlexTrac, Inc. All rights reserved.

On this page
  • 🧾 Summary
  • πŸ—“ Deprecation Timeline
  • πŸ“‹ Details
  • πŸ›  Required Developer Changes

Was this helpful?

Export as PDF
  1. API Documentation
  2. API Change Policy

2025 June - File Import Route Deprecation

Last updated 1 day ago

Was this helpful?

🧾 Summary

We are deprecating the existing POST /api/v2/client/{clientId}/report/{reportId}/importAsync/{source} endpoint (named in our docs) in favor of a new, more reliable, multi-step asynchronous import flow. This update improves backend performance and makes the platform more responsive during heavy import activity.

If you’ve built an integration or script that uses the POST /api/v2/client/{clientId}/report/{reportId}/importAsync/{source} endpoint, you will need to migrate to the new process before the removal date.


πŸ—“ Deprecation Timeline

  • Deprecation Date: June 23, 2025 (the replacement option will be available starting in v2.18)

  • Removal Date: October 8, 2025 - v2.22


πŸ“‹ Details

πŸ“Œ Existing Workflow

Current imports are handled via:

POST /api/v2/client/{clientId}/report/{reportId}/importAsync/{source}

  • The upload is asynchronous, but the actual import runs on the main API thread pool.

  • This can degrade performance and responsiveness when multiple imports occur at once or the platform is experiencing heavy user load.

πŸ” Replacement Workflow

The new import process separates file upload and import execution:

  1. Request Presigned URL POST /api/v2/presigned-url

  2. Upload File to MinIO Use the returned url from step 1 via a PUT request.

  3. Trigger Import POST /api/v2/client/{clientId}/report/{reportId}/preuploaded-import/{source} Pass the returned key from step 1 along with the filename, any tag metadata, and parser action option.

  4. Check Import Status GET /api/v2/my-imports

The actual import now runs on a dedicated worker queue, decoupled from the API thread.

πŸ’‘ Why This Change?

This update brings several backend improvements:

  • True background imports: Jobs are no longer tied to the API thread.

  • Better system responsiveness: Main thread stays available for interactive user actions.

  • Improved visibility: Import status is queryable via a structured /api/v2/my-imports endpoint.

  • More scalable architecture: MinIO upload is streaming, so it supports large files without loading into memory.


πŸ›  Required Developer Changes

βœ… How to Know If You're Affected

You are affected if you call:

POST /api/v2/client/{clientId}/report/{reportId}/importAsync/{source}

…in any script, integration, or automation. This was previously the primary way to upload findings with our file parsers from scan tools like Nessus, Burp, Qualys, etc.

πŸ”„ What You Need to Do

Update your integration to follow the new multi-step flow.

Step-by-step example:

1. Request a Presigned Upload URL

// Fetch presigned url from PlextracApi
const presignedUrlResponse = await axios.post(
    "https://plextrac.localhost/api/v2/presigned-url",
    { name: fileName },
    { headers: { "authorization": authToken, "content-type": "application/json" }}
);

Response:

{
    "status": "success",
    "key": "cmaod32be000i0hovg1oz04b4/burp_sample_small.xml",
    "url": "https://plextrac.localhost/cloud/uploads/cmaod32be000i0hovg1oz04b4/burp_sample_small.xml?..."
}

2. Upload File to MinIO

// Ready file data to be uploaded as a stream
const fileStream = fs.createReadStream(filePath);

// Upload file to Minio using presigned url
const minioUploadResponse = await axios.put(
    presignedUrlResponse.data.url,
    fileStream,
    {
        headers: { "Content-Type": "application/octet-stream" },
        onUploadProgress(progressEvent) {
            if (progressEvent?.total === undefined) return;
            const percentCompleted = Math.round((progressEvent.loaded * 100) / progressEvent.total);
            console.log(`Upload progress: ${percentCompleted}%`);
        }
    }
);

Note: onUploadProgress is supported in Axios and can be used to provide user feedback on large files. Different solutions can be found in different request libraries if needed.

3. Initiate Import via Background Worker

// Initiate asynchronous import process
const asyncImportResponse = await axios.post(
    "https://plextrac.localhost/api/v2/client/{clientId}/report/{reportId}/preuploaded-import/{source}",
    {
         tags: JSON.stringify({ assets: [], findings: [] }),
         useParserActions: false,
         key: presignedUrlResponse.data.key,
         fileName: fileName,
    },
    {
        headers: {
            "authorization": authToken,
            "content-type": "application/json"
        }
    }
);

This request returns a status based on the success of queuing up the import job. You need to use the /api/v2/my-imports to determine the success of the import job. If you uploaded a malformed or invalid file for the source selected, this will still queue up the import job successfully.

  • source corresponds to the file parser (e.g. nessus, burp, qualys), match existing usage from the V2 import

  • key is the same object storage path returned in step 1

  • tags and useParserActions match existing usage from the V2 import

Response:

{
    "status": "success",
    "operationId": "1dbc57cb-dd90-403b-a3e1-f411a66bd922",
    "importJob": {
        "id": "1dbc57cb-dd90-403b-a3e1-f411a66bd922",
        "user_id": 933708862,
        "user_email": "test.email@plextrac.com",
        "tenant_id": 0,
        "client_id": 70891,
        "report_id": 543658528,
        "source": "burp",
        "file_location": "cmapk4wtz000j0hov3pqe5foy/burp_sample_small.xml",
        "original_filename": "burp_sample_small.xml",
        "file_type": "text/xml",
        "file_size": "123625",
        "parser_actions_enabled": false,
        "processed_bytes": "0",
        "processed_findings": 0,
        "processed_assets": 0,
        "processed_finding_assets": 0,
        "processed_evidence": 0,
        "asset_tags": [],
        "finding_tags": [],
        "created_at": "2025-05-15T16:02:00.232Z",
        "last_updated_at": "2025-05-15T16:02:00.232Z",
        "failure_notified_at": null,
        "status": "IN QUEUE",
        "error_status": null
    }
}

4. (Optional) Monitor Import Status

const importStatus = await axios.get(
    "https://plextrac.localhost/api/v2/my-imports",
    {
        headers: {
            "authorization": authToken,
            "content-type": "application/json"
        }
    }
);

This endpoint shows the list of recent imports and their status. A specific import job can be found by matching on the key which is the operationId returned when the import was started.

Response:

[
    {
        "key": "1dbc57cb-dd90-403b-a3e1-f411a66bd922",
        "name": "burp_sample_small.xml",
        "reportName": "Test Report",
        "clientName": "Test Client",
        "fileSize": "123625",
        "status": "PROCESSING",
        "progress": 0,
        "errorStatus": null,
        "source": "burp"
    }
]

The status will be IMPORT COMPLETE or ERROR depending on whether the import job completed successfully or not. If the job results in failure, the errorStatus will have some text labelling the error.

Add Findings from File Imports V2