Skip to content

CSV Import API

The CSV Import API allows you to bulk import findings from CSV files, making it easy to migrate data from other tools or import large numbers of findings at once. The API provides field mapping, validation, and detailed error reporting.

Upload and import findings from a CSV file into a project.

POST /api/v1/projects/{project_uuid}/findings/import-csv

Request:

  • Content-Type: multipart/form-data
  • Body: Form data with file field containing the CSV file

File Requirements:

  • Format: CSV (.csv) or text (.txt) files
  • Size Limit: 10MB maximum
  • Encoding: UTF-8 recommended

Response:

{
"success": true,
"message": "CSV import completed. 8 findings imported, 2 errors.",
"data": {
"success_count": 8,
"error_count": 2,
"errors": [
{
"row": 5,
"error": "Invalid impact level: extreme. Must be one of: informational, low, medium, high, critical",
"data": {
"title": "SQL Injection",
"impact": "extreme",
"probability": "high"
}
},
{
"row": 7,
"error": "The title field is required.",
"data": {
"title": "",
"impact": "high",
"probability": "medium"
}
}
],
"imported_findings": [
{
"row": 2,
"id": 458,
"uuid": "finding-uuid-1",
"title": "SQL Injection in Login Form"
},
{
"row": 3,
"id": 459,
"uuid": "finding-uuid-2",
"title": "Cross-Site Scripting (XSS)"
}
]
}
}

Get information about required and optional fields for CSV import.

GET /api/v1/csv/field-mapping

Response:

{
"success": true,
"data": {
"field_mapping": {
"title": "Finding title/name",
"impact": "Impact level (informational|low|medium|high|critical)",
"probability": "Probability level (informational|low|medium|high|critical)",
"description": "Detailed description of the finding",
"poc": "Proof of concept details",
"risks": "Risk assessment and business impact",
"remediation": "Remediation steps and recommendations",
"cvss": "CVSS vector string",
"cvss_score": "CVSS score (0-10)",
"status": "Finding status (draft|in-progress|done)",
"http_excerpts": "HTTP request/response examples",
"affected_hosts": "JSON array or comma-separated endpoints",
"categories": "Comma-separated vulnerability category IDs",
"template_id": "Template ID to base finding on",
"extra_fields": "JSON object with custom fields"
},
"sample_csv": "title,impact,probability,description,poc,affected_hosts,categories\n\"SQL Injection in Login Form\",high,medium,\"Login form vulnerable to SQL injection\",\"admin' OR 1=1 --\",\"https://example.com/login,https://api.example.com/auth\",\"1,2\"\n\"Cross-Site Scripting (XSS)\",medium,high,\"Reflected XSS in search parameter\",\"<script>alert(1)</script>\",\"{\"\"endpoint\"\": \"\"https://example.com/search\"\", \"\"port\"\": 443}\",\"3\""
}
}

Every CSV must include these columns:

FieldTypeDescriptionValid Values
titlestringFinding name/titleAny string (required)
impactstringImpact severityinformational, low, medium, high, critical
probabilitystringLikelihood of exploitationinformational, low, medium, high, critical
FieldTypeDescriptionFormat
descriptionstringDetailed descriptionPlain text
pocstringProof of conceptPlain text
risksstringBusiness risksPlain text
remediationstringFix recommendationsPlain text
cvssstringCVSS vectorCVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:N
cvss_scorenumberCVSS score0 to 10
statusstringFinding statusdraft, in-progress, done
http_excerptsstringHTTP examplesPlain text
affected_hostsstringAffected endpointsSee Affected Hosts Format
categoriesstringVulnerability categoriesComma-separated IDs: 1,2,3
template_idnumberTemplate referenceTemplate ID number
extra_fieldsstringCustom fieldsJSON string

The affected_hosts field supports two formats:

"https://example.com/login,https://api.example.com/auth,10.1.1.1:8080"
"[{\"endpoint\": \"https://example.com/login\", \"port\": 443, \"protocol\": \"HTTPS\", \"description\": \"Main login page\"}]"
"{\"endpoint\": \"https://example.com/search\", \"port\": 443}"

Custom fields should be provided as JSON:

"{\"cwe_id\": \"CWE-89\", \"owasp_category\": \"A03:2021\", \"custom_field\": \"value\"}"
title,impact,probability,description
"SQL Injection in Login",high,medium,"Authentication bypass via SQL injection"
"XSS in Search",medium,high,"Reflected XSS in search functionality"
"Weak Password Policy",low,high,"Passwords can be 4 characters minimum"
title,impact,probability,description,poc,risks,remediation,cvss,cvss_score,status,http_excerpts,affected_hosts,categories,extra_fields
"SQL Injection in Login Form",high,medium,"Login form vulnerable to SQL injection","admin' OR 1=1 --","Data breach potential","Use parameterized queries","CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:N",8.1,draft,"POST /login HTTP/1.1...","https://example.com/login","1,2","{""cwe_id"": ""CWE-89""}"
"Cross-Site Scripting",medium,high,"Reflected XSS in search","<script>alert(1)</script>","Session hijacking","Implement output encoding","CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:C/C:L/I:L/A:N",6.1,draft,"GET /search?q=<script>alert(1)</script>","[{""endpoint"": ""https://example.com/search"", ""port"": 443}]","3","{""owasp"": ""A03:2021""}"
Terminal window
curl -X POST \
-H "Authorization: Bearer your_api_key" \
-F "file=@findings.csv" \
https://your-instance.pentestpad.com/api/v1/projects/project-uuid/findings/import-csv
Terminal window
curl -H "Authorization: Bearer your_api_key" \
https://your-instance.pentestpad.com/api/v1/csv/field-mapping
// Import CSV file
const formData = new FormData();
formData.append('file', csvFile); // csvFile is a File object
const importResult = await fetch(`/api/v1/projects/${projectId}/findings/import-csv`, {
method: 'POST',
headers: {
'Authorization': 'Bearer your_api_key'
},
body: formData
}).then(r => r.json());
console.log(`Imported ${importResult.data.success_count} findings`);
if (importResult.data.error_count > 0) {
console.log('Errors:', importResult.data.errors);
}
import requests
import csv
# Create CSV data
csv_data = [
['title', 'impact', 'probability', 'description'],
['SQL Injection', 'high', 'medium', 'Database injection vulnerability'],
['XSS Attack', 'medium', 'high', 'Cross-site scripting issue']
]
# Write to file
with open('findings.csv', 'w', newline='') as file:
writer = csv.writer(file)
writer.writerows(csv_data)
# Upload CSV
with open('findings.csv', 'rb') as file:
response = requests.post(
'https://your-instance.pentestpad.com/api/v1/projects/project-uuid/findings/import-csv',
headers={'Authorization': 'Bearer your_api_key'},
files={'file': file}
)
result = response.json()
print(f"Success: {result['data']['success_count']} findings imported")
Error TypeDescriptionSolution
Invalid File FormatFile is not CSV/TXTUse .csv or .txt files
File Too LargeFile exceeds 10MB limitSplit into smaller files
Missing Required FieldsCSV missing title, impact, or probabilityAdd required columns
Invalid Enum ValuesImpact/probability not in valid listUse: informational, low, medium, high, critical
Invalid JSONMalformed JSON in affected_hosts or extra_fieldsValidate JSON format
Invalid Template IDTemplate doesn’t existUse valid template ID or leave empty
Invalid Category IDsCategory doesn’t existUse valid category IDs
  • Title: Required, maximum 255 characters
  • Impact/Probability: Must be one of: informational, low, medium, high, critical
  • CVSS Score: Must be between 0 and 10
  • Status: Must be one of: draft, in-progress, done
  • Categories: Must be valid category IDs (integers)
  • Template ID: Must be valid template ID (integer)
  1. Export from Source: Export data from your current tool
  2. Map Fields: Match columns to PentestPad field names
  3. Clean Data: Remove invalid characters and ensure proper formatting
  4. Test Import: Try a small sample first
  5. Full Import: Import the complete dataset
  • Review imported findings for accuracy
  • Update any findings that need manual correction
  • Assign findings to team members if needed
  • Update project status and generate reports