Overview
SiteAuditLab is a non-intrusive website auditing tool that scans publicly accessible URLs and generates structured health reports across five categories: Security, Performance, SEO, Uptime, and Analytics.
Scans are performed entirely server-side. No browser extensions, no agents, and no changes are made to your website. We analyze HTTP responses, headers, and page content only.
Scan time
~5–15 seconds
Scan method
Server-side HTTP
Checks per scan
30+ checks
How It Works
When you submit a URL, SiteAuditLab performs the following steps:
- Normalises the URL (adds
https://if no protocol is provided) - Makes an HTTP GET request to the target URL, following redirects
- Analyses response headers, status codes, and response time
- Parses the HTML body for meta tags, scripts, links, and content
- Runs each selected scan category against the collected data
- Calculates scores, assigns grades, and stores the report
Scans do not execute JavaScript, load external resources, or perform any form of intrusive testing. They are read-only and safe to run on production sites.
Scan Categories
Security
Checks HTTP security headers, SSL/TLS configuration, and exposed sensitive information.
- Content-Security-Policy header presence and configuration
- X-Frame-Options, X-Content-Type-Options, Referrer-Policy headers
- Strict-Transport-Security (HSTS) configuration
- Server version disclosure in headers
- Mixed content detection
- Common sensitive file exposure (/.env, /wp-config.php, etc.)
Performance
Analyses response speed, content size, and resource optimisation.
- Time to First Byte (TTFB)
- Response compression (gzip/brotli)
- Cache-Control header configuration
- Total page weight estimation
- Inline script and style detection
SEO
Reviews on-page SEO signals from the HTML content.
- Title tag presence, length, and quality
- Meta description presence and length
- Canonical URL configuration
- OpenGraph and Twitter Card tags
- Heading structure (H1/H2 presence)
- robots.txt and XML sitemap references
Uptime
Verifies the site is reachable and responding correctly.
- HTTP status code check
- HTTPS redirect from HTTP
- Response time measurement
- Redirect chain analysis
Analytics (requires sign-in)
Detects analytics and tracking tools present in the page source.
- Google Tag Manager (GTM-XXXX)
- Google Analytics 4 (G-XXXXXXXXXX)
- Universal Analytics / legacy GA (UA-XXXXX-X)
- Google Ads conversion tracking
- Meta Pixel / Facebook Pixel
- Segment, Hotjar, Mixpanel, Microsoft Clarity
- dataLayer.push() event names in source
Grading System
Each category starts at a score of 100. Points are deducted for each finding based on severity. The overall score is a weighted average of all active category scores.
Critical
−25 pts
Warning
−10 pts
Info
−3 pts
Category weights for the overall score:
- Security — 40%
- Performance — 25%
- SEO — 25%
- Uptime — 10%
- Analytics — informational only (does not affect overall score)
Letter grades are assigned from the numeric score: A+ (97–100), A (93–96), A− (90–92), B+ (87–89), B (83–86), B− (80–82), C+ (77–79), C (73–76), C− (70–72), D (60–69), F (0–59).
Rate Limits
| Account type | Limit | Window |
|---|---|---|
| Anonymous (no sign-in) | 5 scans | Per hour, per IP |
| Authenticated user | 20 scans | Per hour, per account |
Rate limit windows reset every 60 minutes from the time of the first request.
API Reference
The scan API is available for programmatic use. All API requests require a valid session cookie (authenticated user) or are subject to anonymous rate limits.
POST /api/scan
Initiates a website scan.
Request body:
{
"url": "https://example.com",
"categories": ["security", "performance", "seo", "uptime", "analytics"]
}Response:
{
"reportId": "clxxxxxxxxxxxxx",
"results": {
"url": "https://example.com",
"overallScore": 84,
"overallGrade": "B",
"scannedAt": "2025-01-15T10:30:00.000Z",
"security": { "score": 72, "grade": "C+", "findings": [...] },
"performance": { "score": 91, "grade": "A-", "findings": [...] },
"seo": { "score": 88, "grade": "B+", "findings": [...] },
"uptime": { "score": 100, "grade": "A+", "findings": [...] },
"analytics": { "score": 85, "grade": "B", "tags": [...], "dataLayerEvents": [...] }
},
"isAuthenticated": true
}GET /api/scans
Returns paginated scan history for the authenticated user.
Authentication required.
Query parameters: page (default: 1), limit (default: 20, max: 50)
GET /api/scans?page=1
{
"scans": [
{
"id": "clxxxxxxxxxxxxx",
"url": "https://example.com",
"overallGrade": "B",
"overallScore": 84,
"createdAt": "2025-01-15T10:30:00.000Z"
}
],
"total": 47,
"page": 1,
"pages": 3
}GET /report/:id
Every scan generates a shareable report URL at /report/[id]. These links are publicly accessible and can be shared with anyone.
Scheduled Scans
Authenticated users can configure automated scans to run on a daily, weekly, or monthly schedule. Each run generates a full report and sends an email summary to the configured address.
Setting up a schedule
- Sign in to your account
- Navigate to Monitoring
- Click New Schedule
- Enter the target URL, select frequency and scan categories, and provide your email
- The first scan will run at the next scheduled interval
Self-hosting the cron trigger
Scheduled scans require an external cron trigger to fire the GET /api/cron endpoint. Use any free cron service (e.g., cron-job.org) or a Vercel Cron Job if deploying to Vercel.
# Example: trigger every hour GET https://yourdomain.com/api/cron Authorization: Bearer YOUR_CRON_SECRET
Set CRON_SECRET in your .env file. Generate one with: openssl rand -base64 32
FAQ
Will scanning my site affect its performance?
No. Scans make a single HTTP request to your site, identical to any regular visitor. There is no load testing or repeated polling.
Can I scan any website?
You should only scan websites you own or have permission to scan. Scanning third-party sites without authorisation may violate their terms of service.
Why does the analytics scanner only work for signed-in users?
Analytics detection requires parsing the full page HTML, which is a more resource-intensive operation. We restrict it to authenticated users to maintain service quality.
Are scan reports public?
Each report has a shareable link that anyone can view if they have the URL. Reports are not indexed or discoverable — only people you share the link with can see them.
How do I delete my data?
You can delete your account and all associated data by contacting privacy@siteauditlab.dev. Deletion is completed within 30 days.