Multi-jurisdiction

How to read your first Veracly compliance report

Your first Veracly report is dense by design. Here is the section-by-section tour, the numbers that actually matter, and a triage plan for the first week.

By Veracly Compliance Team7 min read

Your first Veracly report is dense. The PDF is twelve to twenty pages, the dashboard has fewer pixels but more drill-down, and the cover page shows six numbers before you even get to the body. This article is the guided tour — what each section is for, which numbers matter, and how to get from “received” to “the top five fixes are filed in your tracker” in under thirty minutes.

Section 1 — The cover

The headline is your overall score and a status pill (green/yellow/red). The mini-cards below show one number per jurisdiction that fired. Two things to read here:

  • The status pill is the at-a-glance verdict. Green (85+) means your site is in good shape; yellow (60–84) means you have actionable findings but no single catastrophic one; red (under 60) means at least one critical issue you should file today.
  • The jurisdiction strip tells you which laws actually apply to your real traffic mix. We do not score laws that do not apply — if you have zero EU visitors, GDPR does not appear here.

Section 2 — Executive summary

One paragraph of natural-language summary, AI-drafted from the scan data and reviewed by the rules engine before it lands in the PDF. It tells you what your site does well and what to fix first, in language you can paste into a stakeholder update.

The stat cards underneath translate the headline into traceable numbers: pages scanned, unique issues, jurisdictions evaluated, distinct critical findings. If something on the executive summary surprises you, the stats below tell you which page count is driving it.

Section 3 — Per-jurisdiction scorecard

One card per applicable jurisdiction. Each shows a score, a violation count, the critical count, and a severity-distribution bar (critical / high / medium / low). Read the bar shape, not just the number — two sites with a 72 can have wildly different fix workloads if one has all-low and the other has two criticals.

For AODA you will see two numbers side-by-side: the strict 2.1-AA score (used for cross-jurisdiction parity with the other cards) and the legal-floor score against IASR §14’s 2.0-AA bar (which is Ontario’s actual statutory requirement). The 2.0 number is what an Ontario regulator would compare against in practice.

Section 4 — Top priorities

This is where you start. Ten issues ranked by severity multiplied by affected page count — a critical that appears on every page outranks a critical on one page, which outranks any high. Each row has a one-paragraph plain-English explanation (the AI translator pass, regenerated on every report so the language stays current).

If your team only does one thing with the report this week, file these ten as tickets and assign them. The downstream remediation appendix has the copy-paste fix for each.

Section 5..N — Per-jurisdiction detail

One detail page per jurisdiction that fired. The ring at the top mirrors the scorecard number. The table below it lists the top ten findings for that jurisdiction, which can overlap with the cross-jurisdiction top priorities but often introduces a few jurisdiction-specific ones (an EAA accessibility statement requirement, a UK Equality Act reasonable-adjustments concern, an AODA-only IASR clause).

The cited regulations panel at the right is the answer to “under what law?” Use it when you forward the report internally — pasting the regulation reference into a Jira ticket converts “we have a compliance issue” into “Article X.Y of regulation Z requires the following.”

Section 6 — Remediation appendix

The longest section by page count. One card per top-priority finding, expanded to include: a one-paragraph plain-English explanation, an evidence screenshot (when the scanner could capture a stable element selector), a free-form fix-in-prose, and language-tagged code blocks (HTML / CSS / JavaScript) for the AI-generated patch.

The HTML/CSS/JS snippets are starting points, not patches to commit unreviewed. The scanner cannot see your component library or class-name conventions; treat the snippets the way you would treat a Stack Overflow answer — read it, adapt it, then commit.

Section 7 — Remaining issues

A flat inventory of every issue past the top ten. No fix detail (the top ten and the remediation appendix already cover the high-leverage class) — this exists so the report is a complete record. A buyer who only ships the top ten on the first iteration can come back to this section for the second sprint.

Section 8 — Methodology and glossary

What rule set we used, what each acronym means, what we did not evaluate (manual screen-reader testing, cognitive-load testing). Skip on first read; come back when a stakeholder or auditor asks “how was this measured?”

Section 9 — Legal disclaimer and integrity block

The disclaimer is standard auditor language: the report is a snapshot, not a legal opinion. The integrity block is more interesting — every Veracly report is signed with an Ed25519 key, and the scan UUID printed here is what a holder of the PDF pastes at veracly.app/verify/<uuid> to confirm the bytes are the ones we issued. See How to verify a Veracly report is authentic for the full mechanic.

A 30-minute workflow for the first report

  1. Open the PDF, read the executive summary (1 min).
  2. Scan the per-jurisdiction scorecard. Note the worst jurisdiction and the severity-bar shape (2 min).
  3. Go to Top Priorities. File each row as a ticket in your tracker with the title, severity, and the regulation reference from the corresponding detail page (15 min).
  4. For each ticket, open the remediation appendix card. Paste the explanation into the ticket description and link the AI snippet as a starting point (10 min).
  5. Schedule a re-scan two weeks out. Veracly auto-runs weekly, but a calendar prompt helps the team see the score-drop alert when fixes land (2 min).

What the report deliberately does not tell you

Compliance is not a percentage. A 95 with one critical finding is more exposed than an 80 with twenty mediums. Read the severity distribution, not the headline, when you are deciding whether a result is “good enough.”

And the report is silent on the things automation cannot reach: manual screen-reader usability, cognitive-load testing of complex forms, plain-language review of legal pages by an actual lawyer. The methodology section names these omissions on purpose.

See also: The top 10 issues Veracly finds — and how to fix them · Free scan vs. paid report: when to upgrade

Common questions

What does the overall score mean?+

The headline score is the average of every jurisdiction that fired for your site. A 70 means a typical visitor mix lands in the yellow band — you have meaningful violations but no single critical one. Green is 85+; red is below 60.

Why are some jurisdictions marked "not evaluated"?+

A jurisdiction only fires when your real visitor traffic includes the region it covers. An Australian-publisher site with zero EU visitors will see GDPR marked not-evaluated; we do not score what does not apply. That decision is shown on the report so it does not look like missing data.

What is the difference between an issue and a violation?+

An issue is a unique rule that failed (for example "form-label-missing"). A violation is an instance of that issue (the report counts five form-label-missing findings as five violations of one issue). The executive summary uses unique issues; per-jurisdiction cards use violations.

Where do I start fixing?+

Top Priorities (section 4). Those are ranked by severity × affected page count, so the ten items listed are the highest-leverage fixes regardless of which jurisdiction surfaced them. Anything beyond the top ten goes in the Remaining Issues inventory at the back.

See where your site stands.

Run a free Veracly scan and get a multi-jurisdiction report — EAA, GDPR, ADA, UK Equality Act, AODA — with copy-paste developer fixes.

Run a free scan

Keep reading

Cookies on veracly.app

We set strictly-necessary cookies to keep the site running. Analytics cookies help us understand which pages convert — only with your permission. Read our cookie policy