Files
parsedmarc/splunk/dmarc_failure_dashboard.xml
T
Sean Whalen d7366d088f Add DMARCbis report support; rename forensic→failure project-wide
Rebased on top of master @ 2cda5bf (9.9.0), which added the ASN
source attribution work (#712, #713, #714, #715). Individual Copilot
iteration commits squashed into this single commit — the per-commit
history on the feature branch was iterative (add tests, fix lint,
move field, revert, etc.) and not worth preserving; GitHub squash-
merges PRs anyway.

### DMARCbis fields (new)

New fields from the DMARCbis XSD, plumbed through types, parsing, CSV
output, and the Elasticsearch / OpenSearch mappings:

- ``np`` — non-existent subdomain policy (``none`` / ``quarantine`` /
  ``reject``)
- ``testing`` — testing mode flag (``n`` / ``y``), replaces RFC 7489
  ``pct``
- ``discovery_method`` — policy discovery method (``psl`` /
  ``treewalk``)
- ``generator`` — report generator software identifier (metadata)
- ``human_result`` — optional descriptive text on DKIM / SPF results

RFC 7489 reports parse with ``None`` for DMARCbis-only fields.

### Forensic → failure rename

Forensic reports have been renamed to failure reports throughout the
project to reflect the proper naming since RFC 7489.

- Core: ``types.py``, ``__init__.py`` — ``ForensicReport`` →
  ``FailureReport``, ``parse_forensic_report`` →
  ``parse_failure_report``, report type ``"failure"``.
- Output modules: ``elastic.py``, ``opensearch.py``, ``splunk.py``,
  ``kafkaclient.py``, ``syslog.py``, ``gelf.py``, ``webhook.py``,
  ``loganalytics.py``, ``s3.py``.
- CLI: ``cli.py`` — args, config keys, index names
  (``dmarc_failure``).
- Docs + dashboards: all markdown, Grafana JSON, Kibana NDJSON,
  Splunk XML.

Backward compatibility preserved: old function / type names remain as
aliases (``parse_forensic_report = parse_failure_report``,
``ForensicReport = FailureReport``, etc.), CLI accepts both the old
(``save_forensic``, ``forensic_topic``) and new (``save_failure``,
``failure_topic``) config keys, and updated dashboards query both
old and new index / sourcetype names so data from before and after
the rename appears together.

### Rebase notes

Merge conflicts resolved in ``parsedmarc/constants.py`` (took bis's
10.0.0 bump), ``parsedmarc/__init__.py`` (combined bis's "failure"
wording with master's IPinfo MMDB mention), ``parsedmarc/elastic.py``
and ``parsedmarc/opensearch.py`` (kept master's ``source_asn`` /
``source_asn_name`` / ``source_asn_domain`` on the failure doc path
while renaming ``forensic_report`` → ``failure_report``), and
``CHANGELOG.md`` (10.0.0 entry now sits above the 9.9.0 entry).

All 324 tests pass; ``ruff check`` / ``ruff format --check`` clean.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 02:26:30 -04:00

100 lines
3.6 KiB
XML

<form theme="dark" version="1.1">
<label>Failure DMARC Data</label>
<search id="base_search">
<query>
index="email" (sourcetype="dmarc:failure" OR sourcetype="dmarc:forensic") parsed_sample.headers.From=$header_from$ parsed_sample.headers.To=$header_to$ parsed_sample.headers.Subject=$header_subject$ source.ip_address=$source_ip_address$ source.reverse_dns=$source_reverse_dns$ source.country=$source_country$
| table *
</query>
<earliest>$time_range.earliest$</earliest>
<latest>$time_range.latest$</latest>
</search>
<fieldset submitButton="false" autoRun="true">
<input type="text" token="header_from" searchWhenChanged="true">
<label>Message header from</label>
<default>*</default>
</input>
<input type="text" token="header_to" searchWhenChanged="true">
<label>Message header to</label>
<default>*</default>
</input>
<input type="text" token="header_subject" searchWhenChanged="true">
<label>Message header subject</label>
<default>*</default>
</input>
<input type="text" token="source_ip_address" searchWhenChanged="true">
<label>Source IP address</label>
<default>*</default>
</input>
<input type="text" token="source_reverse_dns" searchWhenChanged="true">
<label>Source reverse DNS</label>
<default>*</default>
</input>
<input type="text" token="source_country" searchWhenChanged="true">
<label>Source country ISO code</label>
<default>*</default>
</input>
<input type="time" token="time_range" searchWhenChanged="true">
<label>Time range</label>
<default>
<earliest>-90d@d</earliest>
<latest>now</latest>
</default>
</input>
</fieldset>
<row>
<panel>
<title>Failure samples</title>
<table>
<search base="base_search">
<query>| table arrival_date_utc authentication_results parsed_sample.headers.From,parsed_sample.headers.To,parsed_sample.headers.Subject | sort -arrival_date_utc</query>
</search>
<option name="drilldown">none</option>
<option name="refresh.display">progressbar</option>
<option name="totalsRow">false</option>
<format type="number" field="count">
<option name="precision">0</option>
</format>
</table>
</panel>
</row>
<row>
<panel>
<title>Failure samples by country</title>
<map>
<search base="base_search">
<query>| iplocation source.ip_address| stats count by Country | geom geo_countries featureIdField="Country"</query>
</search>
<option name="drilldown">none</option>
<option name="height">519</option>
<option name="mapping.type">choropleth</option>
</map>
</panel>
</row>
<row>
<panel>
<title>Failure samples by IP address</title>
<table>
<search base="base_search">
<query>| iplocation source.ip_address | stats count by source.ip_address,source.reverse_dns | sort -count</query>
</search>
<option name="drilldown">none</option>
<option name="refresh.display">progressbar</option>
<format type="number" field="count">
<option name="precision">0</option>
</format>
</table>
</panel>
<panel>
<title>Failure samples by country ISO code</title>
<table>
<search base="base_search">
<query>| stats count by source.country | sort - count</query>
</search>
<option name="drilldown">none</option>
<format type="number" field="count">
<option name="precision">0</option>
</format>
</table>
</panel>
</row>
</form>