mirror of
https://github.com/domainaware/parsedmarc.git
synced 2026-04-26 23:39:32 +00:00
Rebased on top of master @ 2cda5bf (9.9.0), which added the ASN
source attribution work (#712, #713, #714, #715). Individual Copilot
iteration commits squashed into this single commit — the per-commit
history on the feature branch was iterative (add tests, fix lint,
move field, revert, etc.) and not worth preserving; GitHub squash-
merges PRs anyway.
New fields from the DMARCbis XSD, plumbed through types, parsing, CSV
output, and the Elasticsearch / OpenSearch mappings:
- ``np`` — non-existent subdomain policy (``none`` / ``quarantine`` /
``reject``)
- ``testing`` — testing mode flag (``n`` / ``y``), replaces RFC 7489
``pct``
- ``discovery_method`` — policy discovery method (``psl`` /
``treewalk``)
- ``generator`` — report generator software identifier (metadata)
- ``human_result`` — optional descriptive text on DKIM / SPF results
RFC 7489 reports parse with ``None`` for DMARCbis-only fields.
Forensic reports have been renamed to failure reports throughout the
project to reflect the proper naming since RFC 7489.
- Core: ``types.py``, ``__init__.py`` — ``ForensicReport`` →
``FailureReport``, ``parse_forensic_report`` →
``parse_failure_report``, report type ``"failure"``.
- Output modules: ``elastic.py``, ``opensearch.py``, ``splunk.py``,
``kafkaclient.py``, ``syslog.py``, ``gelf.py``, ``webhook.py``,
``loganalytics.py``, ``s3.py``.
- CLI: ``cli.py`` — args, config keys, index names
(``dmarc_failure``).
- Docs + dashboards: all markdown, Grafana JSON, Kibana NDJSON,
Splunk XML.
Backward compatibility preserved: old function / type names remain as
aliases (``parse_forensic_report = parse_failure_report``,
``ForensicReport = FailureReport``, etc.), CLI accepts both the old
(``save_forensic``, ``forensic_topic``) and new (``save_failure``,
``failure_topic``) config keys, and updated dashboards query both
old and new index / sourcetype names so data from before and after
the rename appears together.
Merge conflicts resolved in ``parsedmarc/constants.py`` (took bis's
10.0.0 bump), ``parsedmarc/__init__.py`` (combined bis's "failure"
wording with master's IPinfo MMDB mention), ``parsedmarc/elastic.py``
and ``parsedmarc/opensearch.py`` (kept master's ``source_asn`` /
``source_asn_name`` / ``source_asn_domain`` on the failure doc path
while renaming ``forensic_report`` → ``failure_report``), and
``CHANGELOG.md`` (10.0.0 entry now sits above the 9.9.0 entry).
All 324 tests pass; ``ruff check`` / ``ruff format --check`` clean.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
=================== Splunk Installation =================== Install Splunk for use with Docker ---------------------------------- Download latest Splunk image:: docker pull splunk/splunk:latest Run Splunk with Docker ---------------------- Listen on all network interfaces:: docker run -d -p 8000:8000 -p 8088:8088 -e "SPLUNK_START_ARGS=--accept-license" -e "SPLUNK_PASSWORD=password1234" -e "SPLUNK_HEC_TOKEN=hec-token-1234" --name splunk splunk/splunk:latest Listen on localhost for use with reverse proxy with base URL ``/splunk``:: docker run -d -p 127.0.0.1:8000:8000 -p 127.0.0.1:8088:8088 -e "SPLUNK_START_ARGS=--accept-license" -e "SPLUNK_PASSWORD=password1234" -e "SPLUNK_HEC_TOKEN=hec-token-1234" -e "SPLUNK_ROOT_ENDPOINT=/splunk" --name splunk splunk/splunk:latest Set up reverse proxy, e.g. Apache2:: ProxyPass /splunk http://127.0.0.1:8000/splunk ProxyPassReverse /splunk http://127.0.0.1:8000/splunk Splunk Configuration -------------------- Access web UI at http://127.0.0.1:8000 and log in with ``admin:password1234``. Create App and Index ~~~~~~~~~~~~~~~~~~~~ - Settings > Data > Indexes: New Index - Index name: "email" - HEC token ``hec-token-1234`` should be already set up. - Check under Settings > Data > Data inputs: HTTP Event Collector - Apps > Manage Apps: Create app - Name: "parsedmarc" - Folder name: "parsedmarc" Create Dashboards ~~~~~~~~~~~~~~~~~ 1. Navigate to the app you want to add the dashboards to, or create a new app called DMARC 2. Click Dashboards 3. Click Create New Dashboard 4. Use a descriptive title, such as "Aggregate DMARC Data" 5. Click Create Dashboard 6. Click on the Source button 7. Paste the content of ''dmarc_aggregate_dashboard.xml`` into the source editor 8. If the index storing the DMARC data is not named email, replace index="email" accordingly 9. Click Save 10. Click Dashboards 11. Click Create New Dashboard 12. Use a descriptive title, such as "Failure DMARC Data" 13. Click Create Dashboard 14. Click on the Source button 15. Paste the content of ''dmarc_failure_dashboard.xml`` into the source editor 16. If the index storing the DMARC data is not named email, replace index="email" accordingly 17. Click Save ============== Example Config ============== parsedmarc.ini:: [splunk_hec] url = https://127.0.0.1:8088/ token = hec-token-1234 index = email skip_certificate_verification = True Note that ``skip_certificate_verification = True`` disables security checks. Run parsedmarc:: python3 -m parsedmarc.cli -c parsedmarc.ini