Doc cleanup/improvement (#432)

Co-authored-by: Anael Mobilia <anael.mobilia@mydsomanager.com>
This commit is contained in:
Anael Mobilia
2023-10-11 23:24:50 +02:00
committed by GitHub
parent aaf269b11b
commit 732547e622
3 changed files with 39 additions and 37 deletions

View File

@@ -1,15 +1,13 @@
# Elasticsearch and Kibana
:::{note}
Splunk is also supported starting with `parsedmarc` 4.3.0
:::
To set up visual dashboards of DMARC data, install Elasticsearch and Kibana.
:::{note}
Elasticsearch and Kibana 6 or later are required
:::
## Installation
On Debian/Ubuntu based systems, run:
```bash
@@ -126,7 +124,7 @@ server.ssl.certificate: /etc/kibana/kibana.crt
server.ssl.key: /etc/kibana/kibana.key
```
::{note}
:::{note}
For more security, you can configure Kibana to use a local network connexion
to elasticsearch :
```text
@@ -136,6 +134,7 @@ elasticsearch.hosts: ['https://SERVER_IP:9200']
```text
elasticsearch.hosts: ['https://127.0.0.1:9200']
```
:::
```bash
sudo systemctl restart kibana

View File

@@ -1,15 +1,17 @@
# Installation
## Testing multiple report analyzers
## Prerequisites
`parsedmarc` works with Python 3 only.
### Testing multiple report analyzers
If you would like to test parsedmarc and another report processing
solution at the same time, you can have up to two `mailto` URIs in each of the rua and ruf
tags in your DMARC record, separated by commas.
:::
`parsedmarc` works with Python 3 only.
### Using a web proxy
:::{note}
If your system is behind a web proxy, you need to configure your system
to use that proxy. To do this, edit `/etc/environment` and add your
proxy details there, for example:
@@ -29,18 +31,17 @@ ftp_proxy=http://prox-server:3128
```
This will set the proxy up for use system-wide, including for `parsedmarc`.
:::
:::{warning}
### Using Microsoft Exchange
If your mail server is Microsoft Exchange, ensure that it is patched to at
least:
- Exchange Server 2010 Update Rollup 22 ([KB4295699](https://support.microsoft.com/KB/4295699))
- Exchange Server 2013 Cumulative Update 21 ([KB4099855](https://support.microsoft.com/KB/4099855))
- Exchange Server 2016 Cumulative Update 11 ([KB4134118](https://support.microsoft.com/kb/4134118))
:::
- Exchange Server 2010 Update Rollup 22 ([KB4295699])
- Exchange Server 2013 Cumulative Update 21 ([KB4099855])
- Exchange Server 2016 Cumulative Update 11 ([KB4134118])
## geoipupdate setup
### geoipupdate setup
:::{note}
Starting in `parsedmarc` 7.1.0, a static copy of the
@@ -84,9 +85,8 @@ The latest builds for Linux, macOS, and Windows can be downloaded
from the [geoipupdate releases page on GitHub].
On December 30th, 2019, MaxMind started requiring free accounts to
access the free Geolite2 databases, in order [to
comply with various privacy
regulations][to comply with various privacy regulations].
access the free Geolite2 databases, in order
[to comply with various privacy regulations].
Start by [registering for a free GeoLite2 account], and signing in.
@@ -146,7 +146,7 @@ sudo dnf install -y python39 python3-virtualenv python3-setuptools python3-devel
```
Python 3 installers for Windows and macOS can be found at
<https://www.python.org/downloads/>
<https://www.python.org/downloads/>.
Create a system user
@@ -191,6 +191,9 @@ On Debian or Ubuntu systems, run:
sudo apt-get install libemail-outlook-message-perl
```
[KB4295699]: https://support.microsoft.com/KB/4295699
[KB4099855]: https://support.microsoft.com/KB/4099855
[KB4134118]: https://support.microsoft.com/kb/4134118
[Component "contrib"]: https://wiki.debian.org/SourcesList#Component
[geoipupdate]: https://github.com/maxmind/geoipupdate
[geoipupdate releases page on github]: https://github.com/maxmind/geoipupdate/releases

View File

@@ -111,17 +111,17 @@ The full set of configuration options are:
- `forensic_json_filename` - str: filename for the forensic
JSON output file
- `ip_db_path` - str: An optional custom path to a MMDB file
- from MaxMind or DBIP
from MaxMind or DBIP
- `offline` - bool: Do not use online queries for geolocation
or DNS
- `nameservers` - str: A comma separated list of
DNS resolvers (Default: [Cloudflare's public resolvers])
- `nameservers` - str: A comma separated list of
DNS resolvers (Default: `[Cloudflare's public resolvers]`)
- `dns_timeout` - float: DNS timeout period
- `debug` - bool: Print debugging messages
- `silent` - bool: Only print errors (Default: True)
- `silent` - bool: Only print errors (Default: `True`)
- `log_file` - str: Write log messages to a file at this path
- `n_procs` - int: Number of process to run in parallel when
parsing in CLI mode (Default: 1)
parsing in CLI mode (Default: `1`)
- `chunk_size` - int: Number of files to give to each process
when running in parallel.
@@ -134,7 +134,7 @@ The full set of configuration options are:
- `reports_folder` - str: The mailbox folder (or label for
Gmail) where the incoming reports can be found
(Default: `INBOX`)
- `archive_folder` - str: The mailbox folder (or label for
- `archive_folder` - str: The mailbox folder (or label for
Gmail) to sort processed emails into (Default: `Archive`)
- `watch` - bool: Use the IMAP `IDLE` command to process
- messages as they arrive or poll MS Graph for new messages
@@ -165,7 +165,7 @@ The full set of configuration options are:
:::
- `ssl` - bool: Use an encrypted SSL/TLS connection
(Default: True)
(Default: `True`)
- `skip_certificate_verification` - bool: Skip certificate
verification (not recommended)
- `user` - str: The IMAP user
@@ -188,7 +188,7 @@ The full set of configuration options are:
- `token_file` - str: Path to save the token file
(Default: `.token`)
- `allow_unencrypted_storage` - bool: Allows the Azure Identity
module to fall back to unencrypted token cache (Default: False).
module to fall back to unencrypted token cache (Default: `False`).
Even if enabled, the cache will always try encrypted storage first.
:::{note}
@@ -245,14 +245,14 @@ The full set of configuration options are:
- `hosts` - str: A comma separated list of Kafka hosts
- `user` - str: The Kafka user
- `passsword` - str: The Kafka password
- `ssl` - bool: Use an encrypted SSL/TLS connection (Default: True)
- `ssl` - bool: Use an encrypted SSL/TLS connection (Default: `True`)
- `skip_certificate_verification` - bool: Skip certificate
verification (not recommended)
- `aggregate_topic` - str: The Kafka topic for aggregate reports
- `forensic_topic` - str: The Kafka topic for forensic reports
- `smtp`
- `host` - str: The SMTP hostname
- `port` - int: The SMTP port (Default: 25)
- `port` - int: The SMTP port (Default: `25`)
- `ssl` - bool: Require SSL/TLS instead of using STARTTLS
- `skip_certificate_verification` - bool: Skip certificate
verification (not recommended)
@@ -272,26 +272,26 @@ The full set of configuration options are:
:::
- `s3`
- `bucket` - str: The S3 bucket name
- `path` - str: The path to upload reports to (Default: /)
- `path` - str: The path to upload reports to (Default: `/`)
- `region_name` - str: The region name (Optional)
- `endpoint_url` - str: The endpoint URL (Optional)
- `access_key_id` - str: The access key id (Optional)
- `secret_access_key` - str: The secret access key (Optional)
- `syslog`
- `server` - str: The Syslog server name or IP address
- `port` - int: The UDP port to use (Default: 514)
- `port` - int: The UDP port to use (Default: `514`)
- `gmail_api`
- `credentials_file` - str: Path to file containing the
credentials, None to disable (Default: None)
credentials, None to disable (Default: `None`)
- `token_file` - str: Path to save the token file
(Default: .token)
(Default: `.token`)
- `include_spam_trash` - bool: Include messages in Spam and
Trash when searching reports (Default: False)
Trash when searching reports (Default: `False`)
- `scopes` - str: Comma separated list of scopes to use when
acquiring credentials
(Default: `https://www.googleapis.com/auth/gmail.modify`)
- `oauth2_port` - int: The TCP port for the local server to
listen on for the OAuth2 response (Default: 8080)
listen on for the OAuth2 response (Default: `8080`)
- `log_analytics`
- `client_id` - str: The app registration's client ID
- `client_secret` - str: The app registration's client secret
@@ -315,7 +315,7 @@ The `nameservers` option should only be used if your network
blocks DNS requests to outside resolvers.
:::
:::{warning}
:::{note}
`save_aggregate` and `save_forensic` are separate options
because you may not want to save forensic reports
(also known as failure reports) to your Elasticsearch instance,