Description
Description
Now that we have integrations to ingest contextual information relating to user/group/device data from Azure EntraID (formerly AzureAD), we need to ensure that users who have now moved to EntraID and still running Active Directory on-premises have the ability to ingest the same user/group/device data.
Our Active Directory Entity Analytics integration should have the ability to connect to Active Directory (presumably via LDAP) and ingest data such as usernames, department, title, group membership, last login, locked out status, last password change date and more. We can explore the full set of fields once we've figured out the LDAP connection and what fields are available to us.
As a user of the Security Solution I want to continuously sync user metadata from an Okta organization into Elasticsearch. Data is produced in accordance to RFC 2022-09-07-user-host-entity-ingestion.
Acceptance Criteria
- Collect user data from Active Directory via a new provider in the entity_analytics Filebeat input.
- Generate one document per user that includes group membership data
- Periodically get a list of users that were modified or deprovisioned and generate new documents
- Persist data to disk that allows the input to resume from previous state
Integration release checklist
This checklist is intended for integrations maintainers to ensure consistency
when creating or updating a Package, Module or Dataset for an Integration.
All changes
- Change follows the contributing guidelines
- Supported versions of the monitoring target are documented
- Supported operating systems are documented (if applicable)
- Integration or System tests exist
- Documentation exists
- Fields follow ECS and naming conventions
- At least a manual test with ES / Kibana / Agent has been performed.
- Required Kibana version set to: v8.13.0
New Package
- Screenshot of the "Add Integration" page on Fleet added
Dashboards changes
- Dashboards exists
- Screenshots added or updated
- Datastream filters added to visualizations
Log dataset changes
- Pipeline tests exist (if applicable)
- Generated output for at least 1 log file exists
- Sample event (
sample_event.json
) exists