-
Arne Øslebø authoredArne Øslebø authored
Architecture
SOCTools is a collection of tools for collecting, enriching and analyzing logs and other security data, threat information sharing and incident handling. Many SOCs will already have some tools in place that they want to continue to use. One main feature of SOCTools is therefore to have a flexible architecture where it is simple to integrate existing tools even if they are not directly supported by SOCTools. It is also easy to select which components of SOCTools to install.
High level architecture

The high level architecture is shown in the figure above and consists of the following components:
- Data sources - the platform supports data from many common sources like system logs, application logs, IDS etc. It is also simple to add support for other sources. The main method for sending data into SOCTools is through Filebeat.
- High volume data sources - while the main platform is able to scale to high traffic volumes, it will in some cases be more convenient to have a separate setup for very high volume data like Netflow. Some NRENs might also have an existing setup for this kind of data that they do not want to change. Data sources like this will have its own storage system. If real time processing is done on the data, alerts from this can be shipped to other components in the architecture.
- Data transport - Apache Nifi is the key component that collects data from data sources, normalize it, do simple data enrichment and then ship it to one or more of the other components in the architecture.
- Storage - in the current version all storage is done in Elasiticsearch, but it is easy to make changes to the data transport so that data is sent to other log analysis tools like Splunk or Humio.
- Manual analysis - In the current version Kibana is used for manual analysis of collected data.
- Enrichment - This component enriches the collected data either before or after storage. In the current version this is done as part of the data transport component before data is sent to storage.
- Threat analysis - collects and analyzes threat intelligence data. Typical source for enrichment data. The current version uses MISP.
- Automatic analysis - this is automatic real time analysis of collected data and will be added to later versions of SOCTools. It can be simple scripts looking at thresholds or advanced machine learning algorithms.
- Incident response - The Hive and Cortex is used for this and new cases can be created automatically from manual analysis in Kibana.
Authentication
SOCTools uses Keycloak to provide single sign on to all web interfaces of the various components.
NiFi pipeline
The main job of Nifi is to collect data from various sources, enrich it and send it to storage which currently is Elasticsearch. The pipeline in Nifi is organized into two main prcoess groups, "Data processing" and "Enrichment data".
Enrichment data
This process group is basically a collection of "cron jobs" that runs regularly to update various enrichment data that is used by "Data processing" to enrich collected data. The current version supports the following enrichment data:
- Umbrella top 1 million domains - http://s3-us-west-1.amazonaws.com/umbrella-static/top-1m.csv.zip
- Alexa top 1 million - http://s3.amazonaws.com/alexa-static/top-1m.csv.zip
- Tor exit nodes - https://check.torproject.org/torbulkexitlist
- MaxMind GeoLite2-City database - Requires a free account. https://dev.maxmind.com/geoip/geoip2/geolite2/
- Misp - NiFi automatically downloads new IOCs from the Misp instance that is part of SOCTools. IP addresses and host names are then enriched to show if they are registered in Misp.
Data processing
The processing group is split into 3 parts:
- Data input - receives data, normalizes it and converts it to JSON. This also adds attributes to the data that specifies which filed names to enrich.
- Enrichment - enriches the data. It currently supports enriching IP addresses, domain names and fully qualified domain name (FQDN).
- Data output - sends data to storage. In future version data will also be sent to other tools doing real time stream processing of the data.
Each group contains a process group called "Custom ..." where it is possible to add new processors to the pipeline that will not be overwritten when upgrading to newer versions of SOCTools.
Performance
The two components that decides the performance of SOCTools are Elasticsearch and Apache NiFi. Both components are highly scalable by adding more nodes to the cluster.
There are reports of NiFi being scaled to handle petabytes of data per day in a large cluster, Processing one billion events per second with NiFi. The performance of NiFi depends heavily on the type and number of processors in the pipeline. The enrichment pipeline used in SOCTools is quite CPU intensive but it utilizes flow record processing in Nifi which means that multiple log entries of the same type are grouped together to improve performance.
Uninett is using Humio instead of Elasticsearch for storing logs, but has a pilot installation of Apache Nifi running the same pipeline as the one in SOCTools. The current setup is 6 virtual servers running on 4 physical servers. The HW specification of the virtual servers are:
- CPU: 12 cores
- Memory: 8GB
- Disk: 40GB
This setup processes around 7K events per second of production data per second during peak hours. During performance testing we have been able to add an additional 17K events per second of test traffic before NiFi starting to show performance issues. This translates to more than 1.1TB of data per day.