How ELSA works

ELSA stands for Enterprise Log Search and Archive. It’s a really powerful syslog framework built on Syslog-NG, MySQL, and Sphinx full-text search. It’s one of the main tools that I’m relying on when using Security Onion. On a previous post I’ve put some words on the big picture in Security Onion , but in the present one I’m going to focus on details related to how ELSA works in Security Onion. The main reason for this article is to understand the differences between standard ELSA and Security Onion ELSA, where people (including me) might get confused with file paths and configuration details.

elsa

Legend

  • clouds – log sources
  • big rectangles – scripts and programs
  • small rectangles – user interfaces
  • circles – data formats

Description

1. The log sources are defined in syslog-ng using the following basic template:

source s_import { file(“/path/to/file” flags(expect-hostname syslog-protocol)); };

Example of network log definition:

source s_network {
tcp();
udp();
};

Example of import definition using a pipe:

source s_import { pipe(“/nsm/elsa/data/elsa/tmp/import” flags(expect-hostname syslog-protocol)); };

Example of BRO log on disk definition:

source s_bro_dns { file(“/nsm/bro/logs/current/dns.log” flags(no-parse) program_override(“bro_dns”)); };

2. Syslog-ng relies on the configuration file found in /etc/syslog-ng/syslog-ng.conf . The standard syslog-ng.conf file for ELSA can be found on github.

Syslog-NG writes raw files to /nsm/elsa/data/elsa/tmp/buffers/ and loads them into the index and archive tables

The definitions posted below must be used in this file so that Syslog knows what, how and where to process the log files.

Logs are being processed by Syslog-ng using parsing templates (which are based on parsers), defined like so:

template t_db_parsed { template(“$R_UNIXTIME\t$HOST\t$PROGRAM\t${.classifier.class}\t$MSGONLY\t${i0}\t${i1}\t${i2}\t${i3}\t${i4}\t${i5}\t${s0}\t${s1}\t${s2}\t${s3}\t${s4}\t${s5}\n”); };

And are being sent to a specific destination which is defined by processing script and a parsing template like the following:

destination d_elsa { program(“perl /opt/elsa/node/elsa.pl -c /etc/elsa_node.conf” template(t_db_parsed)); };

Destinations can also be defined by using a file, where the logs parsed by the parsing template are being saved to. Useful to define when we want to check that the parsing template works as expected. This can be defined like below:

destination d_debug { file(“/nsm/elsa/data/elsa/tmp/debug” template(t_db_parsed_import)); };

In order for Syslog to process the logs using the information defined above (sources, parsing templates, destinations) it must have a log entry defined. The following is a minimal example of how a log entry is defined in syslog-ng.conf, using the information from above.

log {
source(s_bro_dns);
parser(p_db);
destination(d_elsa);
log { destination(d_debug); };
};

3. One of the parsers that syslog-ng’s parsing template is relying on is PatternDB. This is the main and default one in Security Onion.

The parser is being defined in the syslog-ng.conf by referencing a file:

parser p_db { db-parser(file(“/opt/elsa/node/conf/patterndb.xml”)); };

The standard patterndb.xml file can be found on the ELSA github repository

 4. The destination for log files is the main ELSA script (/opt/elsa/node/elsa.pl)

The script forwards the logs into the Mysql database which stores information in database files (/nsm/elsa/data/elsa/mysql/). The main configuration file for ELSA is the elsa_node.conf (/etc/elsa_node.conf) . Log entries related to the ELSA node can be found in /nsm/elsa/data/elsa/log/node.log.

5. Sphinx handles the indexing and is the main reason why ELSA queries are so fast.

Sphinx works by creating temporary and permanent indexes (/nsm/elsa/data/sphinx/). In order to check what is happening with sphinx, it keeps a log file located at /nsm/elsa/data/elsa/log/searchd.log

6. Importing into ELSA

ELSA uses the import.pl script (/opt/elsa/node/import.pl) to handle it’s imports. Example of how to use the import script:

/opt/elsa/node/import.pl -f “bro” -d “comment” “/path/to/logfile”

The script uses a file or a pipe to import the logs. The default location of the file is /data/elsa/tmp/import .Keep in mind that the import.pl script does not have the paths updated for Security Onion, so make sure that you update them before using it. An example for importing BRO logs would mean to modify the path on line 183 from/data/elsa/tmp/import to /nsm/elsa/data/elsa/tmp/import.

The file or pipe is defined in the syslog-ng.conf file like below:

source s_import {
file(“/nsm/elsa/data/elsa/tmp/import” flags(expect-hostname syslog-protocol));
};

I recommend using a pipe instead of file, or else you might experience log loss while importing. In order to do that modify the line above like below, delete the import file at that location (if its there), restart syslog-ng (service syslog-ng.conf restart) and the pipe should be automatically created.

source s_import {
pipe(“/nsm/elsa/data/elsa/tmp/import” flags(expect-hostname syslog-protocol));
};

7. User interfaces

The main user interface in ELSA is a web-interface running default on port 3154. Apart from that there is a command-line interface, where a user can use a perl script (/opt/elsa/web/cli.pl) to query the database from the command-line. The command-line query script cannot be used without the webserver (apache2) running, as the query function is part of the web API. Checking what happens during a query can be seen in the /nsm/elsa/data/elsa/log/web.log.

References

  • https://code.google.com/p/enterprise-log-search-and-archive/
  • https://groups.google.com/forum/#!forum/security-onion
  • https://groups.google.com/forum/#!forum/enterprise-log-search-and-archive