Friday, October 7, 2016

Sending data from a remote LFA to Log Analysis

  1. The Log file to check is the GenericReceiver.log. All data that flows into LA will be reported in this log file. Sometimes, if there is something wrong, it goes through silently, and you wouldn't know it. Other times, it will go through, but with a very high level message about the timestamp being wrong, or something like that. If you want to see more detailed logging, edit this file:

/opt/IBM/LogAnalysis/wlp/usr/servers/Unity/apps/Unity.war/WEB-INF/classes/log4j.properties

and change the setting for the log4j.logger.UnityGenericReceiver to DEBUG, then restart unity.  You will then see EVERYTHING that comes in, so its very verbose! Useful for knowing just what is going into the API.

  1. When you configure your remote log file agent, these setting are the most important:
a)     conf/lo/[subnode].conf:
LogSources=/gsa/hurgsa/home/w/o/woolnom/IHS_log/error_log_2016-04-29

This is the pattern to match the log file/s that you want to stream through. The file name can be a pattern like *.log, but the directory path can't. If you have multiple directories and file, use a comma to seperate each directory and filename.

ServerLocation=marco.hursley.ibm.com
ServerPort=5529

This is the Log Analysis host name and EIF port
b)     conf/lo/[subnode].fmt
The two most important settings are:

hostname huraix.hursley.ibm.com
-file /gsa/hurgsa/home/w/o/woolnom/IHS_log/access_log_2016-04-29

The host name MUST match the EXACT host name you set in the data source properties.
The file MUST match the EXACT path you specify in data source properties you configure in the Log Analysis Admin Data Sources screen.
NOTE: hostname and file name and the host and log path values you set in the datasource can actually be anything. As long as they are unique, they don't have to be the physical values. This is ONLY true when you create a Custom data source. Speaking of which....

  1. When you create a datasource and you are streaming data from a remote agent, make sure you create it as a Custom data source. This is frequently an area of confusion, but the options for creating data sources boil down to this:
a)     Local: A log file that resides on the Local Log Analysis Filesystem
b)     Remote: A log file that resides on a REMOTE node that you will stream using the embedded Log File Agent in Log Analysis, by supplying ssh credentials.
SO, LOCAL AND REMOTE DATA SOURCES ARE COUPLED TO THE LOG FILE AGENT RUNNING ON THE LOG ANALYSIS SERVER
    • Custom: Use this for everything else. This sets up the data source and assumes the generic receiver api will recieve data from a seperate agent installed remotely.
IF YOU ARE USING A REMOTE LOG FILE AGENT, CREATE YOUR DATA SOURCE AS A CUSTOM DATA SOURCE AND ENSURE THE HOST AND PATH SETTINGS MATCH THE LFA SETTINGS I DESCRIBE ABOVE!
If you use logstash, you will always create a custom data source.
I hope this helps. If you still have problems after this, its probably down to chosing the wrong source type, or a mismatch between your data and the version the pack was written for. You will gather this from the verbose DEBUG logging.

No comments:

Post a Comment