Product: Columbus
Can I add logging to the command line Import script?
The steps below describe how to add logging to the command line import script. Further information on the command line import method can be found in a separate FAQ post, here:
How do you Import data to Columbus using the Command Line Interface. – Revvity Signals
There are two logging options available, which can be used in combination or individually.
1. Console (Job) logging
Console logging is generated by default and provides details specific to the running job. When an import job is executed via the CLI script, a file named "console" is automatically generated within the working directory of the import script. The content of the console output is very similar to what appears in the Columbus Helper application window (or Helper log) during a standard import, and contains only lines which are relevant to the current import job, such as the start and end times for the well transfer, and any errors occurring during the run.
Redirecting the output:
By default, the console output is appended to the same "console" file each time the import script is executed. However, it may be more desirable to redirect the console output to a new file per job, which makes it much easier to query the console logs for a given job (e.g. for job updates, completions, etc.). To enable job specific console logging the import script must be modified, so that the “useLogger” parameter is set to true.
The line…
useLogger=false);
... must be changed to...
useLogger=true);
Once the import script has been modified, the console output can be redirected using the "logfile" command line argument, as follows:
$ acapella -logfile /path/to/console.log
The full command syntax would look like this:
$ acapella -logfile /path/to/console.log -s User=* -s Password=* -s Host=* -s ImportType=* -s DatasetFolder=* -s ScreenName=* import.script
The example above would give you the console logging only, outputted to a file specified by the “logfile” command line arguement.
2. Acapella (Server) logging
Acapella logging is not generated by default, it must be enabled using the “log” (to specify what gets logged) and “logfile” (to specify output file) command line arguments, as follows:
$ acapella -log all 5 -logfile /tmp/acc.log
The above example sets the Acapella log level to 5 (Major Events) for all modules, then writes the log output to the /tmp/acc.log file.
The full command syntax would look like this:
$ acapella -log all 5 -logfile /path/to/console.log -s User=* -s Password=* -s Host=* -s ImportType=* -s DatasetFolder=* -s ScreenName=* import.script
Acapella logging is most useful when isolating server specific problems with the import. It may be requested by the support team as part of any troubleshooting efforts.
Note, if the “useLogger” parameter has been set to true in the import script, and Acapella logging has been enabled, both Acapella and Console logging will be outputted to the same file, as specified via the "logfile" command line argument.
There is a size limit (20 MB by default in Acapella 5.1) for the Acapella log file, if it reaches that limit it is trimmed by deleting the first half of the log.
Its possible to set the size limit explicitly using the -logfilesize command-line option (specified in MB), although there is still a hard limit of 2 GB. Its also possible to pass 0 for unlimited growth.
$ acapella -log all 5 -logfile /tmp/acc.log -logfilesize 0 ...
Comments
0 comments
Article is closed for comments.