Blogs
A simple script to cover needs
Texto
In the paradignm of cloud service management, one of the crucial
aspects is log management. Logs provide valuable information for
troubleshooting and system monitoring.
If your project is
implemented on Liferay
PaaS or Liferay SaaS, you don't have much to worry about, since
both Liferay PaaS and SaaS offer
out-of-the-box storage and management of logs for the deployed
Services. However, it is quite common to deal with clients who
want to store logs in third-party systems such as their own
infrastructure, in order to cover auditing needs, for example.
Using the tools that Liferay Cloud's infrastructure offers
for handling logs from deployed services, I share a bash script
that simplifies the task of downloading logs from services deployed in
Liferay Cloud, allowing storage in local infrastructure for later
processing. This script is a practical solution for system
administrators and developers looking for an efficient way to extract
and store logs without complications.
What does this script do and what is it for?
This script uses Liferay Cloud Platform Tools (CLI) to access
Logs and is intended to automate the log download process
for the entire default service stack in Liferay Cloud,
saving them to a compressed file on a daily basis. It is
functional on both Liferay SaaS and Liferay PaaS
(a.k.a. Liferay Experience Cloud & Liferay Experience Cloud
Self-Managed). Key features are as follows:
- Log download by Service: Extract logs from multiple services deployed in Liferay Cloud. By default, Webserver, Database, Backup, Liferay, and Search, as well as those from custom apps deployed as Services within the PaaS or client extensions within the SaaS.
- Customizable Time Range: The script can be configured to download logs from a specific time period, for example, yesterday.
- Error handling: The script will be sensitive to errors that occur during the download and will store them in a separate log file for later analysis.
- Automatic compression: After downloading the logs, it compresses them into a ZIP file for easy storage and transportation.
- Execution Time: The script measures how long the download and storage takes, giving you a clear idea of the duration of the process on a daily basis.
How to use the script?
- Environment preparation:
- Make sure you have access to the Liferay Cloud API (liferay.cloud) using the LCP CLI
- It is recommended to create a dedicated local user for this task in the project and cloud environment, with the GUEST role, to limit access to it. This user and its credentials will be used within the script to connect to the API.
-
Script Configuration: Modify the variables in the script to suit all of them for your environment:
-
PROJECT_ID
: Liferay Cloud project ID -
ENVIRONMENT
: Environment to extract logs. Normally prd. -
SERVICES
: Services list to retrieve logs. It depends if you are on SaaS or PaaS project. -
LCP_USER
yLCP_PASSWORD
: User credentials I recommended before to be created with Guest role. -
LOG_DIR
: Folder to store the zip output.
-
- Script
Execution: The script is designed for daily extraction
of log files from the previous day in full, so in a Linux
environment it will be possible to schedule it daily using crontab
and a cron expression that runs the script at 00:00 daily:
0 0 * * * /path/to/script/log_downloader.sh
Where to find it?
You can get it from my
Github repository and contribute whatever you see necessary that
could be helpful.
In order to make it as intuitive as possible,
for SaaS
you can use this one and for PaaS
this other one, where only the list of default Services to which
you will have access to download the logs changes.