This website uses cookies to ensure you get the best experience. Learn More.
Backing Up Your DXP Cloud Environments
I had thought that I would stop with 4 parts to my series of posts about DXP Cloud, but it was recently requested that I talk about backing up DXP Cloud environments. I will start with a caveat: the process to backup DXP Cloud's UAT and DEV environments are not supported. It is on the DXPC team's roadmap, but not currently available. This process was originally tested and documented for my fellow Sales Engineers to backup and restore their demos on DXP Cloud, but lately, there have been requests from other groups for this information, and if there have been requests for this information internally, then there must be some desire for this externally, so here we are.
These steps were run from the DXP Cloud Admin Console shell in the Liferay service, but could also be run from the DXP Cloud CLI.
The steps here are only for use in DEV or UAT, as PRD has its own backup mechanism.
/opt/liferay
cd data
tar -zcf ${FILENAME_VOLUME}.tgz document_library
Go to DEV/UAT
Go to Services -> Liferay -> Environment Variables
Go to Services -> Liferay -> Shell
It should put you in /opt/liferay
Create a dump of the database schema.
mysqldump -u$LCP_SECRET_DATABASE_USER -p$LCP_SECRET_DATABASE_PASSWORD $LCP_SECRET_DATABASE_NAME --databases --add-drop-database $LCP_SECRET_DATABASE_NAME| gzip -c | cat > $LCP_SECRET_DATABASE_NAME.gz
You DO NOT need to find these values. These environment variables are available in the shell and you can invoke them as the above command has done.
You can copy the above command verbatim!
This will take the database that is currently being used for that instance. If you have additional database schemas, you will need to run the dump command separately and with the specific name.
Tar the output *.gz file
tar -zcf $LCP_SECRET_DATABASE_PASSWORD.tgz $LCP_SECRET_DATABASE_PASSWORD.gz
This process is not endorsed or supported by DXP Cloud and was deemed “hacky but with a smiley face.” It works, but there may be other methods out there.
Copy the files into Tomcat for service
Create a new directory called “backup” in ${TOMCAT}/webapps
${TOMCAT}/webapps
mkdir ${TOMCAT}/webapps/backup
Copy the archive files into that directory
cp ${FILENAME_VOLUME}.tgz ${TOMCAT}/webapps/backup
cp ${FILENAME_DATABASE}.tgz ${TOMCAT}/webapps/backup
Note, the file names are what were generated in the previous 2 steps.
Open a command prompt to where you want the files stored on your local machine.
Use wget to download the files
wget --no-check-certificate https://webserver-${REPOS_NAME}-${DEV/UAT}.lfr.cloud/backup/${FILENAME}
If the web server pop-up authentication is still enabled, you will need to add in those flags
wget --no-check-certificate --user=customer --ask-password ${URL}
It will prompt for the password. Paste and hit Enter. It will not show characters.
Refer to the DXP Cloud account welcome email for specific credentials.
This process was suggested by someone on the DXP Cloud team, but it was also deemed “hilariously hacky” so use at your own risk. Instead of using wget to download the files into the local file system, do the following.
It should put you into /opt/liferay
Use the same cURL commands to push the archive files from the lower environment up to PRD’s backup system
curl -X POST https://backup-${REPO_NAME}-prd.lfr.cloud/backup/upload -H 'Content-Type: multipart/form-data' -F 'database=@${FILENAME_DATABASE}.tgz' -F 'volume=@${FILENAME_VOLUME}.tgz' -u email@address.com:password
If you have special characters in your password, like ! it will need to be escaped with \, otherwise the bash shell will read the ! as the command to display the last entry in the shell.
If your password is password123!! then you will need to put password123\!\!
password123!!
password123\!\!
the -u email@address.com:password flag is the login email and password for DXP Cloud. If you are using SSO, you will need to provide the token instead.
-u email@address.com:password
Download the backup files from the PRD Backups system.
First, let me start by saying neither is really more supported than the other. Both are equally unsupported methods. However, there are pro/cons to either one. You can still wget the DXP Cloud backups.
Simpler, less steps
Less secure. Files are available for all if they can find the URL, no protection if you have removed webserver auth.
Files are non-permanent and are removed if the service is restarted.
Persistent backup files
Tied into DXP Cloud’s backup infrastructure
Lots of shell manipulation
Extra steps, more complex
The final answer on which one to use is completely up to you. While more complicated, I do lean slightly towards using the Cloud Backup mechanism because it does keep a record of the backup for use and the backup can be directly restored using DXP Cloud's restore mechanism.
As mentioned previously, PRD has its own backup mechanisms, so we don't have to do cURL commands or take manual database dumps. This process IS supported though!
Go to PRD
Go to Backups
To take a manual backup, press “Backup Now”
Download the backup files using the 3-dot menu
I'm not going to cover restoring the backup, because the whole process is outlined in Part 3.
Dockerizing a Liferay Bundle Part 1Dockerizing a Liferay Bundle Part 2Migrating a Liferay Bundle to DXP CloudLocal Work with DXP Cloud