This website uses cookies to ensure you get the best experience. Learn More.
Migrating Your Bundle to DXP Cloud (Part 3?)
This could be considered Part 3 of "Dockerizing a Liferay Bundle" but has very little to do with Docker, with the exception that DXP Cloud uses Docker images. I split this in to a separate section because not everyone is interested in DXP Cloud.
This will focus more on converting a Liferay DXP instance using a Docker image to DXP Cloud. These steps will work as well for taking a Tomcat (on-premise type) bundle and moving it to DXP Cloud as well.
As we will see in the process of migrating, having migrated to using an official Liferay Docker image means most of the work has been done for us, in terms of the migration. It collects all the files we need in one location. If coming from a regular on-premise type bundle, we will essentially need to duplicate the work to migrate to a Docker image. Whether the intermediate step of using a Docker image is done or not, following the process in the Docker migration blog entry will provide the preparation to migrate into DXP Cloud.
To find out more information about using the official Liferay Docker images, please refer to my previous 2 blog entries, Part 1 and Part 2. You can also look at the whitepaper about Liferay and Docker here. Most of my work on Docker was done via self-exploration.
If the original bundle is a Docker demo, then all of these files are already gathered in one location and ready to migrate.
The cloned repository will look something like this
For the migration operation, we will concern ourselves with the Liferay section
Also present in the liferay directory is the gradle directory. This is also a Liferay Workspace, which means we can directly put code that can be built in this location, but that’s a different blog entry all together.
liferay
Within this directory, we have common, dev, local, prd, and uat directories. Common is for items that will be deployed to all environments. Dev, prd, and uat are for specific deployments into those environments. Local is for the creation of a local Docker image.
common
dev
local
prd
uat
Common
Dev
Local
For this example, we will use the dev environment.
If you have followed the prior blog entry about converting an existing bundle to a Docker image, it’s just copy/paste.
${DOCKER_PROJECT}/files/mount
lctmyrepos/liferay/configs/dev
data
osgi/configs/elasticsearch
That takes care of all our portal-ext.properties file, plugins, deployments, license keys, themes, etc… Yes, it’s just a simple copy/paste. We just need to upload out data (document library store and database).
This is just for the plugins that you are not building from source. If you are building from source, put that code into the Liferay Workspace! Simply go to the liferay directory, and then create a modules and war and themes directory. When all the files are committed to GitHub, the DXP Cloud Jenkins services will build everything and deploy it.
You could. However if you have an uploaded document that’s over 100 MB, then you’re out of luck. GitHub has a 100 MB file size limit. Essentially, you’d be restoring the backup every single time you restart the Liferay service, and potentially overriding any changes that have been made in the meantime. The DXP Cloud backup restore method shown below works for all situations and will persist across restarts.
DXP Cloud wants the archives to be in *.tgz. It MUST be *.tgz. It does not recognize *.tar.gz as an acceptable file format, and while it says it is OK with *.zip, it’s not. It threw an error trying to work with it as a *.zip file.
*.tgz
*.tar.gz
*.zip
${DOCKER_PROJECT}/mount/files/data
document_library.tar.gz
mysqldump -u root -p --databases --add-drop-database lportal | gzip -c | cat > database.gz
tar zcvf database.tgz database.gz
Windows does not have tar as a native command, but can be done via 7-Zip, or using Linux commands via the Git Bash interface.
Since the default DXP database name is “lportal”, there may be issues with importing a new database to overwrite the old one. Having it as not “lportal” is beneficial. In my testing, I used a different file name than “lportal”
Recall in Part 2 we had to set a database flag to set things as lower case table names in the Docker image. We have to do the same thing here, except we won't be doing it via a config file. The reason for doing so is the same. If your SQL file originated from a Windows machine, you need to do this.
LCP_GCP_DATABASE_FLAG_LOWER_CASE_TABLE_NAMES
All database flags that we can set will start with LCP_GCP_DATABASE_FLAG_
For valid flags, please refer to Google Cloud SQL docs here.
1
In both instances, the \ or ^ can be dispensed with and everything written in one line. Note that in Linux/OSX, it uses single quotes, whilst in Windows, it uses double quotes.
Note that only the PRD environment of the DXP Cloud instance will have the backup capability, so the POST URL will always be to the PRD environment.
In the examples of the cURL command, I am using email address and password login to DXP Cloud. However, DXP Cloud offers the ability to use some sort of token authentication via SSO. If you are logging in to DXP Cloud via SSO or some kind of token authentication method, then you will need to supply the token instead of the “-u email@address.com:password”
3. WaitThe backup should appear in the console momentarily. If not check the Backup service logs.
As suggested earlier, the database was exported as not “lportal” and would be imported as such. Therefore, the Liferay service must be configured to use the new database name.
If the backup doesn’t appear in the Backup page, then look in PRD -> Services -> Backup and examine the log. If there are any error messages, examine them and then correct it. Most likely, it is something to do with the files that were uploaded. Fix it and re-upload.
If the Backup Service log doesn’t show anything, then the problem may lie in the database schema. If you see a message that says “Unable to delete lportal” or something like that, then you didn’t heed my warning.
DXP Cloud’s PRD environment has an automated backup process and by default, will take daily backups of PRD.
Once we have something in DXPC, we may want to pull a local backup of some kind or even just back it up for posterity (like a demo).
DXP Cloud offers numerous mechanisms to let us accomplish this. First, there is “local” directory in the GitHub repository, which has a way of making it run as a local Docker image. I haven’t been able to figure that out yet, so when I do, that can be a part 4.
Much like pushing a backup into DXP Cloud, when pulling a backup, we only need the database and the document library. However, we will need to grab the backup name from the DXP Cloud Backup Service.
dxpcloud-aoeusnthaoeusnth-${TIMESTAMP}
curl -X GET https://backup-{$REPO_NAME}-prd.lfr.cloud/backup/download/volume/${BACKUP_ID} --output volume.tgz -u email@address.com:password
curl -X GET https://backup-{$REPO_NAME}-prd.lfr.cloud/backup/download/database/${BACKUP_ID} --output database.tgz -u email@address.com:password
So far, this is all for the PRD environment in DXP Cloud. What if we want to pull a backup of the other environments? Currently, there is no official way to pull backups of those environments. Numerous people have found some creative ways of pulling backups of DEV or UAT by using the console to create the backup files and then have Tomcat or nginx serve them for download. However, they aren’t officially supported methods!
What I've outlined here may change in the future. DXP Cloud is changing and evolving, but the migration steps on what needs to be put where and what should be the same. The process may change in the future to make the restore process easier or simpler than it is now. Rumor has it that there will be an official migration tool, to automate everything I just wrote. If that's the case, then great, but until then, this is what we have to do.
I mentioned the fact that there is a local directory in the repository, and it's for creating a local Docker image of the DXP Cloud build. Maybe this will be a Part 4, but it could be off in the future.