This website uses cookies to ensure you get the best experience. Learn More.
Local Work with DXP Cloud
I guess we'll call this Part 4 in my series on doing stuff with Docker and DXP Cloud.
DXP Cloud offers a full DevOps lifecycle, but having to go through that process for every small change can be very inefficient and tedious. Every code change that gets pushed up to the GitHub repos will need to go through Jenkins CI, and then manually be deployed to the environment. A possible solution would be to have a separate local bundle running or Docker image, but that takes time to setup and can lead to some environmental differences. Is there a way to simply have everything contained in the DXP Cloud workspace?
The answer is yes. The DXP Cloud workspace has a way to generate a local testing instance without having to push everything to the cloud.
Everything shown here does not need DXP Cloud to function, and is provided as part of the Liferay Workspace, so what is described here, while in the context of DXP Cloud, will work in a regular Liferay Workspace.
The biggest assumption here is that all plugins and custom modules reside in their usual location in the workspace. That is, compiled plugins, such as Marketplace plugins exist in the lctmyrepos/liferay/configs/${ENV}/osgi/marketplace directory, and source code for plugins to be built reside in ltcmyrepos/liferay/modules directory.
lctmyrepos/liferay/configs/${ENV}/osgi/marketplace
ltcmyrepos/liferay/modules
This document will refer to concepts gathered and described in several Liferay Developer Community Blog posts. It would behoove the reader to be familiar with them. Part 1, Part 2, Part 3.
In DXP Cloud Workspaces, there are 5 directories in liferay/confgs:
common
dev
local
prd
uat
Dev, uat, and prd correspond to the environments provided in DXP Cloud Console. Common is the folder for deploying things into all environments. Local is what we will work with here, because it’s the local environment.
Dev
Copy in the necessary plugins (osgi/marketplace, etc..) into the local dir, following the folder structure.
osgi/marketplace
Set portal-ext.properties as necessary
portal-ext.properties
We will assume no data for now
If using a database, copy in the database driver jar into the appropriate location.
For details on how to set up everything in the local directory, please refer to Part 1 and Part 2 of the links above.
Navigate to lctmyrepos/liferay
lctmyrepos/liferay
Verify that the Docker service is running.
Run the gradle createDockerContainer command
gradlew.bat|sh createDockerContainer
Q1: Error message
A1: Docker is not started. Start Docker.
In older versions of Docker or the Gradle workspace, you may get this error message relating to enabling the Docker Desktop setting “Expose daemon on tcp://localhost:2375 without TLS” and noting the need to enable it. Docker Desktop has a message saying this is not secure and could create a vulnerability. You could enable the setting (unsafe!) but having updated Docker and the Gradle workspace removed the need to check this setting.
Q2: Error message
A2: Update your Workspace version to at least 3.3.2. If you are using a newly generated workspace, this should not be an issue.
In gradle.settings, update the version number of “com.liferay.gradle.plugins.workspace” to version 3.3.2
This will have created a docker image with the name “liferay-liferay” which can be started in the Docker console.
When it’s done starting up, go to “http://localhost:8080” and test the following:
Marketplace plugins/themes available
Custom modules/portlets/themes available
If everything is present, then we’ll call this a victory. It should start up a blank instance of Liferay DXP.
For more info about using Docker in the Liferay workspace, and how to start the container via command line, look here.
Now that the local container is up, does this mean we have to rebuild it every time we want to test? We could. We’d have to delete the container every time and rerun “gradlew.bat|sh createDockerContainer” each time. However, we don’t need to do that.
Run the command gradlew.bat|sh dockerDeploy
gradlew.bat|sh dockerDeploy
There’s no need to shutdown the Docker container to push the updated modules into it. These updates can be a custom module or theme updates from the Liferay Workspace. This will not push changes to the file system in the local directory.
Isn’t that so much better? Just one command to push code and module changes into a local instance!
So far, our usage has done everything on a blank bundle. What if we want to test with data from the Cloud? How do we import everything? Once the Marketplace plugins are placed in the local directory, do the following.
The import process shown below can be, in theory, scripted, as there is a script directory for the Docker image. I haven’t done it myself, but I have been shown some automated scripts for this purpose.
Currently, there is no supported method to take a backup of the non-prod (dev, uat) environments, and only prd has the backup mechanism. If you do need to backup dev/uat, it is possible to use the shell to generate a database dump, and to archive the document_library directory and then make that available for download.
document_library
Go to PRD
Go to Backups
To take a manual backup, press “Backup Now”
Download the backup files using the 3-dot menu
Unzip the file until you get past the tar and gz.
Add the *.sql extension to it (optional)
Import into MySQL
mysql -u name -p ${DATABASE_SCHEMA} < ${DATABASE_SCHEMA}.sql
In the section for Archiving the Database Schema Step 5a, the command to export the schema given is: mysqldump -u dxpcloud -p --databases --add-drop-database ${DATABASE_SCHEMA} | gzip -c | cat > ${FILENAME}.gz
mysqldump -u dxpcloud -p --databases --add-drop-database ${DATABASE_SCHEMA} | gzip -c | cat > ${FILENAME}.gz
Note, this has the --databases --add-drop-database flags, which will add the line
--databases --add-drop-database
What this flag does is that upon import of the schema, it will search for a database called “lportal” (or whatever the database schema is called) and if it doesn’t exist, import it.
If you are OK with this, then skip the next section. If you have an existing schema with the same name that you’d rather keep, then read on.
If you are not OK with the above auto-create and import, you have several options.
Edit the SQL file to remove that line.
Don’t do this unless you have a choice.
Make a backup copy first.
Re-export the SQL file without the --databases flag
Command mysqldump lportal > lportal.sql
mysqldump lportal > lportal.sql
Download it via the DXP Cloud backup download mechanisms.
Then import the SQL file as you would normally.
Create a directory called “data” in lctmyrepos/liferay/configs/local
data
lctmyrepos/liferay/configs/local
Extract the document_library.tgz to the created data
document_library.tgz
There should be a document_library directory after all the unzipping.
In lctmyrepos/liferay/configs/local open the portal-ext.properties file
Change/replace the database connection properties to work with the local database and imported schema.
If you are using a local database and not another Docker container, you will need to use a specific property to tell the container to connect the host: host.docker.internal
host.docker.internal
jdbc.default.url=jdbc:mysql://host.docker.internal/dxpcloud?characterEncoding=UTF-8&dontTrackOpenResources=true&holdResultsOpenOverStatementClose=true&serverTimezone=GMT&useFastDateParsing=false&useUnicode=true
Navigate to the lctmyrepos/liferay directory
Run the command to create the Docker container
As mentioned above, changes into the file system of the local image require rebuilding the Docker container. This includes making changes to portal-ext.properties.
If you are rebuilding the Docker container, you will need to delete the old one before doing so.
Here: https://help.liferay.com/hc/en-us/articles/360029147591-Leveraging-Docker
In Part 2, I used docker-compose with my Liferay DXP, MySQL, and Elasticsearch containers. We can do the same thing here. For this example, I have chosen not to in order to separate things out, but it could be done. If I was going to use a Docker image of MySQL with this setup, then I would use a docker-compose to link everything together.
docker-compose
Navigate to lctmyrepos/liferay/build/docker
lctmyrepos/liferay/build/docker
Place hot deploy files (Marketplace plugins, license xml, JARs, etc…) into the deploy directory.
deploy
Note that as with all containers, this will go away between container restarts.
Hot deploy is good for testing something, but for more permanent usage, it’s best to put it in the OSGi directory and rebuild the image.
Everything above was done in the local directory, which means if we push a commit into GitHub, all of this will also be synced into the repository. Do we want this? It depends.
In lctmyrepos/liferay/.gitignore add in the entry
lctmyrepos/liferay/.gitignore
/configs/local/**
This avoids committing the contents of the local directory into the GitHub repository. However, it will leave the local directory within the repository for others who may check out the repository so that they can do their own local testing.
If this is the case, it’s probably better to put all the plugins into the common directory, and not the environment specific ones, and reserve that for using specific tests.
The rationale for this is that everyone may have specific things to test in their local and not to be cluttered with other things, or the workflow does mean everything is pushed up to an environment.
Don’t do anything
This means all files within configs/local will be committed to the GitHub repository. This is the case if you want everyone to have the same local testing environment.
It’s also probably a good idea to move plugins to the common directory as well, so there aren’t multiple copies of the same plugin floating around.
Ultimately, whether or not you do this depends on your team and how you want to organize things.
I hope this entry provides some insight into not only working with DXP Cloud in a local instance, but also using some of the features found in the Liferay Workspace.
Will there be a Part 5? It depends if there is another topic about DXP Cloud or Docker that I can investigate. The DXP Cloud CLI is one possibility.