Local Work with DXP Cloud

I Just Want To Test This Real Quick

I guess we'll call this Part 4 in my series on doing stuff with Docker and DXP Cloud.

DXP Cloud offers a full DevOps lifecycle, but having to go through that process for every small change can be very inefficient and tedious. Every code change that gets pushed up to the GitHub repos will need to go through Jenkins CI, and then manually be deployed to the environment. A possible solution would be to have a separate local bundle running or Docker image, but that takes time to setup and can lead to some environmental differences. Is there a way to simply have everything contained in the DXP Cloud workspace?

The answer is yes. The DXP Cloud workspace has a way to generate a local testing instance without having to push everything to the cloud.

Everything shown here does not need DXP Cloud to function, and is provided as part of the Liferay Workspace, so what is described here, while in the context of DXP Cloud, will work in a regular Liferay Workspace.

Requirements

  • DXP Cloud workspace
    • Familiarity with the workspace and why things are organized as such
    • Or regular Liferay Workspace
  • Docker
  • Liferay Blade CLI
  • Gradle
  • Local MySQL installation (or compatible)
    • Or Docker image using docker-compose.

Optional Items

  • Database backup of the environment
  • Document Store backup of the environment

Assumption

The biggest assumption here is that all plugins and custom modules reside in their usual location in the workspace. That is, compiled plugins, such as Marketplace plugins exist in the lctmyrepos/liferay/configs/${ENV}/osgi/marketplace directory, and source code for plugins to be built reside in ltcmyrepos/liferay/modules directory.

This document will refer to concepts gathered and described in several Liferay Developer Community Blog posts. It would behoove the reader to be familiar with them. Part 1, Part 2, Part 3.

Creating the Local Container

In DXP Cloud Workspaces, there are 5 directories in liferay/confgs:

  • common

  • dev

  • local

  • prd

  • uat

Dev, uat, and prd correspond to the environments provided in DXP Cloud Console. Common is the folder for deploying things into all environments. Local is what we will work with here, because it’s the local environment.

Setup the Local directory

  1. Copy in the necessary plugins (osgi/marketplace, etc..) into the local dir, following the folder structure.

  2. Set portal-ext.properties as necessary

    1. We will assume no data for now

  3. If using a database, copy in the database driver jar into the appropriate location.

For details on how to set up everything in the local directory, please refer to Part 1 and Part 2 of the links above. 

Create the Docker Container

  1. Navigate to lctmyrepos/liferay

  2. Verify that the Docker service is running.

  3. Run the gradle createDockerContainer command

    1. gradlew.bat|sh createDockerContainer

Common Issues

Q1: Error message

* What went wrong:
Execution failed for task ':buildDockerImage'.
> com.bmuschko.gradle.docker.shaded.org.apache.hc.client5.http.HttpHostConnectException: Connect to http://127.0.0.1:2375 [/127.0.0.1] failed: Connection refused: connect

A1: Docker is not started. Start Docker. 

In older versions of Docker or the Gradle workspace, you may get this error message relating to enabling the Docker Desktop setting “Expose daemon on tcp://localhost:2375 without TLS” and noting the need to enable it. Docker Desktop has a message saying this is not secure and could create a vulnerability. You could enable the setting (unsafe!) but having updated Docker and the Gradle workspace removed the need to check this setting.

Q2: Error message

* What went wrong:
Execution failed for task ':createDockerContainer'.
> {"message":"invalid volume specification: '/host_mnt/d/repos/private/lctgelesis/liferay/build/docker/deploy:rw'"}

A2: Update your Workspace version to at least 3.3.2. If you are using a newly generated workspace, this should not be an issue.

In gradle.settings, update the version number of “com.liferay.gradle.plugins.workspace” to version 3.3.2

Starting the Docker Container

This will have created a docker image with the name “liferay-liferay” which can be started in the Docker console.

When it’s done starting up, go to “http://localhost:8080” and test the following:

  • Marketplace plugins/themes available

  • Custom modules/portlets/themes available

If everything is present, then we’ll call this a victory. It should start up a blank instance of Liferay DXP.

For more info about using Docker in the Liferay workspace, and how to start the container via command line, look here.

Testing on the Local Container

Now that the local container is up, does this mean we have to rebuild it every time we want to test? We could. We’d have to delete the container every time and rerun “gradlew.bat|sh createDockerContainer” each time. However, we don’t need to do that.

Pushing module updates into the Docker Container

  1. Run the command gradlew.bat|sh dockerDeploy

There’s no need to shutdown the Docker container to push the updated modules into it. These updates can be a custom module or theme updates from the Liferay Workspace. This will not push changes to the file system in the local directory.

Isn’t that so much better? Just one command to push code and module changes into a local instance!

Importing a Backup from DXP Cloud

So far, our usage has done everything on a blank bundle. What if we want to test with data from the Cloud? How do we import everything? Once the Marketplace plugins are placed in the local directory, do the following.

The import process shown below can be, in theory, scripted, as there is a script directory for the Docker image. I haven’t done it myself, but I have been shown some automated scripts for this purpose.

Pull Backups from DXP Cloud

Currently, there is no supported method to take a backup of the non-prod (dev, uat) environments, and only prd has the backup mechanism. If you do need to backup dev/uat, it is possible to use the shell to generate a database dump, and to archive the document_library directory and then make that available for download.

  1. Go to PRD

  2. Go to Backups

  3. To take a manual backup, press “Backup Now”

  1. Download the backup files using the 3-dot menu

Importing the Schema Backup

  1. Unzip the file until you get past the tar and gz.

  2. Add the *.sql extension to it (optional)

  3. Import into MySQL

    1. mysql -u name -p ${DATABASE_SCHEMA} < ${DATABASE_SCHEMA}.sql

Caveat for Importing Backup

In the section for Archiving the Database Schema Step 5a, the command to export the schema given is: mysqldump -u dxpcloud -p --databases --add-drop-database ${DATABASE_SCHEMA} | gzip -c | cat > ${FILENAME}.gz

Note, this has the --databases --add-drop-database flags, which will add the line

CREATE DATABASE /*!32312 IF NOT EXISTS*/ `lportal` /*!40100 DEFAULT CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci */;

What this flag does is that upon import of the schema, it will search for a database called “lportal” (or whatever the database schema is called) and if it doesn’t exist, import it. 

If you are OK with this, then skip the next section. If you have an existing schema with the same name that you’d rather keep, then read on.

Importing the Schema Under a New Name

If you are not OK with the above auto-create and import, you have several options.

  1. Edit the SQL file to remove that line.

    1. Don’t do this unless you have a choice.

    2. Make a backup copy first.

  2. Re-export the SQL file without the --databases flag 

    1. Command mysqldump lportal > lportal.sql

    2. Download it via the DXP Cloud backup download mechanisms.

Then import the SQL file as you would normally.

Importing the Document Store

  1. Create a directory called “data” in lctmyrepos/liferay/configs/local

  2. Extract the document_library.tgz to the created data

    1. There should be a document_library directory after all the unzipping.

Connecting the Database

  1. In lctmyrepos/liferay/configs/local open the portal-ext.properties file

  2. Change/replace the database connection properties to work with the local database and imported schema.

    1. If you are using a local database and not another Docker container, you will need to use a specific property to tell the container to connect the host: host.docker.internal

      1. jdbc.default.url=jdbc:mysql://host.docker.internal/dxpcloud?characterEncoding=UTF-8&dontTrackOpenResources=true&holdResultsOpenOverStatementClose=true&serverTimezone=GMT&useFastDateParsing=false&useUnicode=true

Create the Docker Container (Redux)

  1. Navigate to the lctmyrepos/liferay directory

  2. Run the command to create the Docker container

    1. gradlew.bat|sh createDockerContainer

As mentioned above, changes into the file system of the local image require rebuilding the Docker container. This includes making changes to portal-ext.properties.

Caveat

If you are rebuilding the Docker container, you will need to delete the old one before doing so.

Additional Gradle based commands and documentation

Here: https://help.liferay.com/hc/en-us/articles/360029147591-Leveraging-Docker

Alternative Setups

In Part 2, I used docker-compose with my Liferay DXP, MySQL, and Elasticsearch containers. We can do the same thing here. For this example, I have chosen not to in order to separate things out, but it could be done. If I was going to use a Docker image of MySQL with this setup, then I would use a docker-compose to link everything together.

Using the Environment

Hot Deploy

  1. Navigate to lctmyrepos/liferay/build/docker

  2. Place hot deploy files (Marketplace plugins, license xml, JARs, etc…) into the deploy directory.

    1. Note that as with all containers, this will go away between container restarts.

    2. Hot deploy is good for testing something, but for more permanent usage, it’s best to put it in the OSGi directory and rebuild the image.

To gitignore or to not gitignore?

Everything above was done in the local directory, which means if we push a commit into GitHub, all of this will also be synced into the repository. Do we want this? It depends.

To gitignore

  1. In lctmyrepos/liferay/.gitignore add in the entry

    1. /configs/local/**

This avoids committing the contents of the local directory into the GitHub repository. However, it will leave the local directory within the repository for others who may check out the repository so that they can do their own local testing.

If this is the case, it’s probably better to put all the plugins into the common directory, and not the environment specific ones, and reserve that for using specific tests.

The rationale for this is that everyone may have specific things to test in their local and not to be cluttered with other things, or the workflow does mean everything is pushed up to an environment.

To not gitignore

  1. Don’t do anything

This means all files within configs/local will be committed to the GitHub repository. This is the case if you want everyone to have the same local testing environment.

It’s also probably a good idea to move plugins to the common directory as well, so there aren’t multiple copies of the same plugin floating around.

It Depends

Ultimately, whether or not you do this depends on your team and how you want to organize things.

Conclusion

I hope this entry provides some insight into not only working with DXP Cloud in a local instance, but also using some of the features found in the Liferay Workspace.

Will there be a Part 5? It depends if there is another topic about DXP Cloud or Docker that I can investigate. The DXP Cloud CLI is one possibility.