Database first is at the core of jOOQ’s design. jOOQ has been made primarily for classic systems the database is always there and always has been and will never leave. This is because we think “data have mass”
This not only translates to moving logic closer to the data (see our previous posts about the cost of JDBC round trips or generating vendor agnostic procedural logic), but also avoiding moving data around between migrations (e.g. of RDBMS products).
Compared to how “heavy” data is, applications and UIs come and go. Speaking of go, maybe you’ll replace all of your Java code tomorrow for some go code. But you will keep the database if it isn’t trivial.
With this in mind, jOOQ assumes you have a pre-existing schema, which you mange with Flyway or Liquibase, and then you use jOOQ to reverse engineer your updated schema using the code generator.
The old days
In the old days, setting up an Oracle instance was very heavy, and also hard. I remember working at a company where we had shared development and test instances. The schema was always in flux. We couldn’t assume a stable dev version.
As such, pointing the jOOQ code generator towards a live database used to be a bit of a challenge, which is why jOOQ offers alternative, connection-free code generation modes, including:
- The
JPADatabase
, if you have a pre-existing JPA entity based meta model. - The
XMLDatabase
, if you have some form of XML version of your schema, which you can XSL transform to jOOQ’s format - The
DDLDatabase
, which can interpret your DDL scripts, e.g. the ones you pass to Flyway, or the ones produced by pg_dump. - The
LiquibaseDatabase
, which simulates a Liquibase database migration and uses the simulated database output as a source for meta information of the code generator
But all of the above have the same limitation. You can’t really use many vendor-specific features, such as advanced stored procedures, data types, etc.
A modern approach using testcontainers
Ideally, unless you’re supporting several RDBMS products (most people don’t), you should work only with your production database product, say PostgreSQL.
Thanks to testcontainers.org, it’s very easy to programmatically, or configuratively, start up a PostgreSQL instance of any version in a Docker container. If you have a SQL script that contains your database, you can supply it to the testcontainers JDBC URL, e.g. like this:
jdbc:tc:postgresql:13:///sakila?TC_TMPFS=/testtmpfs:rw&TC_INITSCRIPT=file:${basedir}/src/main/resources/postgres-sakila-schema.sql
For more information, see their docs about JDBC support. Now, add the testcontainers dependency on your project classpath, e.g.
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>postgresql</artifactId>
</dependency>
And use the ContainerDatabaseDriver
instead of the actual PostgreSQL driver for your code generation configuration in jOOQ, e.g. when using Maven:
<plugin>
<groupId>org.jooq</groupId>
<artifactId>jooq-codegen-maven</artifactId>
<executions>
<execution>
<id>java-generator</id>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<jdbc>
<driver>
org.testcontainers.jdbc.ContainerDatabaseDriver
</driver>
<url>${db.url}</url>
<user>${db.username}</user>
<password>${db.password}</password>
</jdbc>
<generator>
<database>
<inputSchema>public</inputSchema>
</database>
<target>
<packageName>org.jooq.example.tc.db</packageName>
<directory>src/main/java</directory>
</target>
</generator>
</configuration>
</execution>
</executions>
<dependencies>
<!-- Junit seems a transitive dependency of testcontainers? -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13.1</version>
</dependency>
</dependencies>
</plugin>
As simple as that! Check out the jOOQ-testcontainers-example
for a runnable example that uses testcontainers for code generation using the above approach.
Adding database change management
A real world example would be using again Flyway or Liquibase, etc. to apply a complete database migration to your PostgreSQL instance inside of testcontainers, prior to generating code and/or running your integration tests.
This just slightly complicates things, but doesn’t produce any impossible problems. Instead of creating throwaway containers with a single TC_INITSCRIPT
, you will now have to make sure the following steps are executed consecutively in your build somehow:
- A testcontainers instance of your database is started
- A Flyway or Liquibase migration is run inside of that database
- The jOOQ code generator reverse engineers that database
- Optionally, your integration tests also reuse the database of that container
Of course you should integration test your code! But for the sake of this discussion, that might be an optional step, as you may have different preferences on how to run those tests, e.g. more globally than just for this module. But in our example, let’s include the tests.
You can find the full example using testcontainers/flyway/jOOQ here.
Start the testcontainers instance
Unfortunately, testcontainers doesn’t ship any Maven / Gradle plugins yet to invoke container lifecycle management during a build. I’ve created a feature request for that here, which you should upvote: https://github.com/testcontainers/testcontainers-java/issues/4397.
But we can easily help ourselves by using the ever powerful Maven escape hatch that is the groovy-maven-plugin
(the ideas are the same for gradle):
<plugin>
<groupId>org.codehaus.gmaven</groupId>
<artifactId>groovy-maven-plugin</artifactId>
<version>2.1.1</version>
<executions>
<execution>
<!-- Start the container in any phase before the actual code
generation is required, i.e. at the latest in
generate-sources -->
<phase>generate-sources</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<source>
db = new org.testcontainers.containers.PostgreSQLContainer(
"postgres:latest")
.withUsername("${db.username}")
.withDatabaseName("postgres")
.withPassword("${db.password}");
db.start();
// After you've started the container, collect its generated
// JDBC URL (which contains a random port)
project.properties.setProperty('db.url', db.getJdbcUrl());
</source>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>postgresql</artifactId>
<version>1.15.2</version>
</dependency>
</dependencies>
</plugin>
So, that starts a container and keeps it running until the build terminates. I won’t show a graceful shutdown, because it’s not needed for the example, but you could implement that as well, of course.
Now, migrate your database
The above database is empty. Now to run the migration, in the example using Flyway, but it will be the same thing with Liquibase.
<plugin>
<groupId>org.flywaydb</groupId>
<artifactId>flyway-maven-plugin</artifactId>
<version>7.14.0</version>
<executions>
<execution>
<!-- We run the migration in the same phase, before jOOQ's
code generation -->
<phase>generate-sources</phase>
<goals>
<goal>migrate</goal>
</goals>
<configuration>
<!-- This URL has been set by groovy, above -->
<url>${db.url}</url>
<user>${db.username}</user>
<password>${db.password}</password>
<locations>
<location>
filesystem:src/main/resources/db/migration
</location>
</locations>
</configuration>
</execution>
</executions>
</plugin>
Add all the additional complexity of your migration in this configuration if you like. jOOQ wouldn’t know anything about it.
Now, generate the code
Again, nothing special here:
<plugin>
<groupId>org.jooq</groupId>
<artifactId>jooq-codegen-maven</artifactId>
<executions>
<execution>
<id>java-generator</id>
<!-- Same phase as above, but the previous plugins have already
executed, so we're generating the db post migration -->
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<jdbc>
<!-- Again, this URL has been set by groovy, above -->
<url>${db.url}</url>
<user>${db.username}</user>
<password>${db.password}</password>
</jdbc>
<generator>
<database>
<inputSchema>public</inputSchema>
</database>
<target>
<packageName>org.jooq.example.db</packageName>
</target>
</generator>
</configuration>
</execution>
</executions>
</plugin>
And finally, optionally, integration test
If you want to re-use the above container with migrated database also in your integration tests, you could just pass along the generated JDBC URL to the maven-surefire-plugin
as follows:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<systemPropertyVariables>
<!-- Again, this URL has been set by groovy, above -->
<db.url>${db.url}</db.url>
<db.username>${db.username}</db.username>
<db.password>${db.password}</db.password>
</systemPropertyVariables>
</configuration>
</plugin>
There are many ways to achieve the same thing, this is one of them that works decently out of the box. You can check out a full example from github here, and play around with it:
https://github.com/jOOQ/jOOQ/tree/main/jOOQ-examples/jOOQ-testcontainers-flyway-example
More about testcontainers
To learn more about testcontainers, see our interview with Richard North here.