October 26, 2020
Creating Functional Tests for Liquibase Extensions
See Liquibase in Action
Accelerate database changes, reduce failures, and enforce governance across your pipelines.
At Liquibase, we use two types of tests to ensure Liquibase works with different types of databases: Integration and Functional. Integration tests evaluate whether Liquibase is doing the right thing to the database server. (Are we creating objects correctly? Are we snapshotting them correctly?) Our Director of QA, Kristyl Gomes, recently wrote about our new integration test harness.
This blog focuses on functional tests that evaluate if Liquibase can do things like update, updateSQL, and other functions that Liquibase performs.
We have some Liquibase extensions that are GREAT about functional tests. For example, MongoDB and HANA are top notch. The contributors of both extensions provided functional tests that run against a Docker database container. Thus, whenever a pull request is created, Travis CI will automatically verify that the pull request passes all functional tests. However, not all extensions have that level of functional testing. For example, I’ve been spending a lot of time with our Cassandra extension and they are (were) completely lacking. We had one JUnit test and it was… underwhelming.
To resolve this lack of functional testing in the Cassandra extensions, we used Spock for our Functional tests. This allows us to write simple tests via Groovy which greatly speeds test writing. Here’s how we did it:
Creating a Functional Test for Cassandra
To start writing functional tests for the Cassandra extension, we first need to add the following to the pom.xml file:
<dependency> <!-- use a specific Groovy version rather than the one specified by spock-core -->
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>3.0.5</version>
<type>pom</type>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-testng</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.spockframework</groupId>
<artifactId>spock-core</artifactId>
<version>2.0-M3-groovy-3.0</version>
<scope>test</scope>
</dependency>
And a little later:
<plugin>
<groupId>org.codehaus.gmavenplus</groupId>
<artifactId>gmavenplus-plugin</artifactId>
<version>1.6.2</version>
<executions>
<execution>
<goals>
<goal>addSources</goal>
<goal>addTestSources</goal>
<goal>compile</goal>
<goal>compileTests</goal>
<goal>removeStubs</goal>
<goal>removeTestStubs</goal>
</goals>
</execution>
</executions>
</plugin>
Now Maven knows that we’re going to be testing using Spock.
The rest is even simpler. We’ll need a changelog.xml to use for testing. So, I put one in src/test/resources. Then, we need to write the test case: src/test/groovy/liquibase/ext/cassandra/FirstTest.groovy. (Please don’t go looking for this test as it’s now been replaced with CassandraFunctionalTests.groovy. This is just a way to verify that you have your pom.xml set up correctly.)
package liquibase.ext.cassandra
import spock.lang.Specification
class FirstTest extends Specification {
def "testing something"() {
expect:
true == false
}
Well, that’s gonna fail. And that’s great! It will make sure that your project is set up correctly. I’ll leave it as an exercise for you to set up your IDE to call this stuff correctly, but you should be able to run mvn test from the command line.
Now, let’s start some real testing!
First, fire up some databases. We use Docker containers for these tests and leverage TravisCI for our testing on PRs.
Here’s a test for liquibase update:
def "update"() {
when:
def url = "jdbc:cassandra://localhost:9042/betterbotz;DefaultKeyspace=betterbotz"
def defaultSchemaName = "betterbotz";
def database = CommandLineUtils.createDatabaseObject(new ClassLoaderResourceAccessor(), url, null, null, null, null, defaultSchemaName, false, false, null, null, null, null, null, null, null);
def liquibase = new Liquibase("changelog.xml", new ClassLoaderResourceAccessor(), database);
liquibase.update((Contexts) null);
then:
database != null;
}
All we’re doing here is setting up Liquibase to connect to our Cassandra instance running in a Docker container. Then, we tell Liquibase which changelog to use. Finally, we do an update.
Of course, we can do deeper testing here. Perhaps we can validate the results with a snapshot or using liquibase status to get results. But, the fact that we don’t have it bomb out on an update is a win.
Our hope is that we have functional tests that verify that all Liquibase functionality is working as expected with all extensions. This is one requirement to graduate an extension from Liquibase Labs into the core Liquibase binary. We’re certain that you, as an extension developer or gifted user, will have all sorts of questions on how to create and improve testing. That’s why we’ve created a number of ways to contact us. Please reach out with any and all questions!