Improving Java build speed with Develocity

We have recently started using the Develocity tool for our builds. I really love it. It is a build acceleration tool made by the company who create Gradle. It was previously called Gradle Enterprise, but has been renamed to make clear that it works for both Maven and Gradle. It offers these features:

  1. Metrics for every build, showing what tasks were performed and how long each one took.
  2. Remote caching.
  3. Test distribution
  4. Predictive test selection. e.g. only run reduced test set on feature branches
  5. Test Failure Analytics e.g. dashboard, flaky test auto retrys

It works for BOTH CI and local builds. So if code has previously built on your CI server and the results from each task stored in the remote cache, local dev builds will not have to repeat all tasks! However work may be required for remote caching to be available – tasks results can only be cached if the task has correctly defined inputs. But using Develocity makes it a lot easier to check your tasks and understand if they need changes to make them cacheable.

The front page of Develocity shows your list of builds (click images to expand):

Click on a build to see a summary:

Then you can see a list of tasks, which you can order by longest task, so you can start optimising:

The performance tabs show you cache usage, so you can figure out if tasks aren’t being retrieved from cache correctly:

If you look at the build list, you will see there is a really nice example of a build running quickly because most tasks were retrieved from cache:

Second build from cache

The first build took 28 minutes, the second build, which only had to rebuild a couple of modules, took 2 minutes!

For more info on Develocity, see:

https://gradle.com/develocity

Or if you would like to read more about how to fix a Gradle task that cannot be cached, see the post I wrote when we were implementing Develocity. For tasks to be cacheable, they first have to be incremental – have correctly defined inputs and outputs:

Gradle incremental tasks and builds

Posted in Gradle, Java | Leave a comment

CXF Restful web server example

I’ve created an example of how to use Apache CXF and Spring together to create a restful web service:
https://github.com/hedleyproctor/cxf-restful-server-example

This example shows the key steps in creating a restful web server:

  • Create a rest service interface class which you annotate with the RS annotation @WebService.
  • Your rest service interface class contains a method signature for each operation. Each one is annotated to say what its path is and the data format for request and response.
  • You write an implementation class that contains the code to be executed when each endpoint is hit.
  • In Spring XML configuration, you define the server endpoint, configuring it with your rest service interface(s) and any necessary data conversion classes.

So in my example, the rest interface looks like this:

 
package org.example;

import com.ice.fraud.Claim;

import javax.ws.rs.*;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import javax.jws.WebService;

@Path("/hello")
@WebService
public interface HelloWorldRestService
{
    @GET
    @Path("/greet")
    @Produces(MediaType.TEXT_PLAIN)
    public Response greet();

    @POST
    @Path("/sayhello")
    @Produces(MediaType.APPLICATION_JSON)
    public Response sayHello(String input);

    @POST
    @Path("/submit")
    @Consumes(MediaType.APPLICATION_JSON)
    @Produces(MediaType.APPLICATION_JSON)
    public Response submit(Claim claim);
}

Then each method is implemented in the implementation class, to return the appropriate data and an http response code:

 
public class HelloWorldRestServiceImpl implements HelloWorldRestService
{
    public Response greet() {

        return Response.status(Status.OK).
                entity("Hi There!!").
                build();
    }

The server definition is in a Spring xml configuration file called cxf-beans.xml. You give CXF a list of all your interface classes, and what data providers you need to use. In this example, the data format is json, so the data provider is the JacksonJsonProvider class.

 
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
       xmlns:cxf="http://cxf.apache.org/jaxrs"
       xsi:schemaLocation="
       http://www.springframework.org/schema/beans
       http://www.springframework.org/schema/beans/spring-beans.xsd
       http://cxf.apache.org/jaxrs
       http://cxf.apache.org/schemas/jaxrs.xsd">

    <import resource="classpath:META-INF/cxf/cxf.xml" />
    <import resource="classpath:META-INF/cxf/cxf-servlet.xml" />

    <bean id="helloWorldRestService" class="org.example.HelloWorldRestServiceImpl" />

    <!-- By default, when CXF starts a server it starts an instance of the Jetty java web server. -->
    <!-- The endpoint for a CXF operation is composed of three sections: -->
    <!-- 1. base URL - defined here -->
    <!-- 2. service path - defined by the @Path annotation at the top of your service class -->
    <!-- 3. operation path - defined by the @Path annotation on each method -->
    <cxf:server id="helloServer" address="http://localhost:8080/cxf-rest">
        <cxf:serviceBeans>
            <ref bean="helloWorldRestService" />
        </cxf:serviceBeans>
        <cxf:providers>
            <bean class="com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider" />
        </cxf:providers>
    </cxf:server>

</beans>

The repo also contains an integration service written in JUnit 5 which shows how to test the services using an Apache http client.

For other Java integration topics, see:
Error handling in Apache Camel
Type conversion exceptions in Apache Camel

Posted in CXF, Java | Tagged , | Leave a comment

IntelliJ Plugin Development Cookbook

This post is intended to help people writing IntelliJ plugins, with a focus on plugins for Java projects. Although the IntelliJ docs are pretty good, this post collects together lots of small “how to” instructions that I’ve found useful when writing a custom plugin.

Introduction

IntelliJ Plugin Development docs: https://www.jetbrains.org/intellij/sdk/docs/welcome.html https://www.jetbrains.org/intellij/sdk/docs/basics/getting_started.html

It is really important to understand the different ways you can interact with files:

  1. As VirtualFiles, which confusingly, are the files on the filesystem.
  2. As structured, understood files, called PSI files. e.g. a Java file is a PSI file
  3. As files open in the Editor.
  4. XML files
IntelliJ file docs:
https://www.jetbrains.org/intellij/sdk/docs/basics/architectural_overview/virtual_file.html https://www.jetbrains.org/intellij/sdk/docs/basics/architectural_overview/documents.html
https://www.jetbrains.org/intellij/sdk/docs/reference_guide/editors.html

Structured files like Java or XML are called PSI:
https://www.jetbrains.org/intellij/sdk/docs/basics/architectural_overview/psi.html
https://www.jetbrains.org/intellij/sdk/docs/basics/psi_cookbook.html

How to add an action to a menu

In your plugin.xml:
 
<actions>
    <action id="com.ice.refactor.RemoveFacadeUsageAction" class="com.ice.refactor.RemoveFacadeUsageAction" text="Remove Facade Usage"
            description="Remove facade usage">
        <add-to-group group-id="RefactoringMenu4" anchor="last"/>
    </action>
However by far the easiest way to do this is to use the custom inspection / quick fix – if you have an action not yet added to the plugin config, right click on the class name and IntelliJ will open a dialog box to help you add the config. Really cool!

How to add settings to your plugin

See this useful post: Adding plugin settings

Essentially:

  • Create a Configuration class
  • Create a Form class that it will bind to
  • Use the Visual Editor to design the form
  • Use the PropertiesComponent.getInstance() to save simple properties like booleans and strings

Find the PSI element under the carat

  
final Editor editor = event.getData(CommonDataKeys.EDITOR);
int findElementFlags = TargetElementUtil.REFERENCED_ELEMENT_ACCEPTED + TargetElementUtil.ELEMENT_NAME_ACCEPTED + TargetElementUtil.LOOKUP_ITEM_ACCEPTED;
TargetElementUtil targetElementUtil = new TargetElementUtil();
PsiElement psiElement = targetElementUtil.findTargetElement(editor, findElementFlags, offset);

Getting the text under the carat

  
final Editor editor = event.getData(CommonDataKeys.EDITOR);
int offset = editor.getCaretModel().getOffset();
Document document = editor.getDocument();
String textUnderCarat = document.getText(TextRange.from(offset - 10, offset + 30));

Getting the VirtualFile from the Document in the editor

 
VirtualFile virtualFile = FileDocumentManager.getInstance().getFile(document);

See: Virtual file

Find classes implementing an interface

 
PsiClass interfaceClass = (PsiClass)psiElement;
PsiElement psiImplClass = DefinitionsScopedSearch.search(interfaceClass).findFirst();
PsiClass facadeImplementationClass = (PsiClass)psiImplClass;

Find usages of a method

Collection usages = ReferencesSearch.search(myMethod).findAll();

Find children of a PSI element

PsiTreeUtil.findChildOfType

Add an annotation to a Java class

 
javaPsiFacade = JavaPsiFacade.getInstance(event.getProject());
PsiElementFactory elementFactory = javaPsiFacade.getElementFactory();
PsiAnnotation inputAnnotation = elementFactory.createAnnotationFromText(annotationText, psiClass);
// Note that we pass in the desired annotation, but another java object is actually created and added to the target. 
// This is what we must pass back to our caller. 
PsiAnnotation actualAnnotation = (PsiAnnotation)targetLocation.addAfter(inputAnnotation, targetLocation);

Show a message / error dialog

Messages.showMessageDialog(project, “Element under carat is not a Java class name”, “Refactoring Plugin”, Messages.getErrorIcon());

Permit message dialogs in tests

TestDialogManager.setTestDialog(TestDialog.OK);

Write messages to the event log

Notifications.Bus.notify(newNotification(“your-plugin-group”,”YourActionName”,message,NotificationType.INFORMATION));

Load an XML file

Simply load as a PsiFile and then cast to XmlFile.
Posted in IntelliJ, Java, Uncategorized | Tagged , | Leave a comment

Gradle dependencies tutorial

Gradle has very powerful dependency management features. In this tutorial I will walk through creating a multi module Java project, and explain:
  • How the api and implementation dependencies work
  • How to create a custom dependency resolution strategy to:
    1. Hard fail if an unwanted dependency is found
    2. Fix a dependency version
    3. Globally exclude a dependency

I will use IntelliJ for this tutorial. Start by creating a new Gradle project. I’m using Groovy as the Gradle DSL language, and the Gradle wrapper:

The Gradle wrapper means that the project includes a small jar that will bootstrap the build process. You don’t need a version of Gradle installed, rather the build will download the correct version. In my project, IntelliJ has generated the wrapper using version 7.6 of Gradle. I want to use a more recent version, so you can open the file gradle/wrapper/gradle-wrapper.properties and change the distributionUrl to a later version. I’m using version 8.6.

api and implementation dependency configurations

Next create two sub projects. I’ll call these “data-access” and “service”. The idea is to simulate a multi module project, representing a layered application, where one module performs data access, using Hibernate, and the other module is the service layer, which has no direct knowledge of the Hibernate data access layer.

If you have added the sub-projects using IntelliJ, they should automatically be added to the settings.gradle file at the root of the project, but you can open this file and check.

In the data-access project, add the java-library plugin to the build.gradle file:

plugins {
    id 'java-library'
}
Now in the dependencies section, let’s add Spring and Hibernate:
dependencies {
    api 'org.springframework:spring-core:6.0.11'
    api 'org.springframework:spring-context:6.0.11'
    implementation 'org.hibernate.orm:hibernate-core:6.5.2.Final'
We’ve added Spring as an api dependency, and Hibernate as an implementation dependency, the difference being:
  • api – also on the classpath of later modules
  • implementation – will be packaged into the final application, but is not on the build classpath for later modules
I think this is a great feature of Gradle, so much more powerful than Maven. It allows you to prevent later modules from being polluted by dependencies added for earlier modules. Let’s test out the access to these dependencies. In your data-access module, you can add a class that uses both libraries:
import org.hibernate.SessionFactory;
import org.springframework.context.ApplicationContext;

public class CustomerDAO {

    public void getCustomer(Long id) {
        SessionFactory sessionFactory = null;
        ApplicationContext applicationContext = null;
    }
}
Now build your project and confirm it works. (In the right nav, from your Gradle tab, you should be able to see Tasks -> build -> build.)

Now let’s add a class in the service project, and confirm that we can see Spring, but not Hibernate. In the service project build.gradle, add a dependency on the data-access project:

dependencies {
    implementation(project(':data-access'))
Now let’s try and add a class to the service project which uses both Spring and Hibernate:
import org.hibernate.SessionFactory;
import org.springframework.context.ApplicationContext;

public class CustomerService {

    public void processCustomer(Long id) {
        ApplicationContext applicationContext = null;
        SessionFactory sessionFactory = null;
    }

}
When you try and build this, you should get an error:
package org.hibernate does not exist
import org.hibernate.SessionFactory;
                    ^
Success! This proves that even though the data-access module is using Hibernate, and the service module depends on this module, the service module cannot use Hibernate code itself. The dependency has not bled into the service module. This is a great way to avoid accidental usage of dependencies from other modules. Now that we have proved this, delete the references to Hibernate from the CustomerService class and confirm the project can build again.

Throwing an error if a dependency is found

For the second part of this tutorial, I want to explain how you can customise dependency resolution. Again, Gradle has far more powerful mechanisms for doing this than Maven does. Firstly, let’s start by understanding what dependencies the project uses. On the command line, you can run:

./gradlew dependencies
This will show a number of different configurations, but most are blank, with only a couple of test configurations having dependencies. What is going on? The answer is that this command has only shown you dependencies for the top level project – not any of the sub projects. To see the dependencies for the data-access project, type:
./gradlew :data-access:dependencies
You should now see a much longer list, with the Spring and Hibernate dependencies included. So for Spring, as well as spring-core and spring-context, the list will show all the transitive dependencies, such as spring-jcl. Suppose we didn’t want spring-jcl, how could we detect it was being used? The answer is to write a custom dependency resolution strategy. In the top level build.gradle file, add the following:
allprojects { Project project ->
    configurations.all {
        println "Configuration: ${name}"
        resolutionStrategy.eachDependency { DependencyResolveDetails details ->
            println "Group: ${details.requested.group} Artifact: ${details.requested.name}"
            if (details.requested.group == 'org.springframework' && details.requested.name == 'spring-jcl') {
                throw new RuntimeException("Don't want spring-jcl")
            }
        }
    }
}
Now try and build your project again. You should get a runtime exception.

Fixing a dependency version

What if you don’t want to hard fail, but rather change the fix the version to one specified by you? We can do that by overriding the version in the custom resolution strategy, so the above code becomes:

allprojects { Project project ->
    configurations.all {
        println "Configuration: ${name}"
        resolutionStrategy.eachDependency { DependencyResolveDetails details ->
            println "Group: ${details.requested.group} Artifact: ${details.requested.name}"
            if (details.requested.group == 'org.springframework' && details.requested.name == 'spring-jcl') {
                details.useVersion '6.0.5'
                details.because 'we need v6.0.5'
            }
        }
    }
}
If you rerun the dependencies commmand for the data-access module, the output should show that the version of spring-jcl has been fixed:
+--- org.springframework:spring-core:6.0.11
|    \--- org.springframework:spring-jcl:6.0.11 -> 6.0.5

Excluding a dependency

What if you simply want to exclude a dependency entirely? In this case, things are simpler. Just use the exclude command in your configurations.all closure:
allprojects { Project project ->
    configurations.all {
        println "Configuration: ${name}"
        exclude group: 'org.springframework', module: 'spring-jcl'
    }
}
You can then rerun the dependencies command and confirm spring-jcl no longer appears in the list.
For more info on Gradle dependencies, see:
https://docs.gradle.org/current/userguide/declaring_dependencies.html
https://docs.gradle.org/current/userguide/dependency_locking.html
https://docs.gradle.org/current/userguide/resolution_strategy_tuning.html

Some of my other posts on Gradle:
Dependencies and configurations in Gradle
Gradle incremental tasks and builds
Gradle Release Plugin
Code coverage with Gradle and Jacoco

Posted in Gradle, Java | Tagged , | Leave a comment

Gradle – working with files

When working with files in Gradle, the key classes are:

FileCollection
FileTree – which extends FileCollection

Getting a FileCollection

You can get a file collection by using the files() method which is always available (from the Project object).

FileCollection myFiles = files("someDirectory")

Getting a filtered list of files

If you want to filter, probably easier to use the FileTree:

FileTree myFiles = fileTree("someDirectory").matching {
	include "*.xsd"
}

Extracting files from a jar / dependency / zip

Sometimes you might need to extract files from a dependency in order to process or consume them. Suppose you have some XSDs in a jar file and you want to extract them. First, define a custom configuration:

configurations {
    myXSDs
}

Then in your dependencies section, assign the jar file to the configuration and then use the zipTree command to unzip the archive:

task unzipXSDs(type: Copy) {
    from zipTree(configurations.myXSDs.singleFile).matching {
        include '**/*.xsd'
        include '**/*.xjb'
    }
    into "$buildDir/myModel"
}

Printing files in a single zip /jar

For debugging, you might want to print out the contents of a zip or jar file. You can do this by adding a custom task that uses the zipTree forEach method, like this:

task printJar {
    doLast {
        println "Printing jar files"
         zipTree('/path/to/jar/my.jar').forEach(f -> println f)
    }
}

For the Gradle docs on file, see:

https://docs.gradle.org/current/userguide/working_with_files.html

The FileCollection and FileTree classes both have good JavaDocs:

FileCollection

FileTree

Some of my other posts on Gradle:

Dependencies and configurations in Gradle

Gradle incremental tasks and builds

Gradle Release Plugin

Code coverage with Gradle and Jacoco

Posted in Gradle, Java | Tagged , | Leave a comment

Debugging Gradle

If you are new to any tool or technology, knowing how to debug when things go wrong is a really important skill. This post gives some beginner tips on how to debug Gradle builds. Note: In the commands below, I’m assuming you are invoking gradle via the gradle wrapper, so all commands start “gradlew”. If you aren’t using the wrapper, this would just be “gradle”.

Logging

By default, Gradle builds run in “quiet” log mode. Use the -i flag to get info level logs, or -d for debug.

Knowing what tasks are available

If you are a beginner, sometimes you don’t even know what tasks are available to you in the current build. Simply run the “tasks” command and it will list every available task.

Running a task in a single module

gradles :module:sub-module:task

Seeing dependencies

gradlew dependencies

This is for the top level module. For a sub module, run the task in that sub module:

gradlew :module:sub-module:task

If you want to add a task to your build that will print the dependencies for all modules, you can do this an an allprojects closure in your top level build.gradle Groovy file:

allprojects {
    task printAllDependencies(type: DependencyReportTask) {}
}

Debugging your own Gradle build scripts

In IntelliJ, you can right click on a Gradle task in the right nav and select the “Debug” option.
Alternatively, you can start Gradle in remote debug by adding:
-Dorg.gradle.debug=true
to the startup properties.

Debugging gradle core

This can be done provided you have the full distribution in use, not just the binary. So in the gradle/gradle-wrapper.properties file, set the distro:

distributionUrl=https\://services.gradle.org/distributions/gradle-7.4.2-all.zip

Debugging a third party plugin

There is currently a bug in IntelliJ whereby it will not find the source for a third party plugin. You can work around this by temporarily adding the plugin as a regular dependency to your app / module.


For Gradle docs on debugging, see:

https://docs.gradle.org/current/userguide/logging.html

https://docs.gradle.org/current/userguide/troubleshooting.html

https://docs.gradle.org/current/userguide/viewing_debugging_dependencies.html

Some of my other posts on Gradle:

Dependencies and configurations in Gradle

Gradle incremental tasks and builds

Gradle Release Plugin

Code coverage with Gradle and Jacoco

Posted in Gradle, Java | Tagged , | Leave a comment

Error handling in Apache Camel

Scenarios

When coding an integration with Apache Camel, we need to be able to deal with many different kinds of error:
  • Bug / error in our own code, before we have communicated with the remote service.
  • Getting a connection to the remote service, but it returning an error response / code.
  • Failing to connect to the remote service.
  • Error inside our error handling code! e.g. you are inside a try catch block
  • Message is retried from DLQ, but a later message has already been sent.
  • Power goes down after message has been picked up.

Coding options available

We have multiple options for handling errors:
  1. Try catch block – for catching errors where we can do something useful. e.g. update a status to failed
  2. On exception – can be used with a retry policy. This will save the current message state and what step failed. It will then block the consumer thread and retry from the failed step when the configured redelivery time is up. This should not be used with long redelivery time periods as the thread is blocked. See https://camel.apache.org/manual/exception-clause.html
  3. Error handler – similar to on exception, can be used with a retry policy that will save message state and failing step, and retry. However with onException you can specify different policies for different exception types. See https://camel.apache.org/manual/error-handler.html
  4. JMS retries. These are configured ActiveMQ rather than Camel. In this case, the message is retried from the start of the route. This is useful if you want to retry after a long redelivery period, like 10 minutes, as the AMQ consumer thread is not blocked. Also, unlike on exception and an error handler, each time the broker retries the message, it increments a header. This means we can write logic to detect when a message has been retried a certain number of times, and then invoke error handling. (See below.)
  5. Filter that if retries have exceeded a threshold and invokes error handling logic. If the message has been retried multiple times, this suggests that it will never succeed. e.g. input data is invalid. We don’t want it to constantly refail back to the DLQ so we can detect the number of retries and then invoke whatever error handling logic is appropriate.
  6. JDBC save points. Within a single transaction you can record save points, and perform a partial rollback to one of these, if an error occurs. The approach you take to errors depends on whether you think the integration module needs to do anything when an error occurs. If there is nothing useful that the module can do, you can permit the message to go straight to the DLQ. If you need to implement some error handling in the module, you can wrap the integration code in a block, with one or more blocks for the error conditions.

On exception

Can be used with a short redelivery period to make the route retry. This is useful in the case of a temporary network problem. You place the config inside your camel context, but outside of the routes.
NOTE: This does NOT retry from the start of the route! Camel saves a copy of the message with all headers and properties, and retries the step that failed!
<onException>
    <exception>java.io.IOException</exception>
    <redeliveryPolicy redeliveryDelay="10000" maximumRedeliveries="5"/>
    <handled>
        <constant>false</constant>
    </handled>
    <log loggingLevel="ERROR" message="Failure processing, giving up with exception: ${exception}"/>
</onException>

Using doCatch and doWhen

Camel actually has a more powerful catch mechanism than plain Java, as you can specify not just an exception type, but additional conditions. e.g.
<doTry>
    <to uri="cxfrs:bean:someCxfClient" pattern="InOut"/>
    <doCatch>
        <exception>org.apache.camel.component.cxf.CxfOperationException</exception>
        <onWhen>
            <simple>${exception.statusCode} == 401</simple>
        </onWhen>
Note you MUST use onWhen here. If you use something like filter or choice you have a code bug – you will catch all exceptions of the specified type, but only handle some of them!

Catching an error, making changes and retrying

This is a common pattern. Consider a route which does the following:
  1. Performs a login.
  2. Stores login credentials in cache.
  3. Connects to remote service.
We always need to consider the possibility that the cached credentials may be invalid. In this case, we need to:
  1. Identify the specific error or return code that says the credentials are invalid.
  2. Clear the cache.
  3. Login again.
  4. Retry the remote call.
You might think that you could catch the error, clear the cache, and then rethrow the exception so that your error handler will retry the message. This won’t work! If you have added the new login details to the message, they will be ignored, because the error handler will be retrying with the saved copy of the message that was the input to the failing step! There are two ways to deal with this problem:
  1. Manual approach. Simply catch the error, make the appropriate changes, then call the remote service again. You may need to set up headers for the remote service, so you might find this easier if you move the code for setting up the headers and making the call to its own route.
  2. Group together the steps you need to be rerun into their own route, and mark this as having no error handler. Then the exception gets propagated back up to the calling route. When this gets handled by an error handler, it will retry from the start of the second route. In the caching example, you would group together getting the auth token from the cache and then making the remote call.

Filter to detect number of retries

When you retry from the AMQ broker, it updates a message header with the retry number. You can use this to detect when a certain number of retries have been attempted, and invoke your error handling code. You should place this filter as close as possible to the top of your route. If not, the code could fail again before hitting the filter, and end up in the DLQ. Sample:
<log id="logRetryCount" loggingLevel="INFO" message="JMS delivery count: ${header.JMSXDeliveryCount}"/>
<filter>
    <simple>$simple{header.JMSXDeliveryCount} > {{jms.max.retries}}</simple>
    <log message="Retry limit has been reached" loggingLevel="INFO"/>
<!-- do your error handling in here, like setting the status to failed, or placing the message on a dedicated failure queue -->
    <stop/>
</filter>
I found in testing that when I tried to set the JMS header, it seemed to be ignored. However you can set it in your test by adding an advice by weaving in extra code. As long as you have a step in the route with an id, you can insert extra code before or after it. In the sample above, we have a log statement just before the filter, so in the test code we have:
AdviceWith.adviceWith(camelContext, "myCamelRoute", a -> {
    a.weaveById("logRetryCount").before()
            .process(exchange -> {
                if (maxRetriesExceeded) {
                    exchange.getIn().setHeader("JMSXDeliveryCount", 6);
                }
            });
});
Setting the header to the retry count only makes sense if the route handles the error. But if you need the message in the DLQ, you would re-throw it. To test this behaviour, just configure a parameter for checking the delivery count and set it to 2 for tests which retry exactly once.

Exceptions and error codes

When throwing exceptions, generally each different exception should have a different message, and potentially a unique error code. This makes it far easier to debug real failures, as you can easily find the section of code which threw the exception.

See Also

Other posts on Camel: Type conversion in Camel
Posted in Camel, Java | Tagged , | Leave a comment

Gradle incremental tasks and builds

One of the things that makes a build efficient is when it can run incrementally. i.e. If you have already run one build, and you change things and run another, the second build should only have to rerun some tasks – where the inputs to that task have changed. Gradle has great support for this. I recently came across an example while migrating a large build from Maven to Gradle. In this build, we have three steps that do the following:
  1. Generate JAXB java classes from XSDs
  2. Precompile these classes plus a small number of other classes
  3. Run a custom annotation processor which will add json annotations to the generated code

The custom annotation task is defined in a separate git repository. It needed a custom classpath. I don’t believe you can change the classpath for normal tasks, but if you use a JavaExec task to run a task in a new JVM, you can obviously configure the classpath as you wish. Hence this is the setup I used. It looked like this:

tasks.register('jacksonAnnotationTask', JavaExec) {

    classpath = sourceSets.main.compileClasspath
    classpath += files("$buildDir/classes/java/generatedSource")
    classpath += configurations.jacksonAnnotationTaskClasspath

    mainClass = 'com.ice.integration.gradle.JacksonAnnotationTask'

    args = ["$buildDir/generated/jaxb/java/claimsApiModelClasses", "com"]
}

These steps all happen before the main compile. When I did a compile, then repeated it, I was disappointed to see that the custom annotation task was rerun. What was going on?

How do you see what is going on with a Gradle build? The easiest thing to do is rerun with the -d debug flag. Once I did this, the problem was obvious – the task was rewriting the generated source files in place – therefore the inputs to the task had changed, therefore the task had to be rerun. Once I understood this, the route to fix it is clear – the task should output the updated files in a new location. I updated the task code to do this, adding a third parameter to specify the output directory. Then I updated the JavaExec config to specify the input and output, like this:

tasks.register('jacksonAnnotationTask', JavaExec) {
    // we must declare inputs and outputs
    // otherwise Gradle will just rerun this task every time
    String outputDirectory = "$buildDir/generated/jaxb/java/claimsApiModelClassesWithJsonAnnotations"
    inputs.files(sourceSets.generatedSource.java)
    outputs.dir(outputDirectory)

    classpath = sourceSets.main.compileClasspath
    classpath += files("$buildDir/classes/java/generatedSource")
    classpath += configurations.jacksonAnnotationTaskClasspath

    mainClass = 'com.ice.integration.gradle.JacksonAnnotationTask'

    args = ["$buildDir/generated/jaxb/java/claimsApiModelClasses", "com", outputDirectory]
}

Once I made this change, rerunning the compile task told me that all tasks were up to date, nothing to be rerun! Fantastic!

For more information on incremental builds, see:
https://docs.gradle.org/current/userguide/incremental_build.html

For other blog posts on Gradle, see:
Dependencies and configurations in Gradle
Gradle release plugin
Using test fixtures in Gradle and Maven
Code coverage with Gradle and Jacoco

Posted in Gradle, Java | Tagged | Leave a comment

Gradle Release Plugin

Gradle has a release plugin that mimics the Maven release plugin behaviour. i.e. you specify a snapshot version in your build and the plugin can update the version to a released version, commit and push that.

The plugin is not bundled / part of Gradle core, but available here:

https://github.com/researchgate/gradle-release

Configuration

To use it, set a snapshot version in your gradle.properties file, like:

version=1.1-SNAPSHOT

You build.gradle file config should be as follows:


plugins {
    id 'java'
    id 'net.researchgate.release' version '2.8.1'
    id 'maven-publish'
}

group 'com.something'

repositories {
    mavenLocal()
    mavenCentral()
    maven {
        url = uri('http://your-maven-repo:8080/repository/your-repo-url/')
    }
}

java {
    withSourcesJar()
}

publishing {
    publications {
        maven(MavenPublication) {
            from components.java
        }
    }

    repositories {
        // define our repo for publishing.
        // Note that you do not need to include maven local here to enable publishing of snapshots
        // to the local repo
        maven {
            name = "myRepo"
            def releasesRepoUrl = "http://myRepoURL:8080/repository/releases"
            def snapshotsRepoUrl = "http://myRepoURL:8080/repository/snapshots"
            url = version.endsWith('SNAPSHOT') ? snapshotsRepoUrl : releasesRepoUrl
            allowInsecureProtocol = true
            // credentials are not stored in the project, put them in gradle.properties on your build server
            credentials(PasswordCredentials)
        }
    }
}

// publish every build to local maven, to enable local testing
publishToMavenLocal.dependsOn(check)
build.dependsOn(publishToMavenLocal)
// publish release builds to remote repo
afterReleaseBuild.dependsOn publish

Diasabling git repo caching on Bamboo

If you use the Bamboo CI server, you can get this problem. By default, Bamboo will create a cached version of a git repository on a build agent. This seems to cause a release to fail as the cache appears to be created as a non-bare git repo. (Why!?) You will get an error like this:
build   12-Jul-2023 21:25:01    > Task :CPA-GJAP-JOB1:preTagCommit FAILED
build   12-Jul-2023 21:25:01    Running [git, push, --porcelain, origin, master] produced an error: [remote: error: refusing to update checked out branch: refs/heads/master       
build   12-Jul-2023 21:25:01    remote: error: By default, updating the current branch in a non-bare repository       
build   12-Jul-2023 21:25:01    remote: is denied, because it will make the index and work tree inconsistent       
build   12-Jul-2023 21:25:01    remote: with what you pushed, and will require 'git reset --hard' to match       
build   12-Jul-2023 21:25:01    remote: the work tree to HEAD.       
build   12-Jul-2023 21:25:01    remote:
build   12-Jul-2023 21:25:01    remote: You can set the 'receive.denyCurrentBranch' configuration variable       
build   12-Jul-2023 21:25:01    remote: to 'ignore' or 'warn' in the remote repository to allow pushing into       
build   12-Jul-2023 21:25:01    remote: its current branch; however, this is not recommended unless you       
build   12-Jul-2023 21:25:01    remote: arranged to update its work tree to match what you pushed in some       
build   12-Jul-2023 21:25:01    remote: other way.       
build   12-Jul-2023 21:25:01    remote:
build   12-Jul-2023 21:25:01    remote: To squelch this message and still keep the default behaviour, set       
build   12-Jul-2023 21:25:01    remote: 'receive.denyCurrentBranch' configuration variable to 'refuse'.       
build   12-Jul-2023 21:25:01    error: failed to push some refs to 'file:///opt/atlassian/bamboo_home/local-working-dir/_git-repositories-cache/75a29889fc2f629c383438079d8c939799ad3383']

You need to go into the Bamboo admin settings for that repository (not the build plan) and deselect “Enable repository caching on agents”.

Avoiding an infinite build loop with Bamboo

A second gotcha when using Bamboo! Bamboo is hard coded to understand and ignore Maven release plugin commits. This is not the case with the Gradle release plugin! Hence without additional configuration you will get an infinite loop. You can fix this by configuring the repository settings to ignore Gradle release plugin commits.

https://stackoverflow.com/questions/48692732/gradle-release-plugin-release-task-on-bamboo-cause-infinite-loop

You need to update the Bamboo admin settings for the repository, not to trigger builds when the commit message matches:

.*?Gradle Release Plugin.*?

initScmAdapter FAILED :Current Git branch is master not main

From v3 of the plugin onwards, the default git branch has changed from master to main. If you are still using master, you can configure this:
release {
    git {
        requireBranch.set('master')
    }
}
For other Gradle posts, see:
Dependencies and Configurations in Gradle
Using test fixtures in Gradle and Maven
Code coverage with Gradle and Jacoco
Posted in Gradle, Java | Tagged , | Leave a comment

Dependencies and Configurations in Gradle

What is a Gradle configuration?

In Maven dependencies can be assigned to a given scope:
  • compile – available on all classpaths and propagated to dependent projects
  • provided – will be provided by your container
  • runtime
  • test
Gradle has a much broader concept called a configuration.

https://docs.gradle.org/current/userguide/declaring_dependencies.html

  • Unlike in Maven, in Gradle it is easy to declare your own configurations.
  • A configuration is a name which you can assign one or more files to.
  • Configurations have producers, which assign files to them, and consumers, which use them.

Different configurations are defined in different plugins so if you try to use a configuration and get an error, confirm you have the plugin available in your module. e.g. https://docs.gradle.org/current/userguide/java_library_plugin.html

Common configurations:

Configuration Description
api Will be on compile classpath for this module, and for all subsequent modules
implementation Will be on compile classpath for this module, and runtime classpath for whole app, but NOT on compile classpath for later modules. This prevents developers from accidentally coding against a library that is used internally in one module.
compileOnly
runtimeOnly
testImplementation
testApi
testFixturesImplementation Used when defining a test fixture. See Using test fixtures in Gradle and Maven
testFixturesApi
testFixtures When consuming a test from another module. See Using test fixtures in Gradle and Maven
annotationProcessor For annotation processing in the compile phase.

Dependency resolution strategy and automatic dependency upgrading

Gradle performs optimistic dependency upgrading. This can cause confusion because you may have a version of a dependency explicitly specified, but if a transitive dependency is a higher version, Gradle will take the higher version. This is not what Maven would do – it would always take the explicitly specified version.

However in Gradle you can override the default dependency resolution strategy. There are multiple ways to do this:

Global fail if any conflict

configurations.all {
    resolutionStrategy {
        failOnVersionConflict()
    }
}

Custom resolution

e.g. fix version
configurations.all {
    resolutionStrategy.eachDependency { DependencyResolveDetails details ->
        if (details.requested.group == 'io.projectreactor' && details.requested.name == 'reactor-core' && details.requested.version == '3.3.0.M1') {
            details.useVersion '3.2.10.RELEASE'
            details.because 'the original version which comes with spring-integration:5.2.0.M2 is no longer available'
        }
See: https://docs.gradle.org/current/userguide/resolution_strategy_tuning.html
Posted in Gradle, Java | Tagged , | Leave a comment