,

In this tutorial, we will be building a demo web application for a Dog Rescue organization that uses JdbcTemplate and Thymeleaf. For this example, we will be using a MySQL database. However, this example is not limited to MySQL and the database could be swapped out for another type with ease.

You can browse and download the code on Github as you follow this example.

1 – Project Structure

The project uses a typical Maven structure. You may notice I am using Spring Tool Suite, which JT is not a fan of!

Project structure of Spring Boot Thymeleaf JdbcTemplate tutorial

 

2 – Dependencies

Besides typical Spring Boot Starter dependencies, we include Thymeleaf and MySQL connector.

 

3 – Configuration

We configure all our datasource information here in the application.properties. Later we will autowire this for our JdbcTemplate use.

application.properties

4 – Database Initialization

When our application starts, these SQL file will be automatically detected and ran. In our case, we will drop the table “dog” every time the application starts, create a new table named “dog” and then insert the values shown in data.sql.

You may recall that “vaccinated” is a Boolean value in Java. In MySQL Boolean is a synonym for TINYINT(1), so we can use this data type for the column.

schema.sql

data.sql

5 – Model/Entity

Here we define the characteristics of a dog that we want to know for our Dog Rescue. The getters and setters were auto created and it is suggested to do this to save time.

6 – Repository

We extend the CrudRepository for our DogRepository. The only additional method we create is a derived query for finding a dog by name.

7 – Service

Using the SOLID principles that JT discusses on the site here :SOLID Principles , we build a service interface and then implement that interface.

DogService.java

DogServiceImpl.java

Here we implement the methods mentioned in DogService.java.

  • addADog – is an example of how to add a record using JdbcTemplate’s update method. It takes three parameters: String, Date and Boolean.
  • deleteADOG – is an example of how to delete a record using JdbcTemplate’s update method. It takes two parameters: Long (id) and String (name).
  • List atriskdogs – is an example of how to select records using JdbcTemplate’s query method. This uses a
    ResultSetExtractor. It takes one parameter: Date. The method returns records of dogs that were rescued before that date that have not been vaccinated (Boolean value of false).
  • long getGeneratedKey – is an example of how to insert records using JdbcTemplate’s query method with PreparedStatementCreator and retrieve a generated key. It takes the same parameters as the other insert example: String, Date and Boolean.

8 – Controller

DogController.java

    • @GetMapping(value = “/”) – there is an optional requirement to provide a search value of type Date in yyyy-MM-dd format. This variable is called q (for “query”) and if it is not null then we will create an ArrayList of all dogs rescued before that date who have not been vaccinated. This ArrayList is called dogModelList and added as an attribute known as “search”. This attribute will be used in our Thymeleaf template.
      Because of its ease of use,
      we use the built in findall method of the CrudRepository to create a list of all dogs in the repository and add it as an attribute, which will also be used by Thymeleaf.
    • @PostMapping(value = “/”) – we request all the parameters that will be passed in our HTML form. We use these values to add a dog to our database.
    • @PostMapping(value = “/delete”) – we request the parameters needed to delete a dog. After the dog is deleted, we redirect the user back to our homepage.
    • @PostMapping(value = “/genkey”) – this is the mapping for inserting a record that returns a generated key. The generated key is printed to standard out in our example.

9 – Thymeleaf template

As this is a basic example application to demonstrate approaches to JdbcTemplate, JPA, Thymeleaf, and other technologies, we have just this one page with a minimalist user interface.

      • Using th:each we are able to iterate through all the records in our dog table
      • Using th:text with the variable and field name, we can display the record. I.E. th:text=”${dogs.id}
      • Using th:if=”${not #lists.isEmpty(search), we prevent the web page from showing the table of search results for dogs at risk (not vaccinated) unless there are results to be shown.
      • With our forms, we map the request to a specific URI and specify names for the inputs of our form that match the parameters in our controller.

index.html

10 – @SpringBootApplication

Our class with the main method has nothing unique in it. The @SpringBootApplication annotation takes care of autodetecting beans that are registered with the various stereotype annotations, such as @Service, etc.

11 – Demo

Landing page

So, I have navigated to localhost:8080 as I did not change the default ports for this application. When I land on the page, you can see that it is displaying the current dogs in our database.

Dog Rescue homepage

Find Dogs That Need Vaccines

Imagine that instead of three dogs in this database we had a larger, less manageable number. Having a feature that allows employees of a dog rescue to find dogs that need vaccinations would be useful if there were more dogs.

The search functionality takes a date and shows dogs that were rescued before that date that have not been vaccinated.

Although we know right now that Buddy is the only dog without his vaccinations, let’s show how this works.

Search for dogs without vaccine

Thymeleaf conditional statement

Add A Dog

As we know, the ID is autogenerated. So we can add all the fields minus the ID and successfully still a Dog to the database.
Adding a dog in thymeleaf template

Table in thymeleaf

Delete A Dog

We remove a dog from the database by using the primary ID but also ask for the name of the dog to verify it is the correct one.

We redirect the user back to the index, so it displays the table of dogs minus the one deleted. Below you can see I have removed “Pooch”.
Delete JdbcTemplate Thymeleaf

Spring Boot Delete Dog

Add A Dog And Retrieve Generated Key

Sometimes we need to retrieve the generated key from our database for other uses. Here in this example, we insert a dog named “Lassie” and retrieve the generated key.

Insert Lassie in database

In the console, this is printed

Our table is once again updated

updated table

Download the code from Github

If you’d like, you can view and download the code from Github

About Michael

Michael Good is a software engineer located in the Washington DC area that is interested in Java, cyber security, and open source technologies. Follow his personal blog to read more from Michael.

0
Share
, ,

I have met many developers who refer to tests as “Unit Tests” when they are actually integration tests. In service layers, I’ve seen tests referred as unit tests, but written with dependencies on the actual service, such as a database, web service, or some message server. Those are part of integration testing. Even if you’re just using the Spring Context to auto-wire dependencies, your test is an integration test. Rather than using the real services, you can use Mockito mocks and spies to keep your tests unit tests and avoid the overhead of running integration tests.

This is not to say Integration tests are bad. There is certainly a role for integration tests. They are a necessity.

But compared to unit tests, integration tests are sloooowwwww. Very slow. Your typical unit test will execute in a fraction of a second. Even complex unit tests on obsolete hardware will still complete sub-second.

Integration tests, on the otherhand take several seconds to execute. It takes time to start the Spring Context. It takes time to start a H2 in memory database. It takes time to establish a database connection.

While this may not seem much, it becomes exponential on a large project. As you add more and more tests, the length of your build becomes longer and longer.

No developer wants to break the build. So we run all the tests to be sure. As we code, we’ll be running the full suite of tests multiple times a day. For your own productivity, the suite of tests needs to run quickly.

If you’re writing Integration tests where a unit test would suffice, you’re not only impacting your own personal productivity. You’re impacting the productivity of the whole team.

On a recent client engagement, the development team was very diligent about writing tests. Which is good. But, the team favored writing Integration tests. Frequently, integration tests were used where a Unit test could have been used. The build was getting slower and slower. Because of this, the team started refactoring their tests to use Mockito mocks and spies to avoid the need for integration tests.

They were still testing the same objectives. But Mockito was being used to fill in for the dependency driving the need for the integration test.

For example, Spring Boot makes it easy to test using a H2 in memory database using JPA and repositories supplied by Spring Data JPA.

But why not use Mockito to provide a mock for your Spring Data JPA repository?

Unit tests should be atomic, lightweight, and fast that are done as isolated units. Additionally, unit tests in Spring should not bring up a Spring Context. I have written about the different types of tests in my earlier Testing Software post.

I have already written a series of posts on JUnit and a post on Testing Spring MVC With Spring Boot 1.4: Part 1. In the latter, I discussed unit testing controllers in a Spring MVC application.

I feel the majority of your tests should be unit tests, not integration tests. If you’re writing your code following the SOLID Principles of OOP, your code is already well structured to accept Mockito mocks.

In this post, I’ll explain how to use Mockito to test the service layer of a Spring Boot MVC application. If Mockito is new for you, I suggest reading my Mocking in Unit Tests With Mockito post first.

Using Mockito Mocks and SpiesMockito Mocks vs Spies

In unit test, a test double is a replacement of a dependent component (collaborator) of the object under test. The test double does not have to behave exactly as the collaborator. The purpose is to mimic the collaborator to make the object under test think that it is actually using the collaborator.

Based on the role played during testing, there can be different types of test doubles. In this post we’re going to look at mocks and spies.

There are some other types of test doubles, such as dummy objects, fake objects, and stubs. If you’re using Spock, one of my favorite tricks was to cast a map of closures in as a test double. (One of the many fun things you can do with Groovy!)

What makes a mock object different from the others is that it has behavior verification. Which means the mock object verifies that it (the mock object) is being used correctly by the object under test. If the verification succeeds, you can conclude the object under test will correctly use the real collaborator.

Spies on the other hand, provides a way to spy on a real object. With a spy, you can call all the real underlying methods of the object while still tracking every interaction, just as you would with a mock.

Things get a bit different for Mockito mocks vs spies. A Mockito mock allows us to stub a method call. Which meams we can stub a method to return a specific object. For example, we can mock a Spring Data JPA repository in a service class to stub a getProduct() method of the repository to return a Product object. To run the test, we don’t need the database to be up and running – a pure unit test.

A Mockito spy is a partial mock. We can mock a part of the object by stubbing few methods, while real method invocations will be used for the other. By saying so, we can conclude that calling a method on a spy will invoke the actual method, unless we explicitly stub the method, and therefore the term partial mock.

Let’s look mocks vs spies in action, with a Spring Boot MVC application.

The Application Under Test

Our application contains a single Product JPA entity. CRUD operations are performed on the entity by ProductRepository using a CrudRepository supplied by Spring Data JPA. If you look at the code, you will see all we did was extend the Spring Data JPA CrudRepository to create our ProductRepository. Under the hood, Spring Data JPA provides implementations to manage entities for most common operations, such as saving an entity, updating it, deleting it, or finding it by id.

The service layer is developed following the SOLID design principles. We used the “Code to an Interface” technique, while leveraging the benefits of dependency injection. We have a ProductService interface and a ProductServiceImpl implementation. It is this ProductServiceImpl class that we will unit test.

Here is the code of ProductServiceImpl .

ProductServiceImpl.java

In the ProductServiceImpl class, you can see that ProductRepository is @Autowired in. The repository is used to perform CRUD operations. – a mock candidate to test ProductServiceImpl.

Testing with Mockito Mocks

Coming to the testing part, let’s take up the getProductById() method of ProductServiceImpl. To unit test the functionality of this method, we need to mock the external Product and ProductRepository objects. We can do it by either using the Mockito’s mock() method or through the @Mockito annotation. We will use the latter option since it is convenient when you have a lot of mocks to inject.

Once we declare a mock` with the @Mockito annotation, we also need to initialize it. Mock initialization happens before each test method. We have two options – using the JUnit test runner, MockitoJUnitRunner or MockitoAnnotations.initMocks() . Both are equivalent solutions.

Finally, you need to provide the mocks to the object under test. You can do it by calling the setProductRepository() method of ProductServiceImpl or by using the @InjectMocks annotation.

The following code creates the Mockito mocks, and sets them on the object under test.

Note: Since we are using the Spring Boot Test starter dependency, Mockito core automatically is pulled into our project. Therefore no extra dependency declaration is required in our Maven POM.

Once our mocks are ready, we can start stubbing methods on the mock. Stubbing means simulating the behavior of a mock object’s method. We can stub a method on the ProductRepository mock object by setting up an expectation on the method invocation.

For example, we can stub the findOne() method of the ProductRepository mock to return a Product when called. We then call the method whose functionality we want to test, followed by an assertion, like this.

This approach can be used to test the other methods of ProductServiceImpl, leaving aside deleteProduct() that has void as the return type.

To test the deleteProduct(), we will stub it to do nothing, then call deleteProduct(), and finally assert that the delete() method has indeed been called.

Here is the complete test code for using Mockito mocks:

ProductServiceImplMockTest.java

Note: An alternative to doNothing() for stubbing a void method is to use doReturn(null).

Testing with Mockito Spies

We have tested our ProductServiceImpl with mocks. So why do we need spies at all? Actually, we don’t need one in this use case.

Outside Mockito, partial mocks were present for a long time to allow mocking only part (few methods) of an object. But, partial mocks were considered as code smells. Primarily because if you need to partially mock a class while ignoring the rest of its behavior, then this class is violating the Single Responsibility Principle, since the code was likely doing more than one thing.

Until Mockito 1.8, Mockito spies were not producing real partial mocks. However, after many debates & discussions, and after finding a valid use case for partial mock, support for partial mock was added to Mockito 1.8.

You can partially mock objects using spies and the callRealMethod() method. What it means is without stubbing a method, you can now call the underlying real method of a mock, like this.

Be careful that the real implementation is ‘safe’ when using thenCallRealMethod(). The actual implementation needs be able to run in the context of your test.

Another approach for partial mocking is to use a spy. As I mentioned earlier, all method calls on a spy are real calls to the underlying method, unless stubbed. So, you can also use a Mockito spy to partially mock few stubbed methods.

Here is the code provide a Mockito spy for our ProductServiceImpl .

ProductServiceImplSpyTest.java

In this test class, notice we used MockitoJUnitRunner instead of MockitoAnnotations.initMocks() for our annotations.

For the first test, we expected NullPointerException because the getProductById() call on the spy will invoke the actual getProductById() method of ProductServiceImpl, and our repository implementations are not created yet.

In the second test, we are not expecting any exception, as we are stubbing the save() method of ProductRepository.

The second and third methods are the relevant use cases of a spy in the context of our application– verifying method invocations.

Conclusion

In Spring Boot applications, by using Mockito, you replace the @Autowired components in the class you want to test with mock objects. In addition to unit test the service layer, you will be unit testing controllers by injecting mock services. To unit test the DAO layer, you will mock the database APIs. The list is endless – It depends on the type of application you are working on and the object under test. If your following the Dependency Inversion Principle and using Dependency Injection, mocking becomes easy.

For partial mocking, use it to test 3rd party APIs and legacy code. You won’t require partial mocks for new, test-driven, and well-designed code that follows the Single Responsibility Principle. Another problem is that when() style stubbing cannot be used on spies. Also, given a choice between thenCallRealMethod on mock and spy, use the former as it is lightweight. Using thenCallRealMethod on mock does not create the actual object instance but bare-bones shell instance of the Class to track interactions. However, if you use spy, you create an object instance. As regard spy, use it if you only if you want to modify the behavior of small chunk of API and then rely mostly on actual method calls.

The code for this post is available for download here.

1
Share
,

Containers based deployments are rapidly gaining popularity in the enterprise. One of the more popular container solutions is Docker.

Many view containers as virtual machines. They’re not. Well kind of not. A container is a virtual walled environment for your application. It’s literally a ‘container’ inside the host OS. Thus your application works like it is in it’s own self contained environment, but it’s actually sharing operating system resources of the host computer.  Because of this, containers are more resource efficient than full blown virtual machines. You get more bang for your buck running a bare metal machine with a bunch of containers, than you do running a bare metal machine with a bunch of VMs. This is why massive cloud computing companies running 10’s of thousands of servers are running containers. Google, Facebook, Netflix, Amazon are all big advocates of containers.

Introducing  Docker Containers

To help you visualize the difference, here are a couple images provided by Docker. Here is the bloated architecture of a traditional virtual machine environment. A popular solution you can try is Oracle’s Virtual Box which allows you run a variety of operating systems on your personal machine. I personally use VMWare Fusion to run Windows on my MBP (and I still feel a little dirty every time I do). If you’ve never used either these, I recommend you give them a try.

In this graphic, note how each stack has its own Guest OS.

what-is-docker-diagram

Now for comparison, here is the same stack containerized by Docker. Here you can see how each application does not get its own operating system. This is key to why Docker containers are so efficient. You’re not providing a virtual layer mimic the hardware, for the the guest OS to use. And you’re not running n+1 guest hosts either.

Docker What is a VM

Clearly this is more efficient computing. I’ve seen estimates in the 10-25% range of improved performance. But like with everything else when it comes to computing performance – your mileage may vary. I’d expect lightweight linux VMs to be closer to the 10% side of the scale, and Windows VMs probably closer to the 25% end of the scale – just because the Windows OS is so bloated in comparison.

This leads me to an important distinction about Docker – Linux only. Yes, you can “run” Docker on Windows and OSX – but at this time, you can only do so using a VM running in Virtual Box to run – a Linux VM.

Running Spring Boot in a Docker Container

Introduction

When I first heard about running Spring Boot in a Docker container, I personally thought – “now why would you want to run a JVM in a VM, on a VM?” At first glance, it just seemed like a absolutely terrible idea from a performance standpoint. I doubt if any of these solutions will ever match the performance of a JVM running on a bare metal installation of Linux. But, I’ve shown above, running a Spring Boot application in a Docker container should have minimal performance impact. Certainly less of an impact than running in a VM. Which is exactly what you’re doing running applications in any cloud provider (see image one above).

Installing Docker

I’m not going to get into installing Docker on your OS. There is ample documentation about installing Docker in the internet. Going forward, I’m going to assume you have Docker installed. Since Docker is Linux based, my focus will be on Linux (RHEL / CentOS).

Introduction to Docker CourseSpring Boot Example Application

For the purposes of this tutorial, let’s start with a simple Spring Boot Application. I’m going to use the completed application from my Mastering Thymeleaf course. This is a simple Spring Boot web application which is perfect for our needs.

If you want to follow this tutorial along step by step, head over to GitHub and checkout this Spring Boot project. Be sure to change to the branch “spring-boot-docker-start”.

Building a Spring Boot Docker Image

For us to run Spring Boot in a Docker container, we need to define a Docker image for it. Building Docker images is done through the use of “Dockerfile”s. Dockerfiles are basically a manifest of commands we will use to build and configure our docker container. To configure our Docker image to run our Spring Boot application, we will want to:

  • Start with the latest CentOS image from Docker Hub.
  • Install and configure Oracle Java.
  • Install the Spring Boot artifact – our executable JAR file.
  • Run the Spring Boot application.

I’m using CentOS for its compatibility with RHEL, which is probably the most popular linux distribution used by enterprises. And Oracle’s Java, mainly for the same reason.

Create Our Dockerfile

In our Maven project, we need to create our Dockerfile. In /src/main/docker  create the file Dockerfile .

NOTE: As a Java developer you may be tempted to create the file as DockerFile. Don’t do this. The Maven Plugin we cover later won’t see your file if it’s CamelCase. I learned this lesson the hard way.

CentOS

We’ll start our Docker image off by using the CentOS image from Docker hub.

Dockerfile

Installing Oracle Java

The following lines in our dockerfile will install wget into the image using the yum package installer, download the Oracle Java JDK from Oracle using wget, then configure Java on the machine.

Dockerfile

Installing the Spring Boot Executable Jar

In this section of the Dockerfile, we are:

  • Adding a /tmp volume. Docker will map this to to /var/lib/docker on the host system. This is the directory Spring Boot will configure Tomcat to use as its working directory.
  • The ADD  command adds the Spring Boot executable Jar into our Docker image.
  • The RUN  command is to ‘touch’ the JAR and give it a modified date.
  • The ENTRY  point is what will run the jar file when the container is started.

I learned about these configuration settings from a post from the Pivotal team here.

Dockerfile

Complete Dockerfile

Here is the complete Dockerfile.

Dockerfile

Building the Docker Image Using Maven

Naturally, we could build our Docker image using docker itself. But this is not a typical use case for Spring developers. A typical use case for us would be to use Jenkins to generate the Docker image as part of a CI build. For this use case, we can use Maven to package the Spring Boot executable JAR, then have that build artifact copied into the Docker image.

There’s actually several competing Maven plugins for Docker support. The guys at Spotify have a nice Maven / Docker plugin. In this example, I’m going to show you how to use the Fabric8 Docker plugin for Maven.

Fabric8

Of the Maven plugins for Docker, at the time of writing, Fabric8 seems to be the most robust. For this post, I’m only interested in building a Docker Image for our Spring Boot artifact. This is just scratching the surface of the capabilities of the Fabric8 Maven plugin. This plugin can be used to spool up Docker Images to use for your integration tests for CI builds. How cool is that!?!? But let’s learn to walk before we run!

Here is a typical configuration for the Fabric8 Maven plugin for Docker.

Fabric8 Maven Docker Plugin Configuration

If you’re following along in tutorial, the complete Maven POM now is:

pom.xml

Building the Docker Image

To build the Docker image with our Spring Boot artifact run this command:

The ‘clean’ tells Maven to delete the target directory. While this step is technically optional, if you don’t use it, sooner or later you’re going to get bit in the ass by some weird issue. Maven will always compile your classes with the package command. If you’ve done some refactoring and changed class names or packages, without the ‘clean’ the old class files are left on the disk. And in the words of IBM – “Unpredictable results may occur”.

It is very important to run the package command with the docker:build command. You’ll encounter errors if you try to run these in two separate steps.

While the Docker image is building, you will see the following output in the console:

Docker images are built in layers. The CentOS image from Docker Hub is our first layer. Each command in our Dockfile is another ‘layer’. Docker works by ‘caching’ these layers locally. I think of it as being somewhat like your local Maven repository under ~/.m2. Where Maven will bring down Java artifacts once then cache them for future use.

The first time you build this Docker image, will take longer since all the layers are being downloaded / built. The next time we build this, the only layers that change are the one which adds the new Spring Boot artifact, all commands after this. The layers before the Spring Boot artifact aren’t changing, so the cached version will be used in the Docker build.

Running the Spring Boot Docker Image

Docker Run Command

So far, we have not said anything about port mapping. This is actually done at run time. When we start the Docker container, in the run command, we will tell Docker how to map the ports. In our example we want to map port 8080 of the host machine to port 8080 of the container. This is done with the ‘-p’ parameter, followed with <host port>:<container port>. We also want to use the ‘-d’ parameter. This tells Docker to start the container in the background.

Here is the complete command to run our docker container:

This command will start the Docker container and echo the id of the started container.

Congratulations, your Spring Boot application is up and running!

You should now be able to access the application on port 8080 of your machine.

Working with Running Docker Containers

Viewing Running Docker Containers

To see all the containers running on your machine, use the following command:

View Log Output

Our running Docker containers are far from little black boxes. There’s a lot we can do with them. One common thing we want to do is see the log output. Easy enough. Use this command:

Access a Running Docker Container

Need to ssh into a Docker container? Okay, technically this really isn’t SSH, but this command will give you a bash:

Stopping the Docker Container

Shutting down our Docker container is easy. Just run this command:

Ending Source Code

Just in case you’ve run into trouble, like always, I have a branch in GitHub with the complete working example. You can get the ending source code for this tutorial here on GitHub.

Conclusion

The default executable Jar artifact of Spring Boot is ideal for deploying Spring Boot applications in Docker. As I’ve shown here launching a Spring Boot application in a Docker container is easy to do.

In terms of technology, Docker is still fairly young. At the time of writing, Docker is only about three years old. Yet, it is rapidly catching on. While Docker is widely used by the web giants, it is just starting to trickle down to Fortune 500 enterprises. At the time of writing, Docker is not available natively on OSX or Windows. Yet. Microsoft has committed to releasing a native version of Docker for Windows. Which is interesting. A lot things going on around Docker at Red Hat and Pivotal too.

Docker is a fundamental paradigm shift in the way we do things as Spring developers. I assure you, if you’re developing applications in the enterprise using the Spring Framework and have not used Docker, it’s not a matter of if, it is a when.

As a developer, Docker brings up some very cool opportunities. Need a Mongo database to work against? No problem, spool up a local Docker container. Need a virtual environment for your Jenkins CI builds. No problem.

I personally have only been working with Docker a short time. I am honestly excited about it. My thoughts on Docker – Now we’re cooking with gas!

4
Share

Its not uncommon for computers to need to communicate with each other. In the early days this was done with simple string messages. Which was problematic. There was no standard language. XML evolved to address this. XML provides a very structured way of sharing data between systems. XML is so structured, many find it too restrictive. JSON is a popular alternative to XML. JSON offers a lighter and more forgiving syntax than XML.

JSON is a text-based data interchange format that is lightweight, language independent, and easy for humans to read and write. In the current enterprise, JSON is used for enterprise messaging, communicating with RESTful web services, and AJAX-based communications. JSON is also extensively used by NoSQL database such as, MongoDB, Oracle NoSQL Database, and Oracle Berkeley DB to store records as JSON documents. Traditional relational databases, such as PostgreSQL is also constantly gaining more JSON capabilities. Oracle Database also supports JSON data natively with features, such as transactions, indexing, declarative querying, and views.

In Java development, you will often need to read in JSON data, or provide JSON data as an output. You could of course do this on your own, or use an open source implementation. For Java developers, there are several options to choose from. Jackson is very popular choice for processing JSON data in Java.

Maven Dependencies for Jackson

The Jackson library is composed of three components: Jackson Databind, Core, and Annotation. Jackson Databind has internal dependencies on Jackson Core and Annotation. Therefore, adding Jackson Databind to your Maven POM dependency list will include the other dependencies as well.

Spring Boot and Jackson

The above dependency declaration will work for other Java projects, but in a Spring Boot application, you may encounter errors such as this.

Jackson Dependency Conflict Error - NoSuchMethod Error

The Spring Boot parent POM includes Jackson dependencies. When you include the version number, thus overriding the Spring Boot curated dependency versions you may encounter version conflicts.

The proper way for Jackson dependency declaration is to use the Spring Boot curated dependency by not including the version tag on the main Jackson library, like this.

NOTE: This problem is highly dependent on the version of Spring Boot you are using.

For more details on this issue, check out my post Jackson Dependency Issue in Spring Boot with Maven Build.

Reading JSON – Data Binding in Jackson

Data binding is a JSON processing model that allows for seamless conversion between JSON data and Java objects. With data binding, you create POJOs following JavaBeans convention with properties corresponding to the JSON data. The Jackson ObjectMapper is responsible for mapping the JSON data to the POJOs. To understand how the mapping happens, let’s create a JSON file representing data of an employee.

employee.json

The preceding JSON is composed of several JSON objects with name-value pairs and a phoneNumbers array. Based on the JSON data, we’ll create two POJOs: Address and Employee. The Employee object will be composed of Address and will contain properties with getter and setter method corresponding to the JSON constructs.

When Jackson maps JSON to POJOs, it inspects the setter methods. Jackson by default maps a key for the JSON field with the setter method name. For Example, Jackson will map the name JSON field with the setName() setter method in a POJO.

With these rules in mind, let’s write the POJOs.

Address.java

Employee.java

With the POJOs ready to be populated with JSON data, let’s use ObjectMapper of Jackson to perform the binding.

ObjectMapperDemo.java

In the ObjectMapperDemo class above, we created an ObjectMapper object and called its overloaded readValue() method passing two parameters. We passed a File object representing the JSON file as the first parameter. And Employee.class as the target to map the JSON values as the second parameter. The readValue() method returns an Employee object populated with the data read from the JSON file.

The test class for ObjectMapperDemo is this.

ObjectMapperDemoTest.java

The output on running the test is this.

Output of ObjectMapperDemo

Simple Data Binding in Jackson

The example above we covered full data binding – a variant of Jackson data binding that reads JSON into application-specific JavaBeans types. The other type is simple data binding where you read JSON into built-in Java types, such as Map and List and also wrapper types, such as String, Boolean, and Number.

An example of simple data binding is to bind the data of employee.json to a generic Map is this.

ObjectMapperToMapDemo.java

In the ObjectMapperToMapDemo class above, notice the overloaded readValue() method where we used a FileInputStream to read employee.json. Other overloaded versions of this method allow you to read JSON from String, Reader, URL, and byte array. Once ObjectMapper maps the JSON data to the declared Map, we iterated over and logged the map entries.

The test class for the ObjectMapperToMapDemo class is this.

ObjectMapperToMapDemoTest.java

The output on running the test is this.
Output of ObjectMapperToMapDemo

With simple data binding, we don’t require writing JavaBeans with properties corresponding to the JSON data. This is particularly useful in situations where we don’t know about the JSON data to process. In such situations, another approach is to use the JSON Tree Model that I will discuss next.

Reading JSON into a Tree Model

In the JSON Tree Model, the ObjectMapper constructs a hierarchical tree of nodes from JSON data. If you are familiar with XML processing, you can relate the JSON Tree Model with XML DOM Model. In the JSON Tree Model, each node in the tree is of the JsonNode type, and represents a piece of JSON data. In the Tree Model, you can randomly access nodes with the different methods that JsonNode provides.

The code to generate a Tree Model of the employee.json file is this.

In the constructor of the JsonNodeDemo class above, we created an ObjectMapper instance and called its readTree() method passing a File object representing the JSON document as parameter. The readTree() method returns a JsonNode object that represents the hierarchical tree of employee.json. In the readJsonWithJsonNode() method, we used the ObjectMapper to write the hierarchical tree to a string using the default pretty printer for indentation.

The output on running the code is this.

Next, let’s access the value of the name node with this code.

In the code above, we called the path() method on the JsonNode object that represents the root node. To the path() method, we passed the name of the node to access, which in this example is name. We then called the asText() method on the JsonNode object that the path() method returns. The asText() method that we called returns the value of the name node as a string.

The output of this code is:

Next, let’s access the personalInformation and phoneNumbers nodes.

Few key things to note in the code above. In Line 4, notice that we called the get() method instead of path() on the root node. Both the methods perform the same functions – they return the specified node as a JsonNode object. The difference is how they behave when the specified node is not present or the node doesn’t have an associated value. When the node is not present or does not have a value, the get() method returns a null value, while the path() method returns a JsonNode object that represents a “missing node“. The “missing node” returns true for a call to the isMissingNode() method. The remaining code from Line 5- Line 9 is simple data binding, where we mapped the personalInformation node to a Map<String, String> object.

In the readPhoneNumbers() method, we accessed the phoneNumbers node. Note that in employee.json, phoneNumbers is represented as a JSON array (Enclosed within [] brackets). After mapping, we accessed the array elements with a call to the elements() method in Line 15. The elements() method returns an Iterator of JsonNode that we traversed and logged the values.

The output on running the code is this.

The complete code of generating the JSON tree model and accessing its nodes is this.

JsonNodeDemo.java

The test class for the JsonNodeDemo class above is this.

JsonNodeDemoTest.java

Writing JSON using Jackson

JSON data binding is not only about reading JSON into Java objects. With the ObjectMapper of JSON data binding, you can also write the state of Java objects to a JSON string or a JSON file.

Let’s write a class that uses ObjectMapper to write an Employee object to a JSON string and a JSON file.

JsonWriterObjectMapper.java

In Line 22 of the code above, we used an ObjectMapper object to write an Employee object to a JSON string using the default pretty printer for indentation. In Line 24, we called the configure() method to configure ObjectMapper to indent the JSON output. In Line 25, we called the overloaded writeValue() method to write the Employee object to the file provided as the first parameter. The other overloaded writeValue() methods allow you to write JSON output using OutputStream and Writer.

The test code for the JsonWriterObjectMapper class is this.

JsonWriterObjectMapperTest.java

In the test class above, we used the JUnit @Before annotation on the setUpEmployee() method to initialize the Address and Employee classes. If you are new to JUnit, checkout my series on JUnit starting from here. In the @Test annotated method, we called the writeEmployeeToJson() method of JsonWriterObjectMapper, passing the initialied Employee object.

The output on running the test is this.
Output of JsonWriterObjectMapper

Spring Support for Jackson

Spring support for Jackson has been improved lately to be more flexible and powerful. If you are developing Spring Restful webservice using Spring RestTemplate API, you can utilize Spring Jackson JSON API integration to send back JSON response. In addition, Spring MVC now has built-in support for Jackson’s Serialization Views. Also, Jackson provides first class support for some other data formats than JSON- Spring Framework and Spring Boot provide built-in support Jackson based XML. In future posts, I will discuss more about advanced JSON-based processing with Jackson- particularly Jackson Streaming Model for JSON, and also Jackson based XML processing.

Conclusion

Jackson is one of the several available libraries for processing JSON. Some others are Boon, GSON, and Java API for JSON Processing.

One advantage that Jackson has over other libraries is its maturity. Jackson has evolved enough to become the preferred JSON processing library of some major web services frameworks, such as Jersey, RESTEasy, Restlet, and Apache Wink. Open source enterprise projects, such as Hadoop and Camel also use Jackson for handling data definition in enterprise integration.

1
Share
,

Recently while working with Jackson within a Spring Boot project, I encountered an issue I’d like to share with you.

Jackson is currently the leading option for parsing JSON in Java. The Jackson library is composed of three components: Jackson Databind, Core, and Annotation. Jackson Databind has internal dependencies on Jackson Core and Annotation. Therefore, adding Jackson Databind to your Maven POM dependency list will include the other dependencies as well. To use the latest Jackson library, you need to add the following dependency in the Maven POM.

The above dependency works well in other Java projects, but unfortunately in a Spring Boot 1.3.x application, you may stumble upon this error.

Jackson Dependency Conflict Error in Spring Boot

You may see several different errors. Here are some additional examples.

This error occurs due to Jackson dependency conflict. We are working on a Spring Boot project and it’s inheriting from the Spring Boot parent POM that includes Jackson. Without any Jackson dependency in the project POM, let’s print the Maven dependency tree to view the in-built Jackson dependencies.

The output is this.

As you can see above, the Spring Boot parent POM uses an older version of Jackson (2.6.5).

Now, if we add the Jackson dependency to our Maven POM using the version like this:

Maven will pulling in older versions of Jackson-annotation and Jackson-core and overriding the newer ones. We can see this by running the dependency:tree command again.

I did not expect Maven to behave this way in dependency resolution. The POM for the primary Jackson artifact does call for the proper version. However, this seems to be getting overridden by the versions specified explicitly in the Spring Boot parent POM.

Ideally, when working with Spring Boot, is to leverage the curated dependencies in the Spring Boot parent POM. In this case, we drop the version for the Jackson dependency so it will get inherited from the Spring Boot Parent POM.

Now, the version will be inherited from the parent POM, and the issue will be resolved.

But what if we want to use a newer version of Jackson? The proper way is to exclude the inherent dependencies, and explicitly add their new versions, like this.

This POM configuration will override the Jackson dependencies set in the Spring Boot parent POM.

This post is specific to Spring Boot version 1.3.3. Naturally, the Spring Boot team will be evolving the version of Jackson used in future releases.

For developers accustomed to working with Maven dependendencies, it’s a very easy mistake to include the version in the dependency declaration. This can cause some unintended issues due to version conflicts. It is a little bit of a paradigm shift for experienced developers to depend on the Spring Boot parent POM. Some won’t want to relinquish control, but in the long run I expect you will be better of leveraging the Spring Boot curated dependencies.

4
Share