,

In this post, I’m going to walk you through using Spring Boot to setup a Hello World example using Spring Integration and ActiveMQ. We’ll configure Spring Integration to listen on a ActiveMQ queue. For fun, I’ll use Spock to place a message on the queue, and we can watch Spring Integration receive the JMS Message and print a message to the console.

Spring Boot Project Setup

Spring Initializr

Using IntelliJ to create a new project, I’ll select the option to use the Spring Initializr to create my new Spring Boot project. The IntellIj dialog makes it easy to create a Spring Boot project.

In this case, I’m selecting the latest version of Spring Boot (1.3.0.M3) at the time of writing, and the option for Spring Integration.

IntelliJ and Spring Initializr

After completing the steps in IntelliJ, I’ll have a fresh Maven project to work with for this example.

Spring Integration and ActiveMQ Dependencies

Spring Boot does a pretty good job of bringing in the basic dependencies. Using the Maven support in IntelliJ, we can look at Maven dependencies for our project. You can see that via the Spring Boot artifacts, we’re bringing in the basic dependencies for Spring Integration.

Spring Boot and Spring Integration maven dependencies in IntelliJ

 

At the time of writing, the Spring Initializr does not support ActiveMQ directly. I didn’t look to see if the Spring Boot team defined a Maven artifact for this or not. But, it’s simple enough to add the dependencies we will need for Spring Integration and ActiveMQ.

Notice how I have not added version information to the dependencies? That gets inherited from the parent Spring Boot Maven POM.

ActiveMQ Spring Boot Configuration

Like many other things, Spring Boot makes our task of configuring ActiveMQ easier. For the purposes of our example, we want to use an embedded ActiveMQ broker. This is common to use when developing Spring projects which use ActiveMQ. Often when developing enterprise applications using Spring, you will use a ActiveMQ embedded broker for development  and then have a configuration to use IBM’s MQSeries in production.

ActiveMQ Broker

By just having ActiveMQ on our build path, Spring Boot will automatically set up a ActiveMQ broker. We need to set a couple properties to make it an in memory broker, without connection pooling. We can do this by setting two properties for Spring Boot.

application.properties

ActiveMQ Queue Configuration

We also need to setup a queue for our example. We can do this in a Spring Java Configuration class as follows.

ActiveMQConfig.java

This is all we need to do to configure ActiveMQ for our example. Spring Boot will take care of the rest.

Spring Integration Configuration

Spring Integration JMS Channel Adapter

Spring Integration comes with a number of different channel adapters. In this case, we need to configure a JMS channel adapter. This will serve as a transparent bridge between Spring Integration Messaging and JMS Messaging.

In the Spring Integration XML configuration below, I’ve defined a Spring Integration JMS channel adapter. The destination property is set to the name of the ActiveMQ queue bean we defined above. (When using Spring Java configuration, the bean reference becomes is inherited from the method name in the configuration class.) I’ve also added a Spring Integration channel to the configuration.

This acts as a bridge, messages coming from the JMS queue will get sent the the Spring Integration channel, and messages sent to the Spring Integration channel will get passed along to the JMS queue.

si-config.xml

Say Hello Service

We’ll use a simple service for our example today. Its a Spring Service component, which simply takes a string in and prints it out to the console.

SayHelloService.java

Spring Integration Service Activator Configuration

Next we need to add a Spring Integration Service Activator. This is what Spring Integration will use to process the message. We just need to add the following to our Spring Integration XML file.

/resources/spring/si-config.xml

Spring Boot Configuration

We need to tell Spring Boot about the Spring Integration XML configuration file. We can to this by adding an ImportResource  annotation to the Spring Boot application class file as follows.

HelloWorldSiActivemqApplication.class

Spock Configuration

Maven Dependencies

To enable Spock support add the following dependencies to your Maven pom file.

Maven Compiler Plugin

Spock test classes are written in Groovy. You will need to add a Groovy compiler to your Maven build. I like to use the Groovy Eclipse Compiler. Add the following plugin to your build plugins.

Sending a JMS Message Using Spock

There’s a lot of different ways to send a JMS message. I chose to use Spock for this example mostly for fun. I enjoy using Spock. In this example, I’ve setup Spock to use the same Spring context used by the Spring Boot Application. The Spring Boot Application class is actually a Spring Configuration class you can source into your Spring Integration tests.

Spock Spring Integration Test

In this Spock Integration test, using the Spring Boot configuration, I autowire in an instance of the JMS connection factory and setup a JMS producer to send a text message. This will drop a text message on the the same ActiveMQ JMS queue we configured Spring Integration to listen on.

SayHellowServiceJMSIT.groovy

Running the Spock Integration Test

Using IntelliJ, running the Spock integration test can simply be done by right clicking on the test method, and then clicking on ‘run’.

running a spock test in Intellij

 

Test Output

Spring Integration JMS Configuration for Spring Boot

Intentionally left an error in the Spring Integration configuration to demonstrate this error. The Spring Integration configuration be default is looking for a Spring Bean called ‘connectionFactory’. Spring Boot by default, creates the JMS connection factory using the name ‘jmsConnectionFactory’.

The solution is easy enough to implement. We just need to update the Spring Integration channel adapter to use the Spring Bean ‘jmsConnectionFactory’ instead of it’s default value of ‘connectionFactory.

si-config.xml

Running the Updated Configuration

When we run the Spock Integration Test again, we can see our expected hello world message in the console output.

Test Output

Conclusion

In this Spring Framework example, I’ve shown you how easy it is to use Spring Boot to configure an Active MQ broker for use with Spring Integration. This blog post was inspired by a real world example where I was coding a enterprise service using the Spring Framework. I needed to use Spring Integration to consume messages off of a MQSeries JMS queue. MQSeries is great, but its not very light weight, nor is it appropriate for Integration tests. Thus, I wrote my integration tests to use an embedded ActiveMQ broker. You can see how easy it was to use Spring Boot to provide the ActiveMQ broker, and dependency injection to wire everything up.

4
Share
,

This past week I had a real world use case for using Spring Integration Futures. I was looking into optimizing the checkout experience for a large ecommerce website. Consider what happens in a large scale website when an order is submitted. Typically, you will see a process something like:

  • Validate the place order form information
  • Verify the address against an address service
  • Verify the credit card with a payment processor
  • Verify available inventory

Each of these in a large scale enterprise are services. For example, it is common for an enterprise to subscribe to an address validation service. This helps ensure shipping accuracy. Or you’ll have a inventory service to help manage inventory levels. Normally not a problem. But when you have a black friday shopping special and thousands of users shopping on the site, managing inventory can become tricky.

Each of these services will take some time to execute. If called in sequence, the website response time slows down and impacts the user experience in the checkout flow. However, in my example here, there is no reason these service calls couldn’t be called concurrently. By calling the services concurrently, the response time now is the longest service call, not the sum of all the service calls. If each service call takes a half second, calling the services sequentially would take 2 seconds of elapsed time. Calling them concurrently only takes a half second.

A couple weeks back I wrote a blog post on testing Spring Integration Gateways.  In this use case, Spring Integration is the perfect tool. Spring Integration has a very cool feature to support asynchronous calls. When using Spring Integration Messaging Gateways, if you wrap the return time in a Java Future, Spring Integration will automatically handle the request in a different thread using a thread pool. As an application developer, this makes writing a multithreaded application very easy to do. The Spring Framework and Spring Integration handle the complexity of managing the worker threads and the thread pool.

In this post, I’m going to walk you through the configuration of the Spring Integration gateways and Spring service beans used to make these four service calls supporting my ecommerce place order example.

Spring Integration Code Examples

Command Objects

In this example, I’m going to use a command object. In a web application, this would typically have values bound to it from a form post. However, in today’s omni-channel retail environment, the website is not the only client I need to worry about. This request could be coming from a native iOS mobile application, a native Android mobile application, an in-store kiosk, or maybe a stand alone customer service application.

By using a command object, I’m decoupling my processing from the front end client. I don’t care where the request originated. Maybe it was a form post. Maybe a web service, could be a JMS request. It doesn’t really matter where the request originated.

Place Order Command

Domain Objects

For this example, I’ve created a couple domain objects. These are just simple POJOs I’m using to illustrate the example. In a real application these objects would be much more robust. 

Order

OrderLine

Product

Spring Integration Gateways

For this example, I’ve defined four different Spring Integration Messaging Gateways. Technically, I could have used just one Spring Integration Messaging Gateway, but that would have been a violation of the Single Responsibility Principle. This approach does lead to more class files. But when I need to maintain this code, I know where to look. The programming logic is clear and organized.

OrderGateway

The Order Gateway interface defines two methods. The first placeOrder  is the start of our processing chain.  This is where we will submit the command object. The second method is used in our the processing of the place order command object.

Note: notice the usage of the Java Future  for the return time of the validateOrder method. This is what instructs Spring Integration to perform the method call asynchronously using a thread pool.

AddressGateway

InventoryGateway

PaymentGateway

Spring Services

Since this is a Spring project, we’ll create our services as Spring Beans, and naturally will we be using Dependency Injection and program to an interface.

OrderService

OrderServiceImpl

The implementation of our Order Service is one of the more complex classes in this tutorial. You can see we are having Spring autowire our four Spring Integration Messaging Gateways into the class. In the placeOrderMethod , you can see I call a method on each of the four Spring Integration Gateways. Each method returns a Java Future. After all four are submitted, I go back to get the value of the Future. In this case, I’m using the Spring Errors object. If all four validation steps come back without errors, in a real system I’d persist the order to the database and do any post processing. But this is just a little sample to show off the use of Spring Integration Futures. So in this case, I’m just returning the command object either way.

Spring Integration Configuration

I had to expand on the Spring Integration configuration from our previous example. You’ll see I’m using the Spring Integration Gateway tag to define the four Spring Integration Gateways we’re using. Then I defined the Spring Integration channels and the appropriate Spring Integration service activators. Nothing new here over the previous example. Just a little more routing to take care of.

Notice how I did not define a thread pool? By default, Spring Integration is providing a thread pool for our use. You can of course define your own or update the settings of the default thread pool if needed.

si-config.xml

Running the Spring Integration Code

I’ve setup a Spock test to run the Spring Integration code in this example. 

OrderGatewayTests

When you run this Spock test you will see the following output:

Each the services I’ve wired into this example do simple print line statements out to the console. Each starts their respective output with the Id number of the thread they are running in. You can see from the output how each service is running concurrently in different threads.

Conclusion

As a programmer, Spring Integration is a very powerful tool to have in your toolbelt. You can see from this example how I’ve created a multithreaded application with a minimal amount of coding. I simply wrapped the return type I wanted on the Spring Integration Messaging Gateways with a Java Future. Spring Integration and Spring managed the thread pool for me. I didn’t need to worry about managing threads. The Spring Framework allowed me to focus on delivering the business solution, and took care of the complex boiler plate code.

Get The Code

I’ve committed the source code for this post to github. It is a Maven project which you can download and build. If you wish to learn more about the Spring Framework, I have a free introduction to Spring tutorial. You can sign up for this tutorial in the section below.

Source Code

The source code for this post is available on github. You can download it here.

0
Share
, ,

One of the things I wish to do on this blog is show you realistic examples of using the Spring Framework in Enterprise Application Development. The Spring Framework is very popular for building large scale applications. When you build a ecommerce website that might have 50,000 users on it at any given time the scope of the application you are building changes. This type of site quickly outgrows the traditional 3 tier architecture (web server / app server / database server). The ‘website’ is no longer a simple war file being deployed to Tomcat. You have a data center, with a small server farm. Load balancers, application clusters, message queuing, ‘cloud computing’, micro services. The Spring Framework was not only built for this type of application environment, it thrives in it.

Environments

When you start developing enterprise class applications, you will need to support multiple deployment environments. You’re no longer going to be testing code on your laptop, then deploying it to the production server. Frequently in the enterprise, as a developer you won’t even have access to the production environment. Companies which need to comply with regulations such as SOX, PCI and/or SAS-70 will have specialized teams which will manage code deployments to their testing (QA/UAT) and production environments. This is known as segregation of duties. A very common business practice. From my personal experience it is more stringent in large financial enterprises than it is in retail organizations. Less so in smaller companies because they simply don’t have the resources to support specialized IT staffs.

More modern development cultures will be performing CI builds and automated deployments. Teams on the bleeding edge of modern software development might even be doing continuous deployments I feel continuous deployment is probably the holy grail of software engineering, but in reality it is as rare as a great white buffalo in the wild. I do hope to see more organizations adopt continuous deployments, but it does take a very disciplined organization to get there.

Each of these environments will have its own configuration needs. As the scope of your application grows, chances are the uniqueness of each environment will grow too. The Spring Framework has some outstanding tools which are used to manage the complexities of modern enterprise application development. First, let’s consider some common challenges in the types of environments you will need to support in a large enterprise application.

Local Development

This is your development environment, running from your laptop. This is an area I often see companies absolutely fail at. Your code should be able to run locally, without the need to connect to other servers in the enterprise. Ideally, you should be able to run the code on a plane at 30,000 without a wi-fi connection.

This means:

  • You cannot use an external database. You’re not going to use the Oracle development database.
  • No interaction with other web services.
  • No JMS (MQ Series, Active MQ, Oracle AQ, etc)
  • Your build artifacts need to be locally cached (hello Maven, Ivy!!)

The Grails team does an outstanding job of supporting a local development environment of of the box. When you run Grails in dev mode, it will automatically create an in memory H2 database. Hibernate is used to generate the database tables based on your domain classes.

Continuous Integration

Continuous Integration servers can be tricky little beasts to configure for because of the different types of testing software they accomodate. You may have a project the produces a JAR file, which only has unit tests which will zip right along. You may have integration tests, like Grails does, which bring up an embedded Tomcat instance an H2 in-memory database. Your tests might even perform functional tests using something like Spock and Geb to interact with the embedded Tomcat instance. Its also not uncommon for CI servers to have automated deployment jobs – to another specialized environment.

Each of these scenarios is likely to drive special configuration needs into your application.

Development

Some companies elect to have a development environment. This is typically a production like server environment that the development team has control over. When you deploy into this environment, you will need to configure the application to interact with servers within that environment.

QA / UAT

QA or “Quality Assurance” and UAT “User Acceptance Testing” are environments for non-developer resources to test the software. In some organizations you may have both QA and UAT, or you might have one or the other. If you have both, chances are the organization has formal QA Engineers which work with the QA Environment, and Business Analysts which work with the UAT environment. These environments are often managed by a configuration management team. Sometimes developers will have access to the environment, but often they will not.

Pre-Production or Stage

Pre-Production or Stage (Staging) is an application environment that works with all the production services and supporting databases. This is an environment that allows the deployment of application code, but limits the access to that code. For a website, you might see a special url or access restricted to specific IPs, or throttled by load balancers.

This environment is not as common, but some organizations use it. Facebook deploys code something along this line.

Production

Production is the environment you end users will utilize. This is the main transactional environment and the one your business partners wish to keep operational at all times.

Summary

You can see each of these environments will have its own unique requirements. You’re going to have different database servers, different database versions, often different database vendors. You’ll have different supporting services. For example in an ecommerce website, you might have a payment gateway. In dev, it might be a mock, in test you might be using a testing gateway supplied by your payments provider, and then a different payment gateway for production.

Spring Framework Multi-Environment Support

The Spring Framework was developed from the ground up to support the challenges of supporting complex enterprise environments. You have a number of mature features in the Spring Framework to use in supporting the challenges of enterprise class applications.

Properties

The Spring Framework has excellent support of externalizing properties. “Properties” are simple string values which are externalized from your application. In the Spring Framework properties can be set the following ways:

  • In a traditional properties file. This is typically kept in the resources folder and is often named ‘application.properties’. It is by convention to use <filename>.properties.
  • Using Operating System Environment variables. Java can read values set in the Operating System as environment variables. I’ve used this in the past for things like database settings, which worked out nicely. In this case, my build artifact was easily portable between environments. Once setup, it was effectively ‘drag and drop’ for the support staff.
  • Command line variables. When starting any Java application, you have the opportunity to pass command line variables into the program. This is a handy way to make your build artifact portable. One word of caution, when examining running processes on a system, you can sometimes see the command line information which started the process. So this solution may not be the best option for password strings.

The Spring Framework Support has a number of options for sourcing in property values. One way is using the Spring Expression Language (SpEL). You can see some examples of this in my recent post here.

Dependency Injection

Changing property values is great for things like connecting to a different database server. But often in enterprise class applications you will need behavioral changes which are beyond simple property changes. One of the core features of the Spring Framework is the support of Dependency Injection. Once you become accustomed to development with Dependency Injection with in the Spring Framework, you will see how your code becomes more modular. Your classes will naturally adhere to the Single Responsibility Principle. If you’re doing Dependency Injection against Interfaces, it becomes very easy to swap out components.

Let’s say you have an application which needs to send a JMS message on an event such as a customer purchase. This may trigger an email to the customer about their order, and the data warehouse group might want the purchase information for their analytics. For your unit tests and your integration tests, you might be using Mockito to provide a mock. In your deployed environments you might be using the corporate standard of MQSeries for messaging. But what about doing CI builds? An embedded instance of ActiveMQ just might be perfect solution.

So, in summary, this example has:

  • A Mockito Mock object;
  • 3 different MQ Series configurations;
  • And embedded ActiveMQ.

The MQSeries configurations are easy to handle with property settings. The only thing changing here are the connection parameters. This is the perfect use case for externalized properties.

For your unit tests, if you want to keep them true unit tests (as I defined here), you’ll need to manage the dependency injection yourself. Testing frameworks such as Mocktio and Spock make this easy to do.

If you’re performing integration tests an easy way to manage the Spring Context is through configuration composition. But you may wish to favor using Spring Profiles instead. Each is easy to use, as I explain in the sections below.

Configuration Composition

Inexperienced Spring developers will place all their configuration into single xml files or configuration packages. This is often a mistake since it limits your configuration options. In our example, all of our configuration options could be supported through configuration composition. You would need to place each configuration into a separate xml file, or configuration package. Then selectivity import it into a parent configuration. You basically import the configuration you wish to use into a parent configuration, then load the parent into your Spring Context at run time. This is still a very popular technique to use for testing where it is very easy to specify the Spring context to use at runtime.

For a long time this was the only option Spring developers had to use. In Spring 3.1, Profiles were introduced to help manage different configurations. As you’ll see in the next section this a very powerful feature of Spring.

Spring Framework Profiles

Spring Profiles is a very powerful feature introduced in Spring Framework 3.1. Profiles allow you to define Spring Beans and specify when they should be loaded into the context.

If you do not mark your Spring Been with a profile, it will always be loaded into the Spring Context.

When you do mark your Spring Bean with a profile, then that profile must be set to active for that bean to be loaded into the Spring Context. This makes management of the environment easy, since you can simply mark the different options with the appropriate profile, then when you set that profile option to active, Spring will wire up the appropriate Spring Beans into your application.

There is one special profile that is rather poorly documented, but very nice to use. And that is the default profile. When you mark a Spring Bean using default , this bean is loaded into the context only if no other profile has been set to active. If it there is an active profile, Spring will attempt to find a bean matching that profile first.

What I like about using this option is you don’t have to set an active profile in production. While this is easy to use, from experience it can cause some friction and confusion with the configuration management team. Through the use of default profiles, your application can be deployed into production without an active profile being set.

Going back to the example we’ve been using, I would probably use a CI build profile for the CI build where I wanted to use an embedded instance of ActiveMQ, then setup the MQSeries option using the default profile. Having control over the CI build environment it’s easy for me to specify an active profile, so my CI build profile objects will get loaded into the Spring context. When my example application gets deployed elsewhere, no profile is set to active, so the default profile objects with the MQSeries configuration get loaded into the Spring Context. While we are supporting 3 different environments with MQSeries, this can be managed (and should be) with properties. This is because the MQSeries objects remain the same, except for the configuration of the environment the application is connecting to.

Conclusion

As an application developer, the Spring Framework offers you a lot of options on how you can compose your application. If you’re used to smaller scale development, the plethora of configuration options in the Spring Framework will probably seem like overkill to you. Why would you need such flexibility? Right? No, Wrong. As I’ve shown here, when developing applications in the enterprise you are going to be challenged with supporting the needs of many different environments. You are not just developing code on your laptop. No longer is the production environment the only environment you need to be concerned with. In a large enterprise you will need to support multiple environments, with different configurations and different needs. The Spring Framework has evolved over the years to support the challenging needs of enterprise application development. It’s no wonder that the Spring Framework is the most popular framework to use for developing enterprise class Java applications.

0
Share