How to OSGi-ify a Camel project
In this post I’m looking at the transformation a Camel project running as a standalone Java application shall take in order to run in an OSGi container.
Managing Camel contexts/routes using Java main() methods is straightforward for application development and testing, but it’s not suitable for complex production deployments.
For this, we need a runtime environment like Karaf where a Camel application can benefit from:
- SSH shell command line console to monitor the data flows
- Standard Camel route management (list/start/stop)
- Bespoke shell commands for specific insights
- Simple integration with 3rd party tools like howtio
A Java standalone Camel example
As starting point for OSGi-fication, I’m using Building a scalable exactly-once data pipeline with Kafka and Camel.
This Camel project uses Java DSL to define the routes. It then defines Java *App classes and main() methods that are invoked using Maven profiles and exec-maven-plugin.
With this setup we can execute the programs (and corresponding routes) using a command like below:
mvn compile exec:java -P <profile/program-name>
Alternatively, we could invoke the unit tests present in our project using maven-surefire-plugin and a command like below:
mvn test -Dtest=<MyUnit>Test
In both cases, our project is focused purely on the routing logic. It contains the minimum code to have it running during development.
Source code changes
One way to expose Camel routes to an OSGi container is to wrap them inside a Blueprint. While the heavy lifting is done in Java DSL, a thin XML layer is used to run these in Karaf.
<bean> tags define instances of Java route objects (previously instantiated using Java Main class) and <camelContext> tags groups them together.
Another option is to expose Camel routes using declarative services. We add to our code a Java class and adorn it with @Component attribute. We define activate() and deactivate() methods which start and stop an OsgiDefaultCamelContext instance.
The two options correspond to camel-container.xml resource and OsgiCamelComponent class below. You can see their content in GitHub at https://github.com/lucian-d/lab/tree/master/camel-kafka-words.
In this enhanced project the blueprint approach is used to expose Word generator route and a KafkaRcv processor route. While in separate Camel contexts, together they provide an end-to-end data flow.
In addition, the project uses declarative services approach to add a secondary KafkaRcv processor route.
Note that OSGi-fication is additive to the previous project. All previous code and capabilities remain unchanged.
Maven pom.xml changes
Both options above need maven-bundle-plugin to create the OSGi bundle manifest, embed third party JARs and package it all together into a Karaf deployable JAR file.
The following pom.xml extract shows the parameters required for camel-kafka-words project.
First thing to point out is that while our routes will be running in Karaf, Camel, Java and OSGi classes will be loaded from other OSGi bundles (i.e. it’s assumed that these are provided by Karaf) and Microsoft JDBC classes will be loaded from our bundle. We use <Import-Package> and <Embed-Dependency> to enforce this arrangement. (Further down you can see how we deploy the “expected” bundles.)
OSGi package
Once the plugin has been configured with the right parameters, we can produce the bundle by executing the command below and copying the JAR output file into Karaf’s deploy folder.
mvn clean compile bundle:bundle
As per xml configuration above, bundle:bundle Maven goal produces the following MANIFEST.MF content:
An OSGi bundle will present itself to Karaf using Bundle-Name, Bundle-SymbolicName and Bundle-Version values.
The manifest uses Bundle-Blueprint: and Service-Component: lines to provide Karaf entry points.
Import-Package: lines contain the complete list of Java packages and versions the bundle expects OSGi container to load from external bundles.
Bundle-ClassPath: line points to the JAR file corresponding to the mssql-jdbc dependency we wanted to package together with our Camel routes. You can see their position within the bundle JAR along with .properties, .MF and .xml files in the following picture:
The bundle further states its expectations from Karaf container on Require-Capability: line. It requires Java SE 11 and osgi.component 1.3 and above (up to 2.0). The later is mandated by exposing a Camel route using declarative services, coded in OsgiCamelComponent.java and packaged as db-secondary-consumer.xml.
Deploy to Karaf
To deploy the bundle into Karaf we could simply copy it into the deploy folder. If I do this on a brand new Karaf instance I’m going to hit a few errors. In order to understand how Karaf deployment works, let’s force these errors and see how we can troubleshoot them.
After I copy the jar and run the following command I can see my bundle is installed.
karaf@root()> list
The bundle is not activated yet because a few requirements are missing. I can see these by running:
karaf@root()> diag 48
To fix this, we run the following commands and add Camel features used by the bundle.
karaf@root()> feature:repo-add camel 3.4.4
karaf@root()> feature:install camel-blueprint camel-sql camel-kafka
If we check the deployment status of our bundle we can see now that only two unsatisfied requirements remain.
Note that these are mandated by using OSGi declarative services (OsgiCamelComponet). Using the blueprint alone would not need this, but we can easily add them by installing the scr feature.
While trying to run the following command I’m getting missing requirement — javax.management error.
karaf@root()> feature:install scr
To work around it I create a brand new Karaf instance and install the necessary features before copying the bundle.
Once camel-kafka-words-1.0-SNAPSHOT.jar is present in apache-karaf-4.2.10\deploy folder, I can see my bundle is active with no deployment errors.
Test in Karaf
With Camel routes running in Karaf, we can monitor their status and start/stop them individually .
When I first run camel:route-list command, all bp-event-generator messages are failing because Kafka is not running.
If you’re following through this example, first stop Karaf. Next, to get Kafka running, follow sections Preparing Kafka broker to Create Words db schema at https://github.com/lucian-d/lab/tree/master/camel-kafka-words.
When words2 topic is available, you should see messages being processed. You can see message stats per route.
If you type log:display, you can see log entries from the producer and both consumers.
If you connect to SQL Server database, you can see new records being posted.
Using howtio
We can further monitor Camel routes using howtio. To install it in Karaf run the following commands:
feature:repo-add hawtio
feature:install hawtio-core
At installation I was getting a javax.management missing requirement error. To work around it, I had to start from a fresh Karaf instance, install howtio, next Camel and finally my bundle.
Once howtio is installed, navigate to http://localhost:8181/hawtio and enter karaf/karaf as username/password.
Once connected we can view and start/stop Camel routes running in Karaf.