writing JAXB extension plugins
Thursday, November 01, 2012
JAXB extension plugins (1) give the user more power than simple binding alterations, plus can encourage clean schemas by keeping the code extensions out of the xsd's. But finding up-to-date documentation on this can be a pain, most predates maven.
In hindsight, the process is actually quite simple and there are some good references, but it still takes collation of quite a few sites to make it work. Here I hope to capture them all for a cohesive approach.
Note that 'plugin' is a ubiquitous term and here can mean many things. For disambiguation I've added a postscript number to each use of the term. See footnotes at the bottom for meanings.
JAXB runtime
This post assumes you're using maven. And if so, the JAXB plugin (2) to use is maven-jaxb2-plugin. Add the latest version to your pom http://mvnrepository.com/artifact/org.jvnet.jaxb2.maven2/maven-jaxb2-plugin. Then get your schema generation working OK as documented plenty of places elsewhere on the web. If you're looking at this page then you're probably well past that step anyway.Plugin project skeleton
Create a new maven project to hold your plugin (1). For this example we'll call it my-jaxb-plugin.In the pom add a dependency on jaxb xjc, eg:
<groupid>com.sun.xml.bind</groupid>
<artifactid>jaxb-xjc</artifactid>
<version>2.2.6</version>
</dependency>
Create a new class, call it MyPlugin1 in package com.eg, extending from com.sun.tools.xjc.Plugin. For now just implement the required methods returning "XmyJaxb" from getOptionName(), and implement run() as System.out.println("***working!"); (or similar :).
Now create new folders META-INF/services on the classpath (eg src/main/resources). Add a text file in there called com.sun.tools.xjc.Plugin.In this file is a list of all the plugins that this project provides, one per line. So for now add your new plugin class:
Run a maven install on the project to make it available (not needed later if you use workspace resolution with eclipse maven plugin (3))
Employ the plugin (1)
<plugins>
<plugin>
<groupId>org.jvnet.jaxb2.maven2</groupId>
<artifactId>maven-jaxb2-plugin</artifactId>
<configuration>
<extension>true</extension>
<args>
<arg>-debug</arg>
<arg>-XmyJaxb</arg>
</args>
<plugins>
<plugin>
<groupId>nz.govt.police</groupId>
<artifactId>esbschema-jaxb-plugin</artifactId>
<version>0.0.1-SNAPSHOT</version>
</plugin>
</plugins>
</configuration>
</plugin>
</plugins>
</build>
The -debug arg can be removed later as needed. You may also find it useful to use -X after your mvn install command to get verbose output from the maven build.
More
For taking your plugin (1) further, read the key post from 2005, resources incl javadoc for jaxb-xjc, and the sun codemodel javadocs which you'll be using to modify the code generation.Footnotes
use eclipse to autowrap an object
Tuesday, April 07, 2009
I'm sure there's a better design methodology to do this, but I have an issue in Java where the PostgreSQL JDBC driver can't execute createBlob() (in JDBC3 spec), so I want to overwrite its Connection. I can't subclass it, as it's returned from the call DriverManager.getConnection(...). So what I need to do is use the Wrapper design pattern, aka Decorator, aka Delegator. Oh wouldn't it be great to use via? But alas, until my cry is heard, or someone corrects me, the solution is to:
- Create a new class that also implements Connection.
- Use this class to wrap the obtained PostgreSQLConnection
- Painstakingly implement each and every method in Connection to pass through to the wrapped connection, except for the methods I want to meddle with.
- Pass out of my Connection-obtainer class, not the raw Connection, but my new wrapper.
- Accept resignedly that when the Connection interface changes, my Connection wrapper will now not fulfil the new interface and will break. Really good reason for via.
To fix all wrapped method calls that return a value:
In the new class, Ctrl-F to bring up the find/replace dialog. Make sure 'Wrap search' and 'Regular expression' are ticked, and 'Scope' is 'All', then copy and paste into Find:
This searches for any lines having the word public, not followed by class, and having a return statement.
Hit Find a few times to validate it's matching correctly.
Now, in Replace:
Takes the first match (\1) and appends the method call (\2) to the field name.
Click 'Replace All' to see it happen.
To fix the remaining method calls with no return value (ie void)
In Find:
Now, in Replace:
And 'Replace All' again.
As yet the above regex statements don't remove parameter types from the calling statements, so you will have to go through and remove them, but that should be relatively little work.
use jndi with spring to access external properties file
Tuesday, January 13, 2009
On the surface, external configuration files in a JEE environment provide a simple mechanism to store environment-specific data, but they can be a pain to access. How to access it when file system access isn't allowed? Since it's outside the classpath, where is the file in each environment? If I'm using Spring, how do I access this movable file?
The best solution I've found is to use jndi resources. The following is a solution using Websphere (6) and Spring (2.01).
Step 1: Configure the jndi reference for Websphere
This step was based on information from IBM's page "Using URL resources to manage J2EE property files in IBM WebSphere Application Server V5", steps A and B. Websphere Studio isn't required, however, so briefly:
a) Navigate to Resources > URL > URLs and create a new URL. Make up a JNDI name starting with "url/". In 'specification', enter the path to the properties file as a URI, eg. "file:///E:/project.properties". So now Websphere has a URL Resource pointing to the properties file for this environment.
b) In the code, edit web.xml. Configure a new resource-ref as shown in IBM's figure 6. 'res-ref-name' is the jndi name we set up in a). Then in ibm-web-bnd.xmi, add a new resRefBindings as shown in IBM's figure 7.
That completes the configuration of Websphere and the jndi configuration in the code.
Step 2: Next Spring needs to be able to load the properties file by looking up the jndi location.
c) I'll assume that you want the properties file to be set as a property on a class 'pkg.MyClass'. To do this, we use a PropertyFactoryBean to convert from properties file to Properties class. The PropertyFactoryBean takes a Resource as location property, so we create a UrlResource bean for this, with the java.net.Url as constructor argument. This java.net.Url is the result of using a JndiObjectFactoryBean to look up the jndi name and return the Url object. The following bean config shows these conversions:
<bean class="myclass">
<property name="props">
<!-- Load from the .properties file-->
<bean class="org.springframework.beans.factory.config.PropertiesFactoryBean">
<property name="location">
<!-- Generate a UrlResource from the java.net.Url -->
<bean class="org.springframework.core.io.UrlResource">
<constructor-arg>
<!-- use jndi to look up the location of the parameters.properties file -->
<bean class="org.springframework.jndi.JndiObjectFactoryBean">
<property name="jndiName" value="java:comp/env/url/analysisParametersURL" />
</bean>
</constructor-arg>
</bean>
</property>
</bean>
</property>
</bean>
And that's it. In summary, Spring looks up a jndi URL reference to a properties file, configured in the JEE server. Spring beans are created that convert the URL to a URLResource to a Properties object, available for injection into your custom class.
java modulus for floating point numbers
Thursday, October 09, 2008
The java modulus (or remainder) operator (%) truncates instead of rounds with floating point numbers (ie float, double), so the solution is to use Math.IEEEremainder instead. It's fine to use % for ints.
interface fulfillment using fields - a java language proposition - part 2 of 2
Friday, July 11, 2008
I'm proposing a new keyword in Java class definitions, via. It essentially provides a methodology to simply and maintainably perform automatic delegation of interface methods to member fields. Part one introduced the concept, while part two will delve into more of the finer points.
Inheritance model integrity
The integrity of the inheritance model is maintained, as a class using via can also subclass another class, eg:
And this class can even be subclassed itself:
As seen in Part one, the classes compile to a traditional POJO so inheritance is not adversely affected. Of course any class can only
via can wrap many classes.
An illustration: If obj.method() is invoked and Obj uses via
- an implementation of method() defined by Obj is looked for first.
- If not found, then tries Obj.super.method() and so on up to Object.method() when not found.
- If not found in the object's hierarchy up to and including Object then its implemented interfaces are searched (and the method invoked on the corresponding field).
- If Obj's interfaces don't define it then work up the inheritance tree again, checking for any interfaces fulfilled with via.
- In the situation where two or more interfaces are mapped with via in the same class and define the same method, obviously the JVM wouldn't know which field to invoke the method on. In this case a compiler error will be thrown. The solution is to explicitly override the method in Obj.
IOC and DI
What about the IOC pattern? As far as I see it, via is an awesome tool to use with IOC patterns, such as the Spring Framework. The idea with the IOC pattern (or Dependency Injection (DI)) is that any implementation class that fulfils the required interface can be swapped in at runtime. As far as I know DI can't be used to inject in a class to be subclassed, as the inheritance is defined as the subclassing of a given concrete class. As we saw in Part one, via basically lets us subclass interfaces. This means that a given interface can have our extra functionality or handling (using via), but that interface can be any class, it's just whatever the IOC container passes in. Very cool.
Multi-class interface fulfilment
Part one mentioned the idea of fulfilling an interface by the combining of two or more classes. The problem this is trying to solve is basically that to implement one interface method might require the help of a number of private methods. This then encourages splitting up interfaces so that the implementation classes don't suffer from unreadability and complexity. The problem this creates is that the exposed interfaces then increase in number and each is more basic than it need be.
The idea is that an interface can be defined with as many methods as the pure design requires, without concern about the resulting complexity of any implementors. Using via, a single class can implement an interface and define a number of implementation classes that combine to fulfil it. Here's a full example:
interface PackageIface1 {
void setName(String name);
}
interface PackageIface2 {
void setNumber(int number);
}
// now the public interface that combines them
public interface PublicIface extends PackageIface1, PackageIface2 {}
// the implemetation classes (package-visible)
class PackageClass1 implements PackageIface1 {
public void setName(String name) {...}
}
class PackageClass2 implements PackageIface2 {
public void setNumber(int number) {...}
}
// and the integration class that brings them all together
public class PublicClass implements PublicIface via this.pkgIface1Obj & this.pkgIface2Obj {
// class just defines the two fields and constructor(s)
}
(Note: The '&' symbol is arbitrarily chosen and could be anything that makes sense and is acheivable).
So the PublicIface is the only interface that needs to be exposed publicly. The PublicClass will be a fixed implementation to use. Discrete functionality subsets of PublicIface can be changed by swapping in a different implementation of PackageIface1 or PackageIface2 that get passed to PublicClass's constructor. And there is no limit to the number of classes that can be used to fulfil the main interface, provided that each one exclusively implements an interface that is extended by the main interface.
I may be way off track, but I dreamed this up when coming across the same problem for the tenth time in an enterprise Java project using Spring, and it just seems to fit. It may be that there are techniques or patterns out there that mean I can do all this already, or just that I should be slapped for suggesting such things. All feedback welcome.
interface fulfillment using fields - a java language proposition - part 1 of 2
I'm proposing a new keyword in Java class definitions, via. It essentially provides a methodology to simply and maintainably perform automatic delegation of interface methods to member fields.
This provides:
- Encouragement for coding to interfaces over inheritance
- Object wrappers that do not need to subclass the wrapped object, yet still expose the wrapped object's methods directly (exposed as 'is-a' rather than having to manually delegate because of 'has-a')
- Can 'swap out' the effective superclass (like subclassing an interface, not a class)
- Advantages of multiple-inheritance
- and potentially: Interface fulfillment by combining two or more classes
Here's an example of what it might look like:
Which says, I have a class, Car, which implements the interface Driveable. Car as an object definition may not fulfill all (or any) of the requirements for Driveable, but via its field 'vehicle', the contract is met.
And the code for Car might look like:
private Driveable vehicle;
public Car() {
this.vehicle = new DefaultCar();
}
public void doCarStuff() {...}
}
So we can see here that the requirements for the Driveable interface can be met by anything that can be assigned to the 'vehicle' field. In this case a new instance of DefaultCar. As with any class, Car can also define any other additional members as well (e.g. doCarStuff()), enriching the functionality of DefaultCar (as subclassing would). It is essentially a methodology for automatic and type-safe delegation. But we gain nothing in this example, we may as well just extend the DefaultCar class. To get the advantages we need to make some changes...
private Driveable vehicle;
public Car() {
this.vehicle = new DefaultCar();
}
public Car(Driveable driveableClass) {
this.vehicle = driveableClass;
}
public void doCarStuff() {...}
}
In this way Car's new functionality can enrich any implementation of Driveable. At runtime we can instantiate different instances of Car, each of which could have a unique implementation of Driveable. Swapping in different driveableClass objects could be useful for endowing different classes with the same extra features. Also for testing, Mocks or test doubles can be passed to the constructor so that only the added features are under test.
In this example Car could also override any of the Driveable methods, as with traditional class inheritance. More on overriding in Part two.
Under the hood
So how is it working? I imagine the compiler would generate bytecode representing the code below. You could write this yourself, but it would be messy and require duplication.
Firstly an example of the Driveable interface:
void setSpeed(int speed);
boolean isMoving();
}
Then the effective resulting code (ie the developer wouldn't see it written like this):
private Driveable vehicle;
public Car() {
this.vehicle = new DefaultCar();
}
public Car(Driveable driveableClass) {
this.vehicle = driveableClass;
}
public void doCarStuff() {...}
//delegation
public void setSpeed(int speed) {
this.vehicle.setSpeed(speed);
}
//delegation
public boolean isMoving() {
return this.vehicle.isMoving();
}
Multiple inheritance
While some debate the merits of multiple inheritance, this is often imitated by subclassing and using an inner class that has subclassed also. The via keyword, however, allows us to do this directly.
Runnable via this.runner {
The construct would not break existing Java inheritance relationships, as a class can both extend a class as well as use interface fulfillment via fields, and a class that uses that construct can itself be subclassed.
More to follow...
More benefits and some finer points will follow in Part 2, including benefit [5] hinted at earlier...
jni - call C/C++/Assembly from Java
Friday, May 23, 2008
- Create Java class containing native method(s) (static or instance) defining interface with C code.public class Processor {
public static native double process(double x, double y);
} - Compile to Java class file.
- Run javah -jni [-o path/to/CProcessor.h] Processorto generate C header file from the Java. Eg. CProcessor.h:/* DO NOT EDIT THIS FILE - it is machine generated */
#include <jni.h>
/* Header for class Processor */
#ifndef _Included_Processor
#define _Included_Processor
#ifdef __cplusplus
extern "C" {
#endif
/*
* Class: Processor
* Method: process
* Signature: (DD)D
*/
JNIEXPORT jdouble JNICALL Java_Processor_process
(JNIEnv *, jclass, jdouble, jdouble);
#ifdef __cplusplus
}
#endif
#endif - Create a .c C source file - this is the stub to call the target code. Include header
and "Processor.h". The angle brackets mean it is registered as a library file for the compiler, whereas the speech marks denote a stand-alone header file. May need to include the path the Processor.h within the speech marks. Here is a simple example that just sums the two input numbers, rather than calling any other C: #include <jni.h>
#include <math.h>
#include "../headers/CProcessor.h"
#include <stdio.h>
JNIEXPORT jdouble JNICALL Java_Processor_process
(JNIEnv *env , jclass obj, jdouble val1, jdouble val2) {
return (val1+val2);
} - Run (eg for linux)gcc -shared -I /usr/lib/jvm/java-1.5.0-sun-1.5.0.15/include/ -Ito create the library file (indicated by 'shared' flag). Any compiler errors about missing .h headers should be solved by the inclusion (-I) of the jni and jni for linux paths.
/usr/lib/jvm/java-1.5.0-sun-1.5.0.15/include/linux/ -o CProcessor.so CProcessor.c - Either in a method inside original Java class, or in a new calling class, create a static block to either load (System.load()) the .so library via its file path, or load (System.loadLibrary()) a registered library (e.g. dll on Windows) via system-specific addressing.
- Once the static block has loaded the library, the methods are available either statically or from an object as determined in 1.public class Caller {In this case our Processor in Java calls the CProcessor in C which adds the 2 doubles we passed it and returns a double. Here it is 2 + 3 with the output 5.
static {
System.load("/home/me/_projects/JNI/C/CProcessor.so");
}
public static void main(String[] args) {
double result = Processor.process(2, 3);
System.out.println(result);
}
}
set ear file's context root in WebSphere
Thursday, March 27, 2008
WebSphere lets you set the context root of a war file directly on page 1 of the update screens. ear files are different however:
Turns out when Updating an ear, you need to select on the first screen to show all build/config options.
Then skip to step 8 'Map context roots for Web modules'. There the context can be specified.
lock hibernate session to avoid lazy loading exceptions in tests
Tuesday, March 04, 2008
//Lock this hibernate session so we don't get lazy loading exceptions
SessionFactory sessionFactory = (SessionFactory) appContext.getBean("sessionFactory");
Session session = SessionFactoryUtils.getSession(sessionFactory, true);
TransactionSynchronizationManager.bindResource(sessionFactory, new SessionHolder(session));