Software Deployment – java applications as a RPM linux package

Java applications archives as jar, war and ear files are elementary distribution blocks in the java world. At the beginning managing all of these libraries and components were a bit cumbersome and error prone as the project dependencies depends on another libraries and all those transitive dependencies creates so called dependency hell. In order to ease developers of this burden Apache Maven (maven like tools) were developed. Every artefact has so called coordinates which uniquely identifies it and all dependencies are driven by those coordinates in a recursive fashion.

Maven ease the management at the stage of the artefact development but doesn’t help that much when we want to deploy the application component. Many times that’s not such a big deal if your run time environment is clustered J2EE aplication server e.g. Weblogic cluster. You hand the ear or war over to your ops team and they deploy it to the cluster via cluster management console to all nodes at once. They need to maintain an archive of deployed components in case of roll back etc. This is the simplest case (isolated component and doesn’t solve dependencies e.g. libraries provided in the cluster etc.) where management is relatively clean but relying on the process a lot. When we consider different run time environment like run the application as a server less java process (opposed to J2EE cluster) then the stuff gets a bit more complicated even for a simplest case. Your java applications are typically distributed as a jar file and you need to distribute it to every single linux server where instance of this process is running. Apart of that standard jar file doesn’t contain dependencies. One possible solution to that would be to create shaded (fat) jar file which has all dependencies embedded. I suppose that you have a repository where all builds are archived. Does it make sense to store those big archives where the major part are 3rd party libraries? This is probably not the right way to go.

Another aspect of roll out process is ability to automate it. In case of J2EE clusters like weblogic there is often scripting tool provided (WLST ~ weblogic scripting tool). The land of pure jar is again a lot worse. You can take some advantage of maven but that doesn’t solve all the problems. Majority of production environments in java world run on linux operating system so why not to try to take advantage of linux server standard distribution management like yum, apt etc. for distributing rpm linux packages. This system provides atomicity, dependency management between rpm linux packages, easy way to roll back (keeps track of versions), minimise the number of manual steps – potential of human error is reduced and involves native auditing. It is pretty easy to get an info about installation history.

To pack your java jar file application you need a tool called rpmbuild which creates a linux package from a SPEC file. SPEC file is something like pom in maven world plus it contains instruction how to install, uninstall etc. Packages containing a required plus handy tools are rpmdevtool and rpmlint. On linux OS it is simple to install it. On Windows OS you need a cygwin installed with the same tool set. In order to build your rpm work space run the following command. It is highly recommended not to run it under root account if there is no special need for it.

rpmdev-setuptree

This command creates rpmbuild folder – that is the place where all linux RPM packaging will happen. It contains sub-folders: BUILD, RPMS, SOURCES, SPECS, SRPMS. For us the important ones are RPMS – will contain final linux rpm and SPECS – this is the place we need to put our SPEC file describing installation and content of our application.
This file is the core of the linux rpm packaging. It contains all the information about version, dependencies, installation, un-installation, upgrade etc. We can create our skeleton SPEC file by running following command:

rpmdev-newspec

Majority of directives in this file are clear from its name, e.g. Name, Version, Summary, BuildArch etc. BuildRoot require special attention. It is a sort of proxy which mimics a root of system under the construction e.g. If I want to install my [application_name] (replace this place holder with actual name) to /usr/local/[application_name] location I have to create this structure under BuildRoot during the installation. Then there are important sections which corresponds to various phases of installation: %prep, %build, %install – which is the most important for as as we do not build from sources but just pack already built jar file to linux rpm package. Last very important section of this file is %files which lists all files which will be in the final linux rpm package and hence installed in the target machine. Apart from that there can be additional hooks to installation and un-installation process as %post, %preun, %postun etc. which allows you to customize a process as you need. Sample SPEC file follows:

%define _tmppath /home/virtual/rpmbuild/tmp
Name: [application_name]
Version: 1.0.2
Release: 1%{?dist}
Summary: Processor component which feed data into DB
Group: Applications/System
License: GPL
URL: https://jaksky.wordpress.com/
BuildRoot: %{_topdir}/%{name}-%{version}-%{release}-root
BuildArch: noarch
Requires: jdk >= 7
%description
Component which process incoming messages and store them to DB.

%prep

%build

%install
rm -rf $RPM_BUILD_ROOT
mkdir -p $RPM_BUILD_ROOT/usr/local
cp -r %{_tmppath}/[application_name] $RPM_BUILD_ROOT/usr/local
mkdir -p $RPM_BUILD_ROOT/usr/local/[application_name]/logs
mkdir -p $RPM_BUILD_ROOT/etc/init.d
cp -r %{_tmppath}/[application_name]/bin/[application_name] $RPM_BUILD_ROOT/etc/init.d
mkdir -p $RPM_BUILD_ROOT/var/run/[java application]

%files
%defattr(644,[application_name],[application_name])
%dir %attr(755, [application_name],[application_name])/usr/local/[application_name]
%dir %attr(755,[application_name],[application_name]) /usr/local/[application_name]/lib
/usr/local/[application_name]/lib/*
%attr(755,[application_name],[application_name]) /usr/local/[application_name]/logs
%dir %attr(755,[[application_name],[application_name]) /usr/local/[application_name]/conf
%config /usr/local/[application_name]/conf/[application_name]-config.xml
%config /usr/local/[application_name]/conf/log4j.properties
%dir %attr(755,[application_name],[application_name]) /usr/local/[application_name]/deploy
/usr/local/[application_name]/deploy/*
%doc /usr/local/[application_name]/README.txt
%dir %attr(755,[application_name],[application_name]) /usr/local/[application_name]/bin
%attr(755,[application_name],[application_name]) /usr/local/[application_name]/bin/*
%attr(755,root,root) /etc/init.d/[application_name]
%dir %attr(755,[application_name],[application_name]) /var/run/[application_name]

%changelog
* Wed Nov 13 2013 Jakub Stransky <Jakub.Stransky@jaksky.com> 1.0.2-1
- Bug Fixing wrong messages format
* Wed Nov 13 2013 Jakub Stransky <Jakub.Stransky@jaksky.com> 1.0.1-1
- Bug Fixing wrong messages format
* Mon Nov 11 2013 Jakub Stransky <Jakub.Stransky@jaksky.com> 1.0.0-1
- First relaese of [application_name]

Several things to highlight in the SPEC file example: tmppath points to the location where the installed application is prepared, that is essentially what is going to be packed to rpm package. %defattr set the standard attributes to files if special one are not specified. %config denotes configuration files which means that for the first installation those standard one are provided but in case of upgrade those file will not be overwritten as they are probably customized to this particular instance.
Now we are ready to create the linux rpm package just the last step is pending:

rpmbuild -v -bb --clean SPECS/nameOfTheSpecFile.spec

Created package can be found in RPMS subfolder. We can test the package locally by

rpm -i nameOfTheRpmPackage.rpm

To complete the smoke test lets remove the package by

rpm -e nameOfTheApplication

Creating a SPEC file should be pretty straightforward process and once you create your SPEC file for the application building of linux rpm package is one minute job. But if you want to automate it there is a maven plugin which generates a SPEC file for you. It is essentially wrapper of rpmbuild utility which means that plugin works fine on linux with tool set installed but on windows machine you need have cygwin installed and create wrapper bat file to mimic rpmbuild utility for the plugin. Detailed manual can be found for example here.

Couple things to highlight when creating a SPEC file. Prepare the linux package for all scenarios – install, remove, upgrade and configuration management right from the beginning. Test it properly. It can save you a lot of troubles and manual work in case of large installations. Creating a new version of java application is only about about replacing jar file, re-packaging rpm bundle.

In this quick walk through I tried to show that creating of linux rpm package as a unit for software deployment of the java application is not that difficult and can neaten a roll out process. I just scratch the surface of linux rpm packaging and I was far away from showing all capabilities of this approach. I will conclude this post by several links which I found really useful.

Great tutorial on RPM packaging in general
Good rpmbuld manual pages
Maven rpm plugin
Maximum RPM book

Advertisements

Java application as a Linux service

Using standard J2EE containers for application deployment is not always suitable option. Time to time you need to run an java application (jar file) as a server less, more light weight linux process. Using standard java -cp …. MainClass is feasible but sooner or latter you will reveal that there is something important missing. Especially if you are supposed to run multiple components in this way. I becomes relly messy and hard to manage pretty soon. On linux system there is a solution which is a lot better – run the component as a linux service.
Lets make is simple and easy to understand. Linux service is essentially a “process” which is driven by init script and has defined API – set of standard commands for management of the underlying linux process. Those linux service commands looks as following(processor represents actual name as defined in init script, see latter):

service processor start
service processor status
service processor stop
service processor restart

That’s a lot simpler, easy to manage and monitor, right? You don’t need to know where particular jar file is located etc. Examples of init scripts can be usually located /etc/init.d/samples or just simply read scripts in /etc/init.d which contains various init scripts for different kinds of linux services already present on the system.
For java applications there is a bunch of projects which acts as a service wrappers. That enables you to quickly and easily turn jar file to regular linux service as a program daemon. There are wrappers even for windows OS. For some reasons I was directed to use just linux server standard tools so the reminder of this post will be about making the linux service program daemon in a common way via shell scripts.
First of all there is a necessity to create startup and shutdown script with a need to properly manage pid (process id) file accordingly. A good practice is to have a dedicated user to run a particular linux services and have them installed under /usr/local/xxx .
startup script follows:

#!/bin/sh
#
# Script parameters: [Instalation_Foleder]
#
# JAVA_HOME Must point at your Java Development Kit installation.
# Required to run the with the "debug" argument.
#
# JRE_HOME Must point at your Java Runtime installation.
# Defaults to JAVA_HOME if empty. If JRE_HOME and JAVA_HOME
# are both set, JRE_HOME is used.
#
# JAVA_OPTS (Optional) Java runtime options used when any command
# is executed.

# Check the way the script has been called and set current directory as PROCESSOR_HOME
if [ "X$1" = "X" ]
then
  cd .. >/dev/null
  pwd >/dev/null
  PROCESSOR_HOME=$PWD
  SERVICE_INVOKE="no"
else 
  PROCESSOR_HOME=$1
  SERVICE_INVOKE="yes"
fi
echo PROCESSOR_HOME set to $PROCESSOR_HOME
# Load confing
source $PROCESSOR_HOME/bin/config.sh
# Check if the invocation is according to configuration [asService | asProcess]
if [ ! "$SERVICE_INVOKE" == "$RUN_AS_SERVICE" ]
then
  echo "ERROR - Invocation is not according to configuration - run as a Lunux Service= $RUN_AS_SERVICE"
  exit 6
fi
# check installation
if [ ! -d "$PROCESSOR_HOME/bin" \
-o ! -f "$PROCESSOR_HOME/bin/config.sh" \
-o ! -d "$PROCESSOR_HOME/conf" \
-o ! -d "$PROCESSOR_HOME/deploy" \
-o ! -d "$PROCESSOR_HOME/lib" \
-o ! -f "$PROCESSOR_HOME/conf/log4j.properties" \
-o ! -f "$PROCESSOR_HOME/deploy/test1-1.0-SNAPSHOT.jar" ]; 
then
echo 
echo ERROR - Installation is not correct!
echo Expected installation package looks:
echo "$PROCESSOR_HOME/bin"
echo "$PROCESSOR_HOME/bin/config.sh"
echo "$PROCESSOR_HOME/conf"
echo "$PROCESSOR_HOME/conf/log4j.properties"
echo "$PROCESSOR_HOME/deploy"
echo "$PROCESSOR_HOME/deploy/test1-1.0-SNAPSHOT.jar"
echo "$PROCESSOR_HOME/lib"
exit 1
fi
# clean up
CLASSPATH=
JAVA_OPTS=
JAVA_PATH=
JAVA_EXEC=

# set JAVA
REQUIRED_JVM_VERSION=1.7
if [ -z "$JAVA_HOME" ]; 
then
  if [ -z "$JRE_HOME" ];
    then
      echo ERROR - either JAVA_HOME or JRE_HOME is not set!!!
      exit 1
    else
    echo Java JRE used $JRE_HOME
    JAVA_PATH=$JRE_HOME
  fi
else
  echo Java used $JAVA_HOME
  JAVA_PATH=$JAVA_HOME 
fi

# set JAVA_EXEC
JAVA_EXEC=$JAVA_PATH/bin/java
#check Java bin
if [ ! -x "$JAVA_EXEC" ];
then
  echo Java binaries not found $JAVA_EXEC
  exit 1
fi
# checkJavaVersion
JVM_VERSION=$("$JAVA_EXEC" -version 2>&1 | awk -F '"' '/version/ {print $2}')
#echo version "$JVM_VERSION"
if [[ "$JVM_VERSION" < "$REQUIRED_JVM_VERSION" ]]; 
then
  echo ERROR - $JAVA_EXEC doesnt point to propper java version $REQUIRED_JVM_VERSION 
  exit 1
fi
# setBDHISTP_MAIN
BDHISTP_MAIN=cz.jaksky.PROCESSOR.PROCESSOR
# setClasspath
CLASSPATH=$PROCESSOR_HOME/deploy/*:$PROCESSOR_HOME/lib/*
# echo Classpath set to: $CLASSPATH

# setJAVA_OPTS
JAVA_OPTS=-Dbdconf=$PROCESSOR_HOME/conf
JAVA_OPTS="$JAVA_OPTS -Dlog4j.configuration=file:$PROCESSOR_HOME/conf/log4j.properties"
#echo JAVA_OPTS set to: $JAVA_OPTS
# This is nasty as in the code there is hardcoded location to actual config file for the process
cd $PROCESSOR_HOME
runProgram() {
echo $JAVA_EXEC $JAVA_OPTS -classpath $CLASSPATH $BDHISTP_MAIN
$JAVA_EXEC $JAVA_OPTS -classpath $CLASSPATH $BDHISTP_MAIN & PROCESS_PID=$!
echo $PROCESS_PID > $PIDDIR/$PID_FILENAME
echo "new application instance started as process $PROCESS_PID"
}

if [ ! -f "$PIDDIR/$PID_FILENAME" ]
then 
  echo "I will try to start new process ..."
  runProgram
else
  PID=$(cat $PIDDIR/$PID_FILENAME)
  if ps -p $PID >/dev/null
    then
      echo "WARNING $APP_NAME already running as process $PID"
    else
      echo "process $PID is not running - will try to start a new instance of the application"
      echo " "
      runProgram 
  fi
fi
exit 0
 

shutdown script follows:

#!/bin/sh
# Script usage:
# this script can be invoked either directly in bin folder or from different location with passing information where to locate installation folder
#
# Check the way the script has been called and set current directory as PROCESSOR_HOME
if [ "X$1" = "X" ]
then
  cd .. >/dev/null
  pwd >/dev/null
  PROCESSOR_HOME=$PWD
  SERVICE_INVOKE="no"
else 
  PROCESSOR_HOME=$1
  SERVICE_INVOKE="yes"
fi
echo PROCESSOR_HOME set to $PROCESSOR_HOME
# Load confing
source $PROCESSOR_HOME/bin/config.sh
if [ -z "$PIDDIR" ]
then
  echo "ERROR - Installation configuration file config.sh not found at $PROCESSOR_HOME/bin"
  exit 1
fi
# Load confing
source $PROCESSOR_HOME/bin/config.sh
# Check if the invocation is according to configuration [asService | asProcess]
if [ ! "$SERVICE_INVOKE" == "$RUN_AS_SERVICE" ]
then
  echo "ERROR - Invocation is not according to configuration - run as a Lunux Service= $RUN_AS_SERVICE"
  exit 6
fi
if [ -f "$PIDDIR/$PID_FILENAME" ]
then
  PID=$(cat $PIDDIR/$PID_FILENAME)
  kill $PID
  RC=$?
  rm $PIDDIR/$PID_FILENAME
  echo "Application $APP_NAME - process $PID shut down successfull"
  exit $RC
else
  echo "pid file not exist $PIDDIR/$PID_FILENAME, nothing to shut down"
  exit 0
fi

Those scripts relies on existence of installation configuration shell script – config.sh located in bin folder of installation as follows:

#!/bin/sh 
RUN_AS_SERVICE="yes"
APP_NAME="Processor"
APP_LONG_NAME="Processor instance" 
PIDDIR="/var/run/processor"
PID_FILENAME="processor.pid"

Startup script creates pid file located /var/run/processor – user under which the installation is running needs to have appropriate privileges.
Finally the init script which needs to be placed into /etc/init.d folder:

 ### BEGIN INIT INFO
# Provides: processor
# Required-Start: 
# Required-Stop: 
# Default-Start: 2 3 4 5
# Default-Stop: 0 1 6
# Short-Description: processor daemon
# Description: processor daemon
# This provides example about how to
# write a Init script.
### END INIT INFO
# Config to edit if needed
INSTALL_HOME=/usr/local/Processor
JAVA_HOME=/usr/java/default
SERVICE_USER="processor"
# No modification allowed from here
# Using the lsb functions to perform the operations.
. /lib/lsb/init-functions
#
# If the daemon is not there, then exit.
test -x $INSTALL_HOME/bin/startUp.sh || exit 5
test -x $INSTALL_HOME/bin/shutDown.sh || exit 5
test -x $INSTALL_HOME/bin/config.sh || exit 5
# Load confing
source $INSTALL_HOME/bin/config.sh
export JAVA_HOME
PIDFILE=$PIDDIR/$PID_FILENAME
# Process name ( For display )
NAME=$APP_NAME
CURRENT_USER=`id -nu`
start(){
  echo "Starting $NAME under $SERVICE_USER user..."
  if [ "$CURRENT_USER" == "$SERVICE_USER" ]
  then
    $INSTALL_HOME/bin/startUp.sh $INSTALL_HOME >/dev/null
    RC=$?
  else
    su --preserve-environment --command="$INSTALL_HOME/bin/startUp.sh $INSTALL_HOME >/dev/null" $SERVICE_USER
    RC=$?
  fi
}
stop(){
  echo "Stoping $NAME running under $SERVICE_USER user ..."
  if [ "$CURRENT_USER" == "$SERVICE_USER" ]
  then
    $INSTALL_HOME/bin/shutDown.sh $INSTALL_HOME >/dev/null
    RC=$?
  else
    su --preserve-environment --command="$INSTALL_HOME/bin/shutDown.sh $INSTALL_HOME >/dev/null" $SERVICE_USER
    RC=$?
  fi
}
case $1 in
start)
  start
  exit $RC
;;
stop)
  stop
  exit $RC
;;
restart)
  stop
  start
  exit $RC
;;
status)
  if [ ! -f "$PIDDIR/$PID_FILENAME" ]
  then 
    echo "$NAME is NOT RUNNING"
    exit 1
  else
    PID=$(cat $PIDDIR/$PID_FILENAME)
    if ps -p $PID >/dev/null
    then
      echo "$NAME is RUNNING $PID"
      exit 0
    else
      echo "$NAME is NOT RUNNING"
      exit 1
    fi
  fi
;;
*)
# For invalid arguments, print the usage message.
echo "Usage: $0 {start|stop|restart|status}"
exit 2
;;
esac

In the init script there is a need to change to appropriate java apps installation folder JAVA_HOME if not default and SERVICE_USER to user which is supposed to run this service. Service can be started under root account or SERVICE_USER without password specification or any other user with knowledge of credentials.

If you have a production like experience with java service wrappers mentioned at the beginning of the article don’t hesitate and share it! This way it serves the purpose at given situation.

Weblogic classloading

Getting a java.lang.NoSuchMethodError is usually the beginning of great exploration of your platform – in this case weblogic. Javadoc says:

Thrown if an application tries to call a specified method of a class (either static or instance), and that class no longer has a definition of that method.
Normally, this error is caught by the compiler; this error can only occur at run time if the definition of a class has incompatibly changed
.

What’s the hack going on here! Libraries used are embedded into the final archive I did verified that! If you don’t know simply suspect classloaders, publicly known enemies of java developers 🙂 As rule no.1 which says: “Verify your assumptions”. The fact that the class is in archive doesn’t necessary mean that it gets loaded, so to verify that simply pass -verbose or -verbose:class argument to weblogic’s JVM in startUp.sh/bin and you will get the origin of loaded classes.

Class loaded from WL_HOME/modules, how’s that possible? To understand that general understanding of classloading is essential and then understand your J2EE standard implementation e.g. Weblogic, JBoss, … This post is not going to pretend an expert detail knowledge level on this topic so I will rather stay with general principles with reference to details documentation.

Java has several class loaders (bootstrap, extension, …) the important fact is that they work in some hierarchy (parent-child relationship) with some delegation scheme which says when to load a class and from where. Java elementary delegation principle says: Delegate finding classes and resources to their parent before searching own classpath. Only if the parent cannot find it child is allowed to load it. So far so good. To complicate the matter a bit more – java servlet specification recommends look at child classloader before delegating to parent (if this recommendation were taken you need to check with documentation of J2EE implementation you are using, as you can see you know nothing based on those rules 🙂 ) So in my case of Weblogic J2EE implementation

as you can see system classloader is the parent of all the application’s classloaders, details can be found here. So how the class get loaded from WL_HOME/modules ? The framework library must be on system classpath. On the system classpath is just weblogic.jar not my framework library?
Weblogic 10 in order to better modularity included components under WL_HOME/modules and weblogic.jar now refers to these components in the modules directory from its manifest classpath. So that means that other version of library sits on system classloader – the parent of all the application classloaders, so that means that those libraries included in application archives will be ignores based on the delegation scheme. (That was probably the idea why was recommended in J2EE classloading delegation scheme – child first). However weblogic does offer other way how to solve this case by so called classloader filters/interceptors defined in weblogic specific deployment descriptor either on ear level or war level.
weblogic-application.xml
<prefer-application-packages>
<package-name>org.apache.log4j.*</package-name>
<package-name>antlr.*</package-name>
</prefer-application-packages>
weblogic.xml
<container-descriptor>
      <prefer-web-inf-classes>true</prefer-web-inf-classes>
</container-descriptor>

K-V pairs to java bean re-map

Time to time you need on your projects to remap Key-Value pairs to regular java beans. One really nasty solution to this task is to do it manually. Yea, it works but this approach is not flexible and more over it is error prone. Mapping each field is hard coded and when adding, remove or modify them you have to correct all hard-coded mappings what is really awkward.
Another approach is to use some K-V to bean re-mapper for example ObjectMapper from JASON Jackson library or commons beanutils offer some possibilities as well.
If for some reason you cannot use these libraries e.g. legal problem or simply you don’t find implementation which suits your needs then it is time for your implementation.
Following example implementation re-map to primitives and enums from string representation. Some highlights: java 1.6 doesn’t offer any way how to find the wrapper class for primitives, wrap any problem (Exception) to base RuntimeException is not a good approach. In case of real usage it is suggested to change this. In the context of this example I think it’s fajn.

public class KVBeanRemaper {
    private static final Map wrappers = new HashMap();

    static {
        wrappers.put(byte.class, Byte.class);
        wrappers.put(short.class, Short.class);
        wrappers.put(int.class, Integer.class);
        wrappers.put(long.class, Long.class);
        wrappers.put(float.class, Float.class);
        wrappers.put(double.class, Double.class);
        wrappers.put(boolean.class, Boolean.class);
    }

    public static  T remap(Map keyValue, final Class classMapTo) {

        final Set dataToMap = new HashSet(keyValue.keySet());
        final Field[] fields;

        T res;
        try {
            res = classMapTo.newInstance();
            fields = classMapTo.getDeclaredFields();

            for (Field f : fields) {
                if (!dataToMap.contains(f.getName())) {
                    continue;
                } else {
                    final String key = f.getName();

                    if (f.getType().isEnum()) {
                        findAccessMethod(true, f, classMapTo).invoke(res,
                                Enum.valueOf((Class) f.getType(), keyValue.get(key).toString().toUpperCase()));
                        dataToMap.remove(key);
                    } else if (wrappers.containsKey(f.getType()) || f.getType() == String.class) {
                        Class c = f.getType();
                        if (c.isPrimitive()) {
                            c = wrappers.get(c);
                        }
                        findAccessMethod(true, f, classMapTo).invoke(res, c.cast(keyValue.get(key)));
                        dataToMap.remove(key);
                    }
                }
            }
        } catch (Exception ex) {
            throw new RuntimeException("Error while remapping", ex);
        }
        if (dataToMap.size() > 0) {
            throw new RuntimeException("Complete fieldset hasn't been remapped");
        }
        return res;
    }

    private static Method findAccessMethod(boolean setter, final Field field, final Class klazz) throws IntrospectionException {
        PropertyDescriptor pd = new PropertyDescriptor(field.getName(), klazz);
        if (setter) {
            return pd.getWriteMethod();
        } else {
            return pd.getReadMethod();
        }
    }
}

Spring JAX-WS timeout

When building up a reliable predictable solution you need to manage a “response time” as one of the core principles. This is accomplished by implementing time out policy. You don’t wanna clients hanging on connection forever. Nowadays very common approach to integration is taking advantage of Spring framework.

Spring JAX-WS web service proxies (JaxWsPortProxyFactoryBean) doesn’t offer a direct possibility to set a service timeout via one of their properties. Following lines documents one of the possibilities how to cope with that requirement.

Java implementation:

public class AbstractJaxWsPortProxyFactoryBean extends JaxWsPortProxyFactoryBean {

    public static final int CONNECT_TIMEOUT = 2500;

    public void setTimeout(final int timeout) {
        // JAX WS
        addCustomProperty("com.sun.xml.ws.connect.timeout", CONNECT_TIMEOUT);
        addCustomProperty("com.sun.xml.ws.request.timeout", timeout);
        // Sun JAX WS
        addCustomProperty("com.sun.xml.internal.ws.connect.timeout", CONNECT_TIMEOUT);
        addCustomProperty("com.sun.xml.internal.ws.request.timeout", timeout);
    }

    @Override
    public void afterPropertiesSet() {
        super.afterPropertiesSet();
    }
}

Spring configuration:

     <bean id="abstractWsClient" class="AbstractJaxWsPortProxyFactoryBean" abstract="true">
        <property name="timeout" value="10000"/>
    </bean>

    <bean id="LocalServiceBinding" parent="abstractWsClient">
        <description>Client</description>
        <property name="serviceInterface" value="cz.Sample.Example.ServiceInterface"/>
        <property name="wsdlDocumentUrl" value="classpath:ServiceInterface.wsdl"/>

        <property name="namespaceUri" value="http://cz/Sample/Example/ServiceInterface"/>

        <property name="serviceName" value="ServiceInterface"/>
        <property name="endpointAddress" value="${localservice.url}"/>
        <property name="timeout" value="${localservice.timeout}"/>
        <property name="username" value="${localservice.user}"/>
        <property name="password" value="${localservice.password}"/>
    </bean>