One Xpand user less, one happy Xtend user more

This week I am consulting a customer who had introduced a model based development approach based almost 10 years ago and used it with success since then. At the time back then Xpand was the most powerful code generation engine and UML models were often used to generate code from. Xtext did not even exist at that time. The customer uses Enterprise Architect and the Enterprise Architect exporter from the components4oaw. The last release was in 2011 and the project has not been developed further. For my customer the component just did what it should, so there was no direct need to change anything at the process. The EA exporter has its flaws, especially since it needs an Enterprise Architect installation and this means it only works on Windows machines. For Enterprise Architect users who do model based development, we therefore offer the YAKINDU Enterprise Architect Bridge, which scales better and can process .eap models on any platform.

My task was to help the customer to modernize their tool chain. To be fair, Xpand is not the right choice anymore. The Xtend Language combines the strengths from Xpand (great templating support, functional programming, static typing, polymorphic dispatch, mature Eclipse tooling) and resolved some weaknesses (performance, compiled code instead of interpreted, Java integration, extensibility of expressions).

For the customer who never had used Xtend so far, but was quite familiar with Xpand, it was quite a surprise how close both languages really are. Most of the concepts can be mapped 1:1 from good old Xpand to Xtend. We created a small generator from scratch and copied functions and templates and translated them manually for demonstration. They understood Xtend within minutes then. Most of the work is monkey-see-monkey-do. For such cases I wrote a small migration script which can translate Xpand,Xtend(1) and Check code to Xtend(2) classes. We used that script now for an initial translation.

One of the reasons why this migration script is not published is that cannot translate Xpand code completely to Xtend. The tool parses Xpand templates and traverses the AST to transform the expressions to Xtend equivalents. But it does not know about the type system which is used by the generator.

And here Xtend is not as powerful as Xpand, especially when using UML. In Xpand, the UML type system adapter analyzed the applied profiles of a model to create virtual types for stereotypes. Elements with stereotypes applied can be processed as if they were of a subtype of the extended type, and tagged values became attributes. In Xtend, there is only the Java type system, and for processing UML models this means that templates have to use the UML metamodel directly. The old “feeling” of a real type system can be simulated with a set of extension functions. I usually introduce an extension class per profile which offers for example a method per tagged value. The creation of such an extension class can again be automated by generating it from a profile .uml model.

Another disadvantage against the Xpand framework is that Xtend is not a code generation framework, but a general purpose programming language. How a generator component looks like that invokes the templates, where to produce output to, how to integrate constraint checks, this all is not provided.

What I usually do here is to use Xtext’s infrastructure like the IGenerator interface and validation based on Xtext’s AbstractDeclarativeValidator and @Check annotations. I reuse Xtext’s MWE Reader and Generator component. However, this requires some work and advanced knowledge. The basic approach is here to make UML models recognized by Xtext as a generic EMF resource. To actually use the Xtext components, a generator specific Guice module has to be created which extends AbstractGenericResourceRuntimeModule and satisfy several dependencies which an Xtext language already as configured by default. A good part of the approach was described by Christian Dietrich in his blog posts “Xtend2 Code Generators with Non-Xtext Models” and “Xtext 2.0 and UML“. I may go into details in a later blog post.

Although Xtend is such a natural choice for writing code generators and it is so well-integrated in the Xtext ecosystem, there is framework support missing for non-Xtext models. It is easy to write a trivial framework, but why should everyone start writing their own when good infrastructure already exists in the Xtext framework? From what I experienced again, there is need to have some more framework support for this use case. I doubt that it could be part of Xtext itself, but maybe we will provide a framework for Xtend based code generators in the future.

At the end we were able to show and demonstrate a migration path in one single workshop day. The customer was happy to save several days or weeks of time. The workshop costs were compensated by far for them. From a sales perspective it might not be wise to leave a customer in a state where he is not dependent on our services in the next time, but this is not how we work. For me it feels right.

Give a man a fish and you feed him for a day; teach a man to fish and you feed him for a lifetime.

Xbase Customization: Redefining operator keywords

If you use Xbase in your Xtext based DSL, you are usually satisfied with the set of operators the expression language defines. They are closely related to what you are used to in Java or similar languages.

However, in a customer’s workshop the customer wished to have custom keywords for some operators. For example, the operator && should be alternatively presented as AND, and the || operator as OR.

To demonstrate this customization we’ll start with Xtext’s famous Domainmodel Example. The Domainmodel.xtext grammar is derived from org.eclipse.xtext.xbase.Xbase and the Operation rule uses an XBlockExpression for the Operation’s body:

grammar org.eclipse.xtext.example.domainmodel.Domainmodel with org.eclipse.xtext.xbase.Xbase

...

Operation:
	'op' name=ValidID '(' (params+=FullJvmFormalParameter (',' params+=FullJvmFormalParameter)*)? ')' (':' type=JvmTypeReference)?
		body=XBlockExpression;

Unit Test

We’ll extend the language test-driven, thus we first create a unit test that uses the new feature, but will fail first until we have successfully implemented it. Fortunately there is already a suitable test class in project org.eclipse.xtext.example.domainmodel.tests.

We extend the class ParserTest.xtend:

	@Test
	def void testOverriddenKeyword() {
		val model = '''
			package example {
			  entity MyEntity {
			    property : String
			    op foo(String s) {
			    	return s!= null && s.length > 0 AND s.startsWith("bar")
			    }
			  }
			}
		'''.parse
		val pack = model.elements.head as PackageDeclaration
		val entity = pack.elements.head as Entity
		val op = entity.features.last as Operation
		val method = op.jvmElements.head as JvmOperation
		model.eResource.assertNoErrors
		Assert::assertEquals("boolean", method.returnType.simpleName)
	}

Note the Operation body, the expression uses both presentations of the And-operator.

return s!= null && s.length > 0 AND s.startsWith("bar")

When the ParserTest is executed it will now fail, of course, but only for the AND keyword:

java.lang.AssertionError: Expected no errors, but got :
ERROR (org.eclipse.xtext.diagnostics.Diagnostic.Linking) 
'The method or field AND is undefined' 
on XFeatureCall, offset 119, length 3
ERROR (org.eclipse.xtext.xbase.validation.IssueCodes.unreachable_code) 'Unreachable expression.' on XFeatureCall, offset 119, length 3

	at org.junit.Assert.fail(Assert.java:88)
	at org.eclipse.xtext.junit4.validation.ValidationTestHelper.assertNoErrors(ValidationTestHelper.java:187)
	at org.eclipse.xtext.example.domainmodel.tests.ParserTest.testOverriddenKeyword(ParserTest.java:268)
...

Overloading the operator rules

Looking at Xbase.xtext shows that Xbase defines separate data type rules for operators:

OpOr:
	'||';
OpAnd:
	'&&';

Xtext’s Grammar Mixin feature allows a redefinition of those rules. The obvious customization is in Domainmodel.xtext:

OpOr:
	'||' | 'OR';
OpAnd:
	'&&' | 'AND';

Regenerating the language’s Xtext implementation makes those keywords available to the syntax. However, the unit test still fails, but now with a different error:

java.lang.AssertionError: Expected no errors, but got :
ERROR (org.eclipse.xtext.diagnostics.Diagnostic.Linking) 
'AND cannot be resolved.' on XBinaryOperation, offset 119, length 3

	at org.junit.Assert.fail(Assert.java:88)
	at org.eclipse.xtext.junit4.validation.ValidationTestHelper.assertNoErrors(ValidationTestHelper.java:187)
	at org.eclipse.xtext.example.domainmodel.tests.ParserTest.testOverriddenKeyword(ParserTest.java:268)

OperatorMapping

The missing piece is a customization of class OperatorMapping. Thus we create a subclass OperatorMappingCustom with constant QualifiedNames for the additional operator keywords and bind it in DomainmodelRuntimeModule:

@Singleton
public class OperatorMappingCustom extends OperatorMapping {
	public static final QualifiedName AND_2 = create("AND");
	public static final QualifiedName OR_2 = create("OR");
}
public class DomainmodelRuntimeModule extends AbstractDomainmodelRuntimeModule {
	[...]
	public Class bindOperatorMapping() {
		return OperatorMappingCustom.class;
	}
}

A naive approach is here to overload the initializeMapping(), as the Javadoc suggests (“Clients may want to override #initializeMapping() to add other operators.“):

But this fails again:

com.google.inject.CreationException: Guice creation errors:

1) Error injecting constructor, java.lang.IllegalArgumentException: value already present: operator_and
  at org.eclipse.xtext.example.domainmodel.OperatorMappingCustom.(Unknown Source)
  at org.eclipse.xtext.example.domainmodel.OperatorMappingCustom.class(Unknown Source)
  while locating org.eclipse.xtext.example.domainmodel.OperatorMappingCustom
  while locating org.eclipse.xtext.xbase.scoping.featurecalls.OperatorMapping
    for field at org.eclipse.xtext.xbase.util.XExpressionHelper.operatorMapping(Unknown Source)
  while locating org.eclipse.xtext.xbase.util.XExpressionHelper
    for field at org.eclipse.xtext.xbase.validation.XbaseValidator.expressionHelper(Unknown Source)
  at org.eclipse.xtext.service.MethodBasedModule.configure(MethodBasedModule.java:57)
  while locating org.eclipse.xtext.example.domainmodel.validation.DomainmodelJavaValidator
Caused by: java.lang.IllegalArgumentException: value already present: operator_and
	at com.google.common.collect.HashBiMap.put(HashBiMap.java:237)
	at com.google.common.collect.HashBiMap.put(HashBiMap.java:214)
	at org.eclipse.xtext.example.domainmodel.OperatorMappingCustom.initializeMapping(OperatorMappingCustom.java:25)
	at org.eclipse.xtext.xbase.scoping.featurecalls.OperatorMapping.(OperatorMapping.java:121)
	at org.eclipse.xtext.example.domainmodel.OperatorMappingCustom.(OperatorMappingCustom.java:18)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	...

Instead, the method getMethodName() must be overridden and delegate the invocations of the new operators to the existing ones. The resulting OperatorMappingCustom is then:

@Singleton
public class OperatorMappingCustom extends OperatorMapping {
	public static final QualifiedName AND_2 = create("AND");
	public static final QualifiedName OR_2 = create("OR");

	@Override
	public QualifiedName getMethodName(QualifiedName operator) {
		if (AND_2.equals(operator)) {
			return getMethodName(AND);
		}
		if (OR_2.equals(operator)) {
			return getMethodName(OR);
		}
		return super.getMethodName(operator);
	}
}

Finally, the unit test will execute successful.

Summary

Xbase allows customization of operators by overriding the operator’s data type rule from Xbase.xtext. This adds the operator keywords to the language, but fails at runtime. Additionaly the class OperatorMapping must be customized and method getMethodName() overloaded.

Redirecting Maven transfer messages to a file

One thing that is often bothering me about Maven is the extensive logging of download messages. Usually I am not interested in these messages unless something is really wrong, and then it is important to know, which URLs are accessed for download.

Before Maven 3.1 there were only poor chances to influence this behavior through the CLI. There were the options “-q” (quiet) and “-B” (batch mode), which influence the TransferListener implementation used by the Maven main class MavenCLI.

(from MavenCLI 3.0.4):

if ( quiet )
{
    transferListener = new QuietMavenTransferListener();
}
else if ( request.isInteractiveMode() )
{
    transferListener = new ConsoleMavenTransferListener( System.out );
}
else
{
    transferListener = new BatchModeMavenTransferListener( System.out );
}

With Maven 3.1 it was decided to use SLF4J as logging API. When using batch mode (-B), Maven will use Slf4jMavenTransferListener for logging, which is determined by method getBatchTransferListener().

(from MavenCLI 3.2.5):

if ( quiet )
{
    transferListener = new QuietMavenTransferListener();
}
else if ( request.isInteractiveMode() && !cliRequest.commandLine.hasOption( CLIManager.LOG_FILE ) )
{
    //
    // If we're logging to a file then we don't want the console transfer listener as it will spew
    // download progress all over the place
    //
    transferListener = getConsoleTransferListener();
}
else
{
    transferListener = getBatchTransferListener();
}


protected TransferListener getBatchTransferListener()
{
    return new Slf4jMavenTransferListener();
}

By default, the SLF4J SimpleLogger is used, which can be configured by the file

<MVN_HOME>/conf/logging/simplelogger.properties

This allows already some decent influence on the message layout, threshold etc. Transfer messages can be suppressed by adding this property to the file:

org.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn

All info level transfer messages will be suppressed, but this would also cover upload messages when deploying artifacts.

For a more advanced set up, the underyling logging framework can be replaced, e.g. with Log4J2. To do so, follow these simple steps:

1) delete lib/slf4j-simple-<version>.jar
2) add to lib/ folder:
log4j-slf4j-impl-2.2.jar
log4j-api-2.2.jar
log4j-core-2.2.jar
3) add to conf/logging folder file log4j2.xml


<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="INFO">
<Appenders>
<Console name="Console" target="SYSTEM_OUT">
<!-- layout see http://logging.apache.org/log4j/2.0/manual/layouts.html -->
<PatternLayout pattern="[%level] %msg%n" />
</Console>
<File name="TransferLog" fileName="mvn_transfer.log" immediateFlush="false" append="false">
<PatternLayout pattern="%msg%n"/>
</File>
</Appenders>
<Loggers>
<Root level="info">
<AppenderRef ref="Console" />
</Root>
<Logger name="org.apache.maven.cli.transfer.Slf4jMavenTransferListener" level="info" additivity="false">
<AppenderRef ref="TransferLog" />
</Logger>
</Loggers>
</Configuration>

As a result, all transfer messages will be redirected to file “mvn_transfer.log”, while all other messages go to the console.

Fornax MWE Workflow Maven Plugin 3.5.1 published on Maven Central

The Fornax Workflow plugin is a Maven Plugin that executes MWE/MWE2 workflows within Maven. It has been there for quite some years now, and whoever needed to integrate MWE/MWE2 workflows in a headless build was likely using it. The Fornax Platform has been an address where components around openArchitectureWare, Xpand and Xtext have been developed. While all other subprojects don’t play any role anymore, the Workflow plugin is still in frequent use.

Over the years we had to change the underlying infrastructure some times. The plugin was hosted on the project’s own repository server, and projects using the plugin had to configure an additional plugin repository either in their POMs or settings.xml. This was undesired, but at the end not really a blocker. However, with a recent change of the repository manager, users experienced problems accessing the Fornax repository http://www.fornax-platform.org/nexus/content/groups/public. Currently, this URL is redirected to a server hosted at itemis, and users might get problems with the HTTPS connection.

It always bothered me that we had to host this plugin on a separate repository, and since it is a widely used component, it is logical that it should be available from the Central repository. But it was never a blocker for me. Now finally I got the driver to change this.

Long story short, the plugin is now published at Maven Central as version 3.5.1. I highly recommend to upgrade to this version and remove the Fornax Maven repository from your configuration. The coordinates did not change, they are still org.fornax.toolsupport:fornax-oaw-m2-plugin. I would like to change this sometime in the future (e.g. the name parts “oaw” and “m2” are not up-to-date anymore), maybe with moving development to another project hosting platform.

Version 3.5.1 does not differ much from 3.4.0, which is the version likely used by the world today. The main work was on refactoring the POM and its parent in order to meet the requirements for deployment on Maven Central. Further, I worked on automation of the release process with the maven-release-plugin.

There is one additional feature in 3.5.x: The new property useTestScope can be used to skip dependencies from the test scope for computation of the Java classpath used to execute a workflow in forked mode. On Windows systems the classpath sometimes reaches the limit of allowed command line length, especially since the local Maven repository is below the user’s home directory by default, which has already a rather long path prefix. By default, the plugin will exclude now these test scope dependencies unless the user configures the property explicitly. In 3.5.0 there was a small logical bug with this feature which made the plugin unusable, so please do not use that version. The version 3.5.1 can be used without problems for all using 3.4.0 so far.

svn: missing argument: –password

I am currently setting up the Maven Release Plugin for a project which is stored in a SVN repository. The plugin needs to do modifications in the repository, for which it executes a svn command. The credentials it gets from ~/.m2/settings.xml, and the password is passed with the --password parameter. On the command line the password is masked.

Now I ran into the trouble that the svn command fails to execute with the message

svn: missing argument: --password

The complete output is:

[INFO] Executing: /bin/sh -c cd /Users/thoms/Development/projects/fornax/ws/fornax-parent && svn --username kthoms --password '*****' --no-auth-cache --non-interactive status
[INFO] Working directory: /Users/thoms/Development/projects/fornax/ws/fornax-parent
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 7.031s
[INFO] Finished at: Wed Nov 26 09:39:38 CET 2014
[INFO] Final Memory: 10M/24M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-release-plugin:2.5.1:prepare (default-cli) on project fornax-parent: Unable to check for local modifications
[ERROR] Provider message:
[ERROR] The svn command failed.
[ERROR] Command output:
[ERROR] svn: missing argument: --password
[ERROR] Geben Sie »svn help« für weitere Hilfe ein.

I could not explain this, since obviously a svn command was executed with passing my password. So I copied only the svn command and executed this only from command line:

svn --username kthoms --password MYPLAINPASSWORD --no-auth-cache --non-interactive update

Same effect.

Finally I found out that the password itself was causing the problem. Without telling too much I can say it started with the ‘#’ character. This lead to the fact that on shell this was interpreted as a comment. At the end this is completely logical, but I did not think about that scenario when choosing the password. And the error message was a bit misleading here.

grep command to filter distinct values from XML tags

I have a ton of Oracle Forms XML export files and wanted to know, which different patterns occur for the value of the FormatMask XML attribute. The input looks as follows:

<Item Name="CREATION_DATE" UpdateAllowed="false" DirtyInfo="false" Visible="false" QueryAllowed="false" InsertAllowed="false" Comment="TABLE ALIAS&amp;#10;  FDA&amp;#10;&amp;#10;BASED ON TABLE&amp;#10;  TMI_FINANCIAL_DATA&amp;#10;&amp;#10;COLUMN USAGES&amp;#10;  ...    CREATION_DATE                 SEL&amp;#10;" ParentModule="OBJLIB1" Width="10" Required="false" ColumnName="CREATION_DATE" DataType="Date" ParentModuleType="25" Label="Creation Date" ParentType="15" ParentName="QMSSO$QUERY_ONLY_ITEM" MaximumLength="10" PersistentClientInfoLength="142" ParentFilename="tmiolb65_mla.olb" FormatMask="DD-MM-RRRR">

A naive grep command would print out the whole line, including the file name. After some iterations I came to the following command, which does what I want in a single line.

grep -R -h -o -e FormatMask=\"[^\"]* * | sed 's/FormatMask="//g' | sort | uniq

What the command does is:

  • grep recursively (-R) for a regular expression (-e)
  • search for FormatMask="<any-char-until-quotation>
  • print only the matching part of the line (-o). This will include the prefix FormatMask="
  • print without the file name (-h)
  • strip off the prefix with sed
  • sort the results alphabetically
  • remove duplicate lines (uniq)

The result (excerpt)is:

00
09
099
0999
0999999
09999999
0D0
0D999
0D9999
9
90
90D0
90D000
...

Enabling Spring in Scout applications

Today I am attending the first Scout User Day 2014 in Ludwigsburg, which is aligned with EclipseCon Europe 2014 starting tomorrow. Yesterday we had a pre-event dinner with some attendees and the organizers at the Rossknecht restaurant. IMG_3161IMG_3160 I got into a chat with Nejc Gasper, who will give a talk titled “Build a Scout backend with Spring” today. I was a bit surprised as he told me he did not manage to get Spring’s classpath scanning working yet. Since we are doing this in our application, I think it is worth writing now down what we had to do to get this working. The goal in our application is primarely to use Spring as dependency injection container, since the customer uses Spring in all their other Java based applications, too, and wanted us to do so also.

Spring Configuration

The Spring configuration files are located in the folder META-INF/spring of the *.client, *.shared, *.server projects. In this configuration files, we mainly activate classpath scanning:

<?xml version="1.0" encoding="UTF-8"?>
<beans:beans xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:beans="http://www.springframework.org/schema/beans" xmlns:p="http://www.springframework.org/schema/p"
    xmlns:context="http://www.springframework.org/schema/context"
    xsi:schemaLocation="
        http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd
        http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.1.xsd">

    <context:annotation-config />
    <context:component-scan base-package="com.rhenus.fl" />
    <!--  http://docs.spring.io/spring/docs/4.0.x/spring-framework-reference/html/validation.html#core-convert-Spring-config -->
    <beans:bean id="conversionService" class="org.springframework.core.convert.support.DefaultConversionService" />
</beans:beans>

The next important thing is to copy the files spring.handlers, spring.schemas<, spring.tooling into the META-INF folder. The files can be found in the META-INF directory of bundle org.springframework.context. screenshot 21 Without doing this, you will get errors while loading the Spring configuration like this:

Caused by: 
org.springframework.beans.factory.parsing.BeanDefinitionParsingException: Configuration problem: Unable to locate Spring NamespaceHandler for XML schema namespace [http://www.springframework.org/schema/context]|Offending resource: URL [bundleresource://9.fwk1993775065:1/META-INF/spring/fl_client.xml]|
	at org.springframework.beans.factory.parsing.FailFastProblemReporter.error(FailFastProblemReporter.java:70)
	at org.springframework.beans.factory.parsing.ReaderContext.error(ReaderContext.java:85)
	at org.springframework.beans.factory.parsing.ReaderContext.error(ReaderContext.java:80)
	at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.error(BeanDefinitionParserDelegate.java:316)
	at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1421)
	at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1414)
	at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:187)
	at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.doRegisterBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:141)
	at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.registerBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:110)
	at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.registerBeanDefinitions(XmlBeanDefinitionReader.java:508)
	at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:391)
	at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:335)
	at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:303)
	at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:180)
	at org.springframework.context.support.GenericXmlApplicationContext.load(GenericXmlApplicationContext.java:116)

Bundle Activator

The Spring configuration files are loaded in the Bundle Activator classes of the three “main” Scout projects (client/shared/server). The Activator can also be used then to access the ApplicationContext. We use the GenericXmlApplicationContext to initialize the context from the XML configuration above. One important thing is that this class uses the ClassLoader of the Activator. Otherwise you will get again the error mentioned in the section above. The Activator class looks then as follows:


public class Activator extends Plugin {

 // The plug-in ID
 public static final String PLUGIN_ID = "com.rhenus.fl.application.client";
 public final static String SPRING_CONFIG_FILE = "META-INF/spring/fl_client.xml";

 // The shared instance
 private static Activator plugin;
 private ApplicationContext ctx;

 @Override
 public void start(BundleContext context) throws Exception {
  super.start(context);
  plugin = this;
  init(context);
 }

 @Override
 public void stop(BundleContext context) throws Exception {
  plugin = null;
  super.stop(context);
 }

 public static Activator getDefault() {
  return plugin;
 }

 private void init(BundleReference bundleContext) {
  URL url = getClass().getClassLoader().getResource(SPRING_CONFIG_FILE);

  UrlResource usr = new UrlResource(url);

  ctx = new GenericXmlApplicationContext() {
  @Override
  public ClassLoader getClassLoader() {
   return Activator.class.getClassLoader();
  }
 };
  ((GenericXmlApplicationContext) ctx).load(usr);
  ((AbstractApplicationContext) ctx).refresh();

 }

 public ApplicationContext getContext() {
  return ctx;
 }
}

Service Factory

In order to use dependency injection in Scout services, the services themselves must be instantiated through the Spring ApplicationContext. The default implementation of course is not aware of Spring, so we need to customize this. Unfortunately we have to copy the class org.eclipse.scout.rt.server.services.ServerServiceFactory. We need just to exchange one single line in the method updateInstanceCache(), where the service is instantiated, but this method is private in Scout. The line

m_service = m_serviceClass.newInstance();

is replaced by

m_service = getContext().getBean(m_serviceClass);

Since we have to provide different ApplicationContexts in the different plugins, we put this into the abstract class AbstractSpringAwareServerServiceFactory (full code):

public abstract class AbstractSpringAwareServerServiceFactory implements IServiceFactory {

 private void updateInstanceCache(ServiceRegistration registration) {
  synchronized (m_serviceLock) {
   if (m_service == null) {
    try {
     // CUSTOMIZING BEGIN
//     m_service = m_serviceClass.newInstance();
     m_service = getContext().getBean(m_serviceClass);
     // CUSTOMIZING END
     if (m_service instanceof IService2) {
      ((IService2) m_service).initializeService(registration);
     } else if (m_service instanceof IService) {
      ((IService) m_service).initializeService(registration);
     }
    } catch (Throwable t) {
     LOG.error("Failed creating instance of " + m_serviceClass,
       t);
    }
   }
  }
 }
 
 // CUSTOMIZING BEGIN
 protected abstract ApplicationContext getContext();
 // CUSTOMIZING END

}

The concrete classes implement the method getContext() by accessing the method from the Bundle Activator:

public class ServerServiceFactory extends AbstractSpringAwareServerServiceFactory {

  /**
   * @param serviceClass
   */
  public ServerServiceFactory(Class<?> serviceClass) {
    super(serviceClass);
  }

  @Override
  protected ApplicationContext getContext() {
    return Activator.getDefault().getContext();
  }

}

plugin.xml

The service factory class implemented above must be used now to create the services. This is done in the plugin.xml file:

<service
 factory="com.rhenus.fl.application.server.services.ServerServiceFactory"
 class="com.rhenus.fl.tmi.server.tmirln010.TMIRLN010Service"
 session="com.rhenus.fl.application.server.ServerSession">
</service>

Use Dependency Injection

Now we are finally able to use Dependency Injection with javax.inject.Inject with Scout services.

import org.springframework.stereotype.Component;
import javax.inject.Inject;
...

@Component
@InputValidation(IValidationStrategy.PROCESS.class)
public class TMIRLN010Service extends AbstractTMIRLN010Service {
  @Inject
  protected ConversionService conversionService;
  ...
}

Go!

If everything is correct, you will now recognize the following lines in the console when starting up the Scout application:

Okt 27, 2014 8:45:28 AM org.springframework.beans.factory.xml.XmlBeanDefinitionReader loadBeanDefinitions
INFO: Loading XML bean definitions from URL [bundleresource://9.fwk1993775065:1/META-INF/spring/fl_client.xml]
Okt 27, 2014 8:45:29 AM org.springframework.beans.factory.xml.XmlBeanDefinitionReader loadBeanDefinitions
INFO: Loading XML bean definitions from URL [bundleresource://10.fwk1993775065:1/META-INF/spring/fl_shared.xml]
Okt 27, 2014 8:45:29 AM org.springframework.context.support.AbstractApplicationContext prepareRefresh
INFO: Refreshing com.rhenus.fl.application.shared.Activator$1@b40d694: startup date [Mon Oct 27 08:45:29 CET 2014]; root of context hierarchy
Okt 27, 2014 8:45:29 AM org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor 
INFO: JSR-330 'javax.inject.Inject' annotation found and supported for autowiring
Okt 27, 2014 8:45:29 AM org.springframework.context.support.AbstractApplicationContext prepareRefresh
INFO: Refreshing com.rhenus.fl.application.client.Activator$1@292f062b: startup date [Mon Oct 27 08:45:29 CET 2014]; root of context hierarchy
Okt 27, 2014 8:45:29 AM org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor 
INFO: JSR-330 'javax.inject.Inject' annotation found and supported for autowiring