EclipseCon Europe 2016 – I’m coming!

130x130 I'm Speaking logo

I’m sitting now in the train on my way to Ludwigsburg. For sure I cannot miss this great event, where so many smart people and friends come together. Although ECE is always at the same time of the year and I knew for long that I want to go there again, it comes not at the best time for me – or better: for my family. The reason is a really positive one: Almost 3 weeks ago on October 4th I became for the second time father of a lovely daughter, Sophia.

IMG_1433.jpg

I am missing her already deeply, and my wife would need some supporting hands at home right now. However she understands that EclipseCon is important for me and supports me. The past weeks I did not get much sleep, but the reason was mostly not Sophia, but mainly preparing my talks – in the late evening and night it was most suitable. I will definitely enjoy EclipseCon, but this year I will also be happy when I can leave towards home again and will take 3 days off afterwards for regeneration and caring for the family.

This year’s EclipseCon will be fully packed for me. Today, on Sunday, it will start with a small pre-conference event at the Rossknecht, where the Eclipse Scout community comes together. Tomorrow at the Unconference day I will attend the Eclipse Scout User Day, which I attended the past two years already. At the moment I do not work with Scout, but I really enjoyed working with this framework in the past and would like to do another project with it again. The recent year Scout has much evolved, and I am keen to learn all the news.

On tuesday the Xtext developers plan to schedule a BoF Session. A beta of Xtext 2.11 was released this week, and we have to work much now to make the 2.11 release round. I plan to invest quite some time on this, and we have to talk about the concrete tasks and collaboration. Since we are now a team spanning several companies, it is important to have the chance to get the whole team together at EclipseCon.

On wednesday it is time for action. I was recently contacted that the proposal for the session “Recipes to build Code Generators for Non-Xtext Models with Xtend” got picked from the waiting list. I will perform the talk with my colleague and friend Holger Schill.

screenshot 68.png

We give this talk because Xtend is a very nice language when it comes to developing template based code generators, but is mostly only used in the context of Xtext. Xtext projects seamlessly integrate a generator infrastructure with Xtend, but it is not that common to use Xtend based generators with models that are not Xtext DSL files. We will show how simple it can be to integrate Xtend for other use cases, e.g. with JSON as input.

After that talk we will participate at the Modeling Symposium (17:45 Theater Stage). There we will shortly (7 minutes slot only) present a generator fragment that creates an extension package for VisualStudio Code to embed support for a DSL with an embedded language server. The Language Server Protocol support is the main feature for Xtext 2.11. We plan to contribute the created generator fragment to the Xtext project.

On thursday it is time for my talk “From stairway to heaven onto the highway to hell with Xtext” (11:00 Theater Stage). In this talk I will explain why I love Xtext and why it is used successfully in so many projects first, but then discuss where users have or run into trouble when using the framework. We see in many projects that first steps are easily done and don’t require much experience, but as requirements grow the complexity of DSL projects also grow and extensive experience with details of Xtext and the technologies behind is crucial. I hopefully compiled an informative set of issues.

screenshot 69.png

One Xpand user less, one happy Xtend user more

This week I am consulting a customer who had introduced a model based development approach based almost 10 years ago and used it with success since then. At the time back then Xpand was the most powerful code generation engine and UML models were often used to generate code from. Xtext did not even exist at that time. The customer uses Enterprise Architect and the Enterprise Architect exporter from the components4oaw. The last release was in 2011 and the project has not been developed further. For my customer the component just did what it should, so there was no direct need to change anything at the process. The EA exporter has its flaws, especially since it needs an Enterprise Architect installation and this means it only works on Windows machines. For Enterprise Architect users who do model based development, we therefore offer the YAKINDU Enterprise Architect Bridge, which scales better and can process .eap models on any platform.

My task was to help the customer to modernize their tool chain. To be fair, Xpand is not the right choice anymore. The Xtend Language combines the strengths from Xpand (great templating support, functional programming, static typing, polymorphic dispatch, mature Eclipse tooling) and resolved some weaknesses (performance, compiled code instead of interpreted, Java integration, extensibility of expressions).

For the customer who never had used Xtend so far, but was quite familiar with Xpand, it was quite a surprise how close both languages really are. Most of the concepts can be mapped 1:1 from good old Xpand to Xtend. We created a small generator from scratch and copied functions and templates and translated them manually for demonstration. They understood Xtend within minutes then. Most of the work is monkey-see-monkey-do. For such cases I wrote a small migration script which can translate Xpand,Xtend(1) and Check code to Xtend(2) classes. We used that script now for an initial translation.

One of the reasons why this migration script is not published is that cannot translate Xpand code completely to Xtend. The tool parses Xpand templates and traverses the AST to transform the expressions to Xtend equivalents. But it does not know about the type system which is used by the generator.

And here Xtend is not as powerful as Xpand, especially when using UML. In Xpand, the UML type system adapter analyzed the applied profiles of a model to create virtual types for stereotypes. Elements with stereotypes applied can be processed as if they were of a subtype of the extended type, and tagged values became attributes. In Xtend, there is only the Java type system, and for processing UML models this means that templates have to use the UML metamodel directly. The old “feeling” of a real type system can be simulated with a set of extension functions. I usually introduce an extension class per profile which offers for example a method per tagged value. The creation of such an extension class can again be automated by generating it from a profile .uml model.

Another disadvantage against the Xpand framework is that Xtend is not a code generation framework, but a general purpose programming language. How a generator component looks like that invokes the templates, where to produce output to, how to integrate constraint checks, this all is not provided.

What I usually do here is to use Xtext’s infrastructure like the IGenerator interface and validation based on Xtext’s AbstractDeclarativeValidator and @Check annotations. I reuse Xtext’s MWE Reader and Generator component. However, this requires some work and advanced knowledge. The basic approach is here to make UML models recognized by Xtext as a generic EMF resource. To actually use the Xtext components, a generator specific Guice module has to be created which extends AbstractGenericResourceRuntimeModule and satisfy several dependencies which an Xtext language already as configured by default. A good part of the approach was described by Christian Dietrich in his blog posts “Xtend2 Code Generators with Non-Xtext Models” and “Xtext 2.0 and UML“. I may go into details in a later blog post.

Although Xtend is such a natural choice for writing code generators and it is so well-integrated in the Xtext ecosystem, there is framework support missing for non-Xtext models. It is easy to write a trivial framework, but why should everyone start writing their own when good infrastructure already exists in the Xtext framework? From what I experienced again, there is need to have some more framework support for this use case. I doubt that it could be part of Xtext itself, but maybe we will provide a framework for Xtend based code generators in the future.

At the end we were able to show and demonstrate a migration path in one single workshop day. The customer was happy to save several days or weeks of time. The workshop costs were compensated by far for them. From a sales perspective it might not be wise to leave a customer in a state where he is not dependent on our services in the next time, but this is not how we work. For me it feels right.

Give a man a fish and you feed him for a day; teach a man to fish and you feed him for a lifetime.

Xbase Customization: Redefining operator keywords

If you use Xbase in your Xtext based DSL, you are usually satisfied with the set of operators the expression language defines. They are closely related to what you are used to in Java or similar languages.

However, in a customer’s workshop the customer wished to have custom keywords for some operators. For example, the operator && should be alternatively presented as AND, and the || operator as OR.

To demonstrate this customization we’ll start with Xtext’s famous Domainmodel Example. The Domainmodel.xtext grammar is derived from org.eclipse.xtext.xbase.Xbase and the Operation rule uses an XBlockExpression for the Operation’s body:

grammar org.eclipse.xtext.example.domainmodel.Domainmodel with org.eclipse.xtext.xbase.Xbase

...

Operation:
	'op' name=ValidID '(' (params+=FullJvmFormalParameter (',' params+=FullJvmFormalParameter)*)? ')' (':' type=JvmTypeReference)?
		body=XBlockExpression;

Unit Test

We’ll extend the language test-driven, thus we first create a unit test that uses the new feature, but will fail first until we have successfully implemented it. Fortunately there is already a suitable test class in project org.eclipse.xtext.example.domainmodel.tests.

We extend the class ParserTest.xtend:

	@Test
	def void testOverriddenKeyword() {
		val model = '''
			package example {
			  entity MyEntity {
			    property : String
			    op foo(String s) {
			    	return s!= null && s.length > 0 AND s.startsWith("bar")
			    }
			  }
			}
		'''.parse
		val pack = model.elements.head as PackageDeclaration
		val entity = pack.elements.head as Entity
		val op = entity.features.last as Operation
		val method = op.jvmElements.head as JvmOperation
		model.eResource.assertNoErrors
		Assert::assertEquals("boolean", method.returnType.simpleName)
	}

Note the Operation body, the expression uses both presentations of the And-operator.

return s!= null && s.length > 0 AND s.startsWith("bar")

When the ParserTest is executed it will now fail, of course, but only for the AND keyword:

java.lang.AssertionError: Expected no errors, but got :
ERROR (org.eclipse.xtext.diagnostics.Diagnostic.Linking) 
'The method or field AND is undefined' 
on XFeatureCall, offset 119, length 3
ERROR (org.eclipse.xtext.xbase.validation.IssueCodes.unreachable_code) 'Unreachable expression.' on XFeatureCall, offset 119, length 3

	at org.junit.Assert.fail(Assert.java:88)
	at org.eclipse.xtext.junit4.validation.ValidationTestHelper.assertNoErrors(ValidationTestHelper.java:187)
	at org.eclipse.xtext.example.domainmodel.tests.ParserTest.testOverriddenKeyword(ParserTest.java:268)
...

Overloading the operator rules

Looking at Xbase.xtext shows that Xbase defines separate data type rules for operators:

OpOr:
	'||';
OpAnd:
	'&&';

Xtext’s Grammar Mixin feature allows a redefinition of those rules. The obvious customization is in Domainmodel.xtext:

OpOr:
	'||' | 'OR';
OpAnd:
	'&&' | 'AND';

Regenerating the language’s Xtext implementation makes those keywords available to the syntax. However, the unit test still fails, but now with a different error:

java.lang.AssertionError: Expected no errors, but got :
ERROR (org.eclipse.xtext.diagnostics.Diagnostic.Linking) 
'AND cannot be resolved.' on XBinaryOperation, offset 119, length 3

	at org.junit.Assert.fail(Assert.java:88)
	at org.eclipse.xtext.junit4.validation.ValidationTestHelper.assertNoErrors(ValidationTestHelper.java:187)
	at org.eclipse.xtext.example.domainmodel.tests.ParserTest.testOverriddenKeyword(ParserTest.java:268)

OperatorMapping

The missing piece is a customization of class OperatorMapping. Thus we create a subclass OperatorMappingCustom with constant QualifiedNames for the additional operator keywords and bind it in DomainmodelRuntimeModule:

@Singleton
public class OperatorMappingCustom extends OperatorMapping {
	public static final QualifiedName AND_2 = create("AND");
	public static final QualifiedName OR_2 = create("OR");
}
public class DomainmodelRuntimeModule extends AbstractDomainmodelRuntimeModule {
	[...]
	public Class bindOperatorMapping() {
		return OperatorMappingCustom.class;
	}
}

A naive approach is here to overload the initializeMapping(), as the Javadoc suggests (“Clients may want to override #initializeMapping() to add other operators.“):

But this fails again:

com.google.inject.CreationException: Guice creation errors:

1) Error injecting constructor, java.lang.IllegalArgumentException: value already present: operator_and
  at org.eclipse.xtext.example.domainmodel.OperatorMappingCustom.(Unknown Source)
  at org.eclipse.xtext.example.domainmodel.OperatorMappingCustom.class(Unknown Source)
  while locating org.eclipse.xtext.example.domainmodel.OperatorMappingCustom
  while locating org.eclipse.xtext.xbase.scoping.featurecalls.OperatorMapping
    for field at org.eclipse.xtext.xbase.util.XExpressionHelper.operatorMapping(Unknown Source)
  while locating org.eclipse.xtext.xbase.util.XExpressionHelper
    for field at org.eclipse.xtext.xbase.validation.XbaseValidator.expressionHelper(Unknown Source)
  at org.eclipse.xtext.service.MethodBasedModule.configure(MethodBasedModule.java:57)
  while locating org.eclipse.xtext.example.domainmodel.validation.DomainmodelJavaValidator
Caused by: java.lang.IllegalArgumentException: value already present: operator_and
	at com.google.common.collect.HashBiMap.put(HashBiMap.java:237)
	at com.google.common.collect.HashBiMap.put(HashBiMap.java:214)
	at org.eclipse.xtext.example.domainmodel.OperatorMappingCustom.initializeMapping(OperatorMappingCustom.java:25)
	at org.eclipse.xtext.xbase.scoping.featurecalls.OperatorMapping.(OperatorMapping.java:121)
	at org.eclipse.xtext.example.domainmodel.OperatorMappingCustom.(OperatorMappingCustom.java:18)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	...

Instead, the method getMethodName() must be overridden and delegate the invocations of the new operators to the existing ones. The resulting OperatorMappingCustom is then:

@Singleton
public class OperatorMappingCustom extends OperatorMapping {
	public static final QualifiedName AND_2 = create("AND");
	public static final QualifiedName OR_2 = create("OR");

	@Override
	public QualifiedName getMethodName(QualifiedName operator) {
		if (AND_2.equals(operator)) {
			return getMethodName(AND);
		}
		if (OR_2.equals(operator)) {
			return getMethodName(OR);
		}
		return super.getMethodName(operator);
	}
}

Finally, the unit test will execute successful.

Summary

Xbase allows customization of operators by overriding the operator’s data type rule from Xbase.xtext. This adds the operator keywords to the language, but fails at runtime. Additionaly the class OperatorMapping must be customized and method getMethodName() overloaded.

Fornax MWE Workflow Maven Plugin 3.5.1 published on Maven Central

The Fornax Workflow plugin is a Maven Plugin that executes MWE/MWE2 workflows within Maven. It has been there for quite some years now, and whoever needed to integrate MWE/MWE2 workflows in a headless build was likely using it. The Fornax Platform has been an address where components around openArchitectureWare, Xpand and Xtext have been developed. While all other subprojects don’t play any role anymore, the Workflow plugin is still in frequent use.

Over the years we had to change the underlying infrastructure some times. The plugin was hosted on the project’s own repository server, and projects using the plugin had to configure an additional plugin repository either in their POMs or settings.xml. This was undesired, but at the end not really a blocker. However, with a recent change of the repository manager, users experienced problems accessing the Fornax repository http://www.fornax-platform.org/nexus/content/groups/public. Currently, this URL is redirected to a server hosted at itemis, and users might get problems with the HTTPS connection.

It always bothered me that we had to host this plugin on a separate repository, and since it is a widely used component, it is logical that it should be available from the Central repository. But it was never a blocker for me. Now finally I got the driver to change this.

Long story short, the plugin is now published at Maven Central as version 3.5.1. I highly recommend to upgrade to this version and remove the Fornax Maven repository from your configuration. The coordinates did not change, they are still org.fornax.toolsupport:fornax-oaw-m2-plugin. I would like to change this sometime in the future (e.g. the name parts “oaw” and “m2” are not up-to-date anymore), maybe with moving development to another project hosting platform.

Version 3.5.1 does not differ much from 3.4.0, which is the version likely used by the world today. The main work was on refactoring the POM and its parent in order to meet the requirements for deployment on Maven Central. Further, I worked on automation of the release process with the maven-release-plugin.

There is one additional feature in 3.5.x: The new property useTestScope can be used to skip dependencies from the test scope for computation of the Java classpath used to execute a workflow in forked mode. On Windows systems the classpath sometimes reaches the limit of allowed command line length, especially since the local Maven repository is below the user’s home directory by default, which has already a rather long path prefix. By default, the plugin will exclude now these test scope dependencies unless the user configures the property explicitly. In 3.5.0 there was a small logical bug with this feature which made the plugin unusable, so please do not use that version. The version 3.5.1 can be used without problems for all using 3.4.0 so far.

Enabling Spring in Scout applications

Today I am attending the first Scout User Day 2014 in Ludwigsburg, which is aligned with EclipseCon Europe 2014 starting tomorrow. Yesterday we had a pre-event dinner with some attendees and the organizers at the Rossknecht restaurant. IMG_3161IMG_3160 I got into a chat with Nejc Gasper, who will give a talk titled “Build a Scout backend with Spring” today. I was a bit surprised as he told me he did not manage to get Spring’s classpath scanning working yet. Since we are doing this in our application, I think it is worth writing now down what we had to do to get this working. The goal in our application is primarely to use Spring as dependency injection container, since the customer uses Spring in all their other Java based applications, too, and wanted us to do so also.

Spring Configuration

The Spring configuration files are located in the folder META-INF/spring of the *.client, *.shared, *.server projects. In this configuration files, we mainly activate classpath scanning:

<?xml version="1.0" encoding="UTF-8"?>
<beans:beans xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:beans="http://www.springframework.org/schema/beans" xmlns:p="http://www.springframework.org/schema/p"
    xmlns:context="http://www.springframework.org/schema/context"
    xsi:schemaLocation="
        http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd
        http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.1.xsd">

    <context:annotation-config />
    <context:component-scan base-package="com.rhenus.fl" />
    <!--  http://docs.spring.io/spring/docs/4.0.x/spring-framework-reference/html/validation.html#core-convert-Spring-config -->
    <beans:bean id="conversionService" class="org.springframework.core.convert.support.DefaultConversionService" />
</beans:beans>

The next important thing is to copy the files spring.handlers, spring.schemas<, spring.tooling into the META-INF folder. The files can be found in the META-INF directory of bundle org.springframework.context. screenshot 21 Without doing this, you will get errors while loading the Spring configuration like this:

Caused by: 
org.springframework.beans.factory.parsing.BeanDefinitionParsingException: Configuration problem: Unable to locate Spring NamespaceHandler for XML schema namespace [http://www.springframework.org/schema/context]|Offending resource: URL [bundleresource://9.fwk1993775065:1/META-INF/spring/fl_client.xml]|
	at org.springframework.beans.factory.parsing.FailFastProblemReporter.error(FailFastProblemReporter.java:70)
	at org.springframework.beans.factory.parsing.ReaderContext.error(ReaderContext.java:85)
	at org.springframework.beans.factory.parsing.ReaderContext.error(ReaderContext.java:80)
	at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.error(BeanDefinitionParserDelegate.java:316)
	at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1421)
	at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1414)
	at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:187)
	at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.doRegisterBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:141)
	at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.registerBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:110)
	at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.registerBeanDefinitions(XmlBeanDefinitionReader.java:508)
	at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:391)
	at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:335)
	at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:303)
	at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:180)
	at org.springframework.context.support.GenericXmlApplicationContext.load(GenericXmlApplicationContext.java:116)

Bundle Activator

The Spring configuration files are loaded in the Bundle Activator classes of the three “main” Scout projects (client/shared/server). The Activator can also be used then to access the ApplicationContext. We use the GenericXmlApplicationContext to initialize the context from the XML configuration above. One important thing is that this class uses the ClassLoader of the Activator. Otherwise you will get again the error mentioned in the section above. The Activator class looks then as follows:


public class Activator extends Plugin {

 // The plug-in ID
 public static final String PLUGIN_ID = "com.rhenus.fl.application.client";
 public final static String SPRING_CONFIG_FILE = "META-INF/spring/fl_client.xml";

 // The shared instance
 private static Activator plugin;
 private ApplicationContext ctx;

 @Override
 public void start(BundleContext context) throws Exception {
  super.start(context);
  plugin = this;
  init(context);
 }

 @Override
 public void stop(BundleContext context) throws Exception {
  plugin = null;
  super.stop(context);
 }

 public static Activator getDefault() {
  return plugin;
 }

 private void init(BundleReference bundleContext) {
  URL url = getClass().getClassLoader().getResource(SPRING_CONFIG_FILE);

  UrlResource usr = new UrlResource(url);

  ctx = new GenericXmlApplicationContext() {
  @Override
  public ClassLoader getClassLoader() {
   return Activator.class.getClassLoader();
  }
 };
  ((GenericXmlApplicationContext) ctx).load(usr);
  ((AbstractApplicationContext) ctx).refresh();

 }

 public ApplicationContext getContext() {
  return ctx;
 }
}

Service Factory

In order to use dependency injection in Scout services, the services themselves must be instantiated through the Spring ApplicationContext. The default implementation of course is not aware of Spring, so we need to customize this. Unfortunately we have to copy the class org.eclipse.scout.rt.server.services.ServerServiceFactory. We need just to exchange one single line in the method updateInstanceCache(), where the service is instantiated, but this method is private in Scout. The line

m_service = m_serviceClass.newInstance();

is replaced by

m_service = getContext().getBean(m_serviceClass);

Since we have to provide different ApplicationContexts in the different plugins, we put this into the abstract class AbstractSpringAwareServerServiceFactory (full code):

public abstract class AbstractSpringAwareServerServiceFactory implements IServiceFactory {

 private void updateInstanceCache(ServiceRegistration registration) {
  synchronized (m_serviceLock) {
   if (m_service == null) {
    try {
     // CUSTOMIZING BEGIN
//     m_service = m_serviceClass.newInstance();
     m_service = getContext().getBean(m_serviceClass);
     // CUSTOMIZING END
     if (m_service instanceof IService2) {
      ((IService2) m_service).initializeService(registration);
     } else if (m_service instanceof IService) {
      ((IService) m_service).initializeService(registration);
     }
    } catch (Throwable t) {
     LOG.error("Failed creating instance of " + m_serviceClass,
       t);
    }
   }
  }
 }
 
 // CUSTOMIZING BEGIN
 protected abstract ApplicationContext getContext();
 // CUSTOMIZING END

}

The concrete classes implement the method getContext() by accessing the method from the Bundle Activator:

public class ServerServiceFactory extends AbstractSpringAwareServerServiceFactory {

  /**
   * @param serviceClass
   */
  public ServerServiceFactory(Class<?> serviceClass) {
    super(serviceClass);
  }

  @Override
  protected ApplicationContext getContext() {
    return Activator.getDefault().getContext();
  }

}

plugin.xml

The service factory class implemented above must be used now to create the services. This is done in the plugin.xml file:

<service
 factory="com.rhenus.fl.application.server.services.ServerServiceFactory"
 class="com.rhenus.fl.tmi.server.tmirln010.TMIRLN010Service"
 session="com.rhenus.fl.application.server.ServerSession">
</service>

Use Dependency Injection

Now we are finally able to use Dependency Injection with javax.inject.Inject with Scout services.

import org.springframework.stereotype.Component;
import javax.inject.Inject;
...

@Component
@InputValidation(IValidationStrategy.PROCESS.class)
public class TMIRLN010Service extends AbstractTMIRLN010Service {
  @Inject
  protected ConversionService conversionService;
  ...
}

Go!

If everything is correct, you will now recognize the following lines in the console when starting up the Scout application:

Okt 27, 2014 8:45:28 AM org.springframework.beans.factory.xml.XmlBeanDefinitionReader loadBeanDefinitions
INFO: Loading XML bean definitions from URL [bundleresource://9.fwk1993775065:1/META-INF/spring/fl_client.xml]
Okt 27, 2014 8:45:29 AM org.springframework.beans.factory.xml.XmlBeanDefinitionReader loadBeanDefinitions
INFO: Loading XML bean definitions from URL [bundleresource://10.fwk1993775065:1/META-INF/spring/fl_shared.xml]
Okt 27, 2014 8:45:29 AM org.springframework.context.support.AbstractApplicationContext prepareRefresh
INFO: Refreshing com.rhenus.fl.application.shared.Activator$1@b40d694: startup date [Mon Oct 27 08:45:29 CET 2014]; root of context hierarchy
Okt 27, 2014 8:45:29 AM org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor 
INFO: JSR-330 'javax.inject.Inject' annotation found and supported for autowiring
Okt 27, 2014 8:45:29 AM org.springframework.context.support.AbstractApplicationContext prepareRefresh
INFO: Refreshing com.rhenus.fl.application.client.Activator$1@292f062b: startup date [Mon Oct 27 08:45:29 CET 2014]; root of context hierarchy
Okt 27, 2014 8:45:29 AM org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor 
INFO: JSR-330 'javax.inject.Inject' annotation found and supported for autowiring

Scout tables with fixed columns

For tables with many columns it is often better if some of the first columns stay fixed when the user scrolls the table content horizontally. Since RAP 2.0, tables and trees support fixed columns, but Scout (current: 4.0/Eclipse Luna) does not provide this feature for its Table implementation. Since Scout supports different UI frameworks (Swing,SWT,RAP) and only RAP provides a table implementation with fixed columns support, this might be a reason for this missing feature.

The necessary additions can be added to Scout without changing Scout sources themselves. Scout allows to customize UI components, including existing ones. We can add this feature with fragments. The code mentioned below can be fetched from Github “scout-experimental”.

Client Extension Plugin

AbstractTableWithFixedColumns

First, we need a Table implementation that provides an additional configuration property fixedColumns. The value is stored using the element’s Property Support:

package org.eclipse.scout.rt.extension.client.ui.basic.table;

import org.eclipse.scout.commons.annotations.ConfigProperty;

/**
 * A table that supports Fixed Columns.
 *
 * @see http://eclipse.org/rap/developers-guide/devguide.php?topic=tree-table.html&version=2.0
 */
public class AbstractTableWithFixedColumns extends AbstractExtensibleTable {
	public static final String PROP_FIXED_COLUMNS = "org.eclipse.rap.rwt.fixedColumns"; // =RWT.FIXED_COLUMNS
	/**
	 * Configures how many of the visible columns should be fixed.
	 * @return A value greater than 0
	 */
	@ConfigProperty(ConfigProperty.INTEGER)
	protected int getConfiguredFixedColumns () {
		return -1;
	}

	public void setFixedColumns (int n) {
		propertySupport.setPropertyInt(PROP_FIXED_COLUMNS, n);
	}

	public int getFixedColumns () {
		return propertySupport.getPropertyInt(PROP_FIXED_COLUMNS);
	}

	@Override
	protected void initConfig() {
		super.initConfig();
		setFixedColumns(getConfiguredFixedColumns());
	}

}

This custom class can be specified as default base class for tables in the Scout workspace preferences:
Scout base class

In the client code, the table field would be defined like this:

public class CtyTableField
      extends
      AbstractTableField<CtyTableField.Table> {
          public class Table extends AbstractTableWithFixedColumns {
             @Override
             protected int getConfiguredFixedColumns() {
                return 2;
             }
          }
}

Scout RAP UI extension fragment

RwtScoutTableExt

The class that must be customized is org.eclipse.scout.rt.ui.rap.basic.table.RwtScoutTable. I’m not good in creating names, so I call its extension just RwtScoutTableExt.


package org.eclipse.scout.rt.ui.rap.basic.table;

import org.eclipse.rap.rwt.RWT;
import org.eclipse.scout.rt.client.ui.basic.table.columns.IColumn;
import org.eclipse.scout.rt.ui.rap.ext.table.TableEx;
import org.eclipse.swt.widgets.Composite;

public class RwtScoutTableExt extends RwtScoutTable {
	@Override
	protected void initializeUi(Composite parent) {
	    super.initializeUi(parent);
	    TableEx table = getUiField();
		// Fixed Columns Support
		// see
		// http://eclipse.org/rap/developers-guide/devguide.php?topic=tree-table.html&version=2.0
		// the configured fixed columns refer to visible columns only, but for
		// the RWT table the number of fixed
		// columns include also the non-visible ones. Compute how many columns
		// should be fixed in the RWT table.
		Integer configuredFixedColumns = (Integer) getScoutObject().getProperty(
				RWT.FIXED_COLUMNS);
		if (configuredFixedColumns != null && configuredFixedColumns > 0) {
			int fixedColumns = 0;
			int visibleColumns = 0;
			for (IColumn<?> column : getScoutObject().getColumns()) {
				fixedColumns++;
				if (column.isDisplayable() && column.isInitialVisible()) {
					visibleColumns++;
				}
				if (visibleColumns == configuredFixedColumns) {
					break;
				}
			}
			table.setData(RWT.FIXED_COLUMNS, new Integer(fixedColumns));
		}
	}
}

After Scout has initialized the table component (call of super), the table has to be configured with the number of fixed columns:


table.setData(RWT.FIXED_COLUMNS, new Integer(fixedColumns));

The number of configured fixed columns has to be specified by a property of the table field. For now, you might wonder why we do not simply set the configured fixed column value. The reason is, that some columns might not be visible, and when configuring the fixed columns, you would expect that only the visible ones should count.

RwtScoutTableFieldExt

The table class itself is constructed by RwtScoutTableField. In order to construct our customized class, RwtScoutTableField must be subclassed and override the createRwtScoutTable() method:

package org.eclipse.scout.rt.ui.rap.form.fields.tablefield;

import org.eclipse.scout.rt.client.ui.form.fields.smartfield.IContentAssistFieldProposalForm;
import org.eclipse.scout.rt.ui.rap.basic.table.IRwtScoutTable;
import org.eclipse.scout.rt.ui.rap.basic.table.RwtScoutTable;
import org.eclipse.scout.rt.ui.rap.basic.table.RwtScoutTableExt;
import org.eclipse.scout.rt.ui.rap.util.RwtUtility;

public class RwtScoutTableFieldExt extends RwtScoutTableField {
	@Override
	protected IRwtScoutTable createRwtScoutTable() {
		if (getScoutObject().getForm() instanceof IContentAssistFieldProposalForm) {
			return new RwtScoutTable(RwtUtility.VARIANT_PROPOSAL_FORM);
		} else {
			return new RwtScoutTableExt();
		}
	}

}

MANIFEST.MF

The created fragment is a fragment of the bundle org.eclipse.scout.rt.ui.rap. At least from Scout version 3.9.0 on, likely earlier, the mentioned solution should work.

Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Scout UI RAP - Extension
Bundle-SymbolicName: de.kthoms.scout.rt.ui.rap;singleton:=true
Bundle-Version: 1.0.0.qualifier
Fragment-Host: org.eclipse.scout.rt.ui.rap;bundle-version="3.9.0"
Bundle-RequiredExecutionEnvironment: JavaSE-1.7
Export-Package: org.eclipse.scout.rt.ui.rap.form.fields.tablefield

fragment.xml

The extended field class has to be configured in the fragment’s descriptor fragment.xml by using the org.eclipse.scout.rt.ui.rap.formfields extension point:


<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.4"?>
<fragment>
  <extension
        point="org.eclipse.scout.rt.ui.rap.formfields">
      <formField
            active="true"
            modelClass="org.eclipse.scout.rt.client.ui.form.fields.tablefield.ITableField"
            name="Table field"
            scope="global">
            <uiClass
              class="com.itemis.scout.rt.ui.rap.form.fields.tablefield.RwtScoutTableFieldExt">
            </uiClass>
      </formField>
  </extension>

</fragment>

Note the scope="global" configuration. The interface org.eclipse.scout.rt.ui.rap.extension.IFormFieldExtension defines 3 scopes: default, global, local. The description for the global scope is:

IFormFieldExtension.SCOPE_GLOBAL to indicate this extension to have a global scope (whole eclipse). Global defined extensions overwrite the default implementation.

Overwriting the default implementation is exactly what we try to achieve here.

Result

The first 2 columns are marked as fixed. In fact, in front of the first visible column the table has a non-displayable Id column. When scrolling the table horizontally, the columns “Iso 2 Code” and “Description” are fixed.

screenshot 2014-07-07 um 12.10.50
screenshot 2014-07-07 um 12.11.02

Remove “Build path specifies execution environment…” warnings from Problems View

I have often workspaces with projects which specify Java 1.5 as minimal execution environment. On my machine there is no JDK 1.5 installed, and it turns out that getting one for Mac OSX Mountain Lion is not trivial. Actually I don’t need a JDK 1.5, since the standard 1.6 JDK is compatible. However, this raises in the workspace these annoying warnings.

screenshot 2013-05-17 um 10.32.31

In the Execution Environments setting it is possible to mark the Java 1.6 installation as compatible to the J2SE-1.5 Execution Environment:

screenshot 2013-05-17 um 10.52.31

Although the JDK 1.6 is marked compatible now, it is not “strictly compatible”, so the warning message remains.

Next you could try to disable the warning message. There is a setting for this in the preference dialog Java/Compiler/Building:

screenshot 2013-05-17 um 10.35.17

After changing the setting you are asked to rebuild the projects. But again, the warning do not disappear. I suspect this to be a bug and raised Bug#408317 for this.

So the last chance is to filter these warnings. Therefore select the options menu in the Problems View and open the “Configure Contents” dialog. In the “Types” selection tree expand the “Java Build Path Problems” node and uncheck “JRE System Library Problem”.

screenshot 2013-05-17 um 11.03.53

Finally the warning messages disappear from the Problems View. However, they are just filtered from the view, the projects themselves will still have these resource markers, so you will have a warning overlay icon on the project in the Package Explorer View although the Problems View might be empty.

But this raises the next problem: Now the warning disappears and the code is compiled with Java 1.6, and thus against the 1.6 API. This leads to the problem that you could accidently use API from >= 1.6. For example, usage of String#isEmpty() would compile even if the Execution Environment is set to J2SE-1.5 (the Execution Environment anyway just defines the lowest requirement) and also if Java source compatibility is set to 1.6 in the compiler settings.

We need to detect this unwanted use of API that is not 1.5 compatible. Therefore the PDE tooling offers support to install an Execution Environment Description for J2SE-1.5 and set up API tooling. This will finally allow us to detect illegal API use:

screenshot 2013-05-17 um 14.00.15

I like to thank Laurent Goubet and Mikael Barbero for their valuable comments on the potential API problems.