Skip to content

Archive

Tag: Java

During last weeks, some local building contractor was involved in reconstruction of our house and renewing the basement drainage. As a result the new basement drainage has been installed and a new hopper for the sewage pumps has been placed. To be honest, it is a lot of heavy work, performed by the builder including digging, insulating, pounding and other dirty staff – but after all the system works if sewage pump is removing the drainage water. In constrast to the earthworks, where you need much experience and human force, which I don’t have, I took over the plumbing and electrical work on the pumps. In order to have a fail-over system, I installed two pumps, where the second pump is triggered if the first one fails. In order to be able to operate on a short circuit of the first pump, I put the second pump on a separate phase (in Europe, we have three phases power supply, 220V each, shifted by 120° to each other, not like split-phase in US). Having this system installed, you get some periodic work to do: finally you want to make sure by regular testing procedures, that the second pump is operating if the first one has failed. Since I’m lazy and like inventing and constucting stuff more than executing regular test procedures, I decided to implement a monitoring system using some cheap electronic and computer components: Raspberry Pi, Tinkerforge Hardware. continue reading…

Ranked

Just started to develop a small application in Scala running on a standard enterprise java stack. You can find more details on github: https://github.com/holisticon/ranked.

Today, I’ll post some details on the persistence layer. The idea is to use Scala case classes and JPA as a persistence layer. For simple attributes it looks very cute:

/**
 * Superclass for all persistent entities.
 */
@MappedSuperclass
abstract class PersistentEntity(
  @BeanProperty @(Column @field)(name = "ID")@(Id @field)@(GeneratedValue @field)(strategy = GenerationType.AUTO) id: Long,
  @BeanProperty @(Column @field)(name = "VERSION")@(Version @field) version: Long) {
  def this() = this(-1, -1);
}

/**
 * Represents a player. A player has a name, initial and current ELO ranking.
 */
@Entity
@Table(name = "PLAYER")
case class Player(
  @BeanProperty @(Column @field)(name = "NAME") name: String) extends PersistentEntity {

  def this() = this(null);
}

/**
 * Represents a team. A team contains of et least one player and might have a name.
 */
@Entity
@Table(name = "TEAM")
case class Team(
  @BeanProperty @(Column @field)(name = "NAME") name: String) extends PersistentEntity {

  def this() = this(null);
}

Stay tuned about the development progress.

Yesterday, I discovered a funny nuance in Java programming language, which I didn’t know before and decided to share it with you. I was designing an API for transport of changes in relationships between two DTO types. Since I wanted to support batch changes, I created the class for carrying these:

class ManyToManyDelta<S extends BaseDto<?>, T extends BaseDto<?>> {

  List<SimpleRelationship<S,T>> relationshipsToAdd;
  List<SimpleRelationship<S,T>> relationshipsToRemove;
  ...
}

class SimpleRelationship<V extends BaseDto<?>, W extends BaseDto<?>> {

  // BaseDto classes are identified by the parameterized Id
  Id<V> left;
  Id<W> right;

  SimpleRelationship(BaseDto<V> one, BaseDto<W> another) {
    left = one.getId();
    right = another.getId();
  }
}

Having this structure, you can model the relationship between two instances of types A and B by an instance of SimpleRelationship<A>. If you want to communicate the creation of a relationship you would put the latter into the relatioshipToAdd list, if you want to model the deletion, you would put it into the relatioshipToRemove list.

Now I it was time to develop methods for access of the relationship lists inside of the ManyToManyDelta:

class ManyToManyDelta<S extends BaseDto<?>, T extends BaseDto<?>> {
  ...
  public void add(SimpleRelationship<S, T> toAdd) {
    if (toAdd == null) { /* react */}
    this.relatioshipToAdd.add(toAdd);
  }
  ...
}

You could think that you have a batch update (e.g. an Array or List) of SimpleRelatioship objects you would like to add them by one invocation instead of a series of invocation. e.G:

class ManyToManyDelta<S extends BaseDto<?>, T extends BaseDto<?>> {
  ...
  public void add(SimpleRelationship<S, T>[] toAdd) {
    if (toAdd == null) { /* react */}
    this.relatioshipToAdd.addAll(Arrays.asList(toAdd));
  }
  public void add(SimpleRelationship<S, T> toAdd) {
    if (toAdd == null) { /* react */}
    this.relatioshipToAdd.add(toAdd);
  }
  ...
}

Using the varargs feature of Java you could also write equivalent:

class ManyToManyDelta<S extends BaseDto<?>, T extends BaseDto<?>> {
  ...
  public void add(SimpleRelationship<S, T>... toAdd) {
    if (toAdd == null) { /* react */}
    this.relatioshipToAdd.addAll(Arrays.asList(toAdd));
  }
  ...
}

That would be nice, right? By the way, it is a good idea, to write some client code, during the development of API. This discovers potential problems:

  ...
  A entityA = ...;
  B entityB = ...;
  ManyToManyDelta<A, B> delta = new ManyToManyDelta<A,B>();
  delta.add(new SimpleRelationship<A,B>(entityA, entityB));

Coding this result in a type safety warning: A generic array of SimpleRelationship is created for a varargs parameter. Which reveals a problem in a Java language: you can not create an array of parameterized types. And resulting from this fact, you can not use that as varargs argument.

Finally, if you want to create convenience methods for one and many items, you have to do it in a old-fashined way, by providing overloaded methods.


I just returned from the furious event given by Adam Bien on Real World Java EE Practices. The presentation has been held in Lehmanns Bookstore in Hamburg in co-operation with the JUGHH. It was a full success with no space left in the bookstore. I think, I got the last seat and there were some people standing.

Adam made it in an hour and presented many interesting topics. He started with new subjects introduces in JEE6, like optional local interfaces, cronjob-like Timer Service and other nice goodies. Then he covered new stuff from JEE like REST and CDI (Context and Dependency Injection). Finally, he moved to the best practices, patterns and anti-pattern. As usual, it was quick and precise – Adam answered many questions and gave a good overview of the technology.

After the presentation, JUGHH / Lehmanns offer a glass of sparkling wine for the smaller audience and Adam spoke about the possibility to speak about JavaFX next time. This time I left my camera at home and only had my phone with me, so sorry for the low-resolutioned picture…

Packaging

Abstract

Using Eclipse-based rich-clients as stand-alone applications is discussed in many books and articles. In the context of enterprise systems, the software development adopted several paradigms to improve the quality of the overall architecture. This short article describes some issues in packaging the application for using it in the context of enterprise systems.

Architectural Assumptions

Designing enterprise architectures is a standard discipline for IT-consulting companies or freelancers involved in software development. Maybe one of the main characteristics of enterprise architectures is the framework-driven approach of software creation. Thus, the software has to comply certain rules and standards adopted inside the enterprise. In order to simplify such constrained development process, it is common to use an in-house software framework, which enforces the  compliance of the enterprise-internal standards and acts as glue between different technologies adopted as parts of the enterprise architecture.

Using such frameworks has major implications for the software development in general, and especially for the rich client development. The design issues are summarized in the next section.

Usaging an Enterprise Framework

The major goal of the enterprise in-house framework is to simplify the process of software systems development and to enforce standardization among the software systems. This usually includes the following aspects:

  • Domain-specific component framework
  • Methods for master data management
  • Infrastructure services: authentication, authorization, communication, security, printing, reporting
  • Application skeletons and launchers

The more unification and standardization is included inside the framework, the easier it is for a software developer to concentrate on the particular business task and the easier is the maintenance of the software system.

From the previous list, the most interesting part related to RCP packaging and deployment is the existence of the application skeletons and launchers. So, when launching an application, the framework libraries are loaded and executed first and pass the control to the application-specific modules. The advantage of this approach is that infrastructure services can be loaded first, which can be developed and shared among different applications.
continue reading…

I found strange problem running wscompile (from Sun’s Java Web Service Developer Pack 1.6) inside Ganymede (Eclipse Version 3.4.1). The run of the wscompile Ant task produce a problem. The build script execution freezes on the wscompile task. It prints the following message on console but then nothing happens.

[wscompile] wscompile ...\env\java\1.4.2_03\jre\bin\classpath-classpath D:\workspaces\general\lib\bla.bo-0.0.1.jar; ... D:\workspaces\general\lib\... jar.0.5.5

In this line, the classpath of wscompile is printed.

The build script uses configured Apache Ant in version 1.6.5. I tried to start it with Java in versions 1.4.2 und 1.6.0.10. Both works in Europa (Eclipse 3.3.x) but don’t work in Ganymede (Eclipse 3.4.x), except for the first run. It seems that Ganymede provides a different handling for Ant scripts. Every first start of an Ant build script produces new “External Tool Configuration” (if not already there). If this configuration already exists, wscompile task doesn’t work!

This means my build script with wscompile task works only once, every first time after deleting the “External Tool Configuration”. I could live with that if I wouldn’t need that configuration. But I need that configuration to use different java version that is the workspace default.

Do anyone know how to fix that?
Here is my task definiton.

<taskdef name="wscompile" 
		classname="com.sun.xml.rpc.tools.ant.Wscompile"
		classpathref="class.path.jwsdp"
		/>

and also task usage in the script

<wscompile fork="true" import="true" base="java/class" sourceBase="java/generated" verbose="true" features="documentliteral, wsi, searchschema, serializeinterfaces, explicitcontext" mapping="java/generated/META-INF/jaxrpc-mapping.xml" config="metadata/wsdl-config.xml" xSerializable="true">
	<classpath>
		<path refid="class.path.local" />
		<path refid="class.path.ant" />
		<pathelement path="${java.class.path}" />
	</classpath>
</wscompile>

Comments are welcome.

On the 10th of November it was time again: an Eclipse Demo Camp took place in the East Hotel in Hamburg, Germany. This time, the Demo Camp was sponsored by Itemis, it-agile, froglic and of course the Eclipse Foundation. The organisators of the evening were Peter Friese (Itemis) and Martin Lippert (it-agile) who intruduced the presenters.

Harald Wellmann of Innovative Systems GmbH (Harman/Becker Automotive Systems) talked about “Europe on a Disk – Geoprocessing for Car Navigation Systems”. He talked about their usage of Eclipse and OSGi to build the map compiler on top of these and explained different benefits and drawbacks in using this technology. Additionally, he talked about Jump and uDig which is used for displaying maps in the Eclipse Map Processing Toolkit. Apart from the technical point of view, the talk gave an interesting little insight how the maps for our beloved navigational systems are created.

The second talk was given by Gerd Wütherich ( independant consultant) and was about “Spring Dynamic Modules for OSGi Service Platforms”. He demostrated how to use Spring in order to harness the power of OSGis dynamic Java lifecycle in enterprise applications. While presenting he showed some small demos. In his order service example, two persistence services were available and one went “offline”, so the other one jumped in to take over. Once the second service went down too, the application was waiting (with a timeout) until some persistence service was available. As a “a world in a nutshell” this was a great demo of how to use dynamic modules.

After the second talk was a little break with italian food. (Which I did not try, so I will not comment on it, but it looked delicious.)

Miguel Garcia ( TUHH) and Rakesh Prithiviraj were talking about “Rethinking the Architecture of Java ORM in terms of LINQ”. This session basically covered a “what we (Java developers) could learn from .NET” features. As far as I understood, LINQ (Language INtegrated Query) is a query which is translated to a query for a specific natural datasource. Visualö Studio seems to provide good support for these kind of queries including content assist. Java on the other hand seems to struggle to provide as good support. The talk covered ideas of how to get at least close, if not catch up. I honestly do not understand, why such a innovative mechanism as LINQ was not introduced in Java much earlier? ( Slides of the two)

The last talk was given by Stephan Herrmann ( TU Berlin) discussed “Plugin reuse and adaptation with Object Teams: Don’t settle for a compromise!”. This was basically an intruduction to Object Teams, a language extension to Java, which was developed over the past seven-eight years at the TU Berlin. This extension does not only cover the fundamental aspects but supports the complete Eclipse tool support: content assist, debugger and finally, compiler. Object Teams provides something, which Stephan explained as inheritence on object level (instead on on the class level). It provides the ability to modify objects (especially class instances, not classes!) with additional behavior. So, it is possible to adapt classes to change their runtime behavior with so-called Role Classes. On method level, the roles can be applied in a call-in or call-out fashion, depending on when they have to be invoked. From the point of view of software engineering and language design this was a very interesting talk. (For more information refer to ObjectTeams, Slides are online at Slides).

And after the end of this talk, 23:00h had passed (we started at 19:00 o’clock). However, seeing many familiar faces and having a pleasant conversation, together with great presentations made it worth staying up late.