Saturday, November 18, 2006

More legacy bean mapping strategies with Spring

My last post covered some basic mapping strategies for accessing legacy beans from within Spring's bean context. This post covers three more strategies to expose legacy beans for which the mapping may not be as evident. In all these cases, wrapper code may need to be written or existing code may need to be slightly modified to expose these beans in Spring.

Exposing legacy configuration

The configuration information in the legacy web application is stored in an external directory in a bunch of properties files. By external directory, I mean that it is not in WEB-INF/classes, where you would normally expect it to be. Even though you may cringe at the thought (I know I did, when I first looked at it), there are a number of benefits. First, changing properties is easier for operations folks to do, since the WAR file does not need to be rebuilt, although the application does need to be bounced for the new configuration to take effect. Second, properties can be reused across other web and standalone applications, resulting in less duplication and creating something of an enterprise level configuration. The downside, of course, is that you need a custom strategy to load and customize properties per environment, rather than use Spring's PropertyPlaceholderConfigurer or Maven's environment based filtering that I wrote about earlier.

Properties in the legacy web application is exposed through a Config bean, which exposes static calls such as this:

1
  String mypropertyValue = Config.getConfig("myproperty").get("key");

This call would go out and load the property file myproperty.properties in the specified external directory (passed in to the application as a system property), if it has not already been loaded in a previous invocation, and get back the value for the property named "key". The properties file itself looks something like this:

1
2
# myproperty.properties
key=value

My objective was to have this value exposed through Spring's PropertyPlaceholderConfigurer in the Spring bean context as ${myproperty.key}. I considered building a custom configurer by extending the PropertyPlaceholderConfigurer, but then I found a JIRA post on Atlassian that discussed strategies to expose configuration specified with Jakarta Commons Configuration, one of which I repurposed for my use.

Basically, what I ended up doing was creating a PropertyExtractor class which iterated through all the properties files in the external directory, and loaded all of these into a single Properties object. The keys for each of these properties was the key itself, prefixed by the basename of the properties file. Once this was done, I could pass in the properties to the PropertiesPlaceholderConfigurer by invoking the getProperties() method on the PropertyExtractor class. The Spring configuration for the PropertyPlaceholderConfigurer is shown below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
  <bean id="configPropertiesExtractor" class="com.mycompany.util.ConfigPropertiesExtractor">
    <property name="configDir" value="/path/to/external/config/directory" />
  </bean>

  <bean id="propertyConfigurer" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
    <property name="properties">
      <bean class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
        <property name="targetObject">
          <ref local="configPropertiesExtractor" />
        </property>
        <property name="targetMethod">
          <value>getProperties</value>
        </property>
      </bean>
    </property>
  </bean>

The code for the PropertyExtractor bean is shown below. It is itself a Spring bean, and is configured using the external directory name. It makes calls to the legacy Config bean to get the properties and rebuild a Properties object which is then injected into the PropertyPlaceholderConfigurer bean. From this point on, all properties can be exposed in the ${property_file_basename.property_key} format within the rest of the context.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
public class ConfigPropertiesExtractor {

  private static final Logger LOGGER = Logger.getLogger(ConfigPropertiesExtractor.class);

  public ConfigPropertiesExtractor() {
    super();
  }

  public void setConfigDir(String configDir) {
    Config.setConfigDir(configDir);
  }

  public Properties getProperties() throws Exception {
    Properties props = new Properties();
    File configDir = new File(Config.getConfigDir());
    if ((! configDir.exists()) || (! configDir.isDirectory())) {
      LOGGER.error("Config dir:[" + configDir.getAbsolutePath() + "] does not exist or is not a directory");
      return props;
    }
    File[] cfFiles = configDir.listFiles(new FileFilter() {
      public boolean accept(File pathname) {
        return (pathname.getName().endsWith(".properties"));
      }
    });
    for (File cfFile : cfFiles) {
      String prefix = FilenameUtils.getBaseName(cfFile.getName());
      Properties cfProps = Config.getConfig(prefix).getAll();
      for (Iterator it = cfProps.keySet().iterator(); it.hasNext();) {
        String key = (String) it.next();
        String value = (String) cfProps.getProperty(key);
        props.setProperty(prefix + "." + key, value);
      }
    }
    return props;
  }
}

Exposing a predefined DataSource

The legacy application was based on JDBC, so there was already a class that built and returned a Connection object from a pool. The DBA had spent considerable effort to optimize the connection pool for our environment, so it made sense to use the optimizations. One approach would have been to build our own DriverManagerDataSource using the exact same optimized configurations. The disadvantage of this approach is that the DBA would have to maintain identical information in two different places, or the developers will have to continuously play catch up with every change. A second approach would have been to add an extra method to the class to return a DataSource object instead of a Connection (since Spring's JdbcTemplate requires a DataSource to be built). The second approach is the approach we went with. The extra code to return a DataSource was minimal, since the implementation of getConnection was DataSource.getConnection().

1
2
3
4
5
6
7
public class DbConnectionManager {
  ...
  public static DataSource getDataSource() throws Exception {
    return _dataSource;
  }
  ...
}

The configuration is shorter than the standard one for DriverManagerDataSource, just a call to a static method on a predefined class.

1
2
  <bean id="dataSource" class="com.mycompany.util.db.DbConnectionManager" 
      factory-method="getDataSource" />

Sometimes the legacy database ConnectionManager does not reference a DataSource object. This was the case with another third-party application, which built the Connection using traditional DriverManager calls, relying on the database driver's pooling capabilities. My solution in that case was to build a DataSource wrapper implementation whose getConnection() method delegates to the ConnectionManager's getConnection() method. Obviously, the other required methods need to have sensible defaults as well.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
public class MyDataSource implements DataSource {

  private DbConnectionManager connectionManager;

  public void setConnectionManager(DbConnectionManager connectionManager) {
    this.connectionManager = connectionManager;
  }

  public Connection getConnection() {
    return connectionManager.getConnection();
  }

  // other methods of DataSource
  ...
}

And the configuration for this would go something like this:

1
2
3
  <bean id="dataSource" class="com.mycompany.util.db.MyDataSource">
    <property name="connectionManager" ref="dbConnectionManager" />
  </bean>

Accessing objects from a factory

This arose out of a desire to remove some boiler-plate code out of my own pre-Spring code. The code parsed XML using the DOM parser. The pattern for parsing an XML file is as follows:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
...
  public void doSomethingWithXmlFile() {
    DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance();
    // set properties for factory
    dbf.setValidating(false);
    dbf.setIgnoringElementContentWhitespace(true);
    DocumentBuilder builder = dbf.newDocumentBuilder();
    // set properties for the builder
    builder.setEntityResolver(myEntityResolver);
    // finally parse the XML to get our Document object
    Document doc = builder.parse(xmlFile);
    ...
  }
...

I wanted to just pass a pre-built DocumentBuilder object to the class, and be done with the boilerplate code on top. I achieved this with the following configuration:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
  <bean id="documentBuilderFactory" class="javax.xml.parsers.DocumentBuilderFactory" factory-method="newInstance">
    <property name="validating" value="false" />
    <property name="ignoringElementContentWhitespace" value="true" />
  </bean>

  <bean id="documentBuilder" class="javax.xml.parsers.DocumentBuilder"
      factory-bean="documentBuilderFactory" factory-method="newDocumentBuilder">
    <property name="entityResolver" ref="myEntityResolver" />
  </bean>
  ...
  <!-- documentBuilder can now be referenced in a bean definition -->

and the resulting code after moving the boilerplate out to Spring looked like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
...
  // the setter for Spring
  public void setDocumentBuilder(DocumentBuilder documentBuilder) {
    this.documentBuilder = documentBuilder;
  }

  public void doSomethingWithXmlFile() {
    Document doc = documentBuilder.parse(xmlFile);
    ...
  }
...

All the three mapping strategies described are quite complex, and are not readily apparent. However, the XML metalanguage provided by Spring to configure beans is quite powerful and has lots of features. The power of the metalanguage becomes most evident when one has to expose legacy beans rather than ones which are already exposable using Spring's standard setter injection. As I dig deeper into the legacy code and have to interface with more legacy beans, I am sure I will come across more complex situations, solutions to which I will probably share if I think they are useful.

Sunday, November 05, 2006

Legacy Bean wrapping strategies with Spring

So far, my experiences with Spring were with new web applications, so it was easy enough to get up and running with Spring using Thomas Risberg's Spring MVC, Step by Step tutorial. However, what I wanted to do now was to introduce Spring into an already existing web application built using a home grown framework based on the JSP-as-Handler pattern described in page 337 of Martin Fowler's Patterns of Enterprise Application Architecture book.

The idea is to continue with the existing framework for features which are already live, and use the Spring MVC Framework and its IoC style configuration for new features in the web application. New features should be able to re-use, as far as possible, the business logic which have been codified in the Java classes in the old framework. We could have taken the path of least resistance and just adapted to building new features with the existing framework, but we chose Spring because it is easier, more powerful and more intuitive to use, and is thus likely to enhance productivity.

This article will detail some strategies I used to wrap or reference existing (or "legacy") beans in the Spring applicationContext configuration file. The list here is quite short, and reflect only the things I had to do so far. If you can think of other ideas or situations, please post them in the comments or point me to links, and I will include them.

Configure only what you need

I was dreading the exercise of having to figure out and configure whole stacks of beans before I started this, but then I realized that our new Spring controllers need to only declare references to service beans that it needs, i.e. to the next level in the abstraction layer. Since the existing framework did not depend on any sort of property injection, the service beans know how to populate and instantiate themselves with DAO and other lower level beans without any help from a bean factory. The lower level beans typically use the Locator pattern to locate configuration information from a registry.

It may be worth moving these to Spring later and dispensing with the registry altogether in favor of the Spring application context, but that can be done in stages.

Static Singletons

By default, Spring builds beans and sets the declared dependencies at application startup. Unless told otherwise (with the singleton="false" attribute), what you get is a reference to a singleton. If the singleton bean has a no-args constructor, then it can be configured in the context as a bean. If the singleton bean has an explicit constructor, you can use constructor injection to build a reference to the bean, and then pass the bean into your new Spring beans as a property.

It is tempting to just make a static call to the bean from within your Spring beans, but then you are not using IoC anymore.

Beans with static initializers

These can generally be used as is, since the first time it is called (by the Spring BeanFactory), the static block would be called. However, it may be worth moving the static block out into an init() method in the bean, and letting Spring call the init() method on application startup after the bean has been fully populated with the configured setter properties using the init-method="init()" attribute in the bean definition.

Beans configured via constructors

There are usually two approaches to IoC, one constructor based and the other setter based. There are valid arguments on both sides, but it is possible to declare beans using both constructor based and setter based injection in Spring. So if a bean needs to be built using a non-default constructor, then this can be done from within Spring.

Resources

Spring is very well-documented, and Chapter 3 of the Spring Framework documents was very helpful when trying to come up with the appropriate wrapping strategy.