Skip to main content

Part II : A Simple and Smart Result Mapper

I am a big fan of Spring JDBC. The reasons are simple -
  • Full control over SQL. I can optimize, tweak and tune them to my free will. Let me think in terms of those tables and not objects.
  • Cuts down my DAO code significantly.
  • Takes care of all boilerplate code.
  • Integrates well with other parts of my application which mostly uses other Spring components.
  • Simple and easy to setup and get running at high speed
  • Minimal learning curve
One area where Spring framework does not do a good job is that of list/finder queries returning n number of records. Lets say you want to find the employees in a certain department. The query is executed and you get a map of records, but then you need to return a list of domain objects to the business tier. You will loop through the map list returned by Spring JDBC, for each of your record create a Javabean populate each field and then add to another list which will be finally returned. So there is lot of boilerplate code here.  Look at the example code below, which fetches all users in my system.
public List<User> listUsers() {
String SQL_LIST_USER = "SELECT first_name,last_name,email,user_id FROM t_user";
        List<Map<String,Object>> users = this.getSimpleJdbcTemplate().queryForList(SQL_LIST_USERS,new HashMap());
        List<User> userList = new ArrayList<User>();
        User userHolder = null;
        for(Map<String,Object> user : users) {
            userHolder = new User();
            userHolder.setFirstName((String)user.get("first_name"));
            userHolder.setLastName((String)user.get("last_name"));
            userHolder.setUserCode((String)user.get("email"));
            userHolder.setUserId((Integer)user.get("user_id"));
            userList.add(userHolder);
        }
        return userList;
    }
There is significant redundant code here. If you notice carefully, if a column name changes this code will lead to errors. You may consider this mapping and listing in a row mapper, but still you need to write this code and unnecessarily add new classes per such list method for Spring JDBC callbacks.
So I looked for a data mapper. The best solution in this regard is iBatis. It is much simpler and has less learning curve compared to ORM and tries to do limited things. While playing with it I was once again struck by XML meta data hell. This time mapping each SQL with different maps. This could be some XML to maintain in a decently large project. Also it queries database multiple time if I need to populate object graphs. Also my attempts to integrate iBatis 3 with Maven 2 and Spring 3 failed with ClassNotFoundException. Hence my journey with iBatis ended very abruptly. I was looking for a smarter solution, which I could configure very easily, should be very simple, no learning curve, efficient and intuitive.
The solution was in Spring framework, bit of SQL trick and an external super efficient bean manipulation library called JODD - http://jodd.org/doc/beanutil.html
As a first step towards building this simple and smart data mapper, I want to externalize my SQLs. So I introduce an abstract dao support class which does load externalized SQLs from a Spring configuration file. Note I am not using any additional XML file for this. This will again lead same problems as in iBatis. Instead its better to stick to what I know best. Here is the first version of this abstract class.
Listing – AbstractBaseDaoSupport.java
package net.sf.webplug.spring.dao;  
import java.util.ArrayList;
import java.util.HashMap;
import java.util.Map;  
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.jdbc.core.simple.SimpleJdbcDaoSupport;  
/**
* @author dhrubo
*
*/
public abstract class AbstractBaseDaoSupport extends SimpleJdbcDaoSupport{
    private Logger logger = LoggerFactory.getLogger(AbstractBaseDaoSupport.class);
    private Map<String, String> sqlMap;  
    /**
     * @param sqlMap the sqlMap to set
     */
    public void setSqlMap(Map<String, String> sqlMap) {
        this.sqlMap = sqlMap;
    }
    public String getSql(String id) {

return this.sqlMap.get(id);
    } }
Now I change my Dao class to use this abstract class instead.
Listing – UserDaoImpl.java
package net.sf.dms.security.dao.impl;  
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;  
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.jdbc.core.simple.SimpleJdbcDaoSupport;  
import net.sf.dms.security.dao.api.UserDao;
import net.sf.dms.security.domain.User;  
/**
* @author dhrubo
* 
*/
public class UserDaoImpl extends AbstractBaseDaoSupport implements UserDao {  
    private Logger logger = LoggerFactory.getLogger(UserDaoImpl.class);  
    @Override
    public List<User> listUsers() {
        String sql = this.getSql("listUsers");
        List<Map<String,Object>> users = this.getSimpleJdbcTemplate().queryForList(sql,new HashMap());
        List<User> userList = new ArrayList<User>();
        User userHolder = null;
        for(Map<String,Object> user : users) {
            userHolder = new User();
            userHolder.setFirstName((String)user.get("first_name"));
            userHolder.setLastName((String)user.get("last_name"));
            userHolder.setUserCode((String)user.get("email"));
            userHolder.setUserId((Integer)user.get("user_id"));
            userList.add(userHolder);
        }
        return userList;
    }  
} 
The sql is now moved to an Spring config file (separate just for SQLs)
Listing – sql-repository.xml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">  
    <bean id="secSqlMap" class="org.springframework.beans.factory.config.MapFactoryBean">
        <property name="sourceMap">
            <map>
                <entry key="listUsers">
                    <value>
                        <![CDATA[
                            SELECT first_name ,last_name ,email ,user_id  FROM t_user
                        ]]>
                    </value>
                </entry>
                
            </map>
        </property>
    </bean>  
</beans> 
Now that I have externalized the SQLs, its time to look into the main configuration file. There is something interesting here too.
Listing – applicationContext.xml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">
    <import resource="sql-repository.xml"/>
       <bean id="secDataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
        <property name="driverClassName" value="org.postgresql.Driver"/>
        <property name="url" value="jdbc:postgresql://localhost:5432/dms"/>
        <property name="username" value="postgres"/>
        <property name="password" value="postgres"/>
    </bean>
    <bean id="abstractSecBaseDaoSupport" class="net.sf.spring.dao.AbstractBaseDaoSupport" abstract="true">
            <property name="dataSource" ref="secDataSource"/>
            <property name="sqlMap" ref="secSqlMap"/>
    </bean>
    <bean id="userDao" class="net.sf.dms.security.dao.impl.UserDaoImpl" parent="abstractSecBaseDaoSupport" />
    <bean id="roleDao" class="net.sf.dms.security.dao.impl.RoleDaoImpl" parent="abstractSecBaseDaoSupport"/>
    <bean id="userRoleDao" class="net.sf.dms.security.dao.impl.UserRoleDaoImpl" parent="abstractSecBaseDaoSupport"/>
</beans> 
Note the abstract class configuration. I do not need to configure the data source property for each dao class anymore. This ends my first step of clean up. In the next post I will show how to extend the abstract class to provide mapper functions.

Comments

  1. The conversion to a list of users would be more elegant if done with a RowMapper or ParameterizedRowMapper

    ReplyDelete
  2. The intention here is not to create such small classes for mapping. These small classes are hazards for long term maintenance. Instead the goal is to devise a generic framework to map it.

    ReplyDelete

Post a Comment

Popular posts from this blog

Why do you need Spring Cloud Config server?

Last month I wrote a primer on concepts around 12 factor app. Before getting into the details of the Spring Cloud Config Server, I must refresh on the principle #3 from the list presented in that post.

3 – ConfigurationStore config in the environments
Configuration information must be separate from the source code. This may seem so obvious, but often we are guilty of leaving critical configuration parameters in the scattered in the code. Instead, applications should have environment specific configuration files. The sensitive information like database password or API key should be stored in these environment configuration files in encrypted format.
 The key takeaways from this postulate for a cloud-native microservices application are:
Do not store configuration as part of the deployable unit (in the case of lead microservice - inside the jar or war if you are still deploying war like the good old days). Instead, store it in an external location and make it easily accessible during run-…

Upgrading Lead Microservice - Use MariaDB and Flyway with Spring Boot

So far I have been using an in-memory H2 database or Mockito for testing the lead microservice. To make the transition towards using the Spring Cloud Config server, I need to upgrade the micro-application to use MariaDB. I will be adding the configuration in the application.yml  the file which in the subsequent post will move over to the config server store. I will also be using Flyway to make it easy to maintain the database schema changes in future. I will use this post to introduce Flyway in the mix. Spring Boot also provides first class integration with Flyway. I am using Flyway as its really quick and easy to get started, minimal learning curve (no DSL) and I am comfortable with it having used it in the past.

Assumptions

MariaDB 10 is installedBasic familiarity with FlywayHeidi SQL client is installed.
Step 1 - Update build.gradle to include the MariaDB JDBC and Flyway dependencies.
Do not forget to do a Gradle refresh on your IDE (I am using STS 3.8.4 on Java 8)

Step 2 - Rename the…

Part 3 - Integrating Tiles, Thymeleaf and Spring MVC 3

In this post I will demonstrate how to integrate Apache Tiles with Thymeleaf. This is very simple. The first step is to include the tiles and thymeleaf-tiles extension dependencies. I will include them in the pom.xml. Note we wil lbe using Tiles 2.2.2Listing 1 - parent/pom.xml --- thymeleaf-tiles and tiles dependencies <!-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -->
<!-- Tiles -->
<!-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -->

<dependency>
<groupId>org.apache.tiles</groupId>
<artifactId>tiles-core</artifactId>
<version>${tiles.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.tiles</groupId>
<artifactId>tiles-template</artifactId>
<version>${tiles.version}</version>
<scope>compile</scope>