Wednesday, May 31, 2017

How to Stand up a Spring Cloud Config Server?


Setup and Configure Spring Cloud Config Server Project

Spring Cloud Config Server is just another Spring Boot application. It provides several infrastructure micro services to centralize access to configuration information backed by a version controlled (well at least in the case of default GIT storage) repository.

Step 1 - Create a Spring Boot project in STS with the dependencies shown in Figure 2.

Figure 1 - Creating Spring Boot project to setup Spring Cloud Config Server


Figure 2 - Spring Cloud Config Server dependencies
Click on 'Finish' to complete the creation of the Spring Boot project in STS.
The build.gradle file is shown in listing below. There is only one dependency to the Spring Cloud Config Server. Also Spring Cloud release train 'Dalston.SR1'.
Step 2 - Annotate the class containing main method
The next step is to annotate the ConfigServerInfraApplication class with @EnableConfigServer
That's all is needed on the Java side to start the configuration server.


Step 3 - Final step, create the configuration file.
Finally, create a file named bootstrap.yml and configure it as shown in listing below.


There are a few things to note in the configuration.
  • The config server application has a name - 'config-service'
  • The config server uses Github (public GIT repository) to store the configuration information.
  • The configuration is actually stored in the folder 'config-store-infra' specified in the property 'search-paths'. This property actually specifies 'search-paths' a list of folders. For sake of simplicity, I have just configured one folder in this example. However, it is a good practice to use one folder per application/microservice. 
  • Note that the Git repository, it is possible to add username and password to provide secure access to the Github repository. But that too has been omitted here to keep the example simple. For more details, I would recommend reading the Spring Cloud config server documentation.
  • The cofing server runs on port 8888
  • The management or security on the config server is disabled. 
Now run the ConfigServerInfraApplication class as a Spring Boot application from STS and it should start the Spring Cloud Config Server.

Creating the config-store
The config server needs data in the Git(Github in this case) to share with the applications. So for that create a simple project in STS using the steps below.

1> Click on 'File' -> 'New' --> 'Project'.
2> In the New Project Wizard expand the node 'General' and select 'Project' (Figure 3) and click 'Next.'

Figure 3 - Select Project

3>  In the New Project dialog, type the name of the project and click on 'Finish'.


Figure 4 - Add project name and finish
4> Create three different files for different environments (development, staging, and production) as follows

  • lead-service-DEV.yml
  • lead-service-STAGE.yml
  • lead-service-PROD.yml

5> For testing purpose add a simple property in the lead-service-DEV.yml file.

6> Finally commit and push the config-store-infra project on Github repository.

The source code for this post is also available on Github location provided below.

Source Code

https://github.com/kdhrubo/playground

What's Next?

In the next post, I will show how to quickly test the Spring cloud config server using a browser. Then I will review the url patterns to be used for the Spring Cloud config server. I will then change the lead-backend microservice to use the config server.
Follow my blog with Bloglovin

Tuesday, May 30, 2017

Upgrading Lead Microservice - Use MariaDB and Flyway with Spring Boot



So far I have been using an in-memory H2 database or Mockito for testing the lead microservice. To make the transition towards using the Spring Cloud Config server, I need to upgrade the micro-application to use MariaDB. I will be adding the configuration in the application.yml  the file which in the subsequent post will move over to the config server store. I will also be using Flyway to make it easy to maintain the database schema changes in future. I will use this post to introduce Flyway in the mix. Spring Boot also provides first class integration with Flyway. I am using Flyway as its really quick and easy to get started, minimal learning curve (no DSL) and I am comfortable with it having used it in the past.

Assumptions

  1. MariaDB 10 is installed
  2. Basic familiarity with Flyway
  3. Heidi SQL client is installed.

Step 1 - Update build.gradle to include the MariaDB JDBC and Flyway dependencies.
Do not forget to do a Gradle refresh on your IDE (I am using STS 3.8.4 on Java 8)

Step 2 - Rename the application.properties to application.yml and add the properties shown in listing below.

The lead backend service will run on port 8080. The application now has a name - 'lead-service'. This will be required by the configuration server later. I have also setup the MariaDB data source, the default Tomcat connection pool (recommended in Spring Boot documentation and also I do not want to introduce another dependency on a connection pool like Hikari or BoneCP etc as this pool is very robust). Also, I have added the dialect configuration and default schema name for Hibernate/JPA. Since I want to use the defaults on the database, Flyway will use the test schema and the Flyway managed DDL script will create the schema for the lead backend as 'lead_db'.
Step 3 - Create the DDL scripts for Flyway.
The next step is to create the DDL scripts for Flyway. The scripts should be stored with the lead-backend codebase under src/main/resources/db/migration folder. The file that creates the database schema for the first time is named 'V1.0__init.sql'. Note there are 2 '_', between 'V1.0' and 'init'.

Step 4 - Test
Check the Tomcat console logs to verify that Flyway and JPA worked without any issues.
Figure 1 - Checking the console for Flyway and JPA


Finally, check the DB with HeidiSQL if the Flyway table and 'lead_db' schema tables are created by Flyway.

Figure 2 - HeidiSQL view

Sunday, May 28, 2017

Why do you need Spring Cloud Config server?

Last month I wrote a primer on concepts around 12 factor app. Before getting into the details of the Spring Cloud Config Server, I must refresh on the principle #3 from the list presented in that post.

3 – Configuration
Store config in the environments

Configuration information must be separate from the source code. This may seem so obvious, but often we are guilty of leaving critical configuration parameters in the scattered in the code. Instead, applications should have environment specific configuration files. The sensitive information like database password or API key should be stored in these environment configuration files in encrypted format.

 The key takeaways from this postulate for a cloud-native microservices application are:
  1. Do not store configuration as part of the deployable unit (in the case of lead microservice - inside the jar or war if you are still deploying war like the good old days). Instead, store it in an external location and make it easily accessible during run-time. 
  2. Configuration files should be separated based on the environment where the microservice is going to run. For example - it is a common practice to maintain environment-specific configuration files like "DEVELOPMENT", "TEST","STAGING", "PRODUCTION" etc.
  3. It is a very common practice to store sensitive information like user id and password for database etc as hard coded in plain text format. It is advised to store such information in environment specific files but in an encrypted format. 
  4. Change in the configuration should not result in application/service to go through the entire build, test, release and upgrade cycle. 
So, you may think that you will have to develop a server based configuration management system. Wait, we already have a solution. Spring Cloud Config Server solves the problems described above and more. Essentially Spring Cloud Config Server is an infrastructure microservice accessible over HTTP. 

Key Features of Spring Cloud Config Server 
  1. Enables centralized configuration management for all environments and different applications.
  2. Provides server based configuration management accessible over HTTP/HTTPS
  3. Configuration information is stored in repositories managed by the config server. 
  4. Provides client-side support for applications to access the configuration properties at startup and cache them. 
  5. Configuration properties can be version controlled depending on the underlying repository support. 
  6. Any changes to configuration properties/values can be propagated to all the client applications. The client side support allows these changes to be applied transparently / refreshed without the need to restart the application again. 
  7. Confidential information can be encrypted. 
  8. Maps to Spring core concepts of Environment, PropertySource, Profile and Value. Thus it is easy to use in Spring applications and microservices.
  9. Facilitates continuous delivery pipelines by supporting configuration for different environments.
  10. It can be used by applications running in any language as the config server is nothing more than a REST endpoint serving configuration managed by an underlying repository. For example.NET clients can also use Spring Cloud Config Server. (More details can be found here - https://steeltoe.io/). 
  11. Supports Git as the primary storage repository for configuration. However other repositories like a file system, Hashicorp Vault are also supported out of the box. The support for MongoDB is in incubation as of this writing. 
  12. Monitoring of the config server is also possible. 
  13. Easy to configure and launch.
  14. Can be easily containerized. 
Limitations of Spring Cloud Config Server
  1. Properties are not cached on the server side.
  2. Each request leads to calls to the backing repository which can lead to multiple remote calls.
  3. High availability and failover features are limited. 
  4. Dynamic update of configuration properties on the server is very cumbersome. 
Note, that config server is extremely performant unless limited by the underlying store. The benefits overweigh the concerns and hence Spring cloud config server is the recommended tool to manage configuration in the microservices ecosystem.

Spring Cloud Config Server Alternatives
  1. Commons Configuration
  2. Netflix Archaius
  3. Apache ZooKeeper
  4. Kubernetes ConfigMap
  5. Consul Configuration
Note that none of these alternatives map to Spring Environment, PropertySource or Profile. Hence it will require a lot of plumbing to provide the features provided by Spring Cloud Config Server. So Spring Cloud Config Server is our tool of choice for configuration data management in the cloud-native architecture. 
Taking stock
So far I have covered only a few pieces of the Spring cloud-native application architecture jigsaw. I have only written about Spring Boot for microservices development and now I am going to write about Spring Cloud Config Server. I will gradually cover all the boxes in figure 1, to complete all the puzzles in the jigsaw. 

Figure 1 - Spring Cloud Jigsaw
Whats's next?

This post was dedicated to some theory behind, the Spring Cloud Config server. In the next post I will get back to hands-on work again and setup the Spring cloud config server and test it using a simple browser based client. Stay tuned for more exciting stuff. 


Friday, May 26, 2017

Unit Test - Microservices - Business Layer


In this post, I am going to show how to unit test the business layer of the lead microservice that I am developing. A couple of things to keep in mind to write the unit test for the business layer.

  1. These tests run must very fast for quick feedback.
  2. Only the business layer is involved and hence the web layer and repository layer should not be started. In other words, the Spring web tier beans should not be created and neither any database connections should be used. (This helps to achieve #1). 
  3. The repository layer will be mocked using Mockito.
  4. The fluent assert from AssertJ library will be used. 
The source code of the lead business unit test is shown in listing below.

Most of the test methods are straight forward. However, the testDelete the method needs some explanation. Here the findOne method is mocked/stubbed. This ensures that correct lead object is returned by the stub method. The other thing to note is that the verify the method for saving method captures the internal Lead object. Finally, that object is checked to verify that the "id" and "deleted" attributes. This ensures that the correct object was marked for soft delete by changing the "deleted" attribute.

I will pause the microservices test series for couple weeks. In the next post, I will write again about "12-factor app" and then introduce Spring Cloud Config Server and see how we can implement one key factor of cloud-native application. So, stay tuned.

The source code is available at - https://github.com/kdhrubo/playground/tree/develop/lead-backend

Thursday, May 25, 2017

Complete Unit tests - Microservices - Repository Layer

In the previous post, I introduced the unit tests for the repository layer of the lead microservice. In this short post, I will do some cleanup, add all the unit tests. Finally, I replaced all the JUNIT asserts with AssertJ asserts. AssserJ is also recommended by the Spring champions over at Pivotal.




Ok thats it for now. In the next post I will show how to unit test the business / service layer of the lead microservice.

Sunday, May 14, 2017

Microservice - Unit Test - Repository Layer

Now that we have completed the first round of development of our lead microservice, it is time to focus on some testing. I will follow a bottom-up approach and start by writing unit tests for the repository layer.

In order to write unit tests, I will make use of the Spring Test Framework and Spring Boot Test support. Spring Boot makes it extremely easy to unit test slices of the application. We can use @DataJpaTest and just unit test the database layer with an embedded database like H2. However, there are some gotchas (I do not know exactly what at this moment), the @DataJpaTest does not work if you have extended the Spring Data JPA to add a custom method for all the repositories. The problem I found was that the when running the Spring Boot application, everything works fine and the custom method works fine and the query returns the correct results. However, when the unit test is run, Spring Data JPA fails, complaining that it is not able to find an attribute for the entity/type.

So to write unit tests for the repository of the lead microservice, I will use the standard @SpringBootTest annotation. The goal here is to unit test and hence only beans relevant to the repository layer are created. To achieve this we will have to prevent unnecessary beans from being created - for example, the controller beans etc. Remember unit tests have to be cheap and must run fast. This will ensure quick feedback and promote continuous delivery principles. Note that the repository unit tests are created and run in an embedded database and hence they are very fast.

Listing 1 - below shows the initial version of the repository unit test class.
There are few important items to note in Listing 1.

  • Most of the methods are not yet implemented. They will be implemented along with future posts. 
  • @Transactional - annotation ensures that the transaction which started at the beginning of the test method is rolled back after the method completes. In other words, the test data that is created is removed. 
  • @SpringBootTest(webEnvironment=WebEnvironment.NONE) - ensures that web environment is not started in other words no controllers or web layer beans are created ensuring that the test runs fast. 
  • @DatabaseSetup("lead.xml") - this is an interesting annotation. I have integrated the spring-test-dbunit project to integrate DBUnit utilities with Spring Boot Test. The detailed documentation of this project can be found at this link - https://springtestdbunit.github.io/spring-test-dbunit/
Few words on Spring test db unit
  • This project makes it extremely easy to create test data and then clean it up.
  • @DatabaseSetup - can be added both at the class and method level. The transaction starts with the DB setup process and it inserts the data in the lead.xml in this example into the database for easy test data setup. Once the test completes and transaction rolls back, this data is removed. When the annotation is added at the class level, then it is run for all the test methods. It can also be added to individual test methods and in that case, the test data will be loaded only for that particular method. 
  • In order to load the lead.xml file for test data it must be stored under the same package name as the repository class. However, the source folder will be - 'src/test/resources'
  • The DbUnitTestExecutionListener will also be required to include Spring-test-db-unit. as shown in the snippet below.


Figure 1 - Test code and resource folder
Figure 2 - JUnit All Green
Finally, some changes are required in the build.gradle dependency management section to include the spring-test-dbunit jars in the lead microservice project. This is shown in snippet below.


Tuesday, May 9, 2017

Lead Microservice - Add the service and controller

This post continues from the last one. I will try to complete the lead microservice in this post. I will be focusing primarily on the CRUD operation. In CRM applications, that is the primary set of operations.
I will start by creating the exception class. Later I would show how to use this exception class and Spring Rest exception handling and build a robust error handling around the API calls.
Listing 1 - BusinessException.java

Next step would be to introduce the service class. Since this is CRUD most of the functionality can be encapsulated in an abstract parent service class. Any specific feature like convert lead to opportunity can be implemented in the LeadService class.

Listing 2 - AbstractBaseBusinessDelegate.java
And then finally the LeadService implementation class is shown in Listing 3.

Listing 3 - LeadBusinessDelegate.java
Note that this code will not compile. In order to get this to compile, you will need to add 2 new dependencies for the Jodd libraries. Jodd is a popular micro framework which provides very useful utilities and lightweight framework to develop Java applications.

Listing 4 - build.gradle snippet


Now I am going to add the Spring Rest controller. This class is responsible for handling all microservices call. It then delegates to the business delegate to complete the business operation. As you must have already noted, the business delegate then calls repository to retrieve any data from the backing data store. In this series, I am using JPA store or my code is interacting with a relational database. Later I intend to add NoSQL store also and show if and how they can reside together.
Just like the business delegate most of the functionality is encapsulated in the base controller class as shown in Listing 5.

Listing 5 - AbstractBaseController.java
And finally here is the LeadController implementation class in Listing 6.

Listing 6 - LeadController.java


Thats all for now. The source code is available on Github repository
https://github.com/kdhrubo/playground.git

First Microservice - Lead Service

This is a continuation of my previous post where I listed the key domain components of CRM. For sake of simplicity, I will focus on just two key components - Lead and Opportunity respectively. These two modules will help me explore end to microservice implementation. I will be primarily using Spring framework's support to deliver the microservices. I am deliberating on a nice interesting Javascript framework for the front end. More on this later.

Goal

For the time being, I am just going to focus on building the backend and cloud enable it. Then slowly I am going to add the front end and security to the backend and front end. I will also explore the complete Sping Cloud gamut of projects and beyond. I will then deliberate on cloud-native applications, transactions and actually making this application suitable to run on virtually any cloud. I will also explore cloud and microservices application design patterns. Last but not the least we also need to check where this application fits in the maturity model for the 12-factor application.



Step 1 - Create a Spring Boot quickstart project using STS.

I am using the 64-bit version of STS 3.8.4 on my laptop.
The first step is to click in the following sequence:

File -> New -> Spring Starter Project. In the dialog fill out the data as shown in Figure 1.

Figure 1 - Configure Spring Starter Project
Then click on 'Next' to fill out the final dialog as shown in Figure 2. Once the final dialog is filled out with the selections click on 'Finish'.

Figure 2 - Select the dependencies

Step 2 - Add the Lead domain class.

Let's go ahead and add the lead domain class. I am assuming some familiarity with Eclipse/STS and will not show how to create a new package or a Java class.

Listing 1 - Lead.java


The common properties are in the BaseEntity class which is shown in Listing 2.
Listing 2 - BaseEntity.java 

The Address class is embedded in the Lead class. This is shown in Listing 3.
Listing 3 - Address.java
I will end the day today by adding the Lead JPA repository.

Listing 4 - LeadRepository.java
The source code is hosted on Github.
GitHub Repo - https://github.com/kdhrubo/playground.git

Saturday, May 6, 2017

Breaking down the CRM monolith

In my previous posts, I have shared some theory regarding microservices. But it's time to start some implementation. I love to write code and see and feel things working. So I will start a series to refactor a monolithic CRM system and transform it into microservices based flexible software.

Big ball of mud.

Customer Relationship Management(CRM) is that giant software which existed since time immemorial and is used by all companies in some form or shape. Big enterprises will buy CRM software (also known as packages) from top CRM vendors like Oracle, SAP, Salesforce etc and then employ an army of consultants to try and implement it. Most of the classic CRM systems in the market today, even if deployed on the cloud are the big monolithic ball of mud. They are the gigantic piece of software with the huge feature set. Most often those requirements are surplus to the requirement or they will not fit into the processes of the company. So the company has to hire these certified consultants or "specialists" to either tailor the product or tailor your process to the product. This takes the involvement of a lot of people from business, consultants, developers, architects, and managers over a long period of time. Since so many parties are involved and too many ideas and thoughts, these programs often end in disaster (if not managed properly) after wasting months and even years not to mention the millions of dollars that go down the drain. I have had a couple of such unlucky experience to be part of failures. 

Outdated technology stack

CRM softwares are monoliths that have evolved over a long period of time. They most use outdated or proprietary technologies and are often closed source for competitive reasons. Even if you get some modern open source CRMs, they are also monoliths and often suffers from the same challenges as the commercial ones. They are also not based on open standards or technologies that leverage modern hardware. Most of the open source CRMs, for example, are written in PHP which is not always best suited to leverage, for example, the power of multi-core CPUs. Hence it can be challenging in terms of performance and scaling.

Vendor lock-in

CRM or such packaged softwares mean long term commitment to a vendor. Even if you run the newer version of these tools, you still are completely locked into the mercy of this vendor and its cloud offerings. You still haplessly pay software and cloud hardware cost. This bill can be a significant dent to the overall IT budget and thus limit spends in R&D. This also makes integration with other custom or COTs product extremely challenging.

Also, you have to train and work in the proprietary technologies used by the CRM vendors. Competent resources on such technologies are hard to find and even if you find someone it's going to be very expensive.

Lack of flexibility

Often the commercial or the open source CRM products lack flexibility. You cannot pick and choose the modules you need. Neither you can run on your favorite cloud. 

Key components of CRM

Now that we know some of the key challenges of CRM softwares let us first understand the key modules that make up a CRM software. I will try to keep this as simple as possible for easy of explanation and understanding. The goal is not to overwhelm you with business or domain knowledge but to see how to break down CRM monoliths and see if it can leverage the flexibility of microservices.

The key components of a CRM software are typically

  1. Lead
  2. Opportunity
  3. Contact
  4. Account
  5. Campaign
  6. Product 
  7. Quote
  8. Invoice
  9. Contract
  10. Project

The number of modules can vary and some CRMs have additional modules but in general, the ones listed above are the most common set of modules. For the sake of simplicity, we will consider the first 2 modules in our example tiny CRM and build the monolith version in the next post. 

Before I end, I want to take the liberty to post a similar problem from the book - Building Microservices by Sam Newman. Sam writes

The CRM—or Customer Relationship Management—tool is an often-encountered beast that can instill fear in the heart of even the hardiest architect. This sector, as typified by vendors like Salesforce or SAP, is rife with examples of tools that try to do everything for you. This can lead to the tool itself becoming a single point of failure, and a tangled knot of dependencies. Many implementations of CRM tools I have seen are among the best examples of adhesive (as opposed to cohesive) services.
The scope of such a tool typically starts small, but over time it becomes an increasingly important part of how your organization works. The problem is that the direction and choices made around this now-vital system are often made by the tool vendor itself, not by you.
I was involved recently in an exercise to try to wrest some control back. The organization I was working with realized that although it was using the CRM tool for a lot of things, it wasn’t getting the value of the increasing costs associated with the platform. At the same time, multiple internal systems were using the less-than-ideal CRM APIs for integration. We wanted to move the system architecture toward a place where we had services that modeled our businesses domain, and also lay the groundwork for a potential migration 


In next few posts, I will try to build a modern CRM based on microservices architecture.