Wednesday 25 November 2020

Git-two commits both success, but local compile error

 Today got an interesting but kind of weird issue after pull the latest code from git repo.

The auto-build task passed for both two commits but local build fail.

The commit1 has committed Class1, Class2;

The commit2 has committed Class3, but renamed the above classes to class1 and class2 (lower-case). As the two commits at the same time, I suppose the second developer either did not pull the code before push. So his pull reques also got passed. (bad code review here !)


If pull both the commits to local, compile error occurs cause cannot find Class1 and Class2.


Wednesday 21 October 2020

Customer operation unique id for springfox swagger lib

 when using swagger, each operation can have a id, by default this id is nickname of the annotation, for instance, 

@ApiOperation(value = "create a new user", nickname = "oper_code_create_new_user")

The oper_code_create_new_user will be used as unique id (not sure why the lib use nickname as a unique id but it does). How if you do not add this annotation on the method, for instance, you may have a parent Controller class, and the sub-Controller class just reuse the method without override it. in this case, the unique id will be oper_code_create_new_user_1 by default (as implementated in CachingOperationNameGenerator.java in springfox-spring-web library)

if you want to customize the unique id in the sub-class, for instance, I want to add the sub-class name or the value defined in controller lever tag name which is in @Api annotation, as the suffix,  you can reference below code sneppet.

@Component
@Order(SwaggerPluginSupport.SWAGGER_PLUGIN_ORDER + 1000)
public class OperationCustomizeUniqueIdReader implements OperationBuilderPlugin {

@Override
public void apply(OperationContext context) {
Optional<ApiOperation> apiOperation = context.findAnnotation(ApiOperation.class);

if (apiOperation.isPresent()) {
Optional<Api> apiOptional = context.findControllerAnnotation(Api.class);
if (apiOptional.isPresent()) {
String tagName = apiOptional.get().tags()[0];
ApiOperation operation = apiOperation.get();
String nickname = operation.nickname();

if (StringUtils.isNotEmpty(nickname) && StringUtils.isNotEmpty(tagName)) {
if (nickname.equalsIgnoreCase("oper_code_create_new_user") ) {
context.operationBuilder().uniqueId(nickname + "_" + tagName);
context.operationBuilder().codegenMethodNameStem(nickname + "_" + tagName);
}
}
}
}
}
@Override
public boolean supports(DocumentationType documentationType) {
return SwaggerPluginSupport.pluginDoesApply(documentationType);
}
}

if you have a sub class with @Api(tags = {"aSubResource"}), then you will get a unique name
for this method "oper_code_create_new_user_aSubResource"

Tuesday 6 October 2020

Passing parameters in Job and Step level in Spring Batch


1. Job level

Besides passing parameters when building a job, using Job Listener to pass extra job level parameters which can be used through the job running. Specifically, create a class who implements JobExecutionListener, override below method: 

@Override
public void beforeJob(JobExecution jobExecution)

In the method, add more parameters into job context as below:

ExecutionContext jobExecutionContext = jobExecution.getExecutionContext();
jobExecutionContext.put(someKey, someValue);

In Job's steps, using below code to introduce the parameter:

@Value("#{jobExecutionContext[someKey]}")
private SomeCalss someClassInstance; // the value will be someValue

2. Step level

How to pass parameters between Steps within a job?

It is the same as Job level, the Step should implement a Listener StepExecutionListener.

Say if you want to passing a parameter from step1 to step2. Step 1 should implements the above Listener, in afterStep method, add the parameter into execution context:

@Override
public ExitStatus afterStep(StepExecution stepExecution) {
// put a number into job execution context for future use
stepExecution.getJobExecution().getExecutionContext().putLong("aNumber", aNumber);
return null;
}

Step 2 also implements the Listener but in before step method to get the parameter out.

@Override
public void beforeStep(StepExecution stepExecution) {
aNumber = stepExecution.getJobExecution().getExecutionContext().getLong("aNumber");
}

Via this way you can passing parameters from one step to another.








Monday 25 May 2020

Spring boot Create a Deployable War File

It is easy to follow https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#howto-create-a-deployable-war-file

But here I want to add one thing is how to access the application context?

One way is you can get the context from below method return:

ConfigurableApplicationContext applicationContext = app.run(args);

Another is if you deploy the .war to external Tomcat server, the applicationContext may be null, in this case, you can get it from onStartup() method

@Overridepublic void onStartup(ServletContext servletContext) throws ServletException {
    WebApplicationContext rootAppContext = createRootApplicationContext(servletContext);
    if (rootAppContext != null) {
        // set context into your own class, say MySpringContext here
        MySpringContext.setContext(rootAppContext);
        servletContext.addListener(new ContextLoaderListener(rootAppContext) {
            @Override            public void contextInitialized(ServletContextEvent event) {
                // no-op because the application context is already initialized            }
        });
    }
    else {
        this.logger.debug("No ContextLoaderListener registered, as " + "createRootApplicationContext() did not "                + "return an application context");
    }
}

Saturday 23 May 2020

Create multiple yml files in Azure pipeline

If you want to have multiple pipeline yml files other than the single default one (with name: azure-pipelines.yml):

Step 1:
Click "pipeline" menu in Azure DevOps, and create a new pipeline yml file. You can rename the yml file with a meanful name, after modify the content, create Save button on the top-right, normally you will choose create a new branch to hold your new created yml file. If this yml is not the default one to be triggered by "- master", you can set
trigger:
none

Step 2:
Run the pipeline to see if it works as your expectation. If it is, you can merge the yml file into "master" branch

Step 3: You can manually run the new created pipeline yml file as per needs.

Normally we only have one yml file with trigger "- master", the other yml files can be triggered manually or per scheduled. As the yml file will be merged into master branch, so it can be ran in master branch, and any branches from master afterwards.


Monday 13 April 2020

Flowable UI local development

Flowable provides 4 UI applications, take modeler for instance, in order to develop locally:

  1. Create local dir, say my-ui-idm, and copy all contents in flowable-ui-idm into it, as all other UI app needs this idm application, so we have to setup idm first
  2. Create local dir, say my-ui-modeler, and copy all contens in flowable-ui-modeler into it.
  3. Create local dir to hold flowable-spring-boot project.
  4. Modify pom.xml, normally we use a stable version not a SNAPSHOT version, for instance, change all <version> in <parent> to a stable version, for instance, <version>6.5.0</version>, this change will apply to the above three projects
  5. in 1 and 2 projects, you may also need to change <relativePath> value in pom.xml relative to the flowable-spring-boot project
  6. modify flowable-default.properties file if you need, for instance, you want to use a different datasource. This file in both flowable-ui-idm-app/resources and flowable-ui-modeler-app/resources 
  7. run flowable-ui-idm application 
  8. run flowable-ui-modeler application
  9. access url, for instance, http://localhost:8888/flowable-modeler, you may redirect to login page if you first access, and after input user "admin" and password "test" (these all configed in the properties file), you will see modeler process list page.

Thursday 2 April 2020

how to make native sql update visible to jpa query

1. Background
Native update then query using jpa (spring data jpa, hibernate as the persistence provider) cannot see native sql's updates in a same transaction.


JPA (native) update jdbc native update
jpa query 1 not visible 2 not visible
jdbc query 3 visible 4 visible

The objective is to solve 1 and 2 to make sure JPA update (via native sql) and jdbc native sql update result are visible to following JAP query.

2. Solution
2.1 case by case
If you are going to solve the issue temporarily, and you can access the entityManager, then the em.refresh(entity) may solve your problem. For instance, We have a DemoRepository where:
@Modifying()
@Query(value = "update demo set name=:newName where id=:id", nativeQuery = true)
void updateNameNative(@Param("newName") String newName, @Param("id")int id);

// you have saved an demo instance with name of "name3", then do an update as below
demoRepository.updateNameNative("name3-1", demo1.getId());
Optional<Demo> demoFromJpa1 = demoRepository.findById(demo1.getId());
em.refresh(demoFromJpa1.get());
if (demoFromJpa1.isPresent()) System.out.println("1. jpa native update name to 'name3-1', then jpa query: " + demoFromJpa1.get());

If you don't refresh, the result will be "name3", if you do, you will see "name3-1"

2.2 Using Spring data annotation
For the above update method, if you do not want to call low level em.refresh method, you can add below attributes into @Modify annotation, to let spring data help to clear the persistence context thus changes can be seen after native sql updates.

@Modifying(flushAutomatically = true, clearAutomatically=true)
@Query(value = "update demo set name=:newName where id=:id", nativeQuery = true)
void updateNameNative(@Param("newName") String newName, @Param("id")int id);

2.3 AOP way

Create an Aspect to intercept all ( or part of ) Spring data jpa (or whatever you need) methods to flush changes to db before query, and clear persistence context after query, then you will get latest updates. Actually this is what the above @Modifying annotation's two attributes works.

@Aspect@Component@Configurablepublic class EntityMangerAspect {
    @PersistenceContext    private EntityManager em;

    @Pointcut("execution(* org.springframework.data.jpa.repository.JpaRepository+.*(*))")
    public void allMethodInJpaRepository() {
    }

    @Around("allMethodInJpaRepository()")
    public Object beforeAllMethodInJpaRepository(ProceedingJoinPoint pjp) {
        Object result = null;
        try {
            em.flush();
            result = pjp.proceed();
            em.clear();
        } catch (Throwable throwable) {
            throwable.printStackTrace();
        }
        return result;
    }
}

FYI: JPA query in Spring data jpa support the above two attributes
this.em = em;
this.flush = method.getFlushAutomatically();
this.clear = method.getClearAutomatically();

   @Override   protected Object doExecute(AbstractJpaQuery query, Object[] values) {

      if (flush) {
         em.flush();
      }

      int result = query.createQuery(values).executeUpdate();

      if (clear) {
         em.clear();
      }

      return result;
   }
}

Monday 30 March 2020

Java Thread CountDownLatch

In case when we want to do some logic until a thread finish (either success or fail), we can use countDownLatch, for instance if we have one thread,
CountDownLatch countDownLatch = new CountDownLatch(1);
jobExecutor = new Thread() {
   @Override public void run() {
      try {
         // core logic
      } catch (Throwable e) {
         //Logger.error
      } finally {
         // count down either success or fail         countDownLatch.countDown();
      }

   } };

// run the thread
jobExecutor.start();
// wait until batch job finishif(countDownLatch != null){
   countDownLatch.await();
}

// other logic when the thread finish running
....

Tuesday 18 February 2020

Monday 17 February 2020

sql server temporary table cannot be found

jdbcTemplate.execute("sql to create #temp_table_1");

fun(){
   jdbcTemplate.execute("sql to update data in #temp_table_1")
}

Exception: Object temp_table_1 cannot be found.

This is may cause the temp_table_1 is created by connection 1, but update sql using connection 2. As local temporary table is created by one connection and cannot be found by another connection.

One solution here is using the same connection to create, retrive and update the temporary table,
Or create a global temporary table ##temp_table and then it can be found by different connection instance.

And if you using datasource and conneciton pool, normally you need to drop the temporary table emplicitely cause the connection will not be closed but returned to the pool.

Thursday 6 February 2020

getClass info may not correct when proxied (in Spring framework)

Say we have a class MyClass.java, and its instance myObject, normally,
myObject.getClass() and MyClass.class are two ways we get the class info, but sometimes the first one does not work when the instance is a proxy not the original one.

When we use Spring framework, if the instance is a Spring bean and been proxied already, the .getClass() may not return a correct Class object but the proxy one, so using MyClass.class as your first choice.