Giter VIP home page Giter VIP logo

blinkfox / fenix Goto Github PK

View Code? Open in Web Editor NEW
338.0 338.0 71.0 1.21 MB

This is an extension library to the Spring Data JPA complex or dynamic SQL query. 这是一个比 MyBatis 更加强大的 Spring Data JPA 扩展库,为解决复杂动态 JPQL (或 SQL) 而生。https://blinkfox.github.io/fenix

Home Page: https://blinkfox.github.io/fenix

License: Apache License 2.0

Java 100.00%
dynamic-sql fenix fenix-spring-boot-starter jpa-extension jpa-plus mybatis spring-data-jpa

fenix's Introduction

fenix's People

Contributors

blinkfox avatar dependabot[bot] avatar dokiyoloo avatar imhansai avatar kerwinguo-v avatar pengten avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fenix's Issues

条件生成需要有优先级怎么写

我想要达到 where a = true and (b = true or b is null)

<trimWhere>
        <andEqual field="a" value="true"/>
        <andIsNull field="b" />
        <orEqual field="b" value="true"/>
</trimWhere>

上面生成的是 where a = true and b = true or b is null,不是想要的效果,有什么办法可以加优先级吗?

@QueryFenix 注解原生SQL的分页查询中没有去掉ORDER BY

Hi,
如题, 在用这个注解的原生SQL查询时, 分页查询的SQL没有去掉ORDER BY语句. 目前可以通过加countQuery解决,
不清楚是否是BUG还是只支持JPQL

java方法:

@QueryFenix(resultType = OperationLogListVo.class, resultTransformer = UnderscoreTransformer.class, nativeQuery = true)
Page<OperationLogListVo> queryLogs(@Param("qv") QueryVo qv, Pageable page);

输入日志:

Hibernate: SELECT LOG.*, US.USER_CNAME AS USER_NAME FROM TL_OPERATION_LOG LOG LEFT JOIN TS_USER US ON US.USER_NO = LOG.CREATE_BY WHERE 1 = 1 ORDER BY LOG.CREATE_TIME DESC offset 0 rows fetch next ? rows only
Hibernate: select count(*) as count from  TL_OPERATION_LOG LOG LEFT JOIN TS_USER US ON US.USER_NO = LOG.CREATE_BY WHERE 1 = 1 ORDER BY LOG.CREATE_TIME DESC

不支持union关键字

unexpected token: UNION

provider:

Fenix.start()
                    .select("t1.orgName as orgName, t1.orgType as orgType, t3.mchName as mchName")
                    .from("OrgInfo").as("t1")
                    .leftJoin("OrgHospital").as("t2")
                    .on("t1.orgId = t2.hospitalId")
                    .leftJoin("Mch").as("t3")
                    .on("t2.mchId = t3.mchId")
                    .whereDynamic()
                    .andLike("t3.mchName", params.getMchName(), StringUtils.hasText(params.getMchName()))
                    .andLike("t1.orgName", params.getOrgName(), StringUtils.hasText(params.getOrgName()))
                    .andEqual("t1.orgType", params.getOrgType(), Objects.nonNull(params.getOrgType()))
                    .andEqual("t1.status", params.getStatus(), Objects.nonNull(params.getStatus()))
                    .between("t1.createDt", params.getStart(), params.getEnd(), params.getStart() != null && params.getEnd() != null)
                    .union()
                    .select("t1.orgName as orgName, t1.orgType as orgType, t3.mchName as mchName")
                    .from("OrgInfo").as("t1")
                    .leftJoin("OrgDrugstore").as("t2")
                    .on("t1.orgId = t2.storeId")
                    .leftJoin("Mch").as("t3")
                    .on("t2.mchId = t3.mchId")
                    .whereDynamic()
                    .andLike("t3.mchName", params.getMchName(), StringUtils.hasText(params.getMchName()))
                    .andLike("t1.orgName", params.getOrgName(), StringUtils.hasText(params.getOrgName()))
                    .andEqual("t1.orgType", params.getOrgType(), Objects.nonNull(params.getOrgType()))
                    .andEqual("t1.status", params.getStatus(), Objects.nonNull(params.getStatus()))
                    .between("t1.createDt", params.getStart(), params.getEnd(), params.getStart() != null && params.getEnd() != null)
                    .end()
                    .setResultTypeClass(OrgInfoVo.class);

sql改写扩展点

您好, 在实际工作中我们的表一般都是下划线隔开多个单词,而hibernate的投影或者自定义的ResultTransformer一般使用Transformers.aliasToBean(Clazz), 根据数据库列的别名和返回类型的名称相同才能正确赋值, 我觉得可以在您的 com.blinkfox.fenix.jpa.FenixJpaQuery.doCreateQuery(Object[]) 这个方法里面的 this.querySql = this.sqlInfo.getSql(); 对这个sql进行再次处理( 这里 ), 使用durid或者jsqlparse等库对sql进行改写,自动增加下划线转别名的功能, 或者支持传入一个策略类, 可以自定义列明到Java对象字段的转换, 这样就实现了类似MyBatis中的 resultMap
的功能,下面是我参考durid的一个写法:

import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.regex.Matcher;
import java.util.regex.Pattern;

import com.alibaba.druid.sql.SQLUtils;
import com.alibaba.druid.sql.ast.SQLExpr;
import com.alibaba.druid.sql.ast.SQLStatement;
import com.alibaba.druid.sql.ast.expr.SQLPropertyExpr;
import com.alibaba.druid.sql.ast.statement.SQLExprTableSource;
import com.alibaba.druid.sql.ast.statement.SQLSelectItem;
import com.alibaba.druid.sql.ast.statement.SQLTableSource;
import com.alibaba.druid.sql.visitor.SQLASTVisitorAdapter;
import com.alibaba.druid.util.JdbcConstants;

public abstract class SqlUtils {

public static enum GenAliasSrategy {
	
	NONE("直接以列名作为别名"),
	LOWER_COLUMN_NAME("以列的小写作为别名"),
	UPPER_COLUMN_NAME("以列的小写作为别名"),
	CAMEL2UNDERLINE("驼峰转下划线"),
	UNDERLINE2CAMEL("下划线转驼峰");
	
	private String srategyRemark;
	
	private GenAliasSrategy(String srategyRemark) {
		this.srategyRemark = srategyRemark;
	}

	public String getSrategyRemark() {
		return srategyRemark;
	} 
	
}

/**
 * <pre>
 * 将传入的sql语句进行处理, 如果这个sql语句的列没有别名, 则根据传入的策略自动加上别名, 如果列本身有别名则不处理有别名的列(依赖 druid 来进行解析, 需要引入依赖包), 
 * 因为是静态解析sql,所以 * 这个需要写程序来展开实际的列, 注: 应该也可以用与查询列的jpql
 * 
 * 例如: (策略为下划线转驼峰)
 *   传入的sql: 
 *       SELECT d.dept_name deptNameVo, e.emp_id, e.gender FROM t_employee e left join department d ON e.emp_id=d.dept_id+200
 *       
 *   处理和sql: 
 *       SELECT d.dept_name AS deptNameVo, e.emp_id AS empId, e.gender AS gender FROM t_employee e LEFT JOIN department d ON e.emp_id=d.dept_id+200
 *   
 * </pre>
 * @return 返回改写后的sql语句
 */
public static String autoPopulatesColumnAliases(String sql, String dbType, GenAliasSrategy strategy) {
	if(strategy == null) {
	   strategy = GenAliasSrategy.UNDERLINE2CAMEL;
	}
	
	List<SQLStatement> stmtList = SQLUtils.parseStatements(sql, dbType);
	// 只能处理一个sql语句
	SQLStatement sqlStatement = stmtList.get(0);
	
	// 设置访问者
	ExportTableAliasVisitor visitor = new ExportTableAliasVisitor();
	sqlStatement.accept(visitor);
	
	// 获取所有的表(可以改写表名,仅对NativeSql有效)
	Map<String, SQLTableSource> tableaAliasMap = visitor.getTableaAliasMap();
	
	// 获取所有的列
	Map<String, SQLSelectItem> columnItems = visitor.getColumnItems();
	Iterator<Entry<String, SQLSelectItem>> iterator = columnItems.entrySet().iterator();
    while(iterator.hasNext()) {
        Entry<String, SQLSelectItem> entry = iterator.next();
        // 数据库列名称
        String name = entry.getKey(); 
        SQLSelectItem item = entry.getValue();
	    String alias = item.getAlias();
	    // 如果没有别名则自动添加别名
	    if(alias==null || "".equals(alias.trim())) {
	       switch(strategy) {
	          case NONE : {
				 item.setAlias(name);
	    	     break;
	          }
	          case LOWER_COLUMN_NAME : {
	        	 item.setAlias(name.toLowerCase());
	        	 
	    	     break;
	          }
	          case UPPER_COLUMN_NAME : {
	        	 item.setAlias(name.toUpperCase());
	        	 
	    	     break;
	          }
	          case CAMEL2UNDERLINE : {
	        	  String underLine = camel2UnderLine(name);
	        	  item.setAlias(underLine);
	        	  
	        	  break;
	          }
	          case UNDERLINE2CAMEL : {
	        	 String camel = underLine2Camel(name);
	        	 item.setAlias(camel);
	        	 
	    	     break;
	          }
	       }
	    }
    }
	
	// 获取处理后的sql语句 
	String newSql = sqlStatement.toString();
	
	return newSql;
}

/**
 * 下划线转驼峰
 */
public static String underLine2Camel(String underline){
    Pattern pattern = Pattern.compile("[_]\\w");
    String camel = underline.toLowerCase();
    Matcher matcher = pattern.matcher(camel);
    while(matcher.find()){
         String w = matcher.group().trim();
         camel = camel.replace(w, w.toUpperCase().replace("_", ""));
    }
    
    return camel;
}

/**
 * 驼峰转下划线
 */
public static String camel2UnderLine(String camel) {
	// 转成小写
	String underline = camel.replaceAll("\\B([A-Z])", "_$1").toLowerCase();
	
	return underline;
}

/**
 * 仅内部使用
 *
 */
 private static class ExportTableAliasVisitor extends SQLASTVisitorAdapter /* MySqlASTVisitorAdapter */ {
	
	// 保存数据库的表的别名与表信息的映射
    private Map<String, SQLTableSource> tableaAliasMap = new LinkedHashMap<String, SQLTableSource>();
    
    // 保存列的别名与列信息的映射
    private Map<String, SQLSelectItem> columnItems = new LinkedHashMap<String, SQLSelectItem>();
    
    @Override
    public boolean visit(SQLExprTableSource x) {
        String alias = x.getAlias();
        tableaAliasMap.put(alias, x);
        return true;
    }
    
	@Override
	public boolean visit(SQLSelectItem item) {
		SQLExpr expr = item.getExpr();
		if(expr instanceof SQLPropertyExpr) {
		   SQLPropertyExpr propExpr = (SQLPropertyExpr) expr;
		   String name = propExpr.getName();
		   columnItems.put(name, item);
		}

		return true;
	}

    public Map<String, SQLTableSource> getTableaAliasMap() {
        return tableaAliasMap;
    }

	public Map<String, SQLSelectItem> getColumnItems() {
		return columnItems;
	}
}
 
public static void main(String[] args) {
	// 测试自动增加别名性能与功能
	String select = "SELECT " 
	                + " d.dept_name deptNameVo, " // 有别名的不会自己转换
	                + " e.emp_id, "               // emp_id ==> empId
	                + " e.gender "                // gender ==> gender
	                + " FROM t_employee e "
	                + " left join department d "
	                + " ON e.emp_id=d.dept_id+200 ";
	long start = System.currentTimeMillis();
	String newSql = autoPopulatesColumnAliases(select, JdbcConstants.MYSQL, GenAliasSrategy.UNDERLINE2CAMEL);
	System.out.println(newSql);
	long end = System.currentTimeMillis();
	System.out.println("自动增加别名耗时: "+(end-start)+" 毫秒" );
	
}

}

我写了另一个半成品,就是使用jpa的@column注解, 但是name是作为数据库的列名称(最好还是自定义一个注解比较好), 然后写了一个 ResultTransformer

返回的vo定义:
import javax.persistence.Column;

import com.ducha.repositories.convert.impl.DateDataConvert;
import com.ducha.repositories.dao.annotation.ResultConvertAndTransformer;
import com.ducha.repositories.dao.annotation.TypeConvert;

// 标记该类中可能出现需要类型转换的字段
@ResultTransformer
public class UserVo {

@Column(name = "id")
private Integer id;

@Column(name = "username")
private String username;

@Column(name = "password")
private String password;

@Column(name = "birthDay")
@TypeConvert(converter=DateDataConvert.class, parameter = {"yyyy/MM/dd HH:mm:ss"}) // @TypeConvert 自定义注解
private String birthDay;

@Column(name = "entryDate")
@TypeConvert(converter=DateDataConvert.class, parameter = {"yyyy-MM-dd HH:mm:ss"})
private String entryDate;

public Integer getId() {
	return id;
}

public void setId(Integer id) {
	this.id = id;
}

public String getUsername() {
	return username;
}

public void setUsername(String username) {
	this.username = username;
}

public String getPassword() {
	return password;
}

public void setPassword(String password) {
	this.password = password;
}

public String getBirthDay() {
	return birthDay;
}

public void setBirthDay(String birthDay) {
	this.birthDay = birthDay;
}

public String getEntryDate() {
	return entryDate;
}

public void setEntryDate(String entryDate) {
	this.entryDate = entryDate;
}

}

ResultTransformer定义:
import java.lang.reflect.Field;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;

import javax.persistence.Column;

import org.hibernate.transform.ResultTransformer;

public class AnnotationBeanResultTransforme implements ResultTransformer {

private static final long serialVersionUID = 1638820969102403186L;

private final Class<?> resultClass;

private Map<String, Field> columnFieldMapping = new LinkedHashMap<String, Field>();

public AnnotationBeanResultTransforme(Class<?> resultClass) {
	this.resultClass = resultClass;
	
	try {
		init();
	} catch (Exception e) {
		e.printStackTrace();
	}
}

@Override
public Object transformTuple(Object[] tuple, String[] aliases) {
	Object result = null;
	try {
		result = resultClass.newInstance();
		for ( int i = 0; i < aliases.length; i++ ) {
			String alias= aliases[i];
			Field field = columnFieldMapping.get(alias);
			if(field!=null) {
				Object columnValue = tuple[i];
				field.set(result, columnValue);
			}
		}
	} catch (Exception e) {
		e.printStackTrace();
	} 
	
	return result;
}

// 首先进入 transformTuple(Object[], String[]) 进行转换, 然后再进入 transformList(List), 此时集合已经是实体对象了, 直接返回即可
@SuppressWarnings("rawtypes")
@Override
public List transformList(List list) {
	return list;
}

private void init() throws Exception {
	Class<?> clazz = resultClass;
	while (clazz != null) {
		  // 获取所有字段,public和protected和private,但是不包括父类字段
		  Field[] declaredFields = clazz.getDeclaredFields();
		  for(int idx=0; idx<declaredFields.length; idx++) {
			  Field f = declaredFields[idx];
			  Column column = f.getAnnotation(Column.class);
			  if(column!=null) {
				  // 数据库列名称
				  String name = column.name();
				  if("".equals(name)) {
					 // 如果没有值则取字段的值
					 name = f.getName(); 
				  }
				  if(!columnFieldMapping.containsKey(name)) {
					  f.setAccessible(true);
					  columnFieldMapping.put(name, f);
				  }
			  }
		  }
		  
		  // 访问父类
	      clazz = clazz.getSuperclass(); 
	}
}

}

这样在 @QueryFenix 新增一个属性或者新增一个新的注解来告诉框架改写sql的时候的策略。

使用语义化标签传递参数问题

此时在 criteria 中如果不传递 description 参数就会报错, 请问如何先判断 criteria 对象中有没有 description 这个属性, 再进行判空呢

fenix发现重大bug

select b from Test as b where 1=1

这段代码,当in的有值,name也有值的情况下,缺失and报错

select b from Test as b where

这段代码,当in没有值,name有值时,发生and报错

能提供idea上用的插件吗

能提供idea上的插件吗,类似mybatis的,能在xml写sql的时候自动提示,接口repository <=> xml 能相互链接快速跳转和检查。

希望增加对数据权限拦截的支持

针对系统内用户数据与部门关联或租户相关时,需要业务进行适配筛选,是否可以针对jpa提供数据自动筛选和入库自动维护部门或租户ID,注解或全局对数据进行数据权限管理

查询结果问题 。

sql为:
SELECT user_id as id, user_name as name, account as account,companyname as companyName,grouppath as groupPath
FROM user
java为:
@QueryFenix(value = "SysUserDao.queryForRoleSelList", nativeQuery = true)
List<Map<String, Object>> queryForRoleSelList(SysUserParams params, Set orders);

如何让Map 中的key 为id 而不是ID

自定义实体类型转换

hi, 您好,看到您的这个扩展感到很高兴,又能挤出点时间去陪别人的女朋友了,我想向您多借点时间,跟您这边提一个小小的需求,比如我的MySQL的字段类型是datetime,实体字段也是写的private Date xxx; 之前使用投影的时候是通过暴力反射注入类型转换器类实现, 也就是在投影的getter方法上增加自定义注解,Spring投影的调用方法链上会去调用之前注册的类型转换器进行转换,我想你这边已经可以支持返回自定义实体了后面是否会考虑增加类型转换这个功能呢? 比如我的数据库字段是 birth_day, 数据类型是datetime, 但是我可以在自定义的实体Dto上使用@convert(pattern=“yyyy-MM-dd”) private String birthDay; 这样来写,
另外还有个小问题想请教下作者, 您官网提供的语义化标签里面后期会考虑支持:

not in

not like

xx>= 1

xx<3

xx>=1 and xx<3

xx>1 and x<=3

xx>1 and xx<3

这样基本上就涵盖了大部分的sql条件了。

能支持一下修改 SELECT_COUNT = "select count(*) as count from "吗

sharding-sphere分库分表查询时碰到select count() from问题:
Caused by: java.lang.IllegalStateException: Can't find index: AggregationSelectItem(type=COUNT, innerExpression=(
), alias=Optional.absent(), derivedAggregationItems=[], index=-1), please add alias for aggregate selections

于是使用countQuery重写为 select count() as count 结果正常
考虑到 com.blinkfox.fenix.jpa.FenixJpaQuery.getCountSql 可替换 select count(
)
SELECT_COUNT = "select count(*) as count from " 考虑看能否支持一下

where 标签问题

模板语句
SELECT * FROM
(SELECT
t1.name as name
FROM table1 t1
<where>
<andStartsWith /> 不满足
<andIn /> 不满足
</where>
) a

会生成这样的sql语句
select * from
(SELECT
t1.name as name
FROM table1 t1
where
) a

多出一个where关键字,请问子查询这样不能支持吗?

传入分页参数Pageable.unpaged(),报错

如果分页参数传入Pageable.unpaged()报错:
java.lang.UnsupportedOperationException: null
at org.springframework.data.domain.Unpaged.getOffset(Unpaged.java:96)
at com.blinkfox.fenix.jpa.FenixJpaQuery.doCreateQuery(FenixJpaQuery.java:150)
at com.blinkfox.fenix.jpa.FenixJpaQuery.doCreateQuery(FenixJpaQuery.java:110)

@QueryFenix(nativeQuery = true)
Page<Map<String, Object>> pageList(@Param("params") Map<String, Object> params, Pageable pageable);

关于2.3.2版本兼容旧版jpa的问题

我们知道2.3.2版本兼容旧版jpa需要在这里FenixJpaClassWriter改字节码。另一个点ClassLoader是有区隔的。
有问题的情况是,如果dao层实例化比较早(比如在实例化filter时候就需要先实例化依赖的dao),会发现报空指针错误(FenixQueryLookupStrategy的字节码修改失败,所以createOldJpaQueryLookupStrategy方法默认返回了null);如果dao层实例化比较晚,则正常。
经过排查是发现由于ClassLoader区隔导致,作者可以测试修改一下。

关于调用 andIn() 时传入空数组的问题

你好, 我发现使用andIn(name, list)传入空数组的时候, 会构造永真条件where 1 = 1, 即查询全部, 但是使用 sql 查询的时候select * from xxx where name in ()时, 返回结果为空, 似乎两边逻辑不相符, 期望的是传入空数组的情况下返回结果为空, 不知道这点是否是专门设计的?

关于是否支持增量更新的探讨

最近在使用fenix过程中,发现增量更新的需求非常普遍。由于JPA的原本的特性并不支持,那么fenix作为扩展库是否可以提供简单通用的API来支持这个功能呢?

多数据源 配置报错

The bean 'userRepository', defined in com.shares.repository.UserRepository defined in @EnableFenix declared on SharesApplication, could not be registered. A bean with that name has already been defined in com.shares.repository.UserRepository defined in @EnableJpaRepositories declared on PrimaryConfig and overriding is disabled.

xml如何热更新

现在在xml写的hql在修改后,需要重启项目。如何可以热更新呢。

No property *** found for type ***

使用自定义扩展方法继承FenixSimpleJpaRepository,使用@QueryFenix 注解时报错

自定义扩展CommonRepositoryImpl:
public class CommonRepositoryImpl<T,ID extends Serializable> extends FenixSimpleJpaRepository<T, ID> implements CommonRepository<T,ID> {

private final EntityManager entityManager;

private final JpaEntityInformation<T, ?> entityInformation;

public CommonRepositoryImpl(JpaEntityInformation<T, ?> entityInformation, EntityManager entityManager) {
    super(entityInformation, entityManager);
    this.entityManager = entityManager;
    this.entityInformation = entityInformation;
}
public CommonRepositoryImpl(Class<T> domainClass, EntityManager entityManager) {
    //super(domainClass, entityManager);
    //this.entityManager = entityManager;
    this(JpaEntityInformationSupport.getEntityInformation(domainClass, entityManager), entityManager);
}


。。。。。。省略实现方法
启动类:
@SpringBootApplication
@EnableJpaRepositories(repositoryBaseClass = CommonRepositoryImpl.class)
public class DispatchApplication {

public static void main(String[] args) {
    SpringApplication.run(DispatchApplication.class, args);
}

}

@QueryFenix 注解所在方法

public interface MyframeTaskInfoRepository extends CommonRepository<MyframeTaskInfo,String> {

@Query(value="select new Map(t.taskId as taskId,t.taskName as taskName)  from MyframeTaskInfo t where t.taskStatus = '1' and t.packageId =:packageId ")
List<Map<String, Object>> getTaskInfoByPackageId(@Param("packageId") String packageId);

@Query("select count(1) from MyframeTaskInfo t where t.relationTaskIds like :taskId")
List<Integer> findByRelationTaskIdsLike(@Param("taskId")String taskId);

@QueryFenix
List<TaskVo> getAllTaskDetailInfo();

}

启动报错:
Caused by: org.springframework.data.mapping.PropertyReferenceException: No property getAllTaskDetailInfo found for type MyframeTaskInfo!

判断不同的数据库

mybatis中有判断不同数据库加载不同的SQL,希望作者可以考虑这个功能,这样可以有更好的通用性

关于 mvel 对 Integer 参数的判断

@QueryFenix(nativeQuery = true)
List<User> queryList(Integer status);

<andEqual field="t.status" value="status" match="?status != empty" />

status 为 null 或 0 判断都一样 , 这个有什么好的解决方法吗

FenixPredicateBuilder 是否考虑支持Lamda表达式获取字段名

例如:

List<Blog> blogs = blogRepository.findAll(builder ->
                builder.andIn(Blog::getId, ids, ids != null && ids.length > 0)
                        .andLike(Blog::getTitle, params.get("title"), params.get("title") != null)
                        .andLike(Blog::getAuthor, params.get("author"))
                        .andBetween(Blog::getCreateTime, params.get("startTime"), params.get("endTime"))
                .build());

2.7.0新特性Entity序列化问题

class ... implements Serializable
,FenixJpaModel<T, ID, R<T,ID>>
,FenixSpecModel<T, R<T,ID>>
implements 了这两个类会出现序列化问题。

com.fasterxml.jackson.databind.exc.InvalidDefinitionException: No serializer found for class org.springframework.data.repository.core.support.TransactionalRepositoryProxyPostProcessor$RepositoryAnnotationTransactionAttributeSource and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS) (through reference chain: java.util.ArrayList[0]->com.dfe.dfzh.smk.entity.SmkBusinessAccountInfo["repository"]->jdk.proxy2.$Proxy120["advisors"]->org.springframework.aop.support.DefaultPointcutAdvisor[3]->org.springframework.aop.support.DefaultPointcutAdvisor["advice"]->org.springframework.transaction.interceptor.TransactionInterceptor["transactionAttributeSource"])
at com.fasterxml.jackson.databind.exc.InvalidDefinitionException.from(InvalidDefinitionException.java:77) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.SerializerProvider.reportBadDefinition(SerializerProvider.java:1300) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.DatabindContext.reportBadDefinition(DatabindContext.java:400) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.failForEmpty(UnknownSerializer.java:46) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.serialize(UnknownSerializer.java:29) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:728) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:774) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:178) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:728) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:774) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:178) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.std.ObjectArraySerializer.serializeContents(ObjectArraySerializer.java:253) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.std.ObjectArraySerializer.serialize(ObjectArraySerializer.java:214) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.std.ObjectArraySerializer.serialize(ObjectArraySerializer.java:23) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:728) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:774) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:178) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:728) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:774) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:178) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContents(IndexedListSerializer.java:119) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:79) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:18) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSerializerProvider.java:480) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:319) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ObjectWriter$Prefetch.serialize(ObjectWriter.java:1518) ~[jackson-databind-2.13.1.jar:2.13.1]
at com.fasterxml.jackson.databind.ObjectWriter.writeValue(ObjectWriter.java:1007) ~[jackson-databind-2.13.1.jar:2.13.1]
at org.springframework.http.converter.json.AbstractJackson2HttpMessageConverter.writeInternal(AbstractJackson2HttpMessageConverter.java:454) ~[spring-web-5.3.15.jar:5.3.15]
at org.springframework.http.converter.AbstractGenericHttpMessageConverter.write(AbstractGenericHttpMessageConverter.java:104) ~[spring-web-5.3.15.jar:5.3.15]
at org.springframework.web.servlet.mvc.method.annotation.AbstractMessageConverterMethodProcessor.writeWithMessageConverters(AbstractMessageConverterMethodProcessor.java:290) ~[spring-webmvc-5.3.15.jar:5.3.15]
at org.springframework.web.servlet.mvc.method.annotation.RequestResponseBodyMethodProcessor.handleReturnValue(RequestResponseBodyMethodProcessor.java:183) ~[spring-webmvc-5.3.15.jar:5.3.15]
at org.springframework.web.method.support.HandlerMethodReturnValueHandlerComposite.handleReturnValue(HandlerMethodReturnValueHandlerComposite.java:78) ~[spring-web-5.3.15.jar:5.3.15]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:135) ~[spring-webmvc-5.3.15.jar:5.3.15]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895) ~[spring-webmvc-5.3.15.jar:5.3.15]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:808) ~[spring-webmvc-5.3.15.jar:5.3.15]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) ~[spring-webmvc-5.3.15.jar:5.3.15]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1067) ~[spring-webmvc-5.3.15.jar:5.3.15]
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:963) ~[spring-webmvc-5.3.15.jar:5.3.15]
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006) ~[spring-webmvc-5.3.15.jar:5.3.15]
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:898) ~[spring-webmvc-5.3.15.jar:5.3.15]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:655) ~[tomcat-embed-core-9.0.56.jar:4.0.FR]
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883) ~[spring-webmvc-5.3.15.jar:5.3.15]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:764) ~[tomcat-embed-core-9.0.56.jar:4.0.FR]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:227) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) ~[tomcat-embed-websocket-9.0.56.jar:9.0.56]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) ~[spring-web-5.3.15.jar:5.3.15]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) ~[spring-web-5.3.15.jar:5.3.15]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) ~[spring-web-5.3.15.jar:5.3.15]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) ~[spring-web-5.3.15.jar:5.3.15]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) ~[spring-web-5.3.15.jar:5.3.15]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) ~[spring-web-5.3.15.jar:5.3.15]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:197) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.core.StandardContextValve.__invoke(StandardContextValve.java:97) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:41002) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:540) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:135) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:357) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:382) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:895) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1732) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) ~[tomcat-embed-core-9.0.56.jar:9.0.56]
at java.base/java.lang.Thread.run(Thread.java:833) ~[na:na]

还有希望自动识别一下主键id,有的主键不一定是id

扩展点建议

您好,阅读了您的代码和官方的文档,

官方文档给出的建议是: Spring Data JPA 的版本须保证 2.1.8.RELEASE 及以上;如果你是 Spring Boot 项目,则 Spring Boot 的版本须保证 2.1.5.RELEASE 及以上。因为后续版本的 Spring Data JPA 对其中 QueryLookupStrategy 的代码有较大改动。

在了解到您的项目之前我也做了些相似的工作,通过新的@NativeSelect注解来完成一些对原生sql的增强,且最近正在读mybatis源码,想把mybatis的xml配置能力移植过来,但是半途而废了,不过我这边选择的扩展点是在两个地方:

org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean.createRepositoryFactory(EntityManager)

org.springframework.data.repository.core.support.RepositoryFactorySupport.addRepositoryProxyPostProcessor(RepositoryProxyPostProcessor)

通过扩展一个 RepositoryProxyPostProcessor 类来新增切面, 我的一个实现如下:

import javax.persistence.EntityManager;

import org.springframework.aop.framework.ProxyFactory;
import org.springframework.data.jpa.repository.support.JpaRepositoryFactory;
import org.springframework.data.repository.core.RepositoryInformation;
import org.springframework.data.repository.core.support.RepositoryProxyPostProcessor;

import com.ducha.repositories.dao.factory.BaseRepositoryFactoryBean.CommonRepositoryFactory;
import com.ducha.repositories.dao.interceptor.JpqlSqlMethodInterceptor;
import com.ducha.repositories.dao.interceptor.NativeSqlMethodInterceptor;
import com.ducha.repositories.dao.interceptor.ResultTransformerMethodInterceptor;
import com.ducha.repositories.dao.interceptor.TransformMethodInterceptor;
/**
*

  • @author hexian

  • 给Spring Data JPA 的方法执行链增加自己的 AOP 切面处理器
    */
    public class EnhanceRepositoryProxyPostProcessor implements RepositoryProxyPostProcessor {

    private EntityManager entityManager;

    private JpaRepositoryFactory repositoryFactory;

    public EnhanceRepositoryProxyPostProcessor(EntityManager entityManager, JpaRepositoryFactory repositoryFactory) {
    this.entityManager = entityManager;
    this.repositoryFactory = repositoryFactory;
    }

    @SuppressWarnings("rawtypes")
    @OverRide
    public void postProcess(ProxyFactory factory, RepositoryInformation repositoryInformation) {
    CommonRepositoryFactory commonRepositoryFactory = (CommonRepositoryFactory)this.repositoryFactory;

     // 添加新的切面链
     factory.addAdvice(new TransformMethodInterceptor());
     // @ResultTransformer, 使得接口可以返回自定义的 vo 对象
     factory.addAdvice(new ResultTransformerMethodInterceptor(repositoryInformation,  this.entityManager,   commonRepositoryFactory, factory));
     // 动态NativeSql, 使得参数支持 ConditionWrapper 参数
     factory.addAdvice(new NativeSqlMethodInterceptor(repositoryInformation,          this.entityManager,   commonRepositoryFactory, factory));
     // 动态JpalSql, 使得参数支持 ConditionWrapper 参数
     factory.addAdvice(new JpqlSqlMethodInterceptor(repositoryInformation,            this.entityManager,   commonRepositoryFactory, factory));
    

    }

}

public class ResultTransformerMethodInterceptor implements MethodInterceptor {

    @Override
public Object invoke(MethodInvocation invocation) throws Throwable {
	        Method method = invocation.getMethod();
	        ResultTransformer resultTransformer  = AnnotatedElementUtils.findMergedAnnotation(method, ResultTransformer.class);

	// 判断方法是否是在接口上声明
	boolean isInterface                  = method.getDeclaringClass().isInterface();
	// 同时包含 @ResultTransformer 和 @Query 注解, 且不包含 @Modifying 注解
	if(isInterface && resultTransformer != null) {

		//  ** TODO: 在这里进行对您提供的功能进行集成(动态解析sql) **

        return result;
	} else {
                    // 调用Spring Data JPA原先的切面链
		Object result = invocation.proceed();
		return result;
	}
    }

}

这样的话应该在Spring Boot 1.x 以上都可以支持了(公司老项目使用的是1.x的版本), 我的扩展工作是在: Spring Boot 2.0.2.RELEASE, spring-data-jpa-2.0.7.RELEASE, 这样就可以在较低的版本中使用,我特意在Spring Data JPA的github上看了下,这个扩展点在 spring-data-commons 1.5.x 版本就开始支持了。 您可以看 这里 , 搜索字符串: " postProcess( "

关于jpa添加自定义方法

@EnableJpaRepositories(repositoryFactoryBeanClass = BaseJpaRepositoryFactoryBean.class)
指定自己JpaRepositorie 新增了一些自己的方法 (isExist 方法)

Failed to create query for method public abstract java.lang.Boolean com.blinkfox.fenix.example.util.jpa.BaseJpaRepository.isExist(java.io.Serializable)! No property isExist found for type Book!

会出现isExist属性找不到的问题
大佬 怎么解决

存在2个致命问题

1.数据库为下划线的,vo类为驼峰的无法转换
临时解决办法:重写了如下类:
package com.blinkfox.fenix.jpa;

import cn.hutool.core.bean.BeanUtil;
import cn.hutool.core.util.StrUtil;
import org.hibernate.transform.ResultTransformer;

import java.util.HashMap;
import java.util.List;
import java.util.Map;

/**

  • @param

  • @author SCY
    */
    public class FenixResultTransformer implements ResultTransformer {
    private final Class resultClass;

    public FenixResultTransformer(Class resultClass) {
    this.resultClass = resultClass;
    }

    @OverRide
    public Object transformTuple(Object[] tuple, String[] aliases) {
    Map<String, Object> map = new HashMap<>(tuple.length);
    for (int i = 0; i < tuple.length; ++i) {
    String alias = aliases[i];
    if (alias != null) {
    map.put(StrUtil.toCamelCase(alias), tuple[i]);
    }
    }
    return BeanUtil.toBean(map, resultClass);
    }

    @OverRide
    public List<?> transformList(List list) {
    return list;
    }

}

2.当项目中引入

org.springframework.boot
spring-boot-devtools
true
runtime

会出现同一个类无法转换成同一个类问题,这个问题很奇怪,单调用jpa内部相关不会报错,暂时没法解决

希望作者下一版本中支持,去除字段使用as 并且下划线能转小驼峰

【急】相同sql在并发中两个线程的查询参数串了

你好。如下图2.2.0版本中相同sql在并发中两个线程的查询参数串了。

1、不知道作者是否已发现此问题?

2、然后在最新版本v2.3.2修复的“修复了在异步多线程情况下,返回自定义实体 Bean 类型时,JDBC 连接未释放的问题,老版本可以使用 @transactional 注解解决;”是否包含解决这个问题呢?

麻烦帮忙看看,谢谢。

fenix-1
fenix-2
fenix-3

foreach里面生成的SQL不能通过占位方式生成SQL而后赋值

org.mvel2.PropertyAccessException: [Error: could not access: item; in class: java.util.HashMap]
[Near : {... item.id ....}]
^
[Line: 1, Column: 2]
at org.mvel2.PropertyAccessor.getBeanProperty(PropertyAccessor.java:679) ~[mvel2-2.4.8.Final.jar:na]
at org.mvel2.PropertyAccessor.getNormal(PropertyAccessor.java:178) ~[mvel2-2.4.8.Final.jar:na]
at org.mvel2.PropertyAccessor.get(PropertyAccessor.java:145) ~[mvel2-2.4.8.Final.jar:na]
at org.mvel2.PropertyAccessor.get(PropertyAccessor.java:125) ~[mvel2-2.4.8.Final.jar:na]
at org.mvel2.ast.ASTNode.getReducedValue(ASTNode.java:187) ~[mvel2-2.4.8.Final.jar:na]
at org.mvel2.MVELInterpretedRuntime.parseAndExecuteInterpreted(MVELInterpretedRuntime.java:112) ~[mvel2-2.4.8.Final.jar:na]
at org.mvel2.MVELInterpretedRuntime.parse(MVELInterpretedRuntime.java:58) ~[mvel2-2.4.8.Final.jar:na]
at org.mvel2.MVEL.eval(MVEL.java:114) ~[mvel2-2.4.8.Final.jar:na]
at com.blinkfox.fenix.helper.ParseHelper.parseExpressWithException(ParseHelper.java:45) ~[fenix-2.3.6.jar:na]
at com.blinkfox.fenix.core.FenixXmlBuilder.renderSqlAndOtherParams(FenixXmlBuilder.java:159) ~[fenix-2.3.6.jar:na]
at com.blinkfox.fenix.core.FenixXmlBuilder.buildSqlInfo(FenixXmlBuilder.java:132) ~[fenix-2.3.6.jar:na]
at com.blinkfox.fenix.core.FenixXmlBuilder.buildNewSqlInfo(FenixXmlBuilder.java:103) ~[fenix-2.3.6.jar:na]
at com.blinkfox.fenix.core.FenixXmlBuilder.getXmlSqlInfo(FenixXmlBuilder.java:79) ~[fenix-2.3.6.jar:na]
at com.blinkfox.fenix.core.Fenix.getXmlSqlInfo(Fenix.java:104) ~[fenix-2.3.6.jar:na]
at com.blinkfox.fenix.jpa.FenixJpaQuery.getSqlInfoByFenix(FenixJpaQuery.java:295) ~[fenix-2.3.6.jar:na]
at com.blinkfox.fenix.jpa.FenixJpaQuery.doCreateQuery(FenixJpaQuery.java:128) ~[fenix-2.3.6.jar:na]
at com.blinkfox.fenix.jpa.FenixJpaQuery.doCreateQuery(FenixJpaQuery.java:110) ~[fenix-2.3.6.jar:na]
at org.springframework.data.jpa.repository.query.AbstractJpaQuery.createQuery(AbstractJpaQuery.java:227) ~[spring-data-jpa-2.4.1.jar:2.4.1]

建议getCountSql支持distinct

问题说明:
fenix/src/main/java/com/blinkfox/fenix/jpa/FenixJpaQuery.java

this.sqlInfo.getSql().replaceFirst(REGX_SELECT_FROM, SELECT_COUNT);

如果 this.sqlInfo.getSql() 查询为:
select distinct t.id from t_user

会被替换成
select count(*) as count from t_user

这样导致 list 集合数据 和 count 总数不一样

正确替换sql应该为:
select count(distinct t.id) as count from t_user

解决方案:
1.count查询 正则支持识别 distinct:
2.或者提供一个方法/注解属性,类似 countQuery ,能够手动控制是否使用 distinct

目前解决方法是用 countQuery 查询总数

关于@EnableFenix注解配置包路径的问题

jpa(dao)层的包根路径和@EnableFenix所在启动类包根路径一致时使用正常(即@QueryFenix被识别时候);
但当根路径不同时候,@EnableFenix无法配置不同的包路径,所以@QueryFenix不能被扫描识别,所以报错说Repository的相关方法提示命名不规范(前提知识,jpa Repository的方法命名,如果使用@query是可以自由命名的(即自定义sql),但当jpa发现没有自定义sql时候要求方法名符合命名规则,不能自用命名,报错)。
作者是否需增加@EnableFenix配置不同的包路径,像@EnableJpaRepositories(basePackages 。。。。。。

通用类型转换器

作者好, 在研究spring data jpa 类型转换时遇到一点问题, com.blinkfox.fenix.jpa.FenixJpaQuery#doCreateQuery(java.lang.Object[]) 这个方法

protected Query doCreateQuery(Object[] values) {
。。。。。。
if (queryFenix.nativeQuery()) {
Class type = this.getTypeToQueryFor(jpaMethod.getResultProcessor().withDynamicProjection( new ParametersParameterAccessor(jpaMethod.getParameters(), values)).getReturnedType(), querySql); query = type == null ? em.createNativeQuery(querySql) : em.createNativeQuery(querySql, type); } else { /* * 这里不太了解作者为什么不和上面的一样使用 Class type = this.getTypeToQueryFor(jpaMethod.getResultProcessor().withDynamicProjection(
new ParametersParameterAccessor(jpaMethod.getParameters(), values)).getReturnedType(), querySql);
query = type == null ? em.createQuery(querySql) : em.createQuery(querySql, type);
* /
query = em.createQuery(querySql);
}
。。。。。。

    // 如果自定义设置的返回类型不为空,就做额外的返回结果处理.
    String resultType = sqlInfo.getResultType();
    if (StringHelper.isNotBlank(resultType)) {
        // 猜测: 应该是为了这里使用
        query = new QueryResultBuilder(query, resultType).build(queryFenix.nativeQuery());
    }

}

我手动改了下改为:

// 如果自定义设置的返回类型不为空,就做额外的返回结果处理.
String resultType = sqlInfo.getResultType(); // TODO: 1.获取 resultType 提前了。
if (queryFenix.nativeQuery()) {
Class type = this.getTypeToQueryFor(jpaMethod.getResultProcessor().withDynamicProjection( new ParametersParameterAccessor(jpaMethod.getParameters(), values)).getReturnedType(), querySql); query = type == null ? em.createNativeQuery(querySql) : em.createNativeQuery(querySql, type); } else { // TODO: 2.这里增加了一个判断 if (StringHelper.isNotBlank(resultType)) { query = em.createQuery(querySql); } else { Class type = this.getTypeToQueryFor(jpaMethod.getResultProcessor().withDynamicProjection(
new ParametersParameterAccessor(jpaMethod.getParameters(), values)).getReturnedType(), querySql);
query = (type == null) ? em.createQuery(querySql) : em.createQuery(querySql, type);
}
}

    if (StringHelper.isNotBlank(resultType)) {
        query = new QueryResultBuilder(query, resultType).build(queryFenix.nativeQuery());
    }

这样是否可以?

如果我没有在xml加上 resultType , 而是提供一种通用的javax.persistence.Tuple转DTO的方式来转换

我试了下去掉xml的resultType获取到的结果就是Object[] 了,而正常的@query可以返回 javax.persistence.Tuple 这个类型,因为Tuple 相当于是一个Map, 可以在方法调用链上增加类型转换器, 这样就可以实现通用的转换了,而不再需要每个都写 resultType="com.blinkfox.fenix.example.dto.BlogDto"

可能没有讲明白, 后面我试试提个pr

Pageable 中有 sort字段时,生成怪异的order sql

这个是最终生成的sql, 加粗倾斜的是sort的值
WHERE
t.ENABLED_FLAG = 'Y',
fa.accountId ASC
)
WHERE
ROWNUM <= ?

这个是fenix的xml

       WHERE
          t.ENABLED_FLAG = 'Y'
      <andLike field="t.ACCOUNT_NAME" value="criteria.accountName" match="?criteria.?accountName != empty" />
      <andLike field="t.ACCOUNT_CODE" value="criteria.accountCode" match="?criteria.?accountCode != empty" />
  </fenix>
</fenixs>

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.