Giter VIP home page Giter VIP logo

crud-wizard's People

Contributors

github-actions[bot] avatar mikolajmitura avatar

Watchers

 avatar

Forkers

cckmit

crud-wizard's Issues

Invoking code before and after generic service like Aspects

GenericServiceDelegator should invoke classes, beans which should invoke before/after generic service.
This will be something like Aspects.
Should be configuration for global/for all with order of aspects.
Every endpoint/service should have opportunity for setup list of aspects.
Should be opportunit for configure aspect code for data storage methods.

create layout for page

default page layout should contains menu at left side, right side and at the top at left and right side.

Every endpoint should have oportunity where should be posted in which menu or maybe is child of some other menu.
So some menu item can have their child and parent.
Some entity should have opportunity to exists in few menus.

Remove additional properties in all DTO classes

Field additional properties should be only in super Class. Do somehom to use superBuilder w Dtos.
Do separated Class for additional properties for model and for dto.
For super builder use:
https://stackoverflow.com/questions/52734182/lombok-superbuilder-example-with-json-annotations

create separated model for default configuration.
Create separated tables for additional properties for every model.
Use join table
https://stackoverflow.com/questions/13047483/hibernate-one-to-many-using-a-join-table-and-hibernate-annotations

Changes for translations, translations on fields in metamodels.

should be translations for:
ClassMetaModelDto for name
FieldMetaModelDto for Field names
ValidatorMetaModelDto for validator names
for generic enum values as well - first should be done #5

In appmessage source maybe should be used some methods to pass locale etc. And pass arguments... Hibernate interpolation only for methods which are using map as args?

Instead od throw exception when cannot find property key should be created big hash map with all properties with language and with their defaults...
Do performance test how faster wil be that.

ObjectsJoinerVerifier with validation during create new endpoint + join type

in interface ObjectsJoinerVerifier should be metod for validate during create new endpoint.
For example EqObjectsJoiner shuld check only types of data.
For example LowerJoiner should check that, types of data can be compared, Numbers with numbers and numbers can be converted from float to int etc... should can check for Dates, times, period, duration etc.
The same story with GreaterOBjectJoiner....
DataStorageResultsJoinerDto should have opportunity for set inner_join, left_join, outer_join...

Support for field types in all layers frontend, metamodels and metamodels in datastorages

in GUI should be opportunity for
provide name for metamodel and name for that field in datastorage metamodel, Backend should then build mapping for that names and types.

Should be opportunity for add something like max, min, size etc in all layers it migth be based on some validations.
By default when some validation has their copy in other layer then add it.
But should be opportunity for disable those in other layers and put others as well.

For frontend side should be opportunity for write some java script for validation or just point out some js files and with type of fields and js type like numbe, string, object.
At java side should be opportunity for write some validation in groovy scripts, which real java type it is, or which other metamodel it is.
At datastorage metamodel should be opportunity for write some native code for example for db datastorage should be opportunity for write sql code, or liqubase code. Which java type it is and which db type it is (which type in real data storage like Blob, varchar, int, float, real, clob examples: https://www.journaldev.com/16774/sql-data-types)

worth to note:
datastorage metamodel is like hibernate entity this is not real db types like charvar etc.

Should be opportunity for recording changes in endpoints/metamodels

so every change in endpoint metamodels will be recorded
and whole session can be saved to some file json which will does the same like liqubase.
This file should contains opportunity for write native scripts to liquibase or to other datastorage changer language.

for create should be whole json which goes via POST.
For updates should be POST with infomation that some entry (some field, transalation, changed type, validation) added, removed, updated.
When some endpoint removed then as well
When some metamodel removed then as well and removing all values for that metamodel and tables in db datastorages or in other datastorages

add new metamodel which will be metamodel for enum

Add validation during conversion From json to objects based on classmetamodels.
Should be validation during create new Class metamodel, at least one enum value and have valid additional property name with index

Forward generic validator to javax validation for Dtos

Should be implementation of generic validator which will invoke a Javax validation on some Dto.
Metamodel for validator should contains on which group that validation should be run (via additional property)
Built path front Javax should be concated to earlier Path front generic validator.
should be opportunity for disable generic validation in some endpoint

Permissions for endpoint

So permission support should be at frontend side, some button should be not editable or just not shown.
in java should be support to verify permissions.

Generation metamodels based on real DTOs

create metamodel based on some Dto.
So will be provided some java class and upon that will be generated metamodels with fields and whole frontend and datastorages mappings.
How to deal when that DTO will be changed in java then, should be information/notification that current metamodel fields etc should be updated with it, when some fields was updated in java code.
In this case number of fields from DTO should be the same like in metamodels because, json value will be mapped to that DTO.
The second way is to generated metamodel based on some java class but when that class will be changed then field by field we should update metamodel to that DTO. So then json value will be converted to some Map.

permission management GUI and structures

with sychronization for keycloak via some plugin
should be opportunity for other SSO.

But permission management and administration should be separated microservice application or just in our application.

Invoke some method with mapped object.

So when invoke some method in Class where input method doesnt match object in generic controller.
I need to check that current generic service is not support that already.

Invoke post with some object for example some personDto. And some method expects SimplePersonDto. So then invocation mapper is necessary.

Invocation of few methods is available and as results can be used results mapping like with few datastorages. So service methods should be like datastorages.

Maybe some extension for datastorage which can get additional properties and then knows which method to invoke or what URL to hit with which http method.

should be opportunity for create few endpoints for some classMetamodel by one endpoint

example:
apiTag: 'users
baseUrl: 'users'
classMetaModel: userMetaModel

  1. POST
    baseUrl = baseUrl
    apiTag = apiTag
    pyaloadModel = userMetaModel

  2. PUT
    baseUrl = baseUrl + '/{id}'
    apiTag = apiTag
    pyaloadModel = userMetaModel

  3. GET
    baseUrl = baseUrl + '/{id}'
    apiTag = apiTag
    responseModel = userMetaModel

  4. DELETE
    baseUrl = baseUrl + '/{id}'
    apiTag = apiTag
    dataStorageConnectors[0].classMetaModelInDataStorage= userMetaModel

  5. GET - list
    baseUrl = baseUrl
    apiTag = apiTag
    responseModel = LIST

Should be plugin for generate DTO based on some metamodels.

so in plugin should be provided
which metamodel name to which class example:

person=pl.jalokim.domain.model.Person

but when somebody wants to have other fields and method in that generated class then should keep in repo file name Person.cw_java
with content:

private String newField;

public String doSomething() {
  return newField + name;
}

Controllers for Update, delete endpoints, classmetamodels

Should be new endpoints for update endpoints
Every Field od them, can add some values to list, can remove or can Update some Field, and can remove endpoints.
The separated endpoints should be for Class metamodel, for Update and delete.
Remove herę means mark it as deleted.

unique names during save some metamodels

ApiTagDto.name should be unique globally
ClassMetaModelDto.name should be unique globally
DataStorageMetaModelDto.name verify that name of ds exists during try usage existence data storage, name should be unique in scope of DataStorage class.
EndpointMetaModelDto.operationName should be unique globally

Generic service

Invoke mappers, get results of a mappers, invoke mapped objects to all data storages. Put to map mapped results from every data storages.
At the begining map results to final results

Fix datastirage API:
Data storages should get and return object not just map. For example in some jpadatastorage should be opportunity for save some entity...
Should be opportunity for extracting id field From given object to save,update

(maper for return from datasources) Support for resolve type from generic services.

Something like mapper above service.
So generic service will invoke mapping/datastorages and by default will return combination from all returned datastorages responses:

so by default will does:

db-datastorage:

{
"id": 1,
"name":  "name value"
}

mongo-datastorage:

{
"uuid": "uuid value",
"surname": "surname value"
}

will combine to:

{
"id": 1,
"name": "name value",
"uuid": "uuid value",
"surname":  "surname value"
}

should based on #1
so when occurred field name duplication then write mapper script like below:

=*dataStorages // it will map from all data storages, this is default mapping for datastorages to service response
id=$mappingContext['db-datastorage'].id // it will put to id from result id from db-datastorage 

but this mapper will have additional function like "=*dataStorages" which under the hood will does:

=$mappingContext['db-datastorage']
=$mappingContext['mongo-datastorage']

should be opportunity for code/script for this mapper or point out normal java class for it.

UrlMetamodel extends for validation of customer/12/order/1000 during Http request

For example when given url

customer/12/order/1000

then UrlMetamodel should contains for which data storages metamodels some given variable value can be used during query.
For example when some body wants do DELETE on that URL then for first should be check that customer with id 12 exists in all given datastorages for that variable name and next check (somehow) that order with id 1000 is under that customer with id 12...

MetaModels should be thread safe.

One solution is to create all metamodel as immutable. But in every change reload will be requried...
Some wrapper with AtomicReference... and inside of that will be immutable object.

After Update some Class metamodel Update generation od some mappers

When occured change on some Class model then all mappers wchich are assigned to it then should be regenerated as well.
For changes which comes via endpoint and From endpoint dedicated to do changes on Class models.

Every mappers metamodel should knows on which classes change should be regenerated.

For example when some mappers have mapping persondb to personView when change occured on one od them then mappers should be regenerated.
For example persondb have Field dokument with documentDb type. When occured change im documentDb then mapper should be regenerated as well

Mapper between Service and DataStorage

types:

  • from metamodel (map) to other metamodel (map)
  • from metamodel (map) to Dto
  • from Dto to metamodel (map)
  • from Dto to Dto

places where mapper can be:

  • service -> datastorage, datastorage -> service
  • mapping finaly response in service with getting value from all datastorage or from just one. Creation/update will have other default strategy but GET will have other stategy.

additional fields for metadata for mappers:

  • disableAutoMapping - disable auto mapping, by defult is enabled. Auto mapping is mapping between string to int etc. Usage od other conversion services.
  • ignoredFields - ignoring target field names. List of field names.
  • ignoreMappingProblem - by default disabled, should inform when have problem with some field, when cannot convert from String to Int etc... or from other metamodel

languages for mappers:

  • CWML - crud wizard mapping language

example:

person = personData // default maping dto/map map/dto 
person.uuid = uuid // mapping of some fields
address = personData.address // default maping dto/map map/dto 
personType = @springBeanName.methodName($headers[‘person-type’]) // maping to value by spring bean named  “springBeanName” via method named “methodName”
otherPerson= @mapperName(personData) // mapping by other named mapper "mapperName"
otherPerson.nestedField= @mapperName(otherPersonData.otherField) // mapping by other named mapper "mapperName" (mapper with type CWML or groovy script with name “mapperName”)
=person // to root of object put value from whole node named "person"
=otherPersonField // to root of object put value from whole node named "otherPersonField", duplication of field names should inform about exception... field duplication can be overriden below, when is overriden then exception should not be thrown
somefield= // put whole value from source to to field named  "somefield"
somfield=$headers['some-key']
list=#innerMethodWithCreateWholeList // TODO this should be checked that it will be used this method or get results from that method and iterate througth that and put to target list
otherList.*=#innerMethodForMapEachElement // TODO this should be checked...
someField=java(headers.get("x-somecookie") + "sometext")//Maybe on future will be opportunity for usage other languages.

for mapping list will be used one of inner method when exists otherwise will be generated inner method for mapping each element with that method.

all available variables (should be used with prefix "$")
rootSourceObject = soure object which will be mapped to other object (usefull in nested expressions). '$rootSourceObject' is equals to '$sourceObject' in main mapper method. In inner method $rootSourceObject means SourceObject from main mapper.
sourceObject= soure object which will be mapped to other object (usefull in nested expressions). '$sourceObject' is equals to '' (when this is inner mapper method)
headers = http headers from rest
pathVariables = path params from URL
requestParams - http get params
mappingContext - other values from other datastorages

above fields and source will come from class GenericMapperArgument:

class GenericMapperArgument {
Object sourceObject;
Map headers;
Map pathVariables;
Map requestParams;
Map mappingContext;
}

mapping fields from GenericMapperArgument in CWML
GenericMapperArgument.source.fieldName in CWML is "fieldName"
GenericMapperArgument.header['cookie'] is "$header['cookie']" etc
at left side is always field value or root in target object.

should be default mappers for concrete metamodels, so if some mapper not exists then try read everything per fields and ignore those which not exists in target and which cannot be mapped.
So will be automapping for 4 mapper types. Mappers should use spring converter or Jackson converts types if certain mapper is not provided or default for those not exists.

java
in mapper could use method argument like in generic service.

so to choose mapper metamodel during create/edit endpoint:

  • java, then you need provide class name, bean's name (optional), method name, if this is not spring bean then create instance by default constructor or get this one with parameters which can be injected from spring context. (inect only by constructor)
  • groovy - pass script for it, it can be mixed with CWML (method argument is GenericMapperArgument )
  • CWML - pass script for it (method argument is GenericMapperArgument )
  • kotlin - can be mixed with CWML

Generic mechanism for generate frontend

Should generate code for:
pages for create, delete, update, list objects or view data for some row

This should be generic mechanism which will be used for create pages for update metamodels and for update specific cusom classmetamodels adres by user.

This should be generated based on classmeta modela, fields, validators.

Generated code should be based on jhipster

Create metamodels for HTTP GET.

Should be opportunity for create getting data based on some dsl like SQL.
But there should be opportunity for write native language for concrete Datastorages.

example of CWQL - crud wizard query language

select 
  p.id,
  p.name,
  p.adresses, -- it returs all fields from adress metamodel,  this 1:N will be create always separated query in SQL only 1:1 will be in one query
  p.insurance.company,  --  here is getting from person.insurance.company in json field name will be "insuranceCompany"
  p.insurance.car.value as 'car insurance value,'  --  after as will be json field name. 
  p.insurance,  --  will get all fields from insurance 
  (  -- example how to get concrete fields from other property
    select  
     ins.car.value,
     ins.home.value
     (
       select 
         ins.company
       from ins
     ) as ins_company,
     (
       select 
         registerNumber  --  prefix is ommited
         vin  --  prefix is ommited
       from ins.car
     ) as ins_car, // always needs to be 'as' key word and name after nested select
    from 
      p.insurance as ins 
   ) as shortInsuranceInfo
from  
  person as p // when p ommited then p.id can be changed with "id" 

above will return list of person objects like:

[{
        "id": 1,
        "name": "John",
        "adresses": [{
                "street": "first",
                "city": "NY"
            }, {
                "street": "second",
                "city": "LA"
            }
        ],
        "insuranceCompany": "some company",
        "car insurance value": "120.54$",
        "insurance": {
            "company": "some company",
            "car": {
                "value": "120.54$",
                "vin": "129384jiohjs",
                "registerNumber": "FX 652G"
            },
            "home": {
                "value": "2000S",
                "number": "kg840dsj6fnfugp"
            }
        },
        "shortInsuranceInfo": {
            "ins_company": "some company",
            "ins_car": {
                "registerNumber": "FX 652G",
                "vin": "129384jiohjs"
            }
        }
    }
]

when you wanted return just object then write "select one"
when you want remove duplicated results then write "select distinct"

CWQL join
other CWQL example with join

select one
 person.name,  -- needs be person as prefix due to ambiguity field names
 surname, -- from person 
 shortName -- from type, when will return one element then will be array in json due to join
from 
 person join type on type_id = type.id -- type.id because will be problem which "id" from person metamodel or from type metamodel
where person.id = 1 

it will return:

{
    "name": "John",
    "surname": "Doe",
    "shortName": ["type1", "type2"] 
}

always join will be map to 1:N, what will appear when will be person but any asigned type to it.
then will be "shortName": [], so join is always "outer left join" left side can be return without join with right side

CWQL join1

select one
 person.name,  -- needs be person as prefix due to ambiguity field names
 surname, -- from person 
 shortName -- from type, due to join1 will return as relation 1:1
from 
 person join1 type on type_id = type.id -- type.id because will be problem which "id" from person metamodel or from type metamodel
where person.id = 1 

it will return:

{
    "name": "John",
    "surname": "Doe",
    "shortName": "short name of type" 
}

always join1 will be map to 1:1, but when query not return 1:1 then will be thrown exception

metamodels:

class person  {
  int id
  int type_id 
}

class type {
  int id 
  String shortName
  String name
}

where
where values should come from get params or from url path variable
get params for search and url path always should be provided.
from get params can be required or optional.
So then query will be genrated dynamically on storage side:
so then

where
p.id = $personId -- '$' comes from url path variables
and 
 p.name like #name -- # stands for get param named "name", when value of "#name" is null then ommit this search condition
and  
 p.type eq #personType -- # stands for get param namde "personType", when "#personTpe" is null then ommit this search condition
and  
 p.login eq ^login -- ^ stands for param from header named "login'
and
 p.lastloginTime is null -- always in query
and 
p.desc eq nullable(#desc)  -- will not be ommited  in query when #desc is null, then will seach for desc is null, so always will 
be in query 
and 
p.permissions in &'name_of_previous_ds_result'.permissions.name  -- & stands for previousQueryResultsContext it is get results from other previous data storage query results

which get params are required will be configured in endpoint query params the same with url path variable and header

Build metamodels for those queries
based on #11
during create that GET VIEW field translation key should be provided for fields which not comes from metamodel but are created from query needs.
So when in select used whole metamodel then will be used current metamodel but when is used with concrete select fields then new metamodel should be created and fields in that metamodel not exists then translation key are required etc.

but this is case when GET view model = datastorage metamodel, and is one metamodel.

Mapping combined results from few datastorages
based on #11
When GET view using few datastorages then metamodel should be created based on service output mapper metamodel.

mapper example:

=$mappingContext['db-datastorage'] // will put all fields from returned result of query from db-datastorage
=$mappingContext['mongo-datastorage'] // will put all fields from returned result of query from mongo-datastorage
otherField=$header['cookie'] 

so final metamodel will be combination of fields from db-datastorage query, mongo-datastorage and otherField.
The types and names and translation of that fields will be based on metamodel returned from somedatastorages metamodels.

when some fields not come from current metamodel then translation key and values are required.

The problem will be when fields which exists will be have changed translation key then in query result metemodel it should be reflected. So field metamodel should have info from which metamodel comes, then translation should be get from there.

When in source metamodel field name will be changed then in query that field should be reflected somehow.
But when somebody tries remove field then this is not allowed, or maybe when somebody clear quries as first.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.