Giter VIP home page Giter VIP logo

snowflake-maven-gradle-plugins's Introduction

Snowflake plugins for Maven and Gradle

These are open source and community supported tools. Support is provided on a best effort basis by project contributors.

Overview

This repo contains the source code for the Snowflake Maven and Gradle plugins, which will help developers publish User-Defined Functions (UDF) and Stored Procedures for Snowflake. The plugins can create a stage on Snowflake, copy your build artifact and dependency .jar files to the stage, and run the CREATE... DDL to create your UDF or stored procedure in the account.

Interested in contributing? See the Contributing Guide for guidance.

Maven

Maven Prereqs

Tool Required Version
JDK 11
Maven 3

Maven Installation

Put the following Maven coordinates in the <plugins> block of the POM file.

<plugin>
    <groupId>com.snowflake</groupId>
    <artifactId>snowflake-maven-plugin</artifactId>
    <version>0.1.0</version>
</plugin>

Authentication

You can provide your account authentication information using a properties file or individually specifying your account parameters directly in the plugin config:

Properties File

Create a file, profile.properties, in the root of the project with information to establish a JDBC connection to your Snowflake account:

# profile.properties
URL=https://MY_ACCOUNT_NAME.snowflakecomputing.com:443
USER=username
PASSWORD=password

# Optional properties:
ROLE=ACCOUNTADMIN
WAREHOUSE=DEMO_WH
DB=MY_DB
SCHEMA=MY_SCHEMA

Then specify this file using the <propertiesFile> tag in the auth section:

<plugin>
    <groupId>com.snowflake</groupId>
    <artifactId>snowflake-maven-plugin</artifactId>
    <version>0.1.0</version>
    <configuration>
        <auth>
            <propertiesFile>profile.properties</propertiesFile>
        </auth>
    </configuration>
</plugin>

Auth fields

Alternatively, you can specify your account information directly in the plugin using the url, user, and password fields. The role, db, and schema fields are optional. An example is shown below.

<plugin>
    <groupId>com.snowflake</groupId>
    <artifactId>snowflake-maven-plugin</artifactId>
    <version>0.1.0</version>
    <configuration>
        <auth>
            <url>https://MY_ACCOUNT_NAME.snowflakecomputing.com:443</url>
            <user>myUsername</user>
            <password>${env.MY_PASSWORD}</password> <!-- Env var injection for secrets -->
            <!-- optional auth configuration -->  
            <role>accountadmin</role>
            <db>${env.MY_ORG_DB}</db>
            <schema>${env.SCHEMA}</schema>
        </auth>
    </configuration>
</plugin>

If a properties file and auth fields are specified in the plugin, then the values provided in the plugin are given priority.

Object properties

Specify UDFs and Stored Procedures objects that should be created on Snowflake by adding a new <function> tag under <functions> or a <procedure> tag under <procedures> for each object. The arguments follow the CREATE FUNCTION and CREATE PROCEDURE syntax:

  • <name> is the name of the UDF/stored proc to be assigned on Snowflake
  • <handler> is className.methodName for the handler method
  • <args> is a list of <arg> which each require a <name> and <type>
  • <returns> is the return type
  • <stage> is the name of the internal stage that will be created (if it doesn't exist) and where files will be uploaded. Note: Choose a new stage name or an existing stage where artifact and dependency .jar files can be uploaded.

Example plugin configuration on POM:

<plugin>
    <groupId>com.snowflake</groupId>
    <artifactId>snowflake-maven-plugin</artifactId>
    <version>0.1.0</version>
    <configuration>
        <auth>
            <propertiesFile>profile.properties</propertiesFile>
        </auth>
        <stage>STAGE_NAME</stage>
        <functions>
            <function>
                <name>funcNameOnSnowflake</name>
                <handler>PackageName.ClassName.MethodName</handler>
                <args>
                    <arg>
                        <name>firstArg</name>
                        <type>integer</type>
                    </arg>
                    <arg>
                        <name>secondArg</name>
                        <type>string</type>
                    </arg>
                    <!-- More arg go here.. -->
                </args>
                <returns>string</returns>
            </function>
            <!-- More functions go here.. -->
        </functions>
        <procedures>
            <procedure>
                <name>procNameOnSnowflake</name>
                <handler>PackageName.ClassName.SomeMethodName</handler>
                <args>
                    <arg>
                        <name>a</name>
                        <type>string</type>
                    </arg>
                    <!-- More arg go here.. -->
                </args>
                <returns>string</returns>
            </procedure>
            <!-- More procedures go here.. -->
        </procedures>
    </configuration>
</plugin>

Maven usage

After configuration, run:

mvn clean package snowflake:deploy

mvn clean package will build the project and mvn snowflake:deploy executes the plugin goal, deploying your objects to Snowflake.

Usage in CI pipelines

As mentioned in auth fields, your account properties can be read directly from the environment variables of your CI pipeline. This can be helpful for keeping secrets out of source control, and to deploy to different environments (QA, UAT, production) but just changing the env vars in different pipelines.

Command Line Usage:

Auth parameters can optionally be provided as arguments when running the plugin from the CLI. Values from CLI arguments will override any values set in the properties file or the POM:

mvn snowflake:deploy \
  -Ddeploy.auth.user="username" \
  -Ddeploy.auth.password="password" \
  -Ddeploy.auth.url="myaccount.snowflakecomputing.com" \
  -Ddeploy.auth.role="myrole" \
  -Ddeploy.auth.db="mydb" \
  -Ddeploy.auth.schema="myschema"

A single function or procedure can also be specified through command line arguments. The command line function/procedure will be created along with any objects defined in the POM. The arguments have the following syntax:

mvn snowflake:deploy \
  -Ddeploy.type="{procedure | function}" \
  -Ddeploy.name="<name>" \
  -Ddeploy.args="[ <arg_name> <arg_data_type> ] [ , ... ]" \
  -Ddeploy.handler="<class>.<handler>" \
  -Ddeploy.returns="<data_type>"

As an example:

mvn clean package snowflake:deploy \
  -Ddeploy.type="procedure" \
  -Ddeploy.name="mvnStringConcat" \
  -Ddeploy.args="a string, b string" \
  -Ddeploy.handler="SimpleUdf.stringConcat" \
  -Ddeploy.returns="string"

Gradle

Gradle Prereqs

Tool Required Version
JDK 11

Gradle Installation

The Gradle plugin has not yet been published to Maven Central or the Gradle Plugin Portal. You can install it locally using the instructions below. Alternbatively, you can use this Gradle plugin developed by Snowflake Data Superhero, Stewart Bryson.

Clone this repository and publish it to your local .m2 repository:

git clone https://github.com/your-username/snowflake-maven-gradle-plugins.git
cd snowflake-maven-gradle-plugins/
gradle publishToMavenLocal

Specify the following at the top of settings.gradle:

pluginManagement {
    repositories {
        mavenLocal() // local Maven .m2 repository
    }
}

Authentication

You can provide your account authentication information using a properties file or individually specifying your account parameters directly in the plugin config:

Properties File

Create a properties file profile.properties in the root of the project with information to establish a JDBC connection to your Snowflake account:

# profile.properties
URL=https://MY_ACCOUNT_NAME.snowflakecomputing.com:443
USER=username
PASSWORD=password

# Optional properties:
ROLE=ACCOUNTADMIN
WAREHOUSE=DEMO_WH
DB=MY_DB
SCHEMA=MY_SCHEMA

In your buid.gradle, provide configuration for auth to the plugin:

snowflake {
 auth {
  propertiesFile = "profile.properties"
 }
}

Auth fields

Alternatively, you can specify your account information directly in the plugin using the url, user, and password fields. The role, db, and schema fields are optional. An example is shown below.

snowflake {
 auth {
  url = 'https://MY_ACCOUNT_NAME.snowflakecomputing.com:443'
  user = 'myUsername'
  password = System.getenv('SNOWFLAKEPWD') // Env var injection for secrets
  // Optional:
  role = 'accountadmin'
  db = 'myDB'
  schema = System.getenv('SNOWFLAKESCHEMA')
 }
}

If a properties file and auth fields are specified in the plugin, then the values provided in the plugin are given priority.

Gradle Plugin Configuration

Specify UDFs and Stored Procedures that should be published to Snowflake by creating a new function closure in the functions block or procedure closure under procedures for each.

The arguments follow the CREATE FUNCTION and CREATE PROCEDURE syntax:

  • functionName or procedureName is the name to be used on Snowflake
  • handler is packageName.className.methodName for the handler method
  • args is a list of argument strings for the function which are formatted as "[ <arg_name> <arg_data_type> ] [ , ... ]"
  • returns is the return type
  • stage is the name of the internal stage that will be created (if it doesn't exist) and where files will be uploaded. Note: Choose a new stage name or an existing stage where artifact and dependency .jar files can be uploaded.

Example plugin configuration on POM:

plugins {
 id 'com.snowflake.snowflake-gradle-plugin'
}

snowflake {
 auth {
  propertiesFile = './path/to/file'
 }
 stage = 'STAGE_NAME'
 functions {
  functionName {
   args = ["a string", "b int"]
   returns = "string"
   handler = "PackageName.ClassName.methodName"
  }
  // More functions here
 }
 procedures {
  procedureName {
   args = ["a string, b string"]
   returns = "string"
   handler = "PackageName.ClassName.methodName"
  }
  // More procedures here
 }
}

Gradle Usage

After configuration, run the following to publish your functions and procedures:

gradle snowflakeDeploy

The snowflakeDeploy task will trigger the jar task if jar is not up to date.

Usage in CI pipelines

As mentioned in auth fields, your account properties can be read directly from the environment variables of your CI pipeline. This can be helpful for keeping secrets out of source control, and to deploy to different environments (QA, UAT, production) but just changing the env vars in different pipelines.

Command Line Usage

Auth parameters can optionally be provided as arguments when running the plugin from the CLI. Values from CLI arguments will override any values set in the properties file or gradle build file:

gradle snowflakeDeploy \
  --auth-url="myaccount.snowflakecomputing.com" \
  --auth-user="username" \
  --auth-password="password" \
  --auth-role="myrole" \
  --auth-db="mydb" \
  --auth-schema="myschema"

A single function or procedure can also be specified through command line arguments. The command line function/procedure will be created along with any defined in build.gradle. The arguments have the following syntax:

gradle snowflakeDeploy \
  --deploy-type="{procedure | function}" \
  --deploy-name="<name>" \
  --deploy-args="[ <arg_name> <arg_data_type> ] [ , ... ]" \
  --deploy-handler="<class>.<handler>" \
  --deploy-returns="<data_type>"

As an example:

gradle snowflakeDeploy \
  --deploy-type="procedure" \
  --deploy-name="mvnStringConcat" \
  --deploy-args="a string, b string" \
  --deploy-handler="SimpleUdf.stringConcat" \
  --deploy-returns="string"

Notes

Dependency reuse

When uploading to stage, the plugin will structure dependency artifacts like a local .m2 cache, with directories following an artifact's organization name and version.

By default, build artifacts will overwrite upon each publish but existing dependencies files will not be uploaded again unless the version changes.

Contributors

Special thanks to...

  • Stewart Bryson for guidance and providing a reference in his own Gradle plugin for Snowflake
  • Jonathan Cui for bootstrapping the project during his internship at Snowflake

snowflake-maven-gradle-plugins's People

Contributors

iamontheinet avatar sfc-gh-bli avatar sfc-gh-hachouraria avatar sfc-gh-jcui avatar sfc-gh-jfreeberg avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

Forkers

sfc-gh-kgaputis

snowflake-maven-gradle-plugins's Issues

Skip Upload of Snowpark and Transitive dependencies

Snowpark is a large dependency of Stored Procedure projects and doesn't need to be uploaded since Snowpark is available on the Snowflake backend. We can skip the upload of Snowpark and its transitive dependencies (and remove them from the IMPORTS section of the CREATE PROCEDURE statement) to speed up deployments from the plugin

SNOW-781962: Publish the Gradle plugin to Gradle Portal

Today, a Gradle user needs to add Maven Central to their settings.gradle file:

pluginManagement {
    repositories {
        mavenCentral()
    }
}

The Gradle plugin should be available on Gradle Portal as well so a user does not need to do this

SNOW-815039: Validate argument types on client before starting upload

In my pom.xml I had the following:

...
<procedure>
  <name>hello_world_proc</name>
  <handler>org.example.procedure.App.run</handler>
  <args></args>
  <returns>long</returns> <!-- problem -->
</procedure>
...

And it turns out long was incorrect (which may be a separate issue because we really should be inferring the SQL types from the Java types) and it went through the upload process, the ultimately failed at the CREATE step:

(base) jfreeberg@YKQ2Q2CMYG snowpark-java-template % mvn snowflake:deploy 
[INFO] Scanning for projects...
[INFO] 
[INFO] ----------------< com.snowflake:snowpark-java-template >----------------
[INFO] Building Java 11 Project Template for Snowflake 0.0.1
[INFO] --------------------------------[ jar ]---------------------------------
[INFO] 
[INFO] --- snowflake-maven-plugin:0.1.0:deploy (default-cli) @ snowpark-java-template ---
[INFO] Execute copy dependencies. Destination: /Users/jfreeberg/Documents/GitHub/snowpark-java-template/target/dependency
[INFO] com.snowflake:snowpark:jar:1.8.0 already exists in destination.
[INFO] org.scala-lang:scala-library:jar:2.12.11 already exists in destination.
[INFO] org.scala-lang:scala-compiler:jar:2.12.11 already exists in destination.
[INFO] org.scala-lang:scala-reflect:jar:2.12.11 already exists in destination.
[INFO] org.scala-lang.modules:scala-xml_2.12:jar:1.0.6 already exists in destination.
[INFO] commons-io:commons-io:jar:2.11.0 already exists in destination.
[INFO] javax.xml.bind:jaxb-api:jar:2.2.2 already exists in destination.
[INFO] javax.xml.stream:stax-api:jar:1.0-2 already exists in destination.
[INFO] javax.activation:activation:jar:1.1 already exists in destination.
[INFO] org.slf4j:slf4j-api:jar:1.7.32 already exists in destination.
[INFO] org.slf4j:slf4j-simple:jar:1.7.32 already exists in destination.
[INFO] commons-codec:commons-codec:jar:1.15 already exists in destination.
[INFO] net.snowflake:snowflake-jdbc:jar:3.13.28 already exists in destination.
[INFO] com.github.vertical-blank:sql-formatter:jar:1.0.2 already exists in destination.
[INFO] com.fasterxml.jackson.core:jackson-databind:jar:2.13.4.2 already exists in destination.
[INFO] com.fasterxml.jackson.core:jackson-core:jar:2.13.2 already exists in destination.
[INFO] com.fasterxml.jackson.core:jackson-annotations:jar:2.13.2 already exists in destination.
[INFO] Can't extract module name from scala-xml_2.12-1.0.6.jar: scala.xml.2.12: Invalid module name: '2' is not a Java identifier
[INFO] Mapped dependencies to stage paths: {scala-compiler-2.12.11.jar=org/scala-lang/scala-compiler/2.12.11, activation-1.1.jar=javax/activation/activation/1.1, slf4j-simple-1.7.32.jar=org/slf4j/slf4j-simple/1.7.32, jackson-core-2.13.2.jar=com/fasterxml/jackson/core/jackson-core/2.13.2, scala-library-2.12.11.jar=org/scala-lang/scala-library/2.12.11, commons-codec-1.15.jar=commons-codec/commons-codec/1.15, jackson-annotations-2.13.2.jar=com/fasterxml/jackson/core/jackson-annotations/2.13.2, slf4j-api-1.7.32.jar=org/slf4j/slf4j-api/1.7.32, jackson-databind-2.13.4.2.jar=com/fasterxml/jackson/core/jackson-databind/2.13.4.2, stax-api-1.0-2.jar=javax/xml/stream/stax-api/1.0-2, scala-reflect-2.12.11.jar=org/scala-lang/scala-reflect/2.12.11, commons-io-2.11.0.jar=commons-io/commons-io/2.11.0, sql-formatter-1.0.2.jar=com/github/vertical-blank/sql-formatter/1.0.2, snowflake-jdbc-3.13.28.jar=net/snowflake/snowflake-jdbc/3.13.28, snowpark-1.8.0.jar=com/snowflake/snowpark/1.8.0, scala-xml_2.12-1.0.6.jar=org/scala-lang/modules/scala-xml_2.12/1.0.6, jaxb-api-2.2.2.jar=javax/xml/bind/jaxb-api/2.2.2}
[INFO] set JDBC client memory limit to 10240
[INFO] Creating connection to snowflake at url: jdbc:snowflake://https://pm.snowflakecomputing.com:443
[INFO] Snowflake Session established!
[INFO] Creating stage STAGE_NAME if not exists
[INFO] Stage located or created!
[INFO] Uploading artifact JAR: /Users/jfreeberg/Documents/GitHub/snowpark-java-template/target/snowpark-java-template-0.0.1.jar
May 12, 2023 11:40:34 AM net.snowflake.client.jdbc.internal.amazonaws.util.Base64 <clinit>
WARNING: JAXB is unavailable. Will fallback to SDK implementation which may be less performant.If you are using Java 9+, you will need to include javax.xml.bind:jaxb-api as a dependency.
[INFO] Artifact JAR uploaded!
[INFO] Uploading dependency JARs from: /Users/jfreeberg/Documents/GitHub/snowpark-java-template/target/dependency
[INFO] Uploading scala-compiler-2.12.11.jar
[INFO] Uploading activation-1.1.jar
[INFO] Uploading slf4j-simple-1.7.32.jar
[INFO] Uploading jackson-core-2.13.2.jar
[INFO] Uploading scala-library-2.12.11.jar
[INFO] Uploading commons-codec-1.15.jar
[INFO] Uploading jackson-annotations-2.13.2.jar
[INFO] Uploading slf4j-api-1.7.32.jar
[INFO] Uploading jackson-databind-2.13.4.2.jar
[INFO] Uploading stax-api-1.0-2.jar
[INFO] Uploading scala-reflect-2.12.11.jar
[INFO] Uploading commons-io-2.11.0.jar
[INFO] Uploading sql-formatter-1.0.2.jar
[INFO] Uploading snowflake-jdbc-3.13.28.jar
[INFO] Uploading snowpark-1.8.0.jar
[INFO] Uploading scala-xml_2.12-1.0.6.jar
[INFO] Uploading jaxb-api-2.2.2.jar
[INFO] Dependency JARs uploaded!
[INFO] Running create function statement: 
[INFO] CREATE OR REPLACE function combineStrings (a string, b string)
RETURNS string
LANGUAGE java
HANDLER = 'org.example.udf.Function.combineStrings'
IMPORTS = ('@STAGE_NAME/libs/snowpark-java-template-0.0.1.jar', '@STAGE_NAME/org/scala-lang/scala-compiler/2.12.11/scala-compiler-2.12.11.jar', '@STAGE_NAME/javax/activation/activation/1.1/activation-1.1.jar', '@STAGE_NAME/org/slf4j/slf4j-simple/1.7.32/slf4j-simple-1.7.32.jar', '@STAGE_NAME/com/fasterxml/jackson/core/jackson-core/2.13.2/jackson-core-2.13.2.jar', '@STAGE_NAME/org/scala-lang/scala-library/2.12.11/scala-library-2.12.11.jar', '@STAGE_NAME/commons-codec/commons-codec/1.15/commons-codec-1.15.jar', '@STAGE_NAME/com/fasterxml/jackson/core/jackson-annotations/2.13.2/jackson-annotations-2.13.2.jar', '@STAGE_NAME/org/slf4j/slf4j-api/1.7.32/slf4j-api-1.7.32.jar', '@STAGE_NAME/com/fasterxml/jackson/core/jackson-databind/2.13.4.2/jackson-databind-2.13.4.2.jar', '@STAGE_NAME/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar', '@STAGE_NAME/org/scala-lang/scala-reflect/2.12.11/scala-reflect-2.12.11.jar', '@STAGE_NAME/commons-io/commons-io/2.11.0/commons-io-2.11.0.jar', '@STAGE_NAME/com/github/vertical-blank/sql-formatter/1.0.2/sql-formatter-1.0.2.jar', '@STAGE_NAME/net/snowflake/snowflake-jdbc/3.13.28/snowflake-jdbc-3.13.28.jar', '@STAGE_NAME/com/snowflake/snowpark/1.8.0/snowpark-1.8.0.jar', '@STAGE_NAME/org/scala-lang/modules/scala-xml_2.12/1.0.6/scala-xml_2.12-1.0.6.jar', '@STAGE_NAME/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar');
[INFO] Running create function statement: 
[INFO] CREATE OR REPLACE procedure hello_world_proc ()
RETURNS long
LANGUAGE java
PACKAGES = ('com.snowflake:snowpark:latest')
HANDLER = 'org.example.procedure.App.run'
IMPORTS = ('@STAGE_NAME/libs/snowpark-java-template-0.0.1.jar', '@STAGE_NAME/org/scala-lang/scala-compiler/2.12.11/scala-compiler-2.12.11.jar', '@STAGE_NAME/javax/activation/activation/1.1/activation-1.1.jar', '@STAGE_NAME/org/slf4j/slf4j-simple/1.7.32/slf4j-simple-1.7.32.jar', '@STAGE_NAME/com/fasterxml/jackson/core/jackson-core/2.13.2/jackson-core-2.13.2.jar', '@STAGE_NAME/org/scala-lang/scala-library/2.12.11/scala-library-2.12.11.jar', '@STAGE_NAME/commons-codec/commons-codec/1.15/commons-codec-1.15.jar', '@STAGE_NAME/com/fasterxml/jackson/core/jackson-annotations/2.13.2/jackson-annotations-2.13.2.jar', '@STAGE_NAME/org/slf4j/slf4j-api/1.7.32/slf4j-api-1.7.32.jar', '@STAGE_NAME/com/fasterxml/jackson/core/jackson-databind/2.13.4.2/jackson-databind-2.13.4.2.jar', '@STAGE_NAME/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar', '@STAGE_NAME/org/scala-lang/scala-reflect/2.12.11/scala-reflect-2.12.11.jar', '@STAGE_NAME/commons-io/commons-io/2.11.0/commons-io-2.11.0.jar', '@STAGE_NAME/com/github/vertical-blank/sql-formatter/1.0.2/sql-formatter-1.0.2.jar', '@STAGE_NAME/net/snowflake/snowflake-jdbc/3.13.28/snowflake-jdbc-3.13.28.jar', '@STAGE_NAME/com/snowflake/snowpark/1.8.0/snowpark-1.8.0.jar', '@STAGE_NAME/org/scala-lang/modules/scala-xml_2.12/1.0.6/scala-xml_2.12-1.0.6.jar', '@STAGE_NAME/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar');
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  01:13 min
[INFO] Finished at: 2023-05-12T11:41:41-07:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal com.snowflake:snowflake-maven-plugin:0.1.0:deploy (default-cli) on project snowpark-java-template: Error creating function or procedure.: SQL compilation error:
[ERROR] Unsupported data type 'LONG'.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

The upload process can take a while (which is a separate area we can improve) so we should try to catch these unsupported argument types before we start the JAR upload.

SNOW-800801 Ignore "@" in stage name

When a user specifies the stage name in the pom.xml, they may prepend the "@" symbol since that's the common syntax for other clients and in the DDL. The Maven plugin should ignore this character.

This is very similar to situations in other libraries where a user provides a URL which may contain a trailing slash (https://google.com vs https://google.com/) and a good client will handle both versions correctly.

SNOW-815068: Idempotent JAR Uploads

It looks like the Maven plugin is uploading the dependency JAR's when I re-run mvn snowflake:deploy, even if nothing has changed between the first and second runs. I think it would be great to check if the dependency already exists on the stage before uploading, to reduce the overall deployment time.

SNOW-1050643: Why Packages are omitted for function creation?

In the documentation it's written, that:

For Snowflake system packages, such the Snowpark package, you can specify the package with the PACKAGES clause rather than specifying its JAR file with IMPORTS. When you do, the package JAR file need not be included in an IMPORTS value.

But into the code, packages are always empty string for functions. Why is it so? Also why can't I specify the exact package version, i.e. com.snowflake:snowpark:1.8.0?

SNOW-781801: Infer the parameters and return types for UDFs/sprocs

Today, the user must manually specify the argument types, argument names, and return type(s) for their UDFs and stored procedures. This can be tiresome to keep in-sync if the implementation of the UDF changes, and it's a lot of XML to write.

It would be better if we could use reflection or code scanning to infer the argument types, argument names, and return types for a UDF or stored procedure. Example:

<functions>
    <function>
        <name>funcNameOnSnowflake</name>
        <handler>ClassName.MethodName</handler>
        <!-- No argument names, types, or return types required. -->
    </function>
</functions>

The user should still be allowed to manually specify those fields, and if they are specified in the pom.xml or build.gradle, then they take priority over the inferred values.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.