Giter VIP home page Giter VIP logo

amazon-kinesis-data-analytics-examples's Introduction

Amazon Managed Service for Apache Flink examples


⚠️This repository is obsolete. Please refer to the new Amazon Managed Service for Apache Flink examples repo.



🚨 August 30, 2023: Amazon Kinesis Data Analytics has been renamed to Amazon Managed Service for Apache Flink.


Example applications in Java, Python and SQL for Kinesis Data Analytics, demonstrating sources, sinks, and operators.

License Summary

This sample code is made available under the MIT-0 license. See the LICENSE file.

amazon-kinesis-data-analytics-examples's People

Contributors

anandshah123 avatar antovespoli avatar awsalialem avatar babakc avatar dannycranmer avatar darenwkt avatar dependabot[bot] avatar fletpatr avatar flo-mair avatar franmorilloaws avatar jeremyber-aws avatar jpeddicord avatar jyusov avatar karthitect avatar nicusx avatar patelpratik4job avatar psolomin avatar rajdban avatar spaethp avatar z3d1k avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

amazon-kinesis-data-analytics-examples's Issues

Python examples: please clarify which kinesis connector JAR should be used

In this codebase, the JAR listed in several application properties for python examples is one of:

  1. amazon-kinesis-connector-flink-2.0.0.jar
  2. flink-sql-connector-kinesis_2.12-1.13.2.jar
  3. amazon-kinesis-sql-connector-flink-2.0.3.jar
  4. and possibly other variations

For example: https://github.com/aws-samples/amazon-kinesis-data-analytics-examples/blob/master/python/S3Sink/application_properties.json

In the AWS documentation however, the JAR mentioned is flink-sql-connector-kinesis_2.12-1.13.1.jar in places, and also the 1.13.2 version within the same article

Located here: https://docs.aws.amazon.com/kinesisanalytics/latest/java/gs-python-createapp.html

Please clarify which of these dependencies should be included in the uploaded ZIP file

java.lang.IllegalArgumentException: Invalid lambda deserialization

HI All ,

amazon-kinesis-data-analytics-java-examples/HudiConnector/

I dont understand why we cant run the below code in the HUDI connector example

private static StreamingFileSink<UserInfo> createS3SinkFromStaticConfig() {

        final StreamingFileSink<UserInfo> sink = StreamingFileSink
                .forBulkFormat(new Path(local), ParquetAvroWriters.forReflectRecord(UserInfo.class))
                // Use hive style partitioning
                .withBucketAssigner(new DateTimeBucketAssigner<>("'year='yyyy'/month='MM'/day='dd'/hour='HH/"))
                .withOutputFileConfig(OutputFileConfig.builder()
                        .withPartSuffix(".parquet")
                        .build())
                .build();
        return sink;
    }

    public static void main(String[] args) throws Exception {
        final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        StreamTableEnvironment tableEnv = StreamTableEnvironment.create(
                env, EnvironmentSettings.newInstance().useBlinkPlanner().build());
        Configuration configuration = tableEnv.getConfig().getConfiguration();
        configuration.setString("execution.checkpointing.interval", "1 min");

        LOG.info("main started");
        DataStream<String> input = createSourceFromStaticConfig(env);

        LOG.info("read from source");
        DataStream<UserInfo> UserInfoDataStream = input.map((MapFunction<String, UserInfo>) CopData::getDataObject);
        UserInfoDataStream.addSink(createS3SinkFromStaticConfig()).name("S3 Parquet Sink");
        env.execute("Flink S3 Streaming Sink Job");
    }

the problem is in only this method createS3SinkFromStaticConfig

I get this Exception

used by: java.io.IOException: unexpected exception type
	at java.base/java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1641)
	at java.base/java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1271)
	at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2205)
	at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1679)
	at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2464)
	at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2358)
	at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2196)
	at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1679)
	at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2464)
	at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2358)
	at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2196)
	at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1679)
	at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2464)
	at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2358)
	at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2196)
	at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1679)
	at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2464)
	at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2358)
	at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2196)
	at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1679)
	at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2464)
	at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2358)
	at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2196)
	at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1679)
	at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:493)
	at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:451)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
	at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
	at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperatorFactory(StreamConfig.java:322)
	... 12 more
Caused by: java.lang.reflect.InvocationTargetException
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at java.base/java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:237)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at java.base/java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1265)
	... 41 more
Caused by: java.lang.IllegalArgumentException: Invalid lambda deserialization
	at org.apache.flink.formats.parquet.avro.ParquetAvroWriters.$deserializeLambda$(ParquetAvroWriters.java:40)
	... 51 more

when I remove below dependency from the pom this works

<dependency>
			<groupId>org.apache.hudi</groupId>
			<artifactId>hudi-flink-bundle_${scala.binary.version}</artifactId>
			<version>${hudi.version}</version>
			<exclusions>
				<exclusion>
					<groupId>com.esotericsoftware.kryo</groupId>
					<artifactId>kryo</artifactId>
				</exclusion>
			</exclusions>
		</dependency>

NOTE: nothing is changed from example pom

please help

S3ParquetSink example

Presently repository has example to write to S3 in text mode. This example is for adding example to write to S3 in bulk format like parquet.

Issue with Pom dependency for getting started project

Getting started pom file location for flink-connector-kinesis_2.11 has been changed . Pom for getting started needs to be changed to

org.apache.flink
flink-connector-kinesis_2.11
1.11.2

While building it throws error, that location for jar not found
Due to the licensing issue, the flink-connector-kinesis_2.11 artifact is not deployed to Maven central for the prior versions. Please see the version specific documentation for further information.
https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kinesis.html

java.io.NotSerializableException: Non-serializable lambda

When trying to start Kinesis Data Analytics with my flink application, I'm getting the following error org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Cannot serialize operator object class org.apache.flink.streaming.runtime.operators.sink.SinkWriterOperatorFactory. caused by java.io.NotSerializableException: Non-serializable lambda. I'm using exactly the same code provided in the examples for creating the kinesis data streams sink. I also use the same lines in the build.sbt. What could be the inconvenient?

Seems like it's the lambda from the setPartitionKeyGenerator method.

What could be a workaround?

def createSink: KinesisStreamsSink[String] = {
  val applicationProperties = KinesisAnalyticsRuntime.getApplicationProperties
  val outputProperties = applicationProperties.get("ProducerConfigProperties")

  KinesisStreamsSink.builder[String]
    .setKinesisClientProperties(outputProperties)
    .setSerializationSchema(new SimpleStringSchema)
    .setStreamName(outputProperties.getProperty(streamNameKey, defaultOutputStreamName))
    .setPartitionKeyGenerator((element: String) => String.valueOf(element.hashCode))
    .build
}

Issue with the Beam example documentation

Hi aws-samples maintainers & community, thank you for the examples, they help a lot.

I found a small issue in one of the examples, particularly amazon-kinesis-data-analytics-java-examples/Beam. Actually, the code is fine, but correspondent documentation is not.

Page https://docs.aws.amazon.com/kinesisanalytics/latest/java/examples-beam.html#examples-beam-resources recommends to:

  • use Apache Flink 1.11 cluster version
  • and build fat jar using this command: mvn package -Dflink.version=1.11.1 -Dflink.version.minor=1.8

If you build the example, upload .jar file and try to run your Kinesis Data Analytics app you will get this error:

Type 'org/apache/flink/streaming/api/transformations/TwoInputTransformation' (current frame, stack[4]) is not assignable to 'org/apache/flink/streaming/api/transformations/StreamTransformation'

Flink documentation (https://beam.apache.org/documentation/runners/flink/#flink-version-compatibility) states though:

The Flink cluster version has to match the minor version used by the FlinkRunner. The minor version is the first two numbers in the version string, e.g. in 1.8.0 the minor version is 1.8.

Thus flink.version.minor must be 1.11, and not 1.8. Full set of POM properties that work for me with the latest version in master branch:

<properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <java.version>1.11</java.version>
        <scala.binary.version>2.12</scala.binary.version>
        <kda-runtime.version>1.2.0</kda-runtime.version>
        <beam.version>2.28.0</beam.version>
        <jackson.version>2.10.2</jackson.version>
        <flink.version>1.11.3</flink.version>
        <flink.version.minor>1.11</flink.version.minor>
    </properties>

Scala NoSuchMethodError: 'scala.collection.mutable.Buffer (and all other Scala packages)

Steps to reproduce:

  • Pull the most recent revision of the project
  • Open Scala S3Sink project
  • add next (see below) in the main method and import
  • sbt assembly (or your other step to build jar)
    import scala.jdk.CollectionConverters._

    val a = util.Arrays.asList(1,2,3,4,5)
    val b = a.asScala.toList.map { x =>
      x^2
    }

    log.info(s"Does it work?: $b")

Error in runtime: java.lang.NoSuchMethodError: 'scala.collection.mutable.Buffer scala.collection.convert.AsScalaConverters.asScala$(scala.collection.convert.AsScalaConverters, java.util.List)'

Please Note! While checking jar package you can find, that all the required libraries already included, for ex. scala.collection.mutable.Buffer and scala.collection.convert.AsScalaConverters already part of the jar and runtime. You can check jar package with command jar tf <file>. However it doesn't work when you build and run in Kinesis Flink application.

Couldn't deploy python getting started example

Was trying to put together the Getting started example, however I'm constantly getting the same error:

"message": "org.apache.flink.runtime.rest.handler.RestHandlerException: Could not execute application.\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$4(JarRunOverrideHandler.java:248)\n\tat java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:930)\n\tat java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:907)\n\tat java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.util.concurrent.CompletionException: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1\n\tat java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)\n\tat java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)\n\t... 6 more\nCaused by: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1\n\tat org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:140)\n\tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.base/java.lang.reflect.Method.invoke(Method.java:566)\n\tat org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)\n\tat org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)\n\tat org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:84)\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:70)\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:239)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)\n\t... 6 more\nCaused by: java.lang.RuntimeException: Python process exits with code: 1\n\tat org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:130)\n\t... 17 more\n", "messageType": "ERROR", "messageSchemaVersion": "1", "errorCode": "CodeError.InvalidApplicationCode"

I'm using flink-sql-connector-kinesis-1.15.2.jar

Locally the setup works perfectly, I can see the code running

Could you point me at next steps to check?

Running python example "GettingStarted" with flink-sql-connector-kinesis_2.12-1.13.2 and amazon-kinesis-sql-connector-flink-2.3.0

When running the getting started (python) the application does not start in KDA.

I have read this issue where it is recommended to use the 'amazon-kinesis-sql-connector-flink-2.3.0.jar' as the jar file passed in the python script with no luck. The error I am getting is the following:

Traceback (most recent call last):
File "/home/bluetab/Escritorio/OCS/Proyectos/new-nrt-dashboard-core/getting-started.py", line 154, in
main()
File "/home/bluetab/Escritorio/OCS/Proyectos/new-nrt-dashboard-core/getting-started.py", line 143, in main
table_result = table_env.execute_sql("INSERT INTO {0} SELECT * FROM {1}"
File "/home/bluetab/Escritorio/OCS/Proyectos/venvs/new-nrt/lib/python3.8/site-packages/pyflink/table/table_environment.py", line 876, in execute_sql
return TableResult(self._j_tenv.executeSql(stmt))
File "/home/bluetab/Escritorio/OCS/Proyectos/venvs/new-nrt/lib/python3.8/site-packages/py4j/java_gateway.py", line 1285, in call
return_value = get_return_value(
File "/home/bluetab/Escritorio/OCS/Proyectos/venvs/new-nrt/lib/python3.8/site-packages/pyflink/util/exceptions.py", line 146, in deco
return f(*a, **kw)
File "/home/bluetab/Escritorio/OCS/Proyectos/venvs/new-nrt/lib/python3.8/site-packages/py4j/protocol.py", line 326, in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling o4.executeSql.
: org.apache.flink.table.api.ValidationException: Unable to create a source for reading table 'default_catalog.default_database.input_table'.

Table options are:

'aws.region'='eu-west-1'
'connector'='kinesis'
'format'='json'
'json.timestamp-format.standard'='ISO-8601'
'scan.stream.initpos'='LATEST'
'sink.partitioner-field-delimiter'=';'
'sink.producer.collection-max-count'='100'
'stream'='input-kinesis-apache-flink'
at org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:150)
at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.createDynamicTableSource(CatalogSourceTable.java:116)
at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.toRel(CatalogSourceTable.java:82)
at org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3585)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2507)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2144)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2093)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2050)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:663)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:644)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3438)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:570)
at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:177)
at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:169)
at org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:1057)
at org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:1026)
at org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:301)
at org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlInsert(SqlToOperationConverter.java:639)
at org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:290)
at org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:101)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:736)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
at org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
at org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.NoSuchMethodError: 'org.apache.flink.table.catalog.CatalogTable org.apache.flink.table.factories.DynamicTableFactory$Context.getCatalogTable()'
at software.amazon.kinesis.connectors.flink.table.KinesisDynamicTableFactory.createDynamicTableSource(KinesisDynamicTableFactory.java:66)
at org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:147)
... 31 more

When running the same code but with the original .jar file ('flink-sql-connector-kinesis_2.12-1.13.2') used in the official documentation . I get to run the application but it ends up failing with the following error:

orio/OCS/Proyectos/new-nrt-dashboard-core/getting-started.py
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.flink.api.java.ClosureCleaner (file:/home/bluetab/Escritorio/OCS/Proyectos/venvs/new-nrt/lib/python3.8/site-packages/pyflink/lib/flink-dist_2.11-1.14.4.jar) to field java.util.Collections$SingletonList.serialVersionUID
WARNING: Please consider reporting this to the maintainers of org.apache.flink.api.java.ClosureCleaner
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Traceback (most recent call last):
File "/home/bluetab/Escritorio/OCS/Proyectos/new-nrt-dashboard-core/getting-started.py", line 154, in
main()
File "/home/bluetab/Escritorio/OCS/Proyectos/new-nrt-dashboard-core/getting-started.py", line 147, in main
table_result.wait()
File "/home/bluetab/Escritorio/OCS/Proyectos/venvs/new-nrt/lib/python3.8/site-packages/pyflink/table/table_result.py", line 76, in wait
get_method(self._j_table_result, "await")()
File "/home/bluetab/Escritorio/OCS/Proyectos/venvs/new-nrt/lib/python3.8/site-packages/py4j/java_gateway.py", line 1285, in call
return_value = get_return_value(
File "/home/bluetab/Escritorio/OCS/Proyectos/venvs/new-nrt/lib/python3.8/site-packages/pyflink/util/exceptions.py", line 146, in deco
return f(*a, **kw)
File "/home/bluetab/Escritorio/OCS/Proyectos/venvs/new-nrt/lib/python3.8/site-packages/py4j/protocol.py", line 326, in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling o90.await.
: java.util.concurrent.ExecutionException: org.apache.flink.table.api.TableException: Failed to wait job finish
at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)
at org.apache.flink.table.api.internal.TableResultImpl.awaitInternal(TableResultImpl.java:129)
at org.apache.flink.table.api.internal.TableResultImpl.await(TableResultImpl.java:92)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
at org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
at org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.flink.table.api.TableException: Failed to wait job finish
at org.apache.flink.table.api.internal.InsertResultIterator.hasNext(InsertResultIterator.java:56)
at org.apache.flink.table.api.internal.TableResultImpl$CloseableRowIteratorWrapper.hasNext(TableResultImpl.java:370)
at org.apache.flink.table.api.internal.TableResultImpl$CloseableRowIteratorWrapper.isFirstRowReady(TableResultImpl.java:383)
at org.apache.flink.table.api.internal.TableResultImpl.lambda$awaitInternal$1(TableResultImpl.java:116)
at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1736)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
... 1 more
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)
at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)
at org.apache.flink.table.api.internal.InsertResultIterator.hasNext(InsertResultIterator.java:54)
... 7 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.runtime.minicluster.MiniClusterJobClient.lambda$getJobExecutionResult$3(MiniClusterJobClient.java:137)
at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.runtime.rpc.akka.AkkaInvocationHandler.lambda$invokeRpc$1(AkkaInvocationHandler.java:258)
at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.util.concurrent.FutureUtils.doForward(FutureUtils.java:1389)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$null$1(ClassLoadingUtils.java:93)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$guardCompletionWithContextClassLoader$2(ClassLoadingUtils.java:92)
at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.runtime.concurrent.akka.AkkaFutureUtils$1.onComplete(AkkaFutureUtils.java:47)
at akka.dispatch.OnComplete.internal(Future.scala:300)
at akka.dispatch.OnComplete.internal(Future.scala:297)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:224)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:221)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at org.apache.flink.runtime.concurrent.akka.AkkaFutureUtils$DirectExecutionContext.execute(AkkaFutureUtils.java:65)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:68)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:284)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:621)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:24)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:23)
at scala.concurrent.Future.$anonfun$andThen$1(Future.scala:532)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:63)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:100)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:100)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:252)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:242)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:684)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444)
at jdk.internal.reflect.GeneratedMethodAccessor48.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:123)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
... 5 more
Caused by: java.lang.NoSuchMethodError: 'org.apache.flink.metrics.MetricGroup org.apache.flink.api.common.functions.RuntimeContext.getMetricGroup()'
at org.apache.flink.streaming.connectors.kinesis.internals.KinesisDataFetcher.(KinesisDataFetcher.java:416)
at org.apache.flink.streaming.connectors.kinesis.internals.KinesisDataFetcher.(KinesisDataFetcher.java:365)
at org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.createFetcher(FlinkKinesisConsumer.java:536)
at org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.run(FlinkKinesisConsumer.java:308)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:67)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:323)
Suppressed: java.lang.NullPointerException
at org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.close(FlinkKinesisConsumer.java:421)
at org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:41)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.close(AbstractUdfStreamOperator.java:114)
at org.apache.flink.streaming.api.operators.StreamSource.close(StreamSource.java:124)
at org.apache.flink.streaming.runtime.tasks.StreamOperatorWrapper.close(StreamOperatorWrapper.java:141)
at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.closeAllOperators(RegularOperatorChain.java:127)
at org.apache.flink.streaming.runtime.tasks.StreamTask.closeAllOperators(StreamTask.java:1035)
at org.apache.flink.streaming.runtime.tasks.StreamTask.runAndSuppressThrowable(StreamTask.java:1021)
at org.apache.flink.streaming.runtime.tasks.StreamTask.cleanUp(StreamTask.java:928)
at org.apache.flink.runtime.taskmanager.Task.lambda$restoreAndInvoke$0(Task.java:940)
at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:958)
at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:940)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:766)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:575)
at java.base/java.lang.Thread.run(Thread.java:829)

When decompiling both .jars I see the following packages:
image

From my point of view, it looks like the new version of the .jar (amazon-kinesis-sql-connector-flink-2.3.0) is now missing the org.apache.flink package which did appear in the old .jar (flink-sql-connector-kinesis_2.12-1.13.2) but in that case it should be getting a "ClassNotFoundException" instead of "NoSuchMethod"...

I am using apache-flink==1.14.4.

Thanks for your help!

Kafka example at https://docs.aws.amazon.com/managed-flink/latest/java/example-msk.html is not working

The example does not work with SSL or without SSL.
The problem is the code does not take any property except the bootstrap url and topic.
Hence SSL config does not work.
If you use plaintext then you get error

org.apache.kafka.common.KafkaException: Unexpected error in InitProducerIdResponse; The transaction timeout is larger than the maximum value allowed by the broker (as configured by transaction.max.timeout.ms).\

Job fails for a Python process due to RowTime field being null.

I am trying to follow the tutorials and also the closed issue #4 that helped me fixing an earlier problem.

I get the following error when trying this example here https://github.com/jeremyber-aws/amazon-kinesis-data-analytics-java-examples/blob/master/python/TumblingWindow/tumbling-windows.py

java.lang.RuntimeException: RowTime field should not be null, please convert it to a non-null long value.
	at org.apache.flink.table.runtime.operators.wmassigners.WatermarkAssignerOperator.processElement(WatermarkAssignerOperator.java:115)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:717)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:692)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:672)
	at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52)
	at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30)
	at StreamExecCalc$23.processElement(Unknown Source)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:717)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:692)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:672)
	at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:52)
	at org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:30)
	at org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104)
	at org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collectWithTimestamp(StreamSourceContexts.java:111)
	at software.amazon.kinesis.connectors.flink.internals.KinesisDataFetcher.emitRecordAndUpdateState(KinesisDataFetcher.java:946)
	at software.amazon.kinesis.connectors.flink.internals.KinesisDataFetcher.access$000(KinesisDataFetcher.java:108)
	at software.amazon.kinesis.connectors.flink.internals.KinesisDataFetcher$AsyncKinesisRecordEmitter.emit(KinesisDataFetcher.java:329)
	at software.amazon.kinesis.connectors.flink.internals.KinesisDataFetcher$SyncKinesisRecordEmitter$1.put(KinesisDataFetcher.java:346)
	at software.amazon.kinesis.connectors.flink.internals.KinesisDataFetcher$SyncKinesisRecordEmitter$1.put(KinesisDataFetcher.java:343)
	at software.amazon.kinesis.connectors.flink.internals.KinesisDataFetcher.emitRecordAndUpdateState(KinesisDataFetcher.java:930)
	at software.amazon.kinesis.connectors.flink.internals.ShardConsumer.deserializeRecordForCollectionAndUpdateState(ShardConsumer.java:194)
	at software.amazon.kinesis.connectors.flink.internals.ShardConsumer.lambda$run$0(ShardConsumer.java:120)
	at software.amazon.kinesis.connectors.flink.internals.publisher.polling.PollingRecordPublisher.run(PollingRecordPublisher.java:115)
	at software.amazon.kinesis.connectors.flink.internals.publisher.polling.PollingRecordPublisher.run(PollingRecordPublisher.java:102)
	at software.amazon.kinesis.connectors.flink.internals.ShardConsumer.run(ShardConsumer.java:112)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)

Same issue while doing another KDA with following tables:

def create_source_table(table_name, stream_name, region, stream_initpos):
    return """CREATE TABLE {0} (
                id VARCHAR(64), 
                key VARCHAR(64), 
                network VARCHAR(64), 
                status_code INTEGER,
                duration BIGINT,
                request_count BIGINT, 
                response_size BIGINT, 
                duration_min BIGINT, 
                duration_max BIGINT, 
                duration_sum BIGINT,
                ts BIGINT,
                ts_time AS TO_TIMESTAMP(FROM_UNIXTIME(ts)),
                WATERMARK FOR ts_time AS ts_time - INTERVAL '5' SECOND
            ) 
            PARTITIONED BY (id)
            WITH (
                'connector' = 'kinesis',
                'stream' = '{1}',
                'aws.region' = '{2}',
                'aws.credentials.provider' = 'ASSUME_ROLE',
                'aws.credentials.role.sessionName' = 'test',
                'aws.credentials.role.arn' = 'arn:aws:iam::1234567890:role/test',
                'scan.stream.initpos' = '{3}',
                'format' = 'json',
                'json.timestamp-format.standard' = 'ISO-8601'
            )""".format(table_name, stream_name, region, stream_initpos)
def create_sink_table(table_name, bucket_name):
    return """CREATE TABLE {0} (
                id VARCHAR(64), 
                key VARCHAR(64), 
                network VARCHAR(64), 
                status_code INTEGER,
                duration BIGINT,
                request_count BIGINT, 
                response_size BIGINT, 
                duration_min BIGINT, 
                duration_max BIGINT, 
                duration_sum BIGINT,
                ts_time TIMESTAMP(3),
                WATERMARK FOR ts_time AS ts_time - INTERVAL '5' SECOND
            ) 
            PARTITIONED BY (id)
            WITH (
                'connector' = 'filesystem',
                'path' = 's3a://{1}/',
                'format' = 'csv',
                'sink.partition-commit.policy.kind' = 'success-file',
                'sink.partition-commit.delay' = '1 min'
            )""".format(table_name, bucket_name)

and inserting :

table_env.execute_sql(
        """INSERT INTO {0}
            SELECT
                id,
                key,
                network,
                status_code,
                COUNT(*) as request_count,
                SUM(response_size) as response_size,
                MIN(duration) as duration_min,
                MAX(duration) as duration_max,
                SUM(duration) as duration_sum,
                TUMBLE_START(ts_time, INTERVAL '1' MINUTE)
            FROM {1}
            GROUP BY TUMBLE(ts_time, INTERVAL '1' minute), id, key, network, status_code
        """.format(output_table_name, input_table_name)
    )
Source: TableSourceScan(table=[[default_catalog, default_database, input_table]], fields=[id, key, network, status_code, duration, request_count, response_size, duration_min, duration_max, duration_sum, ts]) -> Calc(select=[id, key, network, status_code, duration, request_count, response_size, duration_min, duration_max, duration_sum, ts, TO_TIMESTAMP(FROM_UNIXTIME(ts)) AS ts_time]) -> WatermarkAssigner(rowtime=[ts_time], watermark=[(ts_time - 5000:INTERVAL SECOND)]) -> Calc(select=[ts_time, id, auth_key, network, status_code, response_size, duration]) (1/1) 

scala examples do not work for packaging applications which use the TableAPI

Using scala/GettingStarted as a base, I added code to utilize the TableAPI and a kafka connector. Then received this error:

Caused by: org.apache.flink.table.api.ValidationException: Cannot discover a connector using option: 'connector'='kafka'

The example build.sbt does not support assembling an uber jar which preserves the Service Provider Interface required by Flink.

I suggest the build.sbt examples are updated or a new scala example is created which demonstrates how to use the TableAPI.

As long as java 11 or above, then fine

As long as java 11 or above:

mvn -v
Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)
Maven home: /Users/[user]/apache-maven-3.6.3
Java version: 19.0.1, vendor: Oracle Corporation, runtime: /Library/Java/JavaVirtualMachines/jdk-19.jdk/Contents/Home
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "13.0.1", arch: "x86_64", family: "mac"

then mvn package -Dflink.version=1.15.2 is fine for this tutorial.

[INFO] Scanning for projects...
[INFO]
[INFO] -----------< com.amazonaws:aws-kinesis-analytics-java-apps >------------
[INFO] Building aws-kinesis-analytics-java-apps 1.0
[INFO] --------------------------------[ jar ]---------------------------------
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ aws-kinesis-analytics-java-apps ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory /Users/[user]/github/amazon-kinesis-data-analytics-java-examples/GettingStarted/src/main/resources
[INFO]
[INFO] --- maven-compiler-plugin:3.8.1:compile (default-compile) @ aws-kinesis-analytics-java-apps ---
[INFO] Changes detected - recompiling the module!
[WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent!
[INFO] Compiling 1 source file to /Users/[user]/github/amazon-kinesis-data-analytics-java-examples/GettingStarted/target/classes
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ aws-kinesis-analytics-java-apps ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory /Users/[user]/github/amazon-kinesis-data-analytics-java-examples/GettingStarted/src/test/resources
[INFO]
[INFO] --- maven-compiler-plugin:3.8.1:testCompile (default-testCompile) @ aws-kinesis-analytics-java-apps ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ aws-kinesis-analytics-java-apps ---
[INFO] No tests to run.
[INFO]
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ aws-kinesis-analytics-java-apps ---
[INFO] Building jar: /Users/[user]/github/amazon-kinesis-data-analytics-java-examples/GettingStarted/target/aws-kinesis-analytics-java-apps-1.0.jar
[INFO]
[INFO] --- maven-shade-plugin:3.2.1:shade (default) @ aws-kinesis-analytics-java-apps ---
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-artifact-transfer/0.10.0/maven-artifact-transfer-0.10.0.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-artifact-transfer/0.10.0/maven-artifact-transfer-0.10.0.pom (12 kB at 31 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-shared-components/31/maven-shared-components-31.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-shared-components/31/maven-shared-components-31.pom (5.1 kB at 75 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm/7.0/asm-7.0.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm/7.0/asm-7.0.pom (2.9 kB at 48 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-commons/7.0/asm-commons-7.0.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-commons/7.0/asm-commons-7.0.pom (3.7 kB at 78 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-tree/7.0/asm-tree-7.0.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-tree/7.0/asm-tree-7.0.pom (3.1 kB at 69 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-analysis/7.0/asm-analysis-7.0.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-analysis/7.0/asm-analysis-7.0.pom (3.2 kB at 67 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/vafer/jdependency/2.1.1/jdependency-2.1.1.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/vafer/jdependency/2.1.1/jdependency-2.1.1.pom (11 kB at 166 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm/7.0-beta/asm-7.0-beta.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm/7.0-beta/asm-7.0-beta.pom (2.9 kB at 52 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-analysis/7.0-beta/asm-analysis-7.0-beta.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-analysis/7.0-beta/asm-analysis-7.0-beta.pom (3.2 kB at 66 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-tree/7.0-beta/asm-tree-7.0-beta.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-tree/7.0-beta/asm-tree-7.0-beta.pom (3.1 kB at 60 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-commons/7.0-beta/asm-commons-7.0-beta.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-commons/7.0-beta/asm-commons-7.0-beta.pom (3.7 kB at 88 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-util/7.0-beta/asm-util-7.0-beta.pom
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-util/7.0-beta/asm-util-7.0-beta.pom (3.7 kB at 82 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/com/google/guava/guava/19.0/guava-19.0.pom
Downloaded from central: https://repo.maven.apache.org/maven2/com/google/guava/guava/19.0/guava-19.0.pom (6.8 kB at 154 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/19.0/guava-parent-19.0.pom
Downloaded from central: https://repo.maven.apache.org/maven2/com/google/guava/guava-parent/19.0/guava-parent-19.0.pom (9.9 kB at 219 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-utils/3.1.0/plexus-utils-3.1.0.jar
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm/7.0/asm-7.0.jar
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-commons/7.0/asm-commons-7.0.jar
Downloading from central: https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-artifact-transfer/0.10.0/maven-artifact-transfer-0.10.0.jar
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-tree/7.0/asm-tree-7.0.jar
Downloaded from central: https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-utils/3.1.0/plexus-utils-3.1.0.jar (262 kB at 2.7 MB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-analysis/7.0/asm-analysis-7.0.jar
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm/7.0/asm-7.0.jar (114 kB at 806 kB/s)
Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/maven/shared/maven-artifact-transfer/0.10.0/maven-artifact-transfer-0.10.0.jar (128 kB at 907 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-util/7.0-beta/asm-util-7.0-beta.jar
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-tree/7.0/asm-tree-7.0.jar (50 kB at 355 kB/s)
Downloading from central: https://repo.maven.apache.org/maven2/org/vafer/jdependency/2.1.1/jdependency-2.1.1.jar
Downloading from central: https://repo.maven.apache.org/maven2/com/google/guava/guava/19.0/guava-19.0.jar
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-analysis/7.0/asm-analysis-7.0.jar (33 kB at 235 kB/s)
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-commons/7.0/asm-commons-7.0.jar (80 kB at 515 kB/s)
Downloaded from central: https://repo.maven.apache.org/maven2/org/ow2/asm/asm-util/7.0-beta/asm-util-7.0-beta.jar (81 kB at 414 kB/s)
Downloaded from central: https://repo.maven.apache.org/maven2/org/vafer/jdependency/2.1.1/jdependency-2.1.1.jar (186 kB at 843 kB/s)
Downloaded from central: https://repo.maven.apache.org/maven2/com/google/guava/guava/19.0/guava-19.0.jar (2.3 MB at 6.7 MB/s)
[INFO] Including com.amazonaws:aws-kinesisanalytics-runtime:jar:1.2.0 in the shaded jar.
[INFO] Including org.apache.flink:flink-connector-kinesis:jar:1.15.2 in the shaded jar.
[INFO] Including joda-time:joda-time:jar:2.5 in the shaded jar.
[INFO] Including commons-io:commons-io:jar:2.11.0 in the shaded jar.
[INFO] Including commons-lang:commons-lang:jar:2.6 in the shaded jar.
[INFO] Including org.apache.commons:commons-lang3:jar:3.3.2 in the shaded jar.
[INFO] Including commons-logging:commons-logging:jar:1.1.3 in the shaded jar.
[INFO] Including com.fasterxml.jackson.core:jackson-core:jar:2.13.2 in the shaded jar.
[INFO] Including com.google.guava:guava:jar:29.0-jre in the shaded jar.
[INFO] Including com.google.guava:failureaccess:jar:1.0.1 in the shaded jar.
[INFO] Including com.google.guava:listenablefuture:jar:9999.0-empty-to-avoid-conflict-with-guava in the shaded jar.
[INFO] Including org.checkerframework:checker-qual:jar:2.11.1 in the shaded jar.
[INFO] Including com.google.errorprone:error_prone_annotations:jar:2.3.4 in the shaded jar.
[INFO] Including com.google.j2objc:j2objc-annotations:jar:1.3 in the shaded jar.
[INFO] Including org.apache.flink:flink-core:jar:1.15.2 in the shaded jar.
[INFO] Including org.apache.flink:flink-shaded-guava:jar:30.1.1-jre-15.0 in the shaded jar.
[INFO] Including org.apache.flink:flink-connector-base:jar:1.15.2 in the shaded jar.
[INFO] Including org.apache.flink:flink-table-common:jar:1.15.2 in the shaded jar.
[INFO] Including org.apache.flink:flink-annotations:jar:1.15.2 in the shaded jar.
[INFO] Including org.apache.flink:flink-metrics-core:jar:1.15.2 in the shaded jar.
[INFO] Including org.apache.flink:flink-shaded-asm-9:jar:9.2-15.0 in the shaded jar.
[INFO] Including com.esotericsoftware.kryo:kryo:jar:2.24.0 in the shaded jar.
[INFO] Including com.esotericsoftware.minlog:minlog:jar:1.2 in the shaded jar.
[INFO] Including commons-collections:commons-collections:jar:3.2.2 in the shaded jar.
[INFO] Including org.apache.commons:commons-compress:jar:1.21 in the shaded jar.
[INFO] Including com.fasterxml.jackson.core:jackson-annotations:jar:2.13.2 in the shaded jar.
[INFO] Including com.ibm.icu:icu4j:jar:67.1 in the shaded jar.
[INFO] Including commons-codec:commons-codec:jar:1.15 in the shaded jar.
[INFO] Including com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:jar:2.13.2 in the shaded jar.
[INFO] Including com.fasterxml.jackson.core:jackson-databind:jar:2.13.2.2 in the shaded jar.
[INFO] Including org.apache.flink:flink-shaded-force-shading:jar:15.0 in the shaded jar.
[INFO] Including org.objenesis:objenesis:jar:2.1 in the shaded jar.
[INFO] Excluding org.slf4j:slf4j-api:jar:1.7.32 from the shaded jar.
[INFO] Excluding com.google.code.findbugs:jsr305:jar:1.3.9 from the shaded jar.
[INFO] Including org.apache.flink:flink-connector-aws-kinesis-streams:jar:1.15.2 in the shaded jar.
[INFO] Including org.apache.flink:flink-connector-aws-base:jar:1.15.2 in the shaded jar.
[INFO] Including software.amazon.awssdk:netty-nio-client:jar:2.17.247 in the shaded jar.
[INFO] Including io.netty:netty-codec-http:jar:4.1.77.Final in the shaded jar.
[INFO] Including io.netty:netty-codec-http2:jar:4.1.77.Final in the shaded jar.
[INFO] Including io.netty:netty-codec:jar:4.1.77.Final in the shaded jar.
[INFO] Including io.netty:netty-transport:jar:4.1.77.Final in the shaded jar.
[INFO] Including io.netty:netty-resolver:jar:4.1.77.Final in the shaded jar.
[INFO] Including io.netty:netty-common:jar:4.1.77.Final in the shaded jar.
[INFO] Including io.netty:netty-buffer:jar:4.1.77.Final in the shaded jar.
[INFO] Including io.netty:netty-handler:jar:4.1.77.Final in the shaded jar.
[INFO] Including io.netty:netty-transport-classes-epoll:jar:4.1.77.Final in the shaded jar.
[INFO] Including io.netty:netty-transport-native-unix-common:jar:4.1.77.Final in the shaded jar.
[INFO] Including org.reactivestreams:reactive-streams:jar:1.0.3 in the shaded jar.
[INFO] Including software.amazon.awssdk:sts:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:aws-query-protocol:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:kinesis:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:aws-cbor-protocol:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:third-party-jackson-dataformat-cbor:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:third-party-jackson-core:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:aws-json-protocol:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:json-utils:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:protocol-core:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:profiles:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:sdk-core:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:auth:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.eventstream:eventstream:jar:1.0.1 in the shaded jar.
[INFO] Including software.amazon.awssdk:http-client-spi:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:regions:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:annotations:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:utils:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:aws-core:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:metrics-spi:jar:2.17.247 in the shaded jar.
[INFO] Including software.amazon.awssdk:apache-client:jar:2.17.247 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpclient:jar:4.5.13 in the shaded jar.
[INFO] Including org.apache.httpcomponents:httpcore:jar:4.4.13 in the shaded jar.
[WARNING] Discovered module-info.class. Shading will break its strong encapsulation.
[WARNING] flink-connector-aws-base-1.15.2.jar, flink-connector-kinesis-1.15.2.jar define 2 overlapping classes:
[WARNING]   - org.apache.flink.connector.aws.table.util.AsyncClientOptionsUtils
[WARNING]   - org.apache.flink.connector.aws.table.util.AWSOptionUtils
[WARNING] flink-connector-kinesis-1.15.2.jar, flink-connector-aws-kinesis-streams-1.15.2.jar define 21 overlapping classes:
[WARNING]   - org.apache.flink.connector.kinesis.sink.KinesisStreamsException
[WARNING]   - org.apache.flink.connector.kinesis.table.KinesisPartitionKeyGeneratorFactory
[WARNING]   - org.apache.flink.connector.kinesis.table.util.KinesisStreamsConnectorOptionsUtils$KinesisProducerOptionsMapper
[WARNING]   - org.apache.flink.connector.kinesis.table.KinesisDynamicTableSinkFactory
[WARNING]   - org.apache.flink.connector.kinesis.sink.KinesisStreamsSinkBuilder
[WARNING]   - org.apache.flink.connector.kinesis.table.RandomKinesisPartitionKeyGenerator
[WARNING]   - org.apache.flink.connector.kinesis.sink.KinesisStreamsSinkElementConverter
[WARNING]   - org.apache.flink.connector.kinesis.sink.KinesisStreamsStateSerializer
[WARNING]   - org.apache.flink.connector.kinesis.sink.KinesisStreamsException$KinesisStreamsFailFastException
[WARNING]   - org.apache.flink.connector.kinesis.table.KinesisDynamicSink$KinesisDynamicTableSinkBuilder
[WARNING]   - 11 more...
[WARNING] third-party-jackson-core-2.17.247.jar, jackson-dataformat-cbor-2.13.2.jar, flink-connector-kinesis-1.15.2.jar, jackson-core-2.13.2.jar, third-party-jackson-dataformat-cbor-2.17.247.jar, jackson-databind-2.13.2.2.jar define 1 overlapping classes:
[WARNING]   - META-INF.versions.9.module-info
[WARNING] maven-shade-plugin has detected that some class files are
[WARNING] present in two or more JARs. When this happens, only one
[WARNING] single version of the class is copied to the uber jar.
[WARNING] Usually this is not harmful and you can skip these warnings,
[WARNING] otherwise try to manually exclude artifacts based on
[WARNING] mvn dependency:tree -Ddetail=true and the above output.
[WARNING] See http://maven.apache.org/plugins/maven-shade-plugin/
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing /Users/[user]/github/amazon-kinesis-data-analytics-java-examples/GettingStarted/target/aws-kinesis-analytics-java-apps-1.0.jar with /Users/[user]/github/amazon-kinesis-data-analytics-java-examples/GettingStarted/target/aws-kinesis-analytics-java-apps-1.0-shaded.jar
[INFO] Dependency-reduced POM written at: /Users/[user]/github/amazon-kinesis-data-analytics-java-examples/GettingStarted/dependency-reduced-pom.xml
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  13.185 s
[INFO] Finished at: 2022-12-10T18:41:56-06:00
[INFO] ------------------------------------------------------------------------
(base) LM-SJC-11024946:GettingStarted [user]$ ls *target
aws-kinesis-analytics-java-apps-1.0.jar			maven-archiver
classes							maven-status
generated-sources					original-aws-kinesis-analytics-java-apps-1.0.jar

Beam example fails to compile because of unused avro code in directory

within Beam/
executing :
mvn install -Dflink.version=1.13.2 -Dflink.version.minor=1.13

returns:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project basic-beam-app: Compilation failure: Compilation failure: 
[ERROR] /Users/jtrtj/workspace/kinesis_data_analytics_spike/amazon-kinesis-data-analytics-java-examples/Beam/src/main/java/samples/trading/avro/TradeEvent.java:[220,7] no suitable constructor found for SpecificRecordBuilderBase(org.apache.avro.Schema,org.apache.avro.specific.SpecificData)
[ERROR]     constructor org.apache.avro.specific.SpecificRecordBuilderBase.SpecificRecordBuilderBase(org.apache.avro.Schema) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR]     constructor org.apache.avro.specific.SpecificRecordBuilderBase.SpecificRecordBuilderBase(org.apache.avro.specific.SpecificRecordBuilderBase<samples.trading.avro.TradeEvent>) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR]     constructor org.apache.avro.specific.SpecificRecordBuilderBase.SpecificRecordBuilderBase(samples.trading.avro.TradeEvent) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR] /Users/jtrtj/workspace/kinesis_data_analytics_spike/amazon-kinesis-data-analytics-java-examples/Beam/src/main/java/samples/trading/avro/TradeEvent.java:[248,7] no suitable constructor found for SpecificRecordBuilderBase(org.apache.avro.Schema,org.apache.avro.specific.SpecificData)
[ERROR]     constructor org.apache.avro.specific.SpecificRecordBuilderBase.SpecificRecordBuilderBase(org.apache.avro.Schema) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR]     constructor org.apache.avro.specific.SpecificRecordBuilderBase.SpecificRecordBuilderBase(org.apache.avro.specific.SpecificRecordBuilderBase<samples.trading.avro.TradeEvent>) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR]     constructor org.apache.avro.specific.SpecificRecordBuilderBase.SpecificRecordBuilderBase(samples.trading.avro.TradeEvent) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR] /Users/jtrtj/workspace/kinesis_data_analytics_spike/amazon-kinesis-data-analytics-java-examples/Beam/src/main/java/samples/trading/avro/TradeEvent.java:[391,31] cannot find symbol
[ERROR]   symbol:   class AvroMissingFieldException
[ERROR]   location: package org.apache.avro
[ERROR] /Users/jtrtj/workspace/kinesis_data_analytics_spike/amazon-kinesis-data-analytics-java-examples/Beam/src/main/java/samples/trading/avro/TradeEvent.java:[417,3] method does not override or implement a method from a supertype
[ERROR] /Users/jtrtj/workspace/kinesis_data_analytics_spike/amazon-kinesis-data-analytics-java-examples/Beam/src/main/java/samples/trading/avro/TradeEvent.java:[419,3] method does not override or implement a method from a supertype
[ERROR] /Users/jtrtj/workspace/kinesis_data_analytics_spike/amazon-kinesis-data-analytics-java-examples/Beam/src/main/java/samples/trading/avro/TradeEvent.java:[430,3] method does not override or implement a method from a supertype
[ERROR] /Users/jtrtj/workspace/kinesis_data_analytics_spike/amazon-kinesis-data-analytics-java-examples/Beam/src/main/java/samples/trading/avro/TradeEvent.java:[433,51] cannot find symbol

This is due to the existence of unused java file in /Beam/src/main/java/samples/trading/avro/
After removing that directory and /Beam/src/main/resources/ I was able to successfully compile the example app.

Run python process failed when running tumbling-window.py on KDA

When running the tumbling window sample (python) the application does not start in KDA. Followed the instructions in the Docs. I see the following logs in CloudWatch:

{
    "locationInformation": "org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:107)",
    "logger": "org.apache.flink.client.python.PythonDriver",
    "message": "Run python process failed",
    "throwableInformation": "java.lang.RuntimeException: Python process exits with code: 1\n\tat org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:104) ~[flink-python_2.12-1.11.1.jar:1.11.1]\n\tat jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]\n\tat jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]\n\tat jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]\n\tat java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]\n\tat org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:294) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:201) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:78) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:67) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:203) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) [?:?]\n\tat java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]\n\tat java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]\n\tat java.lang.Thread.run(Thread.java:834) [?:?]\n",
    "threadName": "Flink-DispatcherRestEndpoint-thread-2",
    "applicationARN": "arn:aws:kinesisanalytics:us-east-1:XXXXXXXXXXXX:application/pythonsample",
    "applicationVersionId": "5",
    "messageSchemaVersion": "1",
    "messageType": "ERROR"
}
{
    "locationInformation": "org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$4(JarRunOverrideHandler.java:206)",
    "logger": "org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler",
    "message": "Error occurred when trying to start the job",
    "throwableInformation": "org.apache.flink.client.program.ProgramAbortException\n\tat org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:111) ~[?:?]\n\tat jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]\n\tat jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]\n\tat jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]\n\tat java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]\n\tat org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:294) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:201) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:78) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:67) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:203) ~[flink-dist_2.12-1.11.1.jar:1.11.1]\n\tat java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) ~[?:?]\n\t... 6 more\nWrapped by: java.util.concurrent.CompletionException: org.apache.flink.client.program.ProgramAbortException\n\tat java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314) ~[?:?]\n\tat java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319) [?:?]\n\tat java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702) [?:?]\n\tat java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]\n\tat java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]\n\tat java.lang.Thread.run(Thread.java:834) [?:?]\n",
    "threadName": "Flink-DispatcherRestEndpoint-thread-2",
    "applicationARN": "arn:aws:kinesisanalytics:us-east-1:XXXXXXXXXXXX:application/pythonsample",
    "applicationVersionId": "5",
    "messageSchemaVersion": "1",
    "messageType": "ERROR"
}
{
    "locationInformation": "org.apache.flink.runtime.rest.handler.AbstractHandler.handleException(AbstractHandler.java:210)",
    "logger": "org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler",
    "message": "Exception occurred in REST handler: Could not execute application.",
    "threadName": "Flink-DispatcherRestEndpoint-thread-2",
    "applicationARN": "arn:aws:kinesisanalytics:us-east-1:299003995950:application/pythonsample",
    "applicationVersionId": "5",
    "messageSchemaVersion": "1",
    "messageType": "ERROR"
}
{
    "applicationARN": "arn:aws:kinesisanalytics:us-east-1:XXXXXXXXXXXX:application/pythonsample",
    "applicationVersionId": 5,
    "message": "org.apache.flink.runtime.rest.handler.RestHandlerException: Could not execute application.\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$4(JarRunOverrideHandler.java:207)\n\tat java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:930)\n\tat java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:907)\n\tat java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:834)\nCaused by: java.util.concurrent.CompletionException: org.apache.flink.client.program.ProgramAbortException\n\tat java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)\n\tat java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)\n\t... 6 more\nCaused by: org.apache.flink.client.program.ProgramAbortException\n\tat org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:111)\n\tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.base/java.lang.reflect.Method.invoke(Method.java:566)\n\tat org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:294)\n\tat org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:201)\n\tat org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149)\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:78)\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:67)\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:203)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)\n\t... 6 more\n",
    "messageType": "ERROR",
    "messageSchemaVersion": "1",
    "errorCode": "CodeError.InvalidApplicationCode"
}

Run Python Process Failed - Aagain

I followed the tutorial of python S3sink, but failed to sink in S3. I've got the error log below. Appreciate any help.

@timestamp
message
1
2022-02-14T07:23:35.330Z
org.apache.flink.runtime.rest.handler.RestHandlerException: Could not execute application. at org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$4(JarRunOverrideHandler.java:247) at java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:930) at java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:907) at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506) at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: java.util.concurrent.CompletionException: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1 at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314) at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319) at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702) ... 6 more Caused by: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1 at org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:134) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222) at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:84) at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:70) at org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:238) at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) ... 6 more Caused by: java.lang.RuntimeException: Python process exits with code: 1 at org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:124) ... 17 more
@ingestionTime
1644823416407
@log
606015248547:/aws/kinesis-analytics/MyApplication
@logstream
kinesis-analytics-log-stream
@message
{"applicationARN":"arn:aws:kinesisanalytics:us-west-2:606015248547:application/MyApplication","applicationVersionId":2,"message":"org.apache.flink.runtime.rest.handler.RestHandlerException: Could not execute application.\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$4(JarRunOverrideHandler.java:247)\n\tat java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:930)\n\tat java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:907)\n\tat java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.util.concurrent.CompletionException: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1\n\tat java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)\n\tat java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)\n\t... 6 more\nCaused by: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1\n\tat org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:134)\n\tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.base/java.lang.reflect.Method.invoke(Method.java:566)\n\tat org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)\n\tat org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)\n\tat org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:84)\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:70)\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:238)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)\n\t... 6 more\nCaused by: java.lang.RuntimeException: Python process exits with code: 1\n\tat org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:124)\n\t... 17 more\n","messageType":"ERROR","messageSchemaVersion":"1","errorCode":"CodeError.InvalidApplicationCode"}
@timestamp
1644823415330
applicationARN
arn:aws:kinesisanalytics:us-west-2:606015248547:application/MyApplication
applicationVersionId
2
errorCode
CodeError.InvalidApplicationCode
message
org.apache.flink.runtime.rest.handler.RestHandlerException: Could not execute application.
at org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$4(JarRunOverrideHandler.java:247)
at java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:930)
at java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:907)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1
at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)
at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)
... 6 more
Caused by: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1
at org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:134)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:84)
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:70)
at org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:238)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
... 6 more
Caused by: java.lang.RuntimeException: Python process exits with code: 1
at org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:124)
... 17 more
messageSchemaVersion
1
messageType
ERROR
2
2022-02-14T07:23:35.000Z
Run python process failed
@ingestionTime
1644823447272
@log
606015248547:/aws/kinesis-analytics/MyApplication
@logstream
kinesis-analytics-log-stream
@message
{"locationInformation":"org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:127)","logger":"org.apache.flink.client.python.PythonDriver","message":"Run python process failed","throwableInformation":"java.lang.RuntimeException: Python process exits with code: 1\n\tat org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:124) ~[flink-python_2.12-1.13.2.jar:1.13.2]\n\tat jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]\n\tat jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]\n\tat jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]\n\tat java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]\n\tat org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:84) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:70) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:238) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) [?:?]\n\tat java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]\n\tat java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]\n\tat java.lang.Thread.run(Thread.java:829) [?:?]\n","threadName":"Flink-DispatcherRestEndpoint-thread-1","applicationARN":"arn:aws:kinesisanalytics:us-west-2:606015248547:application/MyApplication","applicationVersionId":"2","messageSchemaVersion":"1","messageType":"ERROR"}
@timestamp
1644823415000
applicationARN
arn:aws:kinesisanalytics:us-west-2:606015248547:application/MyApplication
applicationVersionId
2
locationInformation
org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:127)
logger
org.apache.flink.client.python.PythonDriver
message
Run python process failed
messageSchemaVersion
1
messageType
ERROR
threadName
Flink-DispatcherRestEndpoint-thread-1
throwableInformation
java.lang.RuntimeException: Python process exits with code: 1
at org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:124) ~[flink-python_2.12-1.13.2.jar:1.13.2]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:84) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:70) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:238) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) [?:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
at java.lang.Thread.run(Thread.java:829) [?:?]
3
2022-02-14T07:23:35.000Z
Error occurred when trying to start the job
@ingestionTime
1644823447272
@log
606015248547:/aws/kinesis-analytics/MyApplication
@logstream
kinesis-analytics-log-stream
@message
{"locationInformation":"org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$4(JarRunOverrideHandler.java:244)","logger":"org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler","message":"Error occurred when trying to start the job","throwableInformation":"java.lang.RuntimeException: Python process exits with code: 1\n\tat org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:124) ~[?:?]\n\tat jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]\n\tat jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]\n\tat jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]\n\tat java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]\n\tat org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:84) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:70) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:238) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) ~[?:?]\n\t... 6 more\nWrapped by: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1\n\tat org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:134) ~[?:?]\n\tat jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]\n\tat jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]\n\tat jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]\n\tat java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]\n\tat org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:84) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:70) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:238) ~[flink-dist_2.12-1.13.2.jar:1.13.2]\n\tat java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) ~[?:?]\n\t... 6 more\nWrapped by: java.util.concurrent.CompletionException: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1\n\tat java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314) ~[?:?]\n\tat java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319) [?:?]\n\tat java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702) [?:?]\n\tat java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]\n\tat java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]\n\tat java.lang.Thread.run(Thread.java:829) [?:?]\n","threadName":"Flink-DispatcherRestEndpoint-thread-1","applicationARN":"arn:aws:kinesisanalytics:us-west-2:606015248547:application/MyApplication","applicationVersionId":"2","messageSchemaVersion":"1","messageType":"ERROR"}
@timestamp
1644823415000
applicationARN
arn:aws:kinesisanalytics:us-west-2:606015248547:application/MyApplication
applicationVersionId
2
locationInformation
org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$4(JarRunOverrideHandler.java:244)
logger
org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler
message
Error occurred when trying to start the job
messageSchemaVersion
1
messageType
ERROR
threadName
Flink-DispatcherRestEndpoint-thread-1
throwableInformation
java.lang.RuntimeException: Python process exits with code: 1
at org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:124) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:84) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:70) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:238) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) ~[?:?]
... 6 more
Wrapped by: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1
at org.apache.flink.client.python.PythonDriver.main(PythonDriver.java:134) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:84) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:70) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler.lambda$handleRequest$3(JarRunOverrideHandler.java:238) ~[flink-dist_2.12-1.13.2.jar:1.13.2]
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700) ~[?:?]
... 6 more
Wrapped by: java.util.concurrent.CompletionException: org.apache.flink.client.program.ProgramAbortException: java.lang.RuntimeException: Python process exits with code: 1
at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314) ~[?:?]
at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319) [?:?]
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702) [?:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
at java.lang.Thread.run(Thread.java:829) [?:?]
4
2022-02-14T07:23:35.000Z
Exception occurred in REST handler: Could not execute application.
@ingestionTime
1644823447272
@log
606015248547:/aws/kinesis-analytics/MyApplication
@logstream
kinesis-analytics-log-stream
@message
{"locationInformation":"org.apache.flink.runtime.rest.handler.AbstractHandler.handleException(AbstractHandler.java:251)","logger":"org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler","message":"Exception occurred in REST handler: Could not execute application.","threadName":"Flink-DispatcherRestEndpoint-thread-1","applicationARN":"arn:aws:kinesisanalytics:us-west-2:606015248547:application/MyApplication","applicationVersionId":"2","messageSchemaVersion":"1","messageType":"ERROR"}
@timestamp
1644823415000
applicationARN
arn:aws:kinesisanalytics:us-west-2:606015248547:application/MyApplication
applicationVersionId
2
locationInformation
org.apache.flink.runtime.rest.handler.AbstractHandler.handleException(AbstractHandler.java:251)
logger
org.apache.flink.runtime.webmonitor.handlers.JarRunOverrideHandler
message
Exception occurred in REST handler: Could not execute application.
messageSchemaVersion
1
messageType
ERROR
threadName
Flink-DispatcherRestEndpoint-thread-1

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.