Giter VIP home page Giter VIP logo

dapr-logicapps-extension's Introduction

Dapr LogicApps Extension - Run Cloud-Native Workflows using Dapr and LogicApps

This project is a lightweight host that allows developers to run cloud-native workflows locally, on-premises or any cloud environment using the Azure Logic Apps workflow engine and Dapr.

⚠️ ARCHIVED
This project is now archived and transferred to the fork under the Azure organization in GitHub: https://github.com/Azure/dapr-logicapps-extension

Watch this video for overview of Dapr Workflows.

Build Status License: MIT

Contents

Benefits

By using a workflow engine, business logic can be defined in a declarative, no-code fashion so application code doesn't need to change when a workflow changes. Dapr Workflows allows you to use workflows in a distributed application along with these added benefits:

  • Run workflows anywhere - on your local machine, on-premises, on Kubernetes or in the cloud
  • Built-in tracing, metrics and mTLS through Dapr
  • gRPC and HTTP endpoints for your workflows
  • Kick off workflows based on Dapr bindings events
  • Orchestrate complex workflows by calling back to Dapr to save state, publish a message and more

How it works

New to Dapr? Learn more about Dapr with this overview

Dapr Workflows hosts a gRPC server that implements the Dapr Client API.

This allows users to start workflows using gRPC and HTTP endpoints through Dapr, or start a workflow asynchronously using Dapr bindings. Once a workflow request comes in, Dapr Workflows uses the Logic Apps SDK to execute the workflow.

Diagram

Example

Dapr Workflows can be used as the orchestrator for many otherwise complex activities. For example, invoking an external endpoint, saving the data to a state store, publishing the result to a different app or invoking a binding can all be done by calling back into Dapr from the workflow itself.

This is due to the fact Dapr runs as a sidecar next to the workflow host just as if it was any other app.

Examine workflow2.json as an example of a workflow that does the following:

  1. Calls into Azure Functions to get a JSON response
  2. Saves the result to a Dapr state store
  3. Sends the result to a Dapr binding
  4. Returns the result to the caller

Since Dapr supports many pluggable state stores and bindings, the workflow becomes portable between different environments (cloud, edge or on-premises) without the user changing the code - because there is no code involved.

Get Started

Prerequisites:

  1. Install the Dapr CLI
  2. Azure Blob Storage Account

Supported Dapr Version: 0.10.0 and above

Self hosted (running locally)

Deploy Dapr

Once you have the Dapr CLI installed, run:

dapr init

Invoke Logic Apps using Dapr

First, set up the environment variables containing the Azure Storage Account credentials:

Mac / Linux

export STORAGE_ACCOUNT_KEY=<YOUR-STORAGE-ACCOUNT-KEY>
export STORAGE_ACCOUNT_NAME=<YOUR-STORAGE-ACCOUNT-NAME>

Windows

set STORAGE_ACCOUNT_KEY=<YOUR-STORAGE-ACCOUNT-KEY>
set STORAGE_ACCOUNT_NAME=<YOUR-STORAGE-ACCOUNT-NAME>
cd src/Dapr.Workflows

dapr run --app-id workflows --protocol grpc --port 3500 --app-port 50003 -- dotnet run --workflows-path ../../samples

curl http://localhost:3500/v1.0/invoke/workflows/method/workflow1

{"value":"Hello from Logic App workflow running with Dapr!"}                                                                                   

Rejoice!

Kubernetes

Make sure you have a running Kubernetes cluster and kubectl in your path.

Deploy Dapr

Once you have the Dapr CLI installed, run:

dapr init --kubernetes

Wait until the Dapr pods have the status Running.

Create a Config Map for the workflow

kubectl create configmap workflows --from-file ./samples/workflow1.json

Create a secret containing the Azure Storage Account credentials

Replace the account name and key values below with the actual credentials:

kubectl create secret generic dapr-workflows --from-literal=accountName=<YOUR-STORAGE-ACCOUNT-NAME> --from-literal=accountKey=<YOUR-STORAGE-ACCOUNT-KEY>

Deploy Dapr Worfklows

kubectl apply -f deploy/deploy.yaml

Invoke the workflow using Dapr

Create a port-forward to the dapr workflows container:

kubectl port-forward deploy/dapr-workflows-host 3500:3500

Now, invoke logic apps through Dapr:

curl http://localhost:3500/v1.0/invoke/workflows/method/workflow1

{"value":"Hello from Logic App workflow running with Dapr!"}                                                                                   

Rejoice once more!

Invoking workflows using Dapr bindings

First, create any Dapr binding of your choice.

See this How-To tutorial and sample to get started.

In order for Dapr Workflows to be able to start a workflow from a Dapr binding event, simply name the binding with the name of the workflow you want it to trigger. Couldn't get any simpler!

Here's an example of a Kafka binding that will trigger a workflow named workflow1:

apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
  name: workflow1
spec:
  type: bindings.kafka
  metadata:
  - name: topics
    value: topic1
  - name: brokers
    value: localhost:9092
  - name: consumerGroup
    value: group1
  - name: authRequired
    value: "false"

Self hosted

Place the binding yaml file above in a components directory at the root of your application.

Kubernetes

kubectl apply -f my_binding.yaml

Seeing events triggering logic apps

Once an event is sent to the bindings component, check the logs Dapr Workflows to see the output.

In standalone mode, the output will be printed to the local terminal.

On Kubernetes, run the following command:

kubectl logs -l app=dapr-workflows-host -c host

Supported workflow features

Supported Actions and Triggers

Supported Control Workflows

Supported Data Manipulation

Not supported

Build

Make sure you have dotnet core installed on your machine. At minimum, you need .NET Core SDK 3.1 to build.

  1. Clone the repo
  2. Inside the top level dir, run: dotnet build

Build Docker Image

Make sure you have Docker installed on your machine.

Compile to release mode:

dotnet publish -c Release -r linux-x64 --self-contained false

Build image:

docker build -t <registry>/<image> .

Push image:

docker push <registry>/<image>

dapr-logicapps-extension's People

Contributors

aaroncrawfis avatar amanbha avatar artursouza avatar greenie-msft avatar jthin avatar laveeshb avatar tcnghia avatar yaron2 avatar youngbupark avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dapr-logicapps-extension's Issues

Clarify the relationship between the Logic Apps Preview and Dapr Workflows in the docs

Q: Is there a relationship between the Logic Apps Preview that runs in Docker: https://docs.microsoft.com/en-au/azure/logic-apps/create-stateful-stateless-workflows-visual-studio-code?s=09
and Dapr Workflows which allows running Logc Apps workflows https://github.com/dapr/workflows ?

A: Dapr Workflows runs the Logic Apps engine, the .NET binaries. Well in advance of this announcement you have been able to run Logic App workflows in containers using Dapr. This announcement just showed how Logic App workflows also runs Functions Host runtime.

Recurrence Trigger Doesn't Start

I was able to get a workflow to work if it has a manual trigger but if I just want a recurrence trigger it loads correctly but I don't get any events firing.

If I use a manual trigger it performs the workflow as expected. Is this not correctly implemented?

{
  "definition": {
    "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
    "actions": {
      "Function": {
        "type": "Http",
        "inputs": {
          "method": "POST",
          "uri": "http://localhost:3503/v1.0/invoke/receiver/method/log",
          "headers": {
            "Content-Type": "application/json"
          }
        },
        "runAfter": {}
      }
    },
    "contentVersion": "1.0.0.0",
    "outputs": {},
    "parameters": {},
    "triggers": {
      "Recurrence": {
        "type": "Recurrence",
        "recurrence": {
          "frequency": "Minute",
          "interval": 2
        }
      }
    }
  }
}

EDIT: I thought it worked but it was a red herring. I see InternalServerErrors for the action in TableStorage. So it seems like the ability to execute an action on the callback of the recurrence is messed up. I don't know if there is some context missing in memory but it seems to be a problem like that. If I run the following it does work.

{
  "definition": {
    "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
    "actions": {
      "Until": {
        "type": "Until",
        "actions": {
          "Delay": {
            "type": "Wait",
            "inputs": {
               "interval": {
                  "count": 15,
                  "unit": "Second"
               }
            },
            "runAfter": { }
         },
         "FunctionTwo": {
          "type": "Http",
          "inputs": {
            "method": "POST",
            "uri": "http://192.168.1.224:3502/v1.0/invoke/receiver/method/log",
            "body": [
    
            ],
            "headers": {
              "Content-Type": "application/json"
            }
          },
          "runAfter": {
            "Delay": [
              "Succeeded"
            ]
          }
          }
        },
        "expression": "@equals(1, 0)",
        "limit": { 
           "count": 1000,
           "timeout": "PT1H"
        },
        "runAfter": { }
     }
    },
    "contentVersion": "1.0.0.0",
    "outputs": {},
    "parameters": {},
    "triggers": {
      "manual": {
        "inputs": {
          "schema": {}
        },
        "kind": "Http",
        "type": "Request"
      }
    }
  }
}

So it seems like there is something unavailable on the callback that is causing an error.

How to use workflows in the fully On-Premises scenario

I plan to use workflows to track state of the complex and distributed execution of the multiple actions running on the multiple microservices. My software product will be able to run in the Azure cloud (AKS), but also in a completely closed On-Premises environment (under local K8s or microK8s cluster). In case of the Azure I understand that I can use Logic Apps, and there are some videos, documentation etc.

What is not yet clear to me is how (if possible at all) can I use workflows in completely closed environment without any connection to the Azure (no Logic Apps) or any cloud? As I see it must be possible, but I don't see any example.

The thing is also, because my microservices will execute steps of the workflows, and I only need to track the state of the execution, do I need anything like Logic Apps, even in the Azure? Can I just use workflows itself to track state?

Please, help me to understand and clarify this.

Invoking state store fails

You can find repo attached.
dapr-logic-apps.zip

Scenario
Simple request/response Logic App for testing integration with state store (Azure Table Storage).
Sample is running in miniKube

Reproduce

  • Deploy Azure table storage state store
    kubectl apply -f components/customers-state-store.yaml

  • Create config map with the Logic Apps workflow definition
    kubectl create configmap workflow-customer-masterdata-config --from-file ./workflows/customer-masterdata.json

  • Deploy the workflow container with Dapr side car
    kubectl apply -f deploy/workflow-customer-masterdata-app.yaml

  • Configure port forwarding for testing
    kubectl port-forward deploy/workflow-customer-masterdata-app 3500:3500

  • An exception occurs when trying to update the state store. This exception is returned in the Logic Apps response:

Status code: 400
Body: {
    "errorCode": "ERR_STATE_STORES_NOT_CONFIGURED",
    "message": ""
}

Excepted result

The HTTP Action "Update_customers_state_store_with_master_data" is configured with an HTTP POST on http://localhost:3500/v1.0/state/customers

This should update the state store, named "customers" with the key "CUST001""

Thanks for looking into this!
Toon

Asynchronous execution

Hello,

According to your samples and my tests, I have the impression that the workflow execution itself is fully synchronous, unlike the MS hosted logic apps that returns an tracking ID. Is my understanding correct or do I miss anything? If yes, is there a plan to also support asynchronous executions?

Best regards

Sample - Workflow2

Hi,
I run successfull workflow1 but workflow2 report this error:

{
  "error": {
    "code": "NoResponse",
    "message": "The server did not received a response from an upstream server. Request tracking id '08585995089540340895731183776CU00'."
  }
}

How can I run workflow2?
I must create servicebus and store container but there isn't in deploy file o in docs. Could this be the mistake?
Thanks a lot

Write tests

Integration tests should cover invoking Icarus via Dapr, testing validation, correct/wrong inputs and success / error codes.

Improve logging when workflow names are misconfigured

Issue #60 describes the problem. I went through the sample but renamed the workflow file for some reason. Instead of calling it workflow1.json (thus ending up with the following URL http://localhost:3500/v1.0/invoke/workflows/method/workflow1), I named it wf.json but didn't realize the URL should have been http://localhost:3500/v1.0/invoke/workflows/method/wf. Logging was enabled but nothing was reported other than getting a 400 (Bad Request) error code.

Ideally, for such issues:

  • Some extra logging info should be available
  • Instead of a 400 for this particular case, a 404 would be more appropriate since it didn't find the workflow1.

Best Regards

Allow loading multiple workflows

Currently only a single workflow can be loaded.

Icarus needs to support loading N workflows, each activated using a name that identifies it.

Pass HTTP Body to Logic Apps workflow

Hi!

I have written an HTTP-triggered Logic Apps workflow and it's running successfully in Dapr. It gets invoked like this:

POST http://localhost:3500/v1.0/invoke/workflows/method/<workflowname>

Unfortunately, when I try to access the HTTP request body in the Logic Apps workflow, it seems to be empty. I use the @triggerBody() expression in the response shape to check its value, but it's always empty. This expression works fine in Azure.

I tried several HTTP variations on the HTTP body, but none of them works...

Any pointers?
Thank you
T.

Write documentation

Create detailed documentation that covers the following:

  • Project goals
  • Features
  • Known limitations
  • Getting started
  • Troubleshooting

Storage Account - Kind

Really excited about this functionality!

One thing I ran into when setting up the sample was my storage account "kind" was BlobStorage and the sample would fail, however when I tried with a different storage account it worked! That one was set to storage/General Purpose.

With BlobStorage - the error would be "Microsoft.WindowsAzure.Storage.StorageException: Unexpected HTTP status code 'NotImplemented'."

Tip

Add components folder to gitignore

Add components folder to gitignore, this folder and files in it are generated by cli and doesnt need to be checked in. Delete these files as well from the repo.

Allow for custom configuration

Today, the configuration is taken from the resources.resx file. We need to support external configuration that holds, for example, the Azure Storage Account.

Create CI/CD pipeline

  • Github Actions needs to be configured to build the Icarus host into a Docker image
  • Versioning needs to be supported
  • A release pipeline should output the Icarus dotnet publish results for Linux, Darwin and Windows.

Rename dlls, .csproj, namespaces

Currently .csproj, namespaces and dlls are named as Microsoft.Dapr.LogicApps.ExecutionEnvironment. Remove Microsoft from it, it can just be called DApr.LogicApps.ExecutionEnvironment.

Support Dapr as a Logic Apps Action Type

A typed Logic Apps Action would provide a more natural Logic Apps experience to the developer rather than simply using http calls onto Dapr. This is a better developer experience for calling of properties and methods.

Logic Apps designer support

Hi,

With the Logic Apps public preview release, we now have the capability to locally design a logic app in VS Code. It also offers a designer. Do you have any plans for logic apps designer support in the near future?

image

Azure ContainerApp provisioning "failed"

Hi,

sorry if I'm posting this in the wrong place, if i am please redirect me to where i can get some help.

I'm trying to publish daprio/workflows:0.2.2 to azure container apps and it's failing to provision.

the logs state the dapr sidecar spins up fine and is waiting for the application. the image then runs as expected and the logs appear as:

"Loading Configuration
Creating Edge Configuration
Registering Web Environment
Loading workflow: email-orchestrator-workflow.json
Flow Created
Loading workflow: email-sender-workflow.json
Flow Created
Dapr LogicApps Server listening on port 50003"

it then continuous to repeat the above multiple times and is still continuing to do so now.

Replicating this locally by spinning up a docker container using the following command
docker run -p 50003:50003 --env STORAGE_ACCOUNT_KEY=abc --env STORAGE_ACCOUNT_NAME=abc -v C:\Users....\workflows:/workflows daprio/workflows:0.2.2 dotnet app/Dapr.Workflows.dll --workflows-path /workflows

the image runs with the same output above

i then run a sidecar
dapr run --app-id workflows --app-protocol grpc --dapr-http-port 3500 --app-port 50003 --log-level debug

the dapr sidecar log indicates its still waiting for port 50003 to be listening.

if i try and invoke the grpc server on port 50003 i get status code "14 unavailable". i feel like i'm missing something simple here and it should all work as expected.

if i run the project outside of a containerised environment it works fine:
dapr run --app-id workflows --app-protocol grpc --dapr-http-port 3500 --app-port 50003 -- dotnet run --workflows-path ../../workflows

am I missing something about running containerised? hopefully you can help. i hoped copying the example deploy file would just work as expected.

Timeline GA

Hi, we are experimenting with logic apps in Dapr.
What is the timeline for GA of Logic apps in Dapr?

Bad request

Hello,

I've just tried the workflow1 sample in Kubernetes. I keep ending up with a Bad Request (400) issue with no error in the body.

The workflow engine is running together with the daprd:

dapr-workflows-host-76f9fc7b9b-x8hfj 2/2 Running 0 2m33s

Daprd logs are here

time="2020-07-30T15:27:57.39857996Z" level=info msg="starting Dapr Runtime -- version 0.9.0 -- commit 6babe85-dirty" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:27:57.398763266Z" level=info msg="log level set to: debug" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:27:57.399080878Z" level=info msg="metrics server started on :9090/" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.metrics type=log ver=0.9.0
time="2020-07-30T15:27:57.3996595Z" level=info msg="loading default configuration" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:27:57.399803705Z" level=info msg="kubernetes mode configured" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:27:57.399903309Z" level=info msg="app id: workflows" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:27:57.400008413Z" level=info msg="mTLS is disabled. Skipping certificate request and tls validation" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:27:57.436658671Z" level=info msg="application protocol: grpc. waiting on port 50003.  This will block until the app is listening on that port." app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:28:00.14391819Z" level=info msg="application discovered on port 50003" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:28:00.14474632Z" level=warning msg="either no actor state store or multiple actor state stores are specified in the configuration, actor stores specified: 0" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:28:00.144783922Z" level=info msg="Initialized name resolution to kubernetes" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:28:00.181670489Z" level=warning msg="actors: state store must be present to initialize the actor runtime" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime.actor type=log ver=0.9.0
time="2020-07-30T15:28:00.181897197Z" level=warning msg="failed to init actors: state store does not support transactions which actors require to save state - please see https://github.com/dapr/docs" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:28:00.182062103Z" level=info msg="enabled monitoring middleware" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime.grpc.api type=log ver=0.9.0
time="2020-07-30T15:28:00.182202808Z" level=info msg="API gRPC server is running on port 50001" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:28:00.182366815Z" level=info msg="enabled monitoring middleware" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime.grpc.internal type=log ver=0.9.0
time="2020-07-30T15:28:00.18252792Z" level=info msg="internal gRPC server is running on port 50002" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:28:00.182801731Z" level=info msg="enabled cors http middleware" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime.http type=log ver=0.9.0
time="2020-07-30T15:28:00.182904034Z" level=info msg="enabled metrics http middleware" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime.http type=log ver=0.9.0
time="2020-07-30T15:28:00.182997738Z" level=info msg="enabled tracing http middleware" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime.http type=log ver=0.9.0
time="2020-07-30T15:28:00.183091941Z" level=info msg="http server is running on port 3500" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0
time="2020-07-30T15:28:00.183188345Z" level=info msg="dapr initialized. Status: Running. Init Elapsed 2783.390141ms" app_id=workflows instance=dapr-workflows-host-76f9fc7b9b-x8hfj scope=dapr.runtime type=log ver=0.9.0

I have enabled debug level but nothing special is reported. I created the secret & configmap as stated in the read me. The logs of the host are:

Loading Configuration
Creating Edge Configuration
Registering Web Environment
Loading workflow: wf.json
Flow Created
Dapr LogicApps Server listening on port 50003

However, when doing the port forwarding and invoking http://localhost:3500/v1.0/invoke/workflows/method/workflow1, I get a 400 back and no error in the response so not sure what is bad in the request. Not a single extra logging info can be found nor in daprd, nor in the host logs. Note that I have disabled mTls because I'm also using Linderd but I suppose it is not related? Whether I inject the pod with Linkerd makes no difference anyway.

Any clue?

Thanks

Demo Sample

Need to have a compelling LA workflow scenario example that shows off the power of LA and Dapr combined.

Is this project still maintained?

Hi, @yaron2
We are very interested in this project. If workflows can work under the stand-alone, It will solves our two problems.

  1. Build the workflow based on Workflow 2 json.
  2. Service aggregation.

Not working with 1.0.0-rc.3

Hello,

Despites of your own statement:

Supported Dapr Version: 0.10.0 and above

I doubt it is supported with 1.0.0-rc.3. First reason is that in the deploy folder, you still use "id" and "port" in the dapr annotations instead of "app-id" and "app-port", but this could be a problem with the sample.

Secund reason is that I've upgraded my cluster to run on 1.0.0-rc.3 (extract from dapr dashboard -k):

dapr-dashboard | dapr-dashboard-84b6c769f5-tck7p | dapr-system | True | Running | 0.6.0 | 2h | 2021-02-02 09:46.03
-- | -- | -- | -- | -- | -- | -- | --
dapr-operator | dapr-operator-659f98bf8b-d8mvx | dapr-system | True | Running | 1.0.0-rc.3 | 2h | 2021-02-02 09:46.03
dapr-placement-server | dapr-placement-server-0 | dapr-system | True | Running | 1.0.0-rc.3 | 2h | 2021-02-02 09:46.03
dapr-sentry | dapr-sentry-57fc976b7c-gtztm | dapr-system | True | Running | 1.0.0-rc.3 | 2h | 2021-02-02 09:46.03
dapr-sidecar-injector | dapr-sidecar-injector-7c4cbcd9c6-j6nm8 | dapr-system | True | Running | 1.0.0-rc.3 | 2h | 2021-02-02 09:46.03

and since then, the workflow engine is broken. Whatever workflow (including your demo workflow1.json) I try to invoke returns the following (truncated for brevity):

{"error": "client error: error when reading response headers: EOF. Buffer size=63, contents: "\x00\x00\x18\x04\x00\x00\x00\x00\x00\x00\x04\x00@\x00\x00\...""}

I'm invoking the workflow directly through HTTP using Fiddler ==> Busybox injected with Dapr ==> workflow engine

I have checked the following logs:

  • daprd of busybox
  • daprd of workflow engine
  • workflow engine host

and there is absolutely nothing, not even a trace of the call, although it is provisioned with the debug level.

Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.