Giter VIP home page Giter VIP logo

adpe2e's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

adpe2e's Issues

Lab 1 - Copy table from NYCDataSets database to Synapse Analytics data warehouse.

This is a fantastic workshop and very helpful.

I went the through the setup in Lab 0 and Lab 1 and was checking where I could find the steps to load NYCDataSets into OperationalSQL_NYCDataSets SQL Database.

Without this, I cant trigger the datafactory pipleine to copy the table from OperationalSQL into SynapseSQL.

Thanks

error deploying

Hi, thx for the training.

As an old school BI consultant/developer this looks very interesting to me. Unfortunately I can't deploy the solution and am getting an error. My region is west-europe.

thx,
GJ

Role of Data Bricks within Solution - Lab 4: Add AI to your Big Data Pipeline with Cognitive Services

Hi @fabragaMS,

What an amazing set of labs you have created. Absolutely super valuable. In my efforts to learn how to become a better Azure Analytics Solutions Architecture I had a few questions about your chosen architecture.

In Lab 4: Add AI to your Big Data Pipeline with Cognitive Services: We use Data Bricks as the platform to call an Azure Computer Vision API.

Link to lab: https://github.com/fabragaMS/ADPE2E/blob/master/Lab/Lab4/Lab4.md

  1. Why have you chosen to use Data Bricks to provide that aspect of the solution?

  2. What role would you say Data Bricks is serving within your solution design?

  3. Is there other options which could be used (within the Azure eco-system) which perform the same role as Data Bricks for this particular use case?

  4. Within the “Create Data Bricks Linked Service in Azure Data Factory” section of the lab we use Data Factory to run our Data Bricks applet/workspace, which calls the Compute Vision service.

Does that mean the Data Bricks cluster which provide the compute for this aspect must be active indefinitely (or at least as long as we need to use the solution we are implementing)? [link to section within lab: https://github.com/fabragaMS/ADPE2E/blob/master/Lab/Lab4/Lab4.md#create-databricks-linked-service-in-azure-data-factory ]

error deploying

Hello

When I deploy, it's makes error in the "Review & Create" step:

{"code":"InvalidTemplateDeployment","details":[{"code":"InvalidParameter","target":"imageReference","message":"The following list of images referenced from the deployment template are not found: Publisher: MicrosoftWindowsDesktop, Offer: Windows-10, Sku: rs5-pro, Version: latest. Please refer to https://docs.microsoft.com/en-us/azure/virtual-machines/windows/cli-ps-findimage for instructions on finding available images."}],"message":"The template deployment 'Microsoft.Template-20211012172212' is not valid according to the validation procedure. The tracking id is '729a9123-5be9-45a9-97f0-e5655ab6361a'. See inner errors for details."}

------image below---

image
error_deploying_ADPE2E

Add Container [Update]

Hello team,

I am following your instructions and it looks like there are some changes in the instructions as there maybe some updated changes on the platform.
AddContainer

I am following the instructions - Add Container

On the Add Container blade, enter the following details:

  • Database id > Create new: NYC
  • Container id: ImageMetadata
  • Partition key: /requestId
  • Throughput: 400
  • Unique keys: /requestId

As I look through mine it has more features and I had + Add an unique key /requiestID as it originally didn't give me the option to write - Unique keys: /requestId.

France region is not supported despite the fact that it is listed in recommanded regions

When hitting the deploy button to initiate deployment and select my existing resource group located in France Central I get a message saying that France Central region is not supported.

image

However ./deploy/deploy.md states that France Central is supported.

image

Seems that databricks workspace are not supported in France Central location

Obvious workaround is to switch to another location for my resource group.

rgds

Issue with Lab 0 and Lab 1

Hi,

Found few issues.

  1. The ARM Template is completed. mainly it stuck with Operational SQL.
  2. Further, if we execute Restore NYCDataSets.sql getting SQL version not supported.

Operation results in exceeding quota limits of Core. Maximum allowed: 10, Current in use: 0, Additional requested: 12

{"telemetryId":"beeb0598-e318-44f3-8b81-977d482700a9","bladeInstanceId":"Blade_2aa034b766dd4a1da428a619e9442540_3_0","galleryItemId":"MyGalleryItem","createBlade":"DeployToAzure","code":"InvalidTemplateDeployment","message":"The template deployment 'bill_moderndw.net.mdw-lab' is not valid according to the validation procedure. The tracking id is 'b4967577-95eb-4e30-9001-d08a75dddf80'. See inner errors for details. Please see https://aka.ms/arm-deploy for usage details.","details":[{"code":"QuotaExceeded","message":"Operation results in exceeding quota limits of Core. Maximum allowed: 10, Current in use: 0, Additional requested: 12. Please read more about quota increase at https://aka.ms/ProdportalCRP/?#create/Microsoft.Support/Parameters/{"subId":"f2b5de09-a02a-45a6-b5eb-dab4b57ac6d2","pesId":"15621","supportTopicId":"32447243"}."}]}

What do I need to request?

Lab 4 - HTTPError: 403 Client Error: Forbidden for url: https://australiaeast.api.cognitive.microsoft.com/vision

While working through Lab4, I came across the following issue.
After importing NYCImageMetadata-Lab notebook and replacing suscription_key ans vision_base_url I used "https://petlifetoday.com/wp-content/uploads/2018/06/wireless-dog-fence.jpg" in the "Image URL" and hit Run/Run All.
I got the following error (image attached below):
"HTTPError: 403 Client Error: Forbidden for url: https://australiaeast.api.cognitive.microsoft.com/vision/v2.0/analyze?visualFeatures=Categories%2CDescription%2CColor%2CBrands%2CTags%2CObjects&details=Landmarks%2CCelebrities"

image

I opened Microsoft cognitive services API Console for Computer Vision and passed the same subscription key and same image URL. And I got response:
image

"403: Forbidden..." seems to be very common issue. There are number of suggested solutions, but none related to my particular case. There was a suggestion to pass more parameters to the request header, not just 'Ocp-Apim-Subscription-Key'. I added 'User-Agent' and 'Content-Type', but that didn't change anything (and I didn't expect it to).
The fact that I can establish connection and run the service with my subscription key from the MS API console, and not from Databricks tells me something is wrong in the communication between Databricks and ComputerVision resource in Azure with my account.
Any help is highly appreciated.

ForEach Loop in ADF while calling Image Function (NYCImageMetadata-Lab) has space in the path passed in - here is a fix...

There was a problem in passing the list of images in ADF for calling the function created in Databricks Notebook (Lab 04- Step 18) - there is a bug causing a space between the image container path and image name. You need to put this in ForEachImage base parameters to fix the spaces coming between the container path and image name- @concat(trim(variables('ImageMetadataContainerUrl')), trim(item().name))

#8 Dataset [Update]

Ola!

I am just confirming because it looks like Azure Data Factory has gone through an update and I am just wanting to know do I tick or leave the box blank under question 8. Please refer to the attached screenshot. As I am referring to the box in this section:

ChildItems

Select the Get Metadata activity and enter the following details:

  • General > Name: GetImageFileList
  • Dataset: SynapseDataLake_NYCImages_Binary
  • Source > Field list: Child Items

I am enjoying these labs!

Tash :)

Yellow Trip Data CSV Files have a double header

First of all, fantastic workshop, thank you so much for putting this together.

Lab 2 - Azure Data Factory - Copy NYC Taxi Data to Data Warehouse fails.

I looked at the source files on the blob storage and they have a double header

image

The first line is what is causing the error. I tried downloaded them from the blob files but it is taking too long.

I wanted to let you know.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.