azure / test-drive-azure-synapse-with-a-1-click-poc Goto Github PK
View Code? Open in Web Editor NEWLicense: MIT License
License: MIT License
Hi there,
I was getting the following error.
{
"status": "Failed",
"error": {
"code": "CreateWorkspaceError",
"message": "An error has occured while creating the workspace. Correlation Id: ef82716a-21ec-44dc-a7e8-2f4ecf67d8e1"
}
}
Not sure how to resolve this.
Could someone here please help? Thanks
You can still make the rest of the notebook work by commenting out this line:
lrModel.save(logRegDirfilename)
And this is the error:
Py4JJavaError: An error occurred while calling o1156.save.
: Operation failed: "This request is not authorized to perform this operation using this permission.", 403, HEAD, https://mstbniiuq2crxcdapoc.dfs.core.windows.net/dlsmstpocfs1/user/trusted-service-user/lrModel_08-03-2021-1628019120?upn=false&action=getStatus&timeout=90
at org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:166)
at org.apache.hadoop.fs.azurebfs.services.AbfsClient.getPathStatus(AbfsClient.java:414)
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getFileStatus(AzureBlobFileSystemStore.java:551)
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:430)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1627)
at org.apache.spark.ml.util.FileSystemOverwrite.handleOverwrite(ReadWrite.scala:696)
at org.apache.spark.ml.util.MLWriter.save(ReadWrite.scala:179)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
Traceback (most recent call last):
File "/opt/spark/python/lib/pyspark.zip/pyspark/ml/util.py", line 244, in save
self.write().save(path)
File "/opt/spark/python/lib/pyspark.zip/pyspark/ml/util.py", line 183, in save
self._jwrite.save(path)
File "/opt/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
answer, self.gateway_client, self.target_id, self.name)
File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 69, in deco
return f(*a, **kw)
File "/opt/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling o1156.save.
: Operation failed: "This request is not authorized to perform this operation using this permission.", 403, HEAD, https://mstbniiuq2crxcdapoc.dfs.core.windows.net/dlsmstpocfs1/user/trusted-service-user/lrModel_08-03-2021-1628019120?upn=false&action=getStatus&timeout=90
at org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:166)
at org.apache.hadoop.fs.azurebfs.services.AbfsClient.getPathStatus(AbfsClient.java:414)
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getFileStatus(AzureBlobFileSystemStore.java:551)
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:430)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1627)
at org.apache.spark.ml.util.FileSystemOverwrite.handleOverwrite(ReadWrite.scala:696)
at org.apache.spark.ml.util.MLWriter.save(ReadWrite.scala:179)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
I ended up adding them manually, but you will need Microsoft.Network, Microsoft.Sql, Microsoft.Synapse. (And maybe others?)
There are important files that Microsoft projects should all have that are not present in this repository. A pull request has been opened to add the missing file(s). When the pr is merged this issue will be closed automatically.
Microsoft teams can learn more about this effort and share feedback within the open source guidance available internally.
The pipeline run failed because of below error:
{
"errorCode": "2100",
"message": "ErrorCode=SqlFailedToConnect,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot connect to SQL Database. Please contact SQL server team for further support. Server: 'conuwwzj3yu5h2vcpocws1', Database: 'conuwwzj3yu5h2vcpocws1p1 (conuwwzj3yu5h2vcpocws1/conuwwzj3yu5h2vcpocws1p1)', User: 'sqladminuser'. Check the linked service configuration is correct, and make sure the SQL Database firewall allows the integration runtime to access.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server),Source=.Net SqlClient Data Provider,SqlErrorNumber=53,Class=20,ErrorCode=-2146232060,State=0,Errors=[{Class=20,Number=53,State=0,Message=A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server),},],''Type=System.ComponentModel.Win32Exception,Message=The network path was not found,Source=,'",
"failureType": "UserError",
"target": "Create Schema If Does Not Exists",
"details": []
}
I tried database name without the parentheses part. Still did not work. Any idea ?
If I am reading this correctly, it looks like the resume time defaults to 9pm and the pause time is 5pm. Do you mean for the resume time to be 9am?
"defaultValue": "09:00 PM ( 21:00 )",
should be
"defaultValue": "09:00 AM ( 09:00 )",
When deploying this resources to Azure, I get this error:
{
"status": "Failed",
"error": {
"code": "BadRequest",
"message": "Content-Type header value missing.",
"details": []
}
}
Type:
Microsoft.Synapse/workspaces/sqlPools/transparentDataEncryption
Ressource ID
/subscriptions/1f66d329-d23b-4303-acbd-cc4b98f9456e/resourceGroups/1-Click-POC/providers/Microsoft.Synapse/workspaces/conesxkzpg6mdfo2pocws1/sqlPools/conesxkzpg6mdfo2pocws1p1/transparentDataEncryption/current
Deployment Correlation ID
f81c36f2-f701-4205-b4ed-f71f36fcaaba
Timestamp:
12.8.2021, 12:41:02
How to fix this repro to make it work?
Wouldn't it be much easier to provide a pre-provisioned service for registered users with a given ACR budget?
Hi,
Very good material, congratulations.
I would like to suggest to include an Access Policy in the Key Vault for the subscription owner, otherwise it wouldn't be able to see the secret created during the provisioning.
Regards,
Tiago Moraes
Hi,
I got the following error from Logic App
{
"error": {
"code": "ResourceNotFound",
"message": "The Resource 'Microsoft.Synapse/workspaces/mstpocws1/sqlPools/mstpocws1p1' under resource group '1-click-POC' was not found. For more details please go to https://aka.ms/ARMResourceNotFoundFix"
}
}
As you can see, the workspace name is wrong.
I checked the script:
https://github.com/tgpmoraes/Test-Drive-Azure-Synapse-with-a-1-click-POC/blob/main/nestedtemplates/pausetemplate.json
I noticed the variable is different from the main script:
pausetemplate.json - "synapseName": "[toLower(concat(parameters('companyTla'),parameters('deploymentType')))]"
azuredeploy.json - "synapseName": "[toLower(concat(parameters('companyTla'),uniquestring(resourceGroup().id),variables('deploymentType')))]"
After I fixed the name, worked fine.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.