Giter VIP home page Giter VIP logo

Comments (31)

helderpinto avatar helderpinto commented on July 18, 2024

Thank you for reporting this issue. Unfortunately, I do not have access to an US Gov Cloud environment to reproduce the issue. Did you call the script with the -AzureEnvironment parameter? For example:

./Deploy-AzureOptimizationEngine.ps1 -AzureEnvironment AzureUSGovCloud

Alternatively, if it still does not work, can you try to deploy in a PowerShell prompt other than the Cloud Shell, after cloning the repository and installing the requirements as documented?

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

I had not seen much documentation in the way of what I could pass to the script file, but I just tried the unmodified deploy-azureoptimizationengine.ps1, and get the original issue

"WARNING: Interactive authentication is not supported in this session, please run cmdlet 'Connect-AzAccount -UseDeviceAuthentication'."

Which leads to not logging in with the script, and it stops.

My modified scripts that adds in '-usedeviceauthentication' to the login section, with utilizing
./Deploy-AzureOptimizationEngine.ps1 -AzureEnvironment AzureUSGovernment

does allow me to login with the following warning message, which I follow the link, authenticate, and continue.

'WARNING: To sign in, use a web browser to open the page https://microsoft.com/deviceloginus and enter the code to authenticate.'

The script will then proceed through till

Checking Azure Automation Run As account...
Generating a RSA private key
.........................................................+++++
.....+++++
writing new private key to 'xxxxxx'

New-AzADAppCredential: /home/username/azureoptimizationengine/Deploy-AzureOptimizationEngine.ps1:52
Line |
52 | … redential = New-AzADAppCredential -ApplicationId $Application.Applica …
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
| Application with AppId '18fd9058-18db-425f-bb40-05c37180cdef' does not exist.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

Note:
In an attempt to isolate where the problem was, I also attempted doing it from local windows 10 client with azure cli (having to use the modified code, but a slightly different issues due to 'internet explorer' and changed command output/reading.

THis line elicited an error:
$Application = New-AzADApplication -DisplayName $ApplicationDisplayName -HomePage ("http://" + $applicationDisplayName) -IdentifierUris ("http://" + $keyId)

WARNING: Upcoming breaking changes in the cmdlet 'New-AzADApplication' :

  • The parameter : 'IdentifierUris' is changing.
  • Change description : The value will be considered valid only if it exists as a verified domain in a tenant.
    Note : Go to https://aka.ms/azps-changewarnings for steps to suppress this breaking change warning, and other information on breaking changes in Azure PowerShell.

Added a line prior to $application: Set-Item Env:\SuppressAzurePowerShellBreakingChangeWarnings "true"

At that point, the powershell command line would Stall, and give no further output after a certain set of notification lines. while Azure would denote the deployment was finished successfully.

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024

For this error

Application with AppId '18fd9058-18db-425f-bb40-05c37180cdef' does not exist

I guess Azure AD in US Government as a different behaviour that I cannot reproduce. Uou can work-around by

  1. going to the Automation Account created after the failed deployment and manually generating the Run As Account (see how to).
  2. redeploy the solution passing the exact same deployment options (it will jump over the part that failed because the Run As Account already exists).

For this warning

WARNING: Upcoming breaking changes in the cmdlet 'New-AzADApplication'

it does not have any impact in the deployment and it should also be ignored after creating the Run As Account manually.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

I have re-run the script after manually creating the 'run as account' from the automation group,

Script came to a spot where hard coded azure cloud sql address needed to be changed for usgovcloud addressing.

Write-Host "Deploying SQL Database model..." -ForegroundColor Green

$sqlPassPlain = (New-Object PSCredential "user", $sqlPass).GetNetworkCredential().Password    
#Sql Server Address Change for USgov cloud from windows.net to usgovcloudapi.net
$sqlServerEndpoint = "$sqlServerName.database.**usgovcloudapi.net"**

After another re-run

**Granting Azure AD Global Reader role to the Automation Run As Account (look for the login window that may have popped up)...
The expression after '&' in a pipeline element produced an object that was not valid. It must result in a command name, a script block, or a CommandInfo object.
Could not grant role. If you want Azure AD-based recommendations, please grant the Global Reader role manually to the aoe4-auto-runasaccount Service Principal.
Deployment completed!

from that, I cannot adequately adjust code to compensate, but will just add manual global reader.

**

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024

Great! Thanks for the feedback. The solution must definitely be tested against sovereign clouds. I just added issue #37 for this SQL URL problem. A new version is going to be released in the upcoming weeks. There is meanwhile a known issue with the ingestion of data into Log Analytics (impacts US Government and other sovereign clouds), which is already solved in the development branch. You should replace the Ingest-OptimizationCSVExportsToLogAnalytics runbook code in your deployment by this one.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

with my install being done, for the most part, moving on to next steps.
I have imported the new ingest-optimizationcsvexportstologanalytics runbook, but have to wait for it to run, but unsure what it will do as all other runbooks are failing with authentication issues.

Looked at runbooks and schedules, and had all failures. with authentication style issue, for every one of them (same issue)

From the 'Exception' tab of the scheduled run details of any runbook currently in the AOE set

"The running command stopped because the preference variable "ErrorActionPreference" or common parameter is set to Stop: ClientCertificateCredential authentication failed: AADSTS900382: Confidential Client is not supported in Cross Cloud request. Trace ID: 001e12c8-65e1-4d49-931f-f531f0bceb01 Correlation ID: cecbca67-98a8-45af-9269-a0bb74b3f9ea Timestamp: 2021-05-25 21:33:16Z

Looking into the error code:
Azure AD authentication & authorization error codes | Microsoft Docs
AADSTS90082 OrgIdWsFederationNotSupported - The selected authentication policy for the request isn't currently supported.

Edit:
DIving into code and variables...changing azureoptimization_cloudenvironment string (was azurecloud) to azureusgovernment

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024

Yes, updating the AzureOptimization_CloudEnvironment variable would make it work. Will fix that in the deployment script.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

Started getting successful runbook runs(after variable edit with correct azure cloud), with a 'warning' coming up in some of them

Unable to set default context 'Microsoft.Azure.Commands.Common.Authentication.Abstractions.AzureContext'.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

argavailsetexports storage blob has a csv from what I assume is other successful jobs, but gets this, on every blob name it goes through in the schedule.

AOE module ingest-optimizationcsvexportstologanalytics
Failed
Could not find a valid ingestion control row for argavailsetexports (Could not find a valid ingestion control row for argavailsetexports)
Logging in to Azure with RunAsAccount...

Getting blobs list from aoe4sa storage account (argavailsetexports container)...

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024

Can you check the following:

  • Is the AzureOptimization_SQLServerHostname variable correctly set to the actual SQL Server name?
  • Does the LogAnalyticsIngestControl table in the azureoptimization SQL database have rows in it? If it doesn't, then can you run this script against the database?

Thanks!

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

azureoptimization_sqlserverhostname I have checked, it is correct.

Table exists in the db, and I ran the script, 0 rows affected.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

took a look at the current code in the ingest ps1 file, and it is different from the one that got loaded into my automation runbooks. (it did not have the "could not find a valid ingestion control row" writeout error in the code at line 196...

Copied the newer version one in, got further, but got an error

Failed
ScriptHalted (ScriptHalted)
Logging in to Azure with RunAsAccount...

Getting blobs list from aoe4sa storage account (argavailsetexports container)...

Found 1 new blobs to process...
About to process 20210526-availsets-all.csv...

Failed to upload 1 rows

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

went back to setup-loganalyticsworkspaces.ps1, it wasn't working for me in either azure portal (search-azgraph) or azure cli (az graph)
Through trial and error, Got the workspaces variables to be passed from the graph query

on or around line 48:
$workspaces = Search-AzGraph -Query $argQuery -First $ARGPageSize -Subscription $subscriptions)
would return null data to any variable check like $workspace.properties.id or $workspace.subscriptionId and such

Had to adjust the query line to pass data to the variable to be read correctly
$workspaces = (Search-AzGraph -Query $argQuery -First $ARGPageSize -Subscription $subscriptions).Data

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024
  1. For the ingest script, can you check whether the AzureOptimization_LogAnalyticsWorkspaceId and AzureOptimization_LogAnalyticsWorkspaceKey variables have correct values for the LA workspace you configured for the solution?

  2. For the setup-workspaces script, this is a known issue introduced by v. 0.10 of the Az.ResourceGraph module. Will be addressed in the next release.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

AzureOptimization_loganalyticsworkspaceid and workspacekey are there, I re-entered workspacekey, simular results

Failed
ScriptHalted (ScriptHalted)
Logging in to Azure with RunAsAccount...

Getting blobs list from aoe4sa storage account (consumptionexports container)...

Found 2 new blobs to process...
About to process 2021-05-19-subidnamehere.csv...

Failed to upload 184 rows

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

I just re-ran the ingest-cvs again after trying to ascertain where with some write-outputs in the omsdata function and it successfully completed.

and a second ingest tested, and it is just working.
No major changes that can account for it, it just 'worked', other than adding a few lines for write-output to follow the path.

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024

I am not convinced it worked until you tell me you find AzureOptimization* logs actually ingested in Log Analytics, because when you write outputs inside a function, everything you write will normally be returned as function output and may influence the outcome of calling code, i.e., it may just have 'thought it worked'.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

Attempting to use PowerBI seems to concur with your theory the ingests didn't work, since I am connecting with powerBI pbix file, and it has zero data under the columns to load/transform

Just removed all extra marker write-outputs, re-ran the individual ingest jobs manually, each "completed" and found 0 new blobs to process(checking the blobs, there are daily csv files going back to may 26th. using this as a baseline to see how it works going forward.

Edit: the first blob I checked had daily csv files, the rest of the ARGblobs only had csv from may 26th.
is this a timer thing, 7 days?
Blob Current CSV Files
All Dates are from May
aadobjectsexports - 26th through 31st
advisorexports - 27th
argappgwexports - 26th
argavailsetexports - 26th
argdiskexports - 26th
arglbexports - 26th
argvhdexports - 26th
argvmexports - 26th
consumptionexports - 26th through 31st
recommendationsexports - 27th

export-ARG jobs have 'failed' after first successful runs on may 26th.
jobs itself shows no 'errors' but does have an exception "You cannot call a method on a null-valued expression. (You cannot call a method on a null-valued expression.)"

Since nothing has likely changed with these export queries, I assume this is the likely? or known issue?

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

is there a way to reset the processing times so it will ingest everything again?

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024

The best way to check whether the log ingestion process is working is to run the following query in the Log Analytics workspace you associated/created for AOE:

union AzureOptimization* | where TimeGenerated > ago(7d) | summarize LastIngested=max(TimeGenerated), TotalRowsCount=count() by Type

If it returns rows and these were ingested recently, then we can move forward in the troubleshooting process. Otherwise, the Ingest-OptimizationCSVExportsToLogAnalytics runbook isn't definitely working in the US Government cloud and must be fixed.

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024

I meanwhile found out an issue with the Ingest-OptimizationCSVExportsToLogAnalytics runbook. Can you please fully replace the Ingest-OptimizationCSVExportsToLogAnalytics runbook code with this one and give it a new try? If it completes successfully, then you should see logs landing in the LA workspace after some minutes (use the query I shared above to validate).

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

I was unable to get the union query to work, azure sql didn't like it.
Ran sql query 'select * from [dbo].[LogAnalyticsIngestControl]' and got the first attached image.
image

Note: last nights log runs had 2 ingest 'failures (consumption exports and aadobjectexports)' that from the output, appears to have processed blobs, while the rest were successes and processed 0 new blobs.

Re-Ran new ingest runbook on aadobjectsexports and consumptionexports
LastProcessedDateTime changed

image

Re-Ran on one of the other storage containers and got 0 new blobs processed, no date change on it

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

The Export Jobs that populate all the Argxxxyyy storage containers are failing after the initial run.
image
Example Test Run:
Export-ARGManagedDisksPropertiesToBlobStorage
Current failed runs as of 5/27 through today
Failed
1 Warning: Unable to set default context 'Microsoft.Azure.Commands.Common.Authentication.Abstractions.AzureContext'.
OUTPUT:
You cannot call a method on a null-valued expression. (You cannot call a method on a null-valued expression.)
Logging in to Azure with RunAsAccount...

Environments

{[AzureChinaCloud, AzureChinaCloud], [AzureCloud, AzureCloud], [AzureGermanCloud, AzureGermanCloud], [AzureUSGovernme...

Name : Solutions Team (redacted) - redacted -
redacted
Account : redacted
Environment : AzureUSGovernment
Subscription : redacted
Tenant : redacted
TokenCache :
VersionProfile :
ExtendedProperties : {}
Getting subscriptions target
Querying for ARM Managed Disks properties
Found 1 Managed Disk entries

OriginalSuccessful Run 5/26:
Logging in to Azure with RunAsAccount...

Environments

{[AzureChinaCloud, AzureChinaCloud], [AzureCloud, AzureCloud], [AzureGermanCloud, AzureGermanCloud], [AzureUSGovernme...

Name : Solutions Team (redacted) - redacted -
redacted
Account : redacted
Environment : AzureUSGovernment
Subscription : redacted
Tenant : redacted
TokenCache :
VersionProfile :
ExtendedProperties : {}

Getting subscriptions target

Querying for ARM Managed Disks properties

Found 22 Managed Disk entries

ICloudBlob : Microsoft.Azure.Storage.Blob.CloudBlockBlob
BlobType : BlockBlob
Length : 15004
IsDeleted : False
BlobClient : Azure.Storage.Blobs.BlobClient
BlobBaseClient : Azure.Storage.Blobs.Specialized.BlockBlobClient
BlobProperties : Azure.Storage.Blobs.Models.BlobProperties
RemainingDaysBeforePermanentDelete :
ContentType : text/csv
LastModified : 5/26/2021 8:33:24 PM +00:00
SnapshotTime :
ContinuationToken :
VersionId :
IsLatestVersion :
AccessTier : Cool
TagCount : 0
Tags :
Context : Microsoft.WindowsAzure.Commands.Storage.AzureStorageContext
Name : 20210526-disks-all.csv

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024

The union query I shared is to be run in the Log Analytics workspace, not in the SQL Database. By the second screenshot you shared, it seems the ingestion is working properly now. Regarding the ARG exports failures, can you please check the Az.ResourceGraph module version that was installed in the Automation Account? It should be 0.8.0.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

az.resourcegraph is at version 0.10.0

Deleted, and reimported 0.8.0
re-ran export-argxxx runbooks, through to the ingest-xxx runbooks that relied on them.
re-ran union query in log analytics
image

Looks like it is now working.

Thanks for your assistance for this part.
Now on to implimenting the rest

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

Notes:
New updated program files, with new installer has a successfully first run new implementation (in azure us government) for a different sub.

The first run of the scheduled runbook items all went through with success.
Union query pulled the following:
image

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024

Yes, a new release was published early this week. I am glad everything now worked perfectly at the first run. Your inputs were very important for this release. Very much appreciated. Once you get the first recommendations ingested into the SQL Database, you should be able to connect the Power BI report and visualize all the recommendations. Please, let me know how it goes.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

Reporting in on the Power BI View:
Results are consistent since the initial version and subsequent updates.

Tabs that report Data:
Overview
Cost
High Availability
Security
Performance
Operational Excellence
Progress

Tabs that have no Data:
VM Right-size Overview
VM Righ-size Exploration
Rec. Details (drillthrough)
Rec. History (drillthrough)

from azureoptimizationengine.

helderpinto avatar helderpinto commented on July 18, 2024

The VM right-size pages have no data because Azure Advisor is not reporting oversized VMs. If you read the FAQs in the README.md, you'll find out why and how you can adjust Advisor to increase the chances of getting recommendations.

The Rec. Details/History drillthroughs are available only when you right-click on a recommendation line in the Cost/HA/Security/Perf/OpEx pages tables.

Since the solution is now running according to the expectations, please let me know if this issue can be closed.

from azureoptimizationengine.

justinharty1 avatar justinharty1 commented on July 18, 2024

we can close this thread/issue, Ill go through and see how things are from expectations.

Thanks for the prompt responses to queries and help requests.

Thanks,

from azureoptimizationengine.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.