Giter VIP home page Giter VIP logo

tranfm / monitor-wml-model-with-watson-openscale Goto Github PK

View Code? Open in Web Editor NEW

This project forked from ibm/monitor-wml-model-with-watson-openscale

0.0 1.0 0.0 5.74 MB

Monitor performance, fairness, and quality of a WML model with AI OpenScale APIs

Home Page: https://developer.ibm.com/patterns/monitor-performance-fairness-and-quality-of-a-wml-model-with-ai-openscale-apis

License: Apache License 2.0

Jupyter Notebook 100.00%

monitor-wml-model-with-watson-openscale's Introduction

Monitor WML Model With Watson OpenScale

In this Code Pattern, we will use German Credit data to train, create, and deploy a machine learning model using Watson Machine Learning. We will create a data mart for this model with Watson OpenScale and configure OpenScale to monitor that deployment, and inject seven days' worth of historical records and measurements for viewing in the OpenScale Insights dashboard.

When the reader has completed this Code Pattern, they will understand how to:

  • Create and deploy a machine learning model using the Watson Machine Learning service
  • Setup Watson OpenScale Data Mart
  • Bind Watson Machine Learning to the Watson OpenScale Data Mart
  • Add subscriptions to the Data Mart
  • Enable payload logging and performance monitor for subscribed assets
  • Enable Quality (Accuracy) monitor
  • Enable Fairness monitor
  • Score the German credit model using the Watson Machine Learning
  • Insert historic payloads, fairness metrics, and quality metrics into the Data Mart
  • Use Data Mart to access tables data via subscription

architecture

Flow

  1. The developer creates a Jupyter Notebook on Watson Studio.
  2. The Jupyter Notebook is connected to a PostgreSQL database, which is used to store Watson OpenScale data.
  3. The notebook is connected to Watson Machine Learning and a model is trained and deployed.
  4. Watson OpenScale is used by the notebook to log payload and monitor performance, quality, and fairness.

Watch the Video

video

Prerequisites

Steps

  1. Clone the repository
  2. Use free internal DB or Create a Databases for PostgreSQL DB
  3. Create a Watson OpenScale service
  4. Create a Watson Machine Learning instance
  5. Create a notebook in IBM Watson Studio
  6. Run the notebook in IBM Watson Studio
  7. Setup OpenScale to utilize the dashboard

1. Clone the repository

git clone https://github.com/IBM/monitor-wml-model-with-watson-openscale
cd monitor-wml-model-with-watson-openscale

2. Use free internal DB or Create a Databases for PostgreSQL DB

If you wish, you can use the free internal Database with Watson OpenScale. To do this, make sure that the cell for KEEP_MY_INTERNAL_POSTGRES = True remains unchanged.

If you have or wish to use a paid Databases for Postgres instance, follow these instructions:

Note: Services created must be in the same region, and space, as your Watson Studio service.

  • Using the IBM Cloud Dashboard catalog, search for PostgreSQL and choose the Databases for Postgres service.
  • Wait a couple of minutes for the database to be provisioned.
  • Click on the Service Credentials tab on the left and then click New credential + to create the service credentials. Copy them or leave the tab open to use later in the notebook.
  • Make sure that the cell in the notebook that has:
KEEP_MY_INTERNAL_POSTGRES = True

is changed to:

KEEP_MY_INTERNAL_POSTGRES = False

3. Create a Watson OpenScale service

NOTE: At this time (3/27/19) you must use an instance of Watson OpenScale deployed in the Dallas region. This is currently the only region that sends events about scoring requests to the message hub, which is read by OpenScale to populate the payload logging table.

4. Create a Watson Machine Learning instance

  • Under the Settings tab, scroll down to Associated services, click + Add service and choose Watson:

  • Search for Machine Learning, Verify this service is being created in the same space as the app in Step 1, and click Create.

  • Alternately, you can choose an existing Machine Learning instance and click on Select.

  • The Watson Machine Learning service is now listed as one of your Associated Services.

  • In a different browser tab go to https://cloud.ibm.com/ and log in to the Dashboard.

  • Click on your Watson Machine Learning instance under Services, click on Service credentials and then on View credentials to see the credentials.

  • Save the credentials in a file. You will use them inside the notebook.

5. Create a notebook in IBM Watson Studio

  • In Watson Studio, click New Project + under Projects or, at the top of the page click + New and choose the tile for Data Science and then Create Project.
  • In Watson Studio using the project you've created, click on + Add to project and then choose the Notebook tile, OR in the Assets tab under Notebooks choose + New notebook to create a notebook.
  • Select the From URL tab. [1]
  • Enter a name for the notebook. [2]
  • Optionally, enter a description for the notebook. [3]
  • Under Notebook URL provide the following url: https://raw.githubusercontent.com/IBM/monitor-wml-model-with-watson-openscale/master/notebooks/OpenScale.ipynb [4]
  • For Runtime select the Default Python 3.6 Free option. [5]
  • Click the Create notebook button. [6]

OpenScale Notebook Create

6. Run the notebook in IBM Watson Studio

Follow the instructions for Provision services and configure credentials:

Your Cloud API key can be generated by going to the Users section of the Cloud console.

  • From that page, click your name, scroll down to the API Keys section, and click Create an IBM Cloud API key.
  • Give your key a name and click Create, then copy the created key and paste it below.

Alternately, from the IBM Cloud CLI :

ibmcloud login --sso
ibmcloud iam api-key-create 'my_key'
ibmcloud resource service-instance <Watson_OpenScale_instance_name>
  • Enter the AIOS_GUID and CLOUD_API_KEY in the next cell for the AIOS_CREDENTIALS.

  • Add the Watson Machine Learning credentials for the service that you created in the next cell as WML_CREDENTIALS.

  • Either use the internal Database, which requires No Changes or Add your DB_CREDENTIALS after reading the instructions preceeding that cell and change the cell KEEP_MY_INTERNAL_POSTGRES = True to become KEEP_MY_INTERNAL_POSTGRES = False.

  • Move your cursor to each code cell and run the code in it. Read the comments for each cell to understand what the code is doing. Important when the code in a cell is still running, the label to the left changes to In [*]:. Do not continue to the next cell until the code is finished running.

7. Setup OpenScale to utilize the dashboard

Now that you have created a machine learning model, you can utilize the OpenScale dashboard to gather insights. Follow the steps to configure the OpenScale dashboard

Sample Output

  • Go to the instance of Watson OpenScale that you created and click Manage on the menu and then Launch Application. Choose the Insights tab to get an overview of your monitored deployments, Accuracy alerts, and Fairness alerts.

WOS insights

  • Click on the tile for the Spark German Risk Deployment and you can see graphs for Fairness, Accuracy, and Performance (Avg. Requests/Minute). Click on a portion of the graph to bring up a detailed view. We'll describe the data we clicked on, yours will vary.

WOS detailed graphs

  • You can see from the image above that, at this time, our model is receiving 4.6 requests per minute, with an accuracy of 72%. We have 13% of our credit risk (and 87% of no risk) due the fact that this individual is age 18 to 25, and 28% credit risk (and 72% of no risk) due to the fact that this individual is female. This latter statistic id flagged as bias.

  • Now click View details.

WOS transaction detail age

  • We can see that, for the age category, 87% of 18 to 25 year olds received No Risk compared to 82% of 26 to 75 year olds. This is not flagged as biased.

  • Now click on the tab marked age and change it to sex. We can see that only 71% of the females group received the outcome of No Risk as compared to 78% of the male group. This is flagged as BIAS. Now click on View Transactions.

WOS bias

  • In the Explain a transaction window, we can see the details as to which feature contributed specific amounts to the overall assesment of Risk vs. No Risk, as well as Minimum changes for another outcome and Minimum factors supporting this outcome.

WOS explain transaction detail

License

Apache 2.0

monitor-wml-model-with-watson-openscale's People

Contributors

dolph avatar imgbot[bot] avatar ljbennett62 avatar rhagarty avatar sanjeevghimire avatar scottdangelo avatar stevemart avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.