Giter VIP home page Giter VIP logo

oscal-compass / compliance-trestle Goto Github PK

View Code? Open in Web Editor NEW
148.0 19.0 53.0 59.61 MB

An opinionated tooling platform for managing compliance as code, using continuous integration and NIST's OSCAL standard.

Home Page: https://oscal-compass.github.io/compliance-trestle

License: Apache License 2.0

Makefile 0.11% Shell 0.46% Python 99.26% Jinja 0.16% JavaScript 0.01%
compliance oscal nist800-53 pypi

compliance-trestle's Introduction

Compliance-trestle (also known as trestle)

OS Compatibility PyPI - Python Version Pre-commit Code Coverage Quality gate Pypi GitHub Actions status

Trestle is an ensemble of tools that enable the creation, validation, and governance of documentation artifacts for compliance needs. It leverages NIST's OSCAL as a standard data format for interchange between tools and people, and provides an opinionated approach to OSCAL adoption.

Trestle is designed to operate as a CICD pipeline running on top of compliance artifacts in git, to provide transparency for the state of compliance across multiple stakeholders in an environment friendly to developers. Trestle passes the generated artifacts onto tools that orchestrate the enforcement, measurement, and reporting of compliance.

It also provides tooling to manage OSCAL documents in a more human-friendly manner. By splitting large OSCAL data structures into smaller and easier to edit sub-structures, creation and maintenance of these artifacts can follow normal git workflows including peer review via pull request, versioning, releases/tagging.

Trestle provides three separate but related functions in the compliance space:

  • Manage OSCAL documents to allow editing and manipulation while making sure the schemas are enforced
  • Transform documents from other formats to OSCAL
  • Provide support and governance to author compliance content as markdown and drawio.

Trestle provides tooling to help orchestrate the compliance process across a number of dimensions:

  • Help manage OSCAL documents in a more human-friendly manner by expanding the large OSCAL data structures into smaller and easier to edit sub-structures while making sure the schemas are enforced.
  • Transform documents from other formats to OSCAL
  • Provide governance for markdown documents and enforce consistency of format and content based on specified templates
  • Tooling manage authoring and governance of markdown and drawio files within a repository.
  • Support within trestle to streamline management within a managed git environment.
  • An underlying object model that supports developers interacting with OSCAL artifacts.

Important Note:

The current version of trestle supports NIST OSCAL 1.1.2 as well as previous versions 1.1.x and 1.0.x. All files created by trestle will be output as OSCAL version 1.1.2.

There was a breaking change in OSCAL moving from version 1.0.0 to 1.0.2 mainly due to prop becoming props in AssessmentResults. Those who require strict OSCAL 1.0.0 please use trestle version 0.37.x. That version is stable but will not have any features added, and we encourage all users to move to OSCAL 1.1.2. OSCAL version 1.0.0 files are still handled on import but any AssessmentResults must conform to the props in AssessmentResults OSCAL specification.

Why Trestle

Compliance suffers from being a complex topic that is hard to articulate simply. It involves complete and accurate execution of multiple procedures across many disciplines (e.g. IT, HR, management) with periodic verification and audit of those procedures against controls.

While it is possible to manage the description of controls and how an organisation implements them in ad hoc ways with general tools (spreadsheets, documents), this is hard to maintain for multiple accreditations and, in the IT domain at least, creates a barrier between the compliance efforts and the people doing daily work (DevOps staff).

Trestle aims to reduce or remove this barrier by bringing the maintenance of control descriptions into the DevOps domain. The goal is to have changes to the system (for example, updates to configuration management) easily related to the controls impacted, and to enable modification of those controls as required in concert with the system changes.

Trestle implicitly provides a core opinionated workflow driven by its pipeline to allow standardized interlocks with other compliance tooling platforms.

Machine readable compliance format

Compliance activities at scale, whether size of estate or number of accreditations, require automation to be successful and repeatable. OSCAL as a standard allows teams to bridge between the "Governance" layer and operational tools.

By building human managed artifacts into OSCAL, Trestle is not only able to validate the integrity of the artifacts that people generate - it also enables reuse and sharing of artifacts, and furthermore can provide suitable input into tools that automate operational compliance.

Supported OSCAL elements and extensions

trestle implicitly supports all OSCAL schemas for use within the object model. The development roadmap for trestle includes adding workflow around specific elements / objects that is opinionated.

Supported file formats for OSCAL objects.

OSCAL supports xml, json and yaml with their metaschema tooling. Trestle natively supports only json and yaml formats at this time.

Future roadmap anticipates that support for xml import and upstream references will be enabled. However, it is expected that full support will remain only for json and yaml.

Users needing to import XML OSCAL artifacts are recommended to look at NIST's XML to json conversion page here.

Python codebase, easy installation via pip

Trestle runs on almost all Python platforms (e.g. Linux, Mac, Windows), is available on PyPi and can be easily installed via pip. It is under active development and new releases are made available regularly.
To install run: pip install compliance-trestle
See Install trestle in a python virtual environment for the full installation guide.

Complete documentation and tutorials

Complete documentation, tutorials, and background on compliance can be found here.

Agile Authoring

A trestle-based agile authoring repository setup tool, documentation and tutorial can be found here.

Agile authoring comprises the following beneficial features:

  • based on OSCAL documents behind-the-scenes
  • employs GIT for document control and access
  • exposes text (markdown) and spread sheets (csv) to ease management of compliance artifacts
  • implements compliance digitization for improved audit readiness and cost effectiveness

Demos

A collection of demos utilizing trestle can be found in the related project compliance-trestle-demos.

Development status

Compliance trestle is currently stable and is based on NIST OSCAL version 1.1.2, with active development continuing.

Community meetings and communications

Scheduled meetings

Please attend! All are invited.

When:

Every other Tuesday starting on April 23, 2024 · 11:00 – 11:30am ET convert to your local time

Where: Google Meet Link

Dial in: (US) +1 402-627-0247 PIN: 535 362 764#
More phone numbers

What: Meeting agenda and notes Google Docs

Chat anytime

Slack: #oscal-compliance-trestle-agileauthoring-c2p

  • Note: You can login to Slack using another account like Google, Apple

Contributing to Trestle

Our project welcomes external contributions. Please consult contributing to get started.

License & Authors

If you would like to see the detailed LICENSE click here. Consult contributors for a list of authors and maintainers for the core team.

# Copyright (c) 2020 IBM Corp. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

compliance-trestle's People

Contributors

aj-stein-nist avatar alejo2995 avatar anebula avatar be-code avatar bradh avatar brunomarq avatar butler54 avatar compliance-trestle-1 avatar deenine avatar degenaro avatar enikonovad avatar folksgl avatar fsuits avatar guyzyl avatar hukkinj1 avatar imgbot[bot] avatar jayhawk87 avatar jeffdmgit avatar jpower432 avatar jrubinstein-dev avatar leninmehedy avatar mab879 avatar mrgadgil avatar pritamdutt avatar srmamit avatar stevemar avatar vikas-agarwal76 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

compliance-trestle's Issues

Implement `trestle split`

Issue description / feature objectives

Implement trestle split according to specifications.

Completion Criteria

Complete implementation and testing.

Expected Behavior

  • Be able to split subcomponent of an OSCAL model including simple JSON object, arrays and objects of type additionalProperties.

Implement `trestle validate`

Issue description / feature objectives

Implement trestle validate according to specifications.

Completion Criteria

Complete implementation and testing.

Expected Behavior

  • Validate contents of OSCAL model as per specifications.

scripts/fix_any.py does not work together with make gen-oscal

Describe the bug
scripts/fix_any.py is embedded in scripts/gen_oscal.sh. That should be run by make gen-oscal. When doing so the path assumptions within scripts/fix_any.py fail with the following error:

Traceback (most recent call last):
  File "scripts/fix_any.py", line 123, in <module>
    with open(out_name, 'w') as out_file:
FileNotFoundError: [Errno 2] No such file or directory: 'out_trestle/oscal/ssp.py'

To Reproduce
Steps to reproduce the behavior:

  1. make gen-oscal

Expected behavior
make gen-oscal updates oscal models AND overwrites current models so it can be run cleanly.

Add trestle duplicate to CLI spec doc

Issue description / feature objectives

In discussion trestle duplicate appears to drive foundational behaviour for other commands. Build out docs for duplicates.

Completion Criteria

  • Complete documentation
  • Create issue for implementation.

Minimize duplication in referencing model names as strings

Issue description / feature objectives

There is currently significantly duplication in the code when referencing model names. We should probably have a dictionary in a constants file that maps the reference models names to the pydantic model objects and use that dictionary for all the trestle commands such as:

  • during the creation of the directory structure in trestle init
  • during instantiation of the pydantic model object

Completion Criteria

  • have a single place for reference models names that is used everywhere else in the code

Bug: Conflicting dependencies between markdown-it-py and attrs

Describe the bug
When following the instructions in the README to install Trestle from source, the following error is thrown after trying to install dependencies:
ERROR: markdown-it-py 0.4.9 has requirement attrs~=19.3, but you'll have attrs 20.2.0 which is incompatible.

To Reproduce
Steps to reproduce the behavior:

  1. git clone https://github.com/IBM/compliance-trestle.git
  2. cd compliance-trestle
  3. python3 -m venv venv
  4. . ./venv/bin/activate
  5. pip install -q -e ".[dev]" --upgrade --upgrade-strategy eager

Expected behavior
All dependencies for dev should be installed without errors.

Pydanitc appears to no be correctly enforcing for some structures.

Issue description / feature objectives

In investigating some typing behaviour I noticed that there are some issues with the pydantic models. when we have a structure such as
as

{
"named-uuid-element_1" :{"component"},
"named-uuid-element_2":{"component"}
}

which results in an 'any type enforcement from the generated models (see two examples below)

class ComponentDefinition(BaseModel):
    metadata: Metadata
    import_component_definitions: Optional[List[ImportComponentDefinition]] = Field(
        None, alias='import-component-definitions'
    )
    components: Optional[Dict[str, Any]] = None
    capabilities: Optional[Dict[str, Any]] = None
    back_matter: Optional[BackMatter] = Field(None, alias='back-matter')
class InventoryItem(BaseModel):
    asset_id: str = Field(
        ...,
        alias='asset-id',
        description='Organizational asset identifier that is unique in the context of the system. This may be a reference to the identifier used in an asset tracking system or a vulnerability scanning tool.',
        title='Asset Identifier',
    )
    description: Description
    properties: Optional[List[Prop]] = None
    annotations: Optional[List[Annotation]] = None
    links: Optional[List[Link]] = None
    responsible_parties: Optional[Dict[str, Any]] = Field(
        None, alias='responsible-parties'
    )
    implemented_components: Optional[Dict[str, Any]] = Field(
        None, alias='implemented-components'
    )
    remarks: Optional[Remarks] = None

Expected behavior would be something closer to this:

class ComponentDefinition(BaseModel):
    metadata: Metadata
    import_component_definitions: Optional[List[ImportComponentDefinition]] = Field(
        None, alias='import-component-definitions'
    )
    components: Optional[Dict[str, Component]] = None
    capabilities: Optional[Dict[str, Capability]] = None
    back_matter: Optional[BackMatter] = Field(None, alias='back-matter')

Completion Criteria

Models are correctly generated.

Model generation as part of the CICD build process.

Issue description / feature objectives

One of the original thoughts on trestle is to automate the build process for keeping up to date with oscal as the standard evolves.

One concern I have is that our 'drift' today is inconsistent. We have not made a firm decision on 'which' version of OSCAL we use (apart from the latest) - and differences are mainly driven by when we decide that we need to mess with the OSCAL models.

I can see one a few approaches could be taken.

We change our generation script to look for a release explicitly: https://github.com/usnistgov/OSCAL/releases.
The laziest way is we do this manually in a 'set and forget' mode, however, that may expose us to drift.

A second approach could be to put a check in the CICD pipeline that we are always using the 'latest' version of oscal. This could be defined one of two ways.

  1. The latest tag / release
  2. The latest in trunk

Given the way OSCAL are managing their repo it's possible to converge both options so latest in trunk will be the latest tag (as they have working and release directories in the repo.).

Once we have a decision here - the next question is on how to deploy it. My gut feel would be the best way to check would be to run the 'update' code in the CICD pipeline and fail the check if it produces a change. That way the user would need to do and update locally before a PR could be merged.

Completion Criteria

  • Strategy decided and documented in contributing.md
  • CICD / other changes are implemented.

Deep run time type conversion for pydantic models across different pydantic 'modules' representing OSCAL objects.

Issue description / feature objectives

Current approach for trestle is to create one module of modules per OSCAL schema. OSCAL schemas are overlapping in nature - which means we have actual or near duplicate object definitions in various modules.

Discussion with NIST team (usnistgov/OSCAL#731) is unresolved and we are unlikely to see shared models in the short term w/o changing our model structures.

This gives us a requirement: I want a simple interface to do a deep copy of pydantic models across 'module spaces'.

E.g. Humans know that trestle.oscal.catalog.Metadata is equivalent to trestle.oscal.profile.Metadata and should be deep copy-able, however, the pydantic type enforcement cause errors.

Ideally what we would want to have is something similar in functionality to this:

import trestle.oscal.catalog as c
import trestle.oscal.profile as p

my_catalog: c.Catalog = catalog_from_disk_or_elsewhere


# Following line does not work
# profile: p.Profile = p.Profile(metadata=my_catalog.metadata)

# Something like this would be nice(ish)
 profile: p.Profile = p.Profile(metadata=my_catalog.metadata.cast_to(p.Metadata))
# or 
 profile: p.Profile = p.Profile(metadata=my_catalog.metadata.cast_to(p))

Completion Criteria

Deep copy routine completed for a generic usecase including associated unit tests.

Expected Behavior

See above plus the notes below

  1. It should be recursive - Field introspection in pydantic should allow the exploration of the underlying objects.
  2. By default inconsistent fields should throw exceptions - allowing users to specially program / handle
  3. It may be worth exploring option behaviour which is 'permissive' / opportunistic.
  4. At this point there should be no assumptions on transformation in the workflow. Generic copy capabilities come fir

Actual Behavior

  • None - users have to deep code copies at this current point in time.

Create codecov.yml which provides realistic coverage metrics

The automatically generated classes in trestle.oscal should not need to be individually tested to 100% coverage. UT coverage for any behaviour w.r.t pydantic should, occur over our OscalBaseModel which dictates pydantic behavior.

  • Create a codecov.yml file which realistically computes coverage.

  • Potentially move trestle.oscal.base_model such that trestle.oscal is only generated code.

  • Put in place minimum coverage metrics as well with codecov.

Formalize support for various formats

Issue description / feature objectives

To date the project has been a little washy on language support within the context of trestle. This is a proposal for documenting what our planned support will be for xml vs json vs yaml.

Context:

Reading the files in the OSCAL repo it looks like yaml is a bit of a second class citizen. Given this the proposal is the following:

  1. XML is supported in the following ways:

    • For import and as an optional output for assemble
    • To be read as an external reference (e.g. via href)
    • Not supported for split / merge
  2. For split / merge both json and yaml are supported

  • Json is created by default
  • yaml output is an optional override
  • Trestle may / MUST error when 'hybrid' use is created

On the last point there are technically no issues so i'm not really sure what we should do - I would just like it to look clean.

  1. Users wanting to transition from json to yaml or vice versa should do so via re-importing the file trees.

Completion criteria

  • Agreed / document decision in spec and/or README.md
  • Issues created for future work items.

Evaluate tooling options to xccdf conversion into json and/or converting xccd schema to json

XCCDF is another NIST standard for representing compliance information as part of scap (https://csrc.nist.gov/Projects/Security-Content-Automation-Protocol/Specifications/xccdf)

the definition of xccdf is xml only.

Evaluate options for representing xccdf in high fidelity as json including:

  1. Creating a derivative json schema
  2. Translation tools

Ideally these tools should be callable from trestle (aka python based).

Even more ideally: this should be represented as pydantic models.

Provide initial feedback then we will determine a path forward.

Refactor / Enhancement: Consolidating model IO into OscalBaseClass and/or pydantic

Issue description / feature objectives

As an trestle developer I would like to have one place which presents a consistent set of IO features and abstractions. Today pydantic presents IO utilities for json which are very good and have wrappers which help support the use of optimized json libraries. This functionality does not exist XML and/or yaml, however, could be created.

In order to streamline this the recommendation is to consolidate functionality into the OscalBaseClass where json and other formats can be treated in a similar method.

This will allow us to explore whether:

  1. Yaml support we build generically and can potentially upstream to pydantic
  2. XML support could potentially be added.

Completion Criteria

trestle.oscal.utils functionality is streamlined into OscalBaseClass

Support and testing for windows based running of trestle.

Issue description / feature objectives

This is an optimistic issue, however, I think we should think about this now before the refactors get to big (as in it may be containable now).

Compliance officers may be using windows. We should check for at least the editing functionality whether installs and builds work properly on windows. Based on the experience of @fsuits this should be true now.

Completion Criteria

Create a binary distribution of trestle for airgapped / environments which have 'allowed software' lists.

Compliance is most important in sensitive environments. One property of those environments is that applications (and source code) typically must be vetted before use (e.g. https://www.cyber.gov.au/acsc/view-all-content/essential-eight/essential-eight-explained)

  • Create an approach to creating binary distributions of trestle such that one object (not trestle and all dependencies) needs to be approved.

  • Previous reviews had identified the approach taken by the aws cli (see quoted content below) as a good approach.

  • Explore whether code can also be signed.

Objectives:

We need to package trestle cli so that it is easier for end user to download and use rather than going through usual complex process of setting up python requirements as outlined here: https://packaging.python.org/tutorials/installing-packages/#installing-requirements

As said here: Python’s flexibility is why the first step in every Python project must be to think about the project’s audience and the corresponding environment where the project will run. It might seem strange to think about packaging before writing code, but this process does wonders for avoiding future headaches. https://packaging.python.org/overview/
Findings

AWS-CLI is the best role model for us and we can follow their approach on delivering the python based cli to end-users:

AWS CLI github: https://github.com/aws/aws-cli
AWS CLI has the install script and self contained installer script here: https://github.com/aws/aws-cli/tree/develop/scripts (edited)
Mac: https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-mac.html
Linux: https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-linux.html
windows: https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-windows.html

Implement `trestle add`

Issue description / feature objectives

Implement trestle add according to specifications.

Completion Criteria

Complete implementation and testing.

Expected Behavior

  • Add subcomponent to existing OSCAL model as per specifications.

Implement `trestle assemble`

Issue description / feature objectives

Implement trestle assemble according to specifications.

Completion Criteria

Complete implementation and testing.

Expected Behavior

  • Assemble all subcomponents of a OSCAL model and place the resulting file under dist folder as per specifications.

Implement `trestle init`

Issue description / feature objectives

Implement trestle init according to specifications.

Completion Criteria

Complete implementation and testing.

Expected Behavior

  • Directory structure created for trestle project according to specification.

parser.wrap_for_output fails to insert object into field object correctly due to incorrect name reference

Describe the bug
Calling wrap_for_output will fails for classes which have hyphens in their names.

   wrapper = parser.wrap_for_output(tdn)
  File "/Users/chris/opt/anaconda3/lib/python3.7/site-packages/trestle/core/parser.py", line 144, in wrap_for_output
    wrapped_model = wrapper_model(**{class_to_oscal(class_name, 'field'): model})
  File "pydantic/main.py", line 346, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for TargetDefinition

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

Intra-document validation for OSCAL artifacts.

Issue description / feature objectives

The current pydantic models do not describe sufficiently the constrains required for a OSCAL schema.

NIST defines two sets of IDs

  1. UUIDs: UUIDs must be globally unique within their object type. An acceptable solution is to test whether
    a) UUIDs conform to required schema
    b) UUIDs are unique (and error identifying the conflict if it exists)

Note: Many UUIDs are optional - I think we should have an option to populate more UUIDs as required.

  1. NIST also defines ID fields. ID fields are scoped within the referenced document e.g. two catalogs can have colliding ID fields
    Suggest two tiers of validation:
  2. Compulsory: For an given object type (e.g. control) no id fields collide
  3. Best-practice: For a given schema document where ID's are defined no ID's collide

Note: (1) and (2) explicitly point to defining UUIDs / ID's and not references to UUIDs.

The above functionality should be generic such that it can operate on any OSCAL object, however, presuming uuid and id are special fields that exist.

Completion Criteria

  • Completed functionality
  • Unit test coverage across all new code
  • tested for catalog, profile, target, and SSP at a minimum.
  • This functionality should work for both assembled and 'distributed' artifacts.
  • validation should be able to operate over all files in a repository.

Make a decision: Use of mypy validation within the project

Mypy may make our development cleaner / safer, however, does require a concerted effort across the project.

  1. Demonstrate mypy is viable including validation (including for IDEs) or not

  2. IF so upstream and ensure mypy compilation is part of devops process.

Create optimisation function / flag for trestle directory trees.

Issue description / feature objectives

The default behaviour of trestle split and trestle merge is that they do not create a single file within it's own directory

e.g.
trestle split -e metadata
on

catalog.json

would result in

catalog.json
metadata.json

NOT

catalog.json
metadata/metadata.json

However, it's possible a user can end up in a situation where they DO have single directories like this based on using different contexts to perform different operations.

E.g.
running trestle merge -e version
from

metadata.json
version.json

would merge in the version.json file. However looking at the root directory for the catalog would give you a tree such as

catalog.json
metadata/metadata.jsoln

which is unnecessary. trestle optimise would take a root file and optimise the directory tree below it.

e.g. trestle optimise -f catalog.json on the above directory would result in

catalog.json
metadata.json

Of course this would need to be recursive.

Completion

  • Update and agree on spec
  • Implement appropriate commands

Implement `trestle merge`

Issue description / feature objectives

Implement trestle merge according to specifications.

Completion Criteria

Complete implementation and testing.

Expected Behavior

  • The inverse behavior of trestle split.

  • Example of expected behavior:

SCENARIO #1

cd nist80053
├── catalog.json
├── catalog
│   ├── groups.json
│   ├── metadata.json
│   ├── groups
│   │   ├── 0000__group.json
│   │   ├── 0001__group.json
│   │   ├── 0002__group.json
│   │   ├── 0003__group.json
│   │   ├── 0004__group.json
│   │   ├── 0005__group.json
│   │   ├── 0006__group.json
│   │   ├── 0007__group.json
│   │   ├── 0008__group.json
│   │   ├── 0009__group.json
│   │   ├── 0010__group.json
│   │   ├── 0011__group.json
│   │   ├── 0012__group.json
│   │   ├── 0013__group.json
│   │   ├── 0014__group.json
│   │   ├── 0015__group.json
│   │   ├── 0016__group.json
│   │   ├── 0017__group.json
│   │   ├── 0018__group.json
│   │   ├── 0019__group.json
│   ├── metadata
│            ├── parties.json
│            ├── parties
│                ├── 0000__party.json

----

cd catalog/groups
/catalogs/nist80053/catalog/groups$ tsl merge -e *
> Invalid. Merge path needs to have at least 2 parts.

----

cd catalog
/catalogs/nist80053/catalog$ tsl merge -e groups.*
├── catalog.json
├── catalog
│   ├── groups.json
│   ├── metadata.json

or

/catalogs/nist80053/catalog$ tsl merge -e groups
> Invalid. Merge path needs to have at least 2 parts.

or

/catalogs/nist80053/catalog$ tsl merge -e metadata.*
├── catalog.json
├── catalog
│   ├── groups.json
│   ├── metadata.json
│   ├── groups
│   │   ├── 0000__group.json
│   │   ├── 0001__group.json
│   │   ├── 0002__group.json
│   │   ├── 0003__group.json
│   │   ├── 0004__group.json
│   │   ├── 0005__group.json
│   │   ├── 0006__group.json
│   │   ├── 0007__group.json
│   │   ├── 0008__group.json
│   │   ├── 0009__group.json
│   │   ├── 0010__group.json
│   │   ├── 0011__group.json
│   │   ├── 0012__group.json
│   │   ├── 0013__group.json
│   │   ├── 0014__group.json
│   │   ├── 0015__group.json
│   │   ├── 0016__group.json
│   │   ├── 0017__group.json
│   │   ├── 0018__group.json
│   │   ├── 0019__group.json

or

cd ..
/catalogs/nist80053$ tsl merge -e catalog.metadata
├── catalog.json
├── catalog
│   ├── groups.json
│   ├── groups
│   │   ├── 0000__group.json
│   │   ├── 0001__group.json
│   │   ├── 0002__group.json
│   │   ├── 0003__group.json
│   │   ├── 0004__group.json
│   │   ├── 0005__group.json
│   │   ├── 0006__group.json
│   │   ├── 0007__group.json
│   │   ├── 0008__group.json
│   │   ├── 0009__group.json
│   │   ├── 0010__group.json
│   │   ├── 0011__group.json
│   │   ├── 0012__group.json
│   │   ├── 0013__group.json
│   │   ├── 0014__group.json
│   │   ├── 0015__group.json
│   │   ├── 0016__group.json
│   │   ├── 0017__group.json
│   │   ├── 0018__group.json
│   │   ├── 0019__group.json

SCENARIO #2

----
├── catalog.json
├── catalog
│   ├── groups.json
│   ├── metadata.json
│   ├── groups
│   │   ├── 0000__group.json
│   │   ├── 0001__group.json
│   │   ├── 0002__group.json
│   │   ├── 0003__group.json
│   │   ├── 0004__group.json
│   │   ├── 0005__group.json
│   │   ├── 0006__group.json
│   │   ├── 0007__group.json
│   │   ├── 0008__group.json
│   │   ├── 0009__group.json
│   │   ├── 0010__group.json
│   │   ├── 0011__group.json
│   │   ├── 0012__group.json
│   │   ├── 0013__group.json
│   │   ├── 0014__group.json
│   │   ├── 0015__group.json
│   │   ├── 0016__group.json
│   │   ├── 0017__group.json
│   │   ├── 0018__group.json
│   │   ├── 0019__group.json
│   ├── metadata
│            ├── parties.json
│            ├── parties
│                ├── 0000__party.json
----

/catalogs/nist80053$ tsl merge -e catalog.*
├── catalog.json
-----
/catalogs/nist80053$ tsl merge -e catalog
> Invalid. Merge path needs to have at least 2 parts.
-----
/catalogs/nist80053$ tsl merge -e *
> Invalid. Merge path needs to have at least 2 parts.

Implement `trestle import`

Issue description / feature objectives

Implement trestle import according to specifications.

Completion Criteria

Complete implementation and testing.

Expected Behavior

  • Contents of OSCAL model in imported file decomposed and placed in the expected folder as specification.

Review parser functionality as parser.parse_model and parser.to_full_model_name have overlapping functionality

Issue description / feature objectives

In reviewing the functionality in #60 I noticed that the functionality in that PR is a little confusing. Specifically is there a situation where we ever need to get the full module name as provided by to_full_model_name and not return the class itself?

We may be able to simplify processes and eliminate parser.parse_model by returning the class rather than the class path.

Also to_full_model_name makes some pretty dramatic assumptions that I do not think we are necessarily assuming especially when we have split models.

Demonstrate generation of dynamic models which manipulate input OSCAL objects.

Issue description / feature objectives

trestle split and trestle merge will create issues in the OSCAL objects that are non-compliant. e.g. they miss mandatory fields.

Demonstrate the ability to define partial objects for use within the split / merge while STILL being able to validate the documents successfully

Completion Criteria

  • Demonstration class / methods complete
  • Test class / methods complete.

Raise an issue with datamodel-code-generator about strange behavior

Issue description / feature objectives

#12 created a temporary fix for pydantic model creation. The fix will be brittle. Underlying generation in datamodel-code-generator needs to be fixed.

Completion Criteria

  • Issue raised with datamodel-code-generator
  • Ideally create a fix, however, reporting will be sufficient.
  • If underlying issue is fixed remove hacky script from #12

XCCDF and OPA results transformation to OSCAL findings results

Issue description / feature objectives

With most Cloud APIs requiring JSON as a first class citizen we need a consistent mechanism for converting 'test results' into OSCAL.

Current assumption is that asssessment-results in OSCAL is the appropriate location. The question is how to we capture 'rules' and other objects that are may or may not be 1:1 with an objective and represent an individual result of a technical implementation.

Usecase a: XCCDF where 1 'rule' == 1 objective.

Usecase b: NIST where N rules = 1 objective.

For this we need to test our ability to map a limited subset of 'results' formats into OSCAL finding.

  1. SCAP Xccdf test results from compliance operator (or similar)compliance_operator_AssessmentResult 7-15.json.zip
    1a) OpenSCAP xccdf results running on linux os (centos / rhel)

  2. OPA inspired / generated results (https://github.ibm.com/cocoa/evidence-summary)

  3. IBM S&CC results (based on spreadsheets).

Completion Criteria

Implement `trestle remove`

Issue description / feature objectives

Implement trestle remove according to specifications.

Completion Criteria

Complete implementation and testing.

Expected Behavior

  • Be able to remove a sub OSCAL element from a file or model

Implement `trestle create`

Issue description / feature objectives

Implement trestle create according to specifications.

Completion Criteria

Complete implementation and testing for all trestle create subcommands.

Expected Behavior

  • Sample content created for each trestle create subcommand.

Code signing for BInary release.

Setup infrastructure to automate code signing for a binary release.

Issue description / feature objectives

We need to sign code with digital certificate (possibly with IBM CA root) to ensure authenticity of code open sourced by IBM.
Completion Criteria

codesign -dv --verbose=4

returns expected Authority and Hash

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.