Mapper Creation

Environment Set Up 

Node.js (a runtime environment for Javascript) and Yarn (a package manager for Node.js) are external utilities which are utilized extensively within this guide. The following section details their installation process.

Linux/Mac OS:

  1. Install nvm.

1a. Use either of the following commands to install nvm:

curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash
wget -qO- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash

1b. Either restart the terminal or run the following commands to use nvm:

export NVM_DIR="$([ -z "${XDG_CONFIG_HOME-}" ] && printf %s "${HOME}/.nvm" || printf %s "${XDG_CONFIG_HOME}/nvm")"

[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
  1. Run the following command to install and use Node.js v16:
nvm install 16
  1. Install Yarn:
npm install --global yarn

Windows:

Install Node.js v16 via the installer. If v16 is not available from the page, use this archive.

Install Yarn:

npm install --global yarn

 

*-to-HDF Creation Guide 

Fork/branch a development repository from the main Heimdall2 GitHub repository.

  • SAF team developers have write access to the main repository and should create a branch on the primary development repository. Non-SAF team developers should instead create a fork of the main repository and create a development branch there.

Create a draft pull request for your development branch against the main repository branch.

Have a rough, high-level outline of how your export translates to the HDF. For an example of this process, refer to the HDF Schema Mapping Example Walkthrough section.

Set up for *-to-HDF mapper development.

4a. Install the necessary dependencies for Heimdall2. Under the heimdall2 directory, enter the following command in the terminal:

yarn install

4b. Create a blank TypeScript file under the src directory in hdf-converters. It should be named:

{YOUR-EXPORT-NAME-HERE}-mapper.ts

4c. Select the appropriate mapper skeleton for your export type. Place them in the file created in step 4b. Replace names (skeleton by default) as necessary.

4d. Export your mapper class created in the previous steps by specifying its export in the index.ts file. Add the following line:

export * from './src/{YOUR-EXPORT-NAME-HERE}-mapper';

4e. Create a new directory named {YOUR-EXPORT-NAME-HERE}_mapper under the sample_jsons directory in hdf-converters. Create another directory named sample_input_report in the directory you just made. The file structure should look like this:

+-- sample_jsons
|   +-- {YOUR-EXPORT-NAME-HERE}-mapper
|   |   +-- sample_input_report

4f. Place your sample export under the sample_input_report directory. Your sample export should be genericized to avoid any leaking of sensitive information. The file structure should now look like this:

+-- sample_jsons
|   +-- {YOUR-EXPORT-NAME-HERE}-mapper
|   |   +-- sample_input_report
|   |   |   +-- {YOUR-SAMPLE-EXPORT}

4g. Create a blank TypeScript file under the test/mappers/forward directory in hdf-converters. It should be named:

{YOUR-EXPORT-NAME-HERE}_mapper.spec.ts

4h. Select the appropriate mapper testing skeleton for your export type. Place it in the file created in step 4g. Replace names (skeleton by default) as necessary.

  1. Add fingerprinting to identify your security service scan export.

5a. Go to the file report_intake.ts under the heimdall2/apps/frontend/src/store directory.

5b. Import your mapper file. You should be able to add the name of your mapper class to a pre-existing import statement pointing at hdf-converters as follows:

import {
  ASFFResults as ASFFResultsMapper,
  BurpSuiteMapper,
  ...
  {YOUR-MAPPER-CLASS-HERE}
} from '@mitre/hdf-converters';

5c. Instantiate your mapper class in the convertToHdf switch block. Add the following lines:

case '{YOUR-EXPORT-SERVICE-NAME-HERE}':
  return new {YOUR-MAPPER-CLASS-HERE}(convertOptions.data).toHdf();

5d. Navigate to the file fingerprinting.ts in the src/utils directory in hdf-converters. Add keywords that are unique to your sample export to the fileTypeFingerprints variable. It should be formatted as follows:

export const fileTypeFingerprints = {
  asff: ['Findings', 'AwsAccountId', 'ProductArn'],
  ...
  {YOUR-EXPORT-SERVICE-NAME-HERE}: [{UNIQUE KEYWORDS AS STRINGS}]
};
  1. Create the *-to-HDF mapper.

6a. Return to the {YOUR-EXPORT-NAME-HERE}-mapper.ts file. In the file, you should have a generic skeleton mapper picked according to your export type.

6b. Certain security services produce exports which are not immediately usable by the skeleton mapper. In this case, pre-processing on the export and post-processing on the generated HDF file is necessary in order to ensure compatibility.

6c. The skeleton mapper and base-converter have been designed to provide the base functionality needed for *-to-HDF mapper generation. For most developers, mapper creation will be limited to assigning objects from the export structure to correlating attributes in the mapper according to the HDF schema.

6d. Commit your changes with the signoff option and push the changes to your development branch. This should queue up the Github Actions pipeline which includes a Netlify instance of Heimdall2 which you can use to test if your mapper is generating a HDF file correctly.

  1. Set up and use regression testing on your mapper.

7a. Uncomment out the commented out lines in the {YOUR-SAMPLE-EXPORT}-hdf.json file created in step 4f. This will allow the regression tests to automatically generate a HDF output file whenever you run the tests. The commented out lines should look similar to the following:

// fs.writeFileSync(
//   'sample_jsons/skeleton_mapper/skeleton-hdf.json',
//   JSON.stringify(mapper.toHdf(), null, 2)
// );

7b. Using the terminal, cd into the hdf-converters directory and run the following command. This command will run your mapper against the sample export file in sample_jsons and test to see if the output is generated as expected.

yarn run test --verbose --silent=false ./test/mappers/forward/{YOUR-EXPORT-NAME-HERE}_mapper.spec.ts

7c. Your tests should generate HDF output files for when --with-raw is not flagged (default behavior) and when it is flagged (denoted by -withraw in the filename). It will also compare the contents of these generated files with a temporary mapper instance created in the test itself. Review the test output to ensure that the tests are all passing and review the HDF output files to ensure that the contents of the mapping are being generated correctly.

7d. Recomment out the lines from step 7b.

  1. Document your new mapper in the README for hdf-converters under the Supported Formats section. It should be formatted as follows:
{#}. [{YOUR-EXPORT-NAME-HERE}] - {MAPPER INPUT DESCRIPTION}

Commit your final changes and mark your pull request as 'ready for review'. You should request for a code review from members of the SAF team and edit your code as necessary. Once approved, your mapper will be merged into the main development branch and scheduled for release as an officially supported conversion format for the HDF Converters.

Create a development branch against the SAF CLI repository and create a draft pull request for your new branch.

Set up for SAF CLI mapper integration.

11a. In the package.json file, update the versions of @mitre/hdf-converters and @mitre/heimdall-lite to the latest release of Heimdall2.

11b. In the src/commands/convert directory, create a blank TypeScript file. It should be named:

{YOUR-EXPORT-NAME-HERE}2hdf.ts

11c. In the test/sample_data directory, create a directory named {YOUR-EXPORT-NAME-HERE}. Underneath it, create a directory named sample_input_report. The file structure should now look like this:

+-- sample_data
|   +-- {YOUR-EXPORT-NAME-HERE}
|   |   +-- sample_input_report

11d. Place your sample export under the sample_input_report directory. Your sample export should be genericized to avoid any leaking of sensitive information. Under the {YOUR-EXPORT-NAME-HERE} directory, place your output HDF files generated during the testing phase of step 7c. The file structure should now look like this:

+-- sample_data
|   +-- {YOUR-EXPORT-NAME-HERE}
|   |   +-- sample_input_report
|   |   |   +-- {YOUR-SAMPLE-EXPORT}
|   |   +-- {YOUR-EXPORT-NAME-HERE}-hdf.json
|   |   +-- {YOUR-EXPORT-NAME-HERE}-hdf-withraw.json

11e. In the test/commands/convert directory, create a blank TypeScript file. It should be named:

{YOUR-EXPORT-NAME-HERE}2hdf.test.ts
  1. Integrate your mapper with the SAF CLI.

12a. Insert the skeleton for integrating a HDF mapper with the SAF CLI. Replace names (skeleton by default) as necessary.

12b. Insert the skeleton for a convert command test for the SAF CLI. Replace names (skeleton by default) as necessary.

12c. Navigate to the index.ts file under the src/commands/convert directory. Import your mapper using the existing import block as follows:

import {
  ASFFResults,
  ...
  {YOUR-MAPPER-CLASS-HERE}
  } from '@mitre/hdf-converters'

12d. Under the switch block in the getFlagsForInputFile function, add your mapper class as it is defined in step 5d for fingerprinting for the generic convert command. If the convert command for your mapper has any additional flags beyond the standard --input and --output flags, return the entire flag block in the switch case. This is demonstrated as follows:

switch (Convert.detectedType) {
  ...
  case {YOUR-EXPORT-SERVICE-NAME-HERE}:
    return {YOUR-CLI-CONVERT-CLASS}.flags  //Only add if special flags exist
  ...
    return {}
  }
  1. Edit the README file to reflect your newly added conversion command under the To HDF section. It should be formatted as follows:
##### {YOUR-EXPORT-NAME-HERE} to HDF

\```
convert {YOUR-EXPORT-NAME-HERE}2hdf       Translate a {YOUR-EXPORT-NAME-HERE} results {EXPORT-TYPE} into
                                              a Heimdall Data Format JSON file

OPTIONS
  -i, --input=input          Input {EXPORT-TYPE} File
  -o, --output=output        Output HDF JSON File
  -w, --with-raw             Include raw input file in HDF JSON file

EXAMPLES
  saf convert {YOUR-EXPORT-NAME-HERE}2hdf -i {INPUT-NAME} -o output-hdf-name.json
\```
  1. Commit your changes and mark your pull request as 'ready for review'. You should request for a code review from members of the SAF team and edit your code as necessary. Once approved, merged, and released, your mapper will be callable using the SAF CLI.

 

Mapper Creation Skeletons 

  • Skeleton for a general file-based *-to-HDF mapper:
import {ExecJSON} from 'inspecjs';
import _ from 'lodash';
import {version as HeimdallToolsVersion} from '../package.json';
import {
  BaseConverter,
  ILookupPath,
  impactMapping,
  MappedTransform
} from './base-converter';

const IMPACT_MAPPING: Map<string, number> = new Map([
  ['critical', 0.9],
  ['high', 0.7],
  ['medium', 0.5],
  ['low', 0.3]
]);

export class SkeletonMapper extends BaseConverter {
  withRaw: boolean;

  mappings: MappedTransform<
    ExecJSON.Execution & {passthrough: unknown},
    ILookupPath
  > = {
    platform: {
      name: 'Heimdall Tools',
      release: HeimdallToolsVersion,
      target_id: null  //Insert data
    },
    version: HeimdallToolsVersion,
    statistics: {
      duration: null  //Insert data
    },
    profiles: [
      {
        name: '',              //Insert data
        title: null,           //Insert data
        maintainer: null,      //Insert data
        summary: null,         //Insert data
        license: null,         //Insert data
        copyright: null,       //Insert data
        copyright_email: null, //Insert data
        supports: [],          //Insert data
        attributes: [],        //Insert data
        depends: [],           //Insert data
        groups: [],            //Insert data
        status: 'loaded',      //Insert data
        controls: [
          {
            key: 'id',
            tags: {},             //Insert data
            descriptions: [],     //Insert data
            refs: [],             //Insert data
            source_location: {},  //Insert data
            title: null,          //Insert data
            id: '',               //Insert data
            desc: null,           //Insert data
            impact: 0,            //Insert data
            code: null,           //Insert data
            results: [
              {
                status: ExecJSON.ControlResultStatus.Failed,  //Insert data
                code_desc: '',                                //Insert data
                message: null,                                //Insert data
                run_time: null,                               //Insert data
                start_time: ''                                //Insert data
              }
            ]
          }
        ],
        sha256: ''
      }
    ],
    passthrough: {
      transformer: (data: Record<string, any>): Record<string, unknown> => {
        return {
          auxiliary_data: [{name: '', data: _.omit([])}],  //Insert service name and mapped fields to be removed
          ...(this.withRaw && {raw: data})
        };
      }
    }
  };
  constructor(exportJson: string, withRaw = false) {
    super(JSON.parse(exportJson), true);
    this.withRaw = withRaw
  }
}
  • Skeleton for a general test for a *-to-HDF mapper in HDF Converters:
import fs from 'fs';
import {SkeletonMapper} from '../../../src/skeleton-mapper';
import {omitVersions} from '../../utils';

describe('skeleton_mapper', () => {
  it('Successfully converts Skeleton targeted at a local/cloned repository data', () => {
    const mapper = new SkeletonMapper(
      fs.readFileSync(
        'sample_jsons/skeleton_mapper/sample_input_report/skeleton.json',
        {encoding: 'utf-8'}
      )
    );

    // fs.writeFileSync(
    //   'sample_jsons/skeleton_mapper/skeleton-hdf.json',
    //   JSON.stringify(mapper.toHdf(), null, 2)
    // );

    expect(omitVersions(mapper.toHdf())).toEqual(
      omitVersions(
        JSON.parse(
          fs.readFileSync(
            'sample_jsons/skeleton_mapper/skeleton-hdf.json',
            {
              encoding: 'utf-8'
            }
          )
        )
      )
    );
  });
});

describe('skeleton_mapper_withraw', () => {
  it('Successfully converts withraw flagged Skeleton targeted at a local/cloned repository data', () => {
    const mapper = new SkeletonMapper(
      fs.readFileSync(
        'sample_jsons/skeleton_mapper/sample_input_report/skeleton.json',
        {encoding: 'utf-8'}
      ),
      true
    );

    // fs.writeFileSync(
    //   'sample_jsons/skeleton_mapper/skeleton-hdf-withraw.json',
    //   JSON.stringify(mapper.toHdf(), null, 2)
    // );

    expect(omitVersions(mapper.toHdf())).toEqual(
      omitVersions(
        JSON.parse(
          fs.readFileSync(
            'sample_jsons/skeleton_mapper/skeleton-hdf-withraw.json',
            {
              encoding: 'utf-8'
            }
          )
        )
      )
    );
  });
});
  • Skeleton for SAF CLI mapper conversion integration:
import {Command, Flags} from '@oclif/core'
import fs from 'fs'
import {SkeletonMapper as Mapper} from '@mitre/hdf-converters'
import {checkSuffix} from '../../utils/global'

export default class Skeleton2HDF extends Command {
  static usage = 'convert skeleton2hdf -i <skeleton-json> -o <hdf-scan-results-json>'

  static description = 'Translate a Skeleton output file into an HDF results set'

  static examples = ['saf convert skeleton2hdf -i skeleton.json -o output-hdf-name.json']

  static flags = {
    help: Flags.help({char: 'h'}),
    input: Flags.string({char: 'i', required: true, description: 'Input Skeleton file'}),
    output: Flags.string({char: 'o', required: true, description: 'Output HDF file'}),
    'with-raw': Flags.boolean({char: 'w', required: false}),
  }

  async run() {
    const {flags} = await this.parse(Skeleton2HDF)
    const input = fs.readFileSync(flags.input, 'utf8')

    const converter = new Mapper(input, flags.['with-raw'])
    fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
  }
}
  • Skeleton for a convert command test for the SAF CLI:
import {expect, test} from '@oclif/test'
import tmp from 'tmp'
import path from 'path'
import fs from 'fs'
import {omitHDFChangingFields} from '../utils'

describe('Test skeleton', () => {
  const tmpobj = tmp.dirSync({unsafeCleanup: true})

  test
  .stdout()
  .command([
    'convert skeleton2hdf',
    '-i',
    path.resolve(
      './test/sample_data/skeleton/sample_input_report/skeleton_sample.json',
    ),
    '-o',
    `${tmpobj.name}/skeletontest.json`,
  ])
  .it('hdf-converter output test', () => {
    const converted = JSON.parse(
      fs.readFileSync(`${tmpobj.name}/skeletontest.json`, 'utf8'),
    )
    const sample = JSON.parse(
      fs.readFileSync(
        path.resolve('./test/sample_data/skeleton/skeleton-hdf.json'),
        'utf8',
      ),
    )
    expect(omitHDFChangingFields(converted)).to.eql(
      omitHDFChangingFields(sample),
    )
  })
})

describe('Test skeleton withraw flag', () => {
  const tmpobj = tmp.dirSync({unsafeCleanup: true})

  test
  .stdout()
  .command([
    'convert skeleton2hdf',
    '-i',
    path.resolve(
      './test/sample_data/skeleton/sample_input_report/skeleton_sample.json',
    ),
    '-o',
    `${tmpobj.name}/skeletontest.json`,
    '-w',
  ])
  .it('hdf-converter withraw output test', () => {
    const converted = JSON.parse(
      fs.readFileSync(`${tmpobj.name}/skeletontest.json`, 'utf8'),
    )
    const sample = JSON.parse(
      fs.readFileSync(
        path.resolve('./test/sample_data/skeleton/skeleton-hdf-withraw.json'),
        'utf8',
      ),
    )
    expect(omitHDFChangingFields(converted)).to.eql(
      omitHDFChangingFields(sample),
    )
  })
})

 

Best Practices 

[//] # WIP

 

*-to-HDF Mapper Construction Example 

The following is an example of a implemented HDF mapper for Twistlock created using a high-level Twistlock-to-HDF mapping and tools from base-converter.

const IMPACT_MAPPING: Map<string, number> = new Map([
  ['critical', 0.9],
  ['important', 0.9],
  ['high', 0.7],
  ['medium', 0.5],
  ['moderate', 0.5],
  ['low', 0.3]
]);

export class TwistlockMapper extends BaseConverter {
  withRaw: boolean;

  mappings: MappedTransform<
    ExecJSON.Execution & {passthrough: unknown},
    ILookupPath
  > = {
    platform: {
      name: 'Heimdall Tools',
      release: HeimdallToolsVersion,
      target_id: {path: 'results[0].name'}
    },
    version: HeimdallToolsVersion,
    statistics: {},
    profiles: [
      {
        path: 'results',
        name: 'Twistlock Scan',
        title: {
          transformer: (data: Record<string, unknown>): string => {
            const projectArr = _.has(data, 'collections')
              ? _.get(data, 'collections')
              : 'N/A';
            const projectName = Array.isArray(projectArr)
              ? projectArr.join(' / ')
              : projectArr;
            return `Twistlock Project: ${projectName}`;
          }
        },
        summary: {
          transformer: (data: Record<string, unknown>): string => {
            const vulnerabilityTotal = _.has(data, 'vulnerabilityDistribution')
              ? `${JSON.stringify(
                  _.get(data, 'vulnerabilityDistribution.total')
                )}`
              : 'N/A';
            const complianceTotal = _.has(data, 'complianceDistribution')
              ? `${JSON.stringify(_.get(data, 'complianceDistribution.total'))}`
              : 'N/A';
            return `Package Vulnerability Summary: ${vulnerabilityTotal} Application Compliance Issue Total: ${complianceTotal}`;
          }
        },
        supports: [],
        attributes: [],
        groups: [],
        status: 'loaded',
        controls: [
          {
            path: 'vulnerabilities',
            key: 'id',
            tags: {
              nist: ['SI-2', 'RA-5'],
              cci: ['CCI-002605', 'CCI-001643'],
              cveid: {path: 'id'}
            },
            refs: [],
            source_location: {},
            title: {path: 'id'},
            id: {path: 'id'},
            desc: {path: 'description'},
            impact: {
              path: 'severity',
              transformer: impactMapping(IMPACT_MAPPING)
            },
            code: {
              transformer: (vulnerability: Record<string, unknown>): string => {
                return JSON.stringify(vulnerability, null, 2);
              }
            },
            results: [
              {
                status: ExecJSON.ControlResultStatus.Failed,
                code_desc: {
                  transformer: (data: Record<string, unknown>): string => {
                    const packageName = _.has(data, 'packageName')
                      ? `${JSON.stringify(_.get(data, 'packageName'))}`
                      : 'N/A';
                    const impactedVersions = _.has(data, 'impactedVersions')
                      ? `${JSON.stringify(_.get(data, 'impactedVersions'))}`
                      : 'N/A';
                    return `Package ${packageName} should be updated to latest version above impacted versions ${impactedVersions}`;
                  }
                },
                message: {
                  transformer: (data: Record<string, unknown>): string => {
                    const packageName = _.has(data, 'packageName')
                      ? `${JSON.stringify(_.get(data, 'packageName'))}`
                      : 'N/A';
                    const packageVersion = _.has(data, 'packageVersion')
                      ? `${JSON.stringify(_.get(data, 'packageVersion'))}`
                      : 'N/A';
                    return `Expected latest version of ${packageName}\nDetected vulnerable version ${packageVersion} of ${packageName}`;
                  }
                },
                start_time: {path: 'discoveredDate'}
              }
            ]
          }
        ],
        sha256: ''
      }
    ],
    passthrough: {
      transformer: (data: Record<string, unknown>): Record<string, unknown> => {
        let resultsData = _.get(data, 'results');
        if (Array.isArray(resultsData)) {
          resultsData = resultsData.map((result: Record<string, unknown>) =>
            _.omit(result, [
              'name',
              'collections',
              'complianceDistribution',
              'vulnerabilities',
              'vulnerabilityDistribution'
            ])
          );
        }
        return {
          auxiliary_data: [
            {
              name: 'Twistlock',
              data: {
                results: resultsData,
                consoleURL: _.get(data, 'consoleURL')
              }
            }
          ],
          ...(this.withRaw && {raw: data})
        };
      }
    }
  };
  constructor(twistlockJson: string, withRaw = false) {
    super(JSON.parse(twistlockJson), true);
    this.withRaw = withRaw;
  }
}
Deploys by Netlify

Copyright © 1997-2026, The MITRE Corporation. All rights reserved.

MITRE is a registered trademark of The MITRE Corporation. Material on this site may be copied and distributed with permission only.