6.2 System integration
In order to integrate Code Checker for MATLAB in your (continuous integration) systems, you may want to utilize the additional capabilities described in this section. The feature to generate a JSON report is not available with a Pro license by default. If you wish to use them, please contact us at
info@monkeyproofsolutions.nl. The feature to generate a JSON summary is available with a Pro license.
6.2.1 Generating JSON reports
By adding
’-generateJSON’ to the inputs of your call to
monkeyproof.cc4m.start, a JSON report will be generated instead of an HTML report. The versatility of JSON allows you to
integrate the results with other tools or programming languages. Together with the command window interface described in Chapter
6.1↑, your code can be checked automatically (for example using hooks, or periodic triggers) and the results can easily be stored and interpreted at a later point in time. This facilitates analyzing your coding rules compliance over time, which can be especially advantageous to your development process when running Code Checker for MATLAB on a designated computer.
The JSON report is saved to the same location as HTML reports would be. This location can be obtained using monkeyproof.cc4m.getReportsFolder() and changed using
monkeyproof.cc4m.setReportsFolder(<The/Folder>). The report has extension .json. This option is not available from the graphical user interface.
The elements of the JSON report are as follows:
-
MetaData: Global information regarding the run.
-
CheckPrioritiesRun: The priorities of the checks that were run.
-
MatlabPath: Can assume one of several forms:
-
“Custom” indicates that a custom MATLAB path was provided for the Code Checker for MATLAB run.
-
“Local” means that no changes were made to the path for the Code Checker for MATLAB run.
-
“xxxx.txt” where xxxx is the absolute path to a text-file indicates that the MATLAB path was provided using a text-file.
-
Version: Code Checker for MATLAB release number.
-
MatlabErrors: Information on what files could not be checked due to errors in them.
-
FileName: Path of the file containing the error(s).
-
ErrorMessages: One or more error messages and the location of their source.
-
Msg: The error message as given in the MATLAB editor.
-
LineStart: The line number at which the error is shown.
-
LineEnd: The end line of the cause of the error.
-
ColumnStart: The start column at which the error is shown.
-
ColumnEnd: The end column of the error.
-
CheckResults: Per checked file, the results.
-
FileName: Path to the file that is checked.
-
Checks: List of checks and their results.
-
Name: Unique name of the check.
-
CheckResult: Status of the check. For example “fail”.
-
RuleID: Identifier of the rule.
-
RuleLink: URL of the rule.
-
Priority: Priority of the check as configured. Can be “Recommended”, “Strongly recommended”, or “Mandatory”.
-
SeverityLevel: Severity level of the check as configured. Severity level of the rule can be in the range from 1 to 10. The most important ones are of severity level 1 and next levels are reserved for more pedantic issues.
-
DefaultConfig: true if the default values of a check or parameter were used due to problems with the configuration. This is the case when a check is configured multiple times, if it is missing from the configuration file, or if a check’s parameters are configured more than once or not at all.
-
ReferencedConfig: If applicable, the referenced set of configurations the check configuration came from. This field is not there if the check configuration is defined in the active set of configurations.
-
Results: Information on the possible violations of the check. Per violation:
-
Msg: Description on what went wrong.
-
LineNr: Line number at which the violation was detected.
-
ColNr: Column at which the violation was detected.
-
IsExempt: Whether or not the violation is exempt from the rules.
-
ExemptionReason: The comment on the line of code after the exemption (if any).
6.2.2 Generating JSON summary
By adding ’-generateJSONSummary’ to the inputs of your call to monkeyproof.cc4m.start, a JSON summary will be generated together with the HTML report. The JSON summary can easily be integrated with other tools or programming languages. Together with the command window interface described in Chapter 6.1, your code can be checked automatically (for example using hooks, or periodic triggers). This facilitates analyzing your coding guideline compliance over time. And if, based on the information in the JSON summary, you want to see all detailed results, one can open the HTML report corresponding to the run.
The JSON summary is saved to the same location as HTML reports. The summary has extension .json. This option is not available from the graphical user interface but unlike the JSON report, this feature is available with a Pro license by default.
The elements of the JSON summary are as follows:
-
MetaData: Global information regarding the run.
-
RunDuration:Message containing the number of violations for X rules in Y files out of Z files with the elapsed time.
-
TimeOfCheck:Date and time the run is executed.
-
UserName:Name of user that executed the run.
-
UserComment:A user comment that the user added to the report’s content (if any), for example the pull request URL related to the run.
-
Settings:Settings for the run. Such as: file/folder/project, including subfolder, all files or only local changes, priorities selected etc.
-
EnvironmentInfo:lnformation about the environment.
-
CodeCheckerVersion:Code Checker for MATLAB release number.
-
MATLABVersion:MATLAB release number.
-
OperatingSystem:The operating system.
-
RootFolder:Comon root of the files checked.
-
ReportsFolder:Folder that contains both the JSON summary and the HTML report.
-
HTMLReport:Link to the HTML report that contains the complete set of results.
-
TrialInfo:If applicable, information about the expiring trial license.
-
Results: The results.
-
FilesChecked:Path to the files that are checked.
-
NrChecksRun:The total number of checks ran.
-
NrViolations:The total number of violations.
-
PerCheck:Results per check.
-
RuleID: Rule identifier the check is related to.
-
CheckName: Name of the check.
-
Priority: Priority of the rule.
-
SeverityLevel: Severity level of the rule.
-
NrViolatedFiles: The number of files that violate this rule.
-
NrViolations: The number of violations.
-
PerPriority:Results per priority.
-
Mandatory: The number of rules with priority "Mandatory" that got violated.
-
StronglyRecommended:The number of rules with priority "Strongly Recommended" that got violated.
-
Recommended:The number of rules with priority "Recommended" that got violated.
-
FilesWithError:List of files that could not be checked due to errors in them.
-
Configuration: Information about the used configuration.
-
Type:The type of the configuration used; Predefined or Custom.
-
Name:The name of the configuration used.
-
NrChecks:The number of enabled checks and reports after applying priority filtering.
-
NrReferencedChecks: The number of check configurations that came from referenced sets of configurations (if any).
-
NrReferencedConfigs: The number of referenced sets of configurations used for the currently active set of check configurations (if any).
-
ReferencedConfigs: A list of referenced sets of configurations used for the currently active set of check configurations (if any).
-
NrCustomChecks:The number of custom checks in the configuration.
-
NrDisabledChecks:The number of disabled checks and reports.
-
NrMissingConfigs:The number of missing checks configurations.
-
MissingConfigs:The missing checks configurations.
-
NrDuplicateConfigs:The number of duplicate checks configurations.
-
DuplicateConfigs:The duplicate checks configurations.
-
NrMissingParamConfig:The number of missing parameter configurations.
-
MissingParamConfig:The missing parameter configurations.