#Bentley Cloud

#Introduction

Open iT supports Bentley Cloud usage reporting. The following sections will guide you on the required configuration for license manager utility polling and log file parsing.

#Configuring License Manager Utility Polling

Open iT polls the license servers at regular intervals to get the current status of its license use and availability. This data collection utilizes the license server's built-in license administration utility, which collects and processes license statuses for reporting.

For this collection, the data source is through an API. The data collector/preprocessor initiates the license status utility every hour using a 5-minute sample interval, triggering the data collection process. The license status utility requests the current license usage data from the license manager portal. After the license manager portal provides the requested data, the license status utility passes this information to the data collector/preprocessor. The data collector/preprocessor processes the data, preparing it for transmission. Finally, the preprocessed data is sent to the Core Server every night, according to the client's timezone, for further storage, completing the license usage data collection and processing.

License Manager Utility Polling Workflow through API

License Manager Utility Polling Workflow through API

This will produce the following aggregated data types used for reporting in Power BI:

#Requirements

  • An Open iT Client connected to an Open iT Server or a coexistent Open iT setup
  • Activated LicenseAnalyzer collection and license poller
  • Non-SSO Bentley cloud account with Administrator access to the Bentley Portal
  • Chromium must not be blocked or restricted by endpoint security policies, firewalls, or application whitelisting tools in the company’s environment

#Initializing Bentley Cloud Data Collection

Before configuring data collection, it is necessary to follow the steps below:

  1. Open a command prompt with Administrator level privileges.

  2. Go to the bin directory, which is by default in C:\Program Files\OpeniT\Core\bin, run the command:

    Command Syntax
    cd $BIN_DIR
    Example
    cd C:\Program Files\OpeniT\Core\bin
  3. Run the command:

    Command Syntax
    openit_bentleycloud -c <bentley_admin_email> <bentley_admin_password> -s

    where:

    ParameterDescription
    -c <bentley_admin_email> <bentley_admin_password>Use this to specify the credentials to access the Bentley portal.
    -sUse this to save the credentials.
    Parameters for Initializing Bentley Cloud Data Collection

    Example
    openit_bentleycloud -c jsmith@email.com bentleyAdm!n123 -s

    This example initializes Bentley Cloud data collection using the administrator account jsmith@email.com with passsword bentleyAdm!n123.

    Optional Parameters
    ParameterDescription
    -hUse this to display the help message.
    -dUse this to turn on debug logging.
    Optional Parameters

#Configuring Data Collection

These are the required steps to activate and configure collection of Bentley Cloud usage data.

  1. Go to the Components directory, which is by default in C:\Program Files\OpeniT\Core\Configuration\Components, and back up the licpoll.xml configuration file.

  2. Open a command prompt with Administrator level privileges.

  3. Go to the bin directory, which is by default in C:\Program Files\OpeniT\Core\bin, run the command:

    Command Syntax
    cd $BIN_DIR
    Example
    cd C:\Program Files\OpeniT\Core\bin
  4. Once in the directory, activate the collection of Bentley Cloud data, run the command:

    Command Syntax
    openit_confinit -c "licpoll.license-types.genericlicense-bentleycloud-sample.active=true"
  5. Set any preferred argument for the openit_bentleycloud binary, run the command:

    Command Syntax
    openit_confinit -c "licpoll.license-types.genericlicense-bentleycloud-sample.status-command.arguments=<argument>"
    Optional Parameters
    ParameterDescription
    -t "<TEMPDIR>"specifies where the program will store temporary files, where:
    <TEMPDIR> - path to the temporary directory; the default path is C:\ProgramData\OpeniT\Data\temp\Bentley
    -o "<OUTDIR>"specifies where the program will save its output, where:
    <OUTDIR> - path to the output directory; the default path is C:\ProgramData\OpeniT\Data\temp\Bentley\out
    -ci <COUNTRYISO>lists the country ISO code(s) where the collection will be made, where:
    <COUNTRYISO> - ISO code per country
    Optional Parameters
    Example
    openit_confinit -c "licpoll.license-types.genericlicense-bentleycloud-sample.status-command.arguments=-t C:\ProgramData\OpeniT\Data\temp\Bentley"
  6. Run the following command to update the configuration files:

    Command Syntax
    openit_confbuilder --client

    Make sure no errors are encountered.

Advanced Configuration

Refer to the Bentley Cloud Data Collection Configuration table to learn more about Bentley Cloud configuration in licpoll.xml.

Object NameAccepted ValueDescription
activeBoolean (true or false)Setting this to true activates Bentley Cloud usage data collection.
typeString (i.e., GenericLicense)The license manager type.
intervalTimespan (e.g., P30S, P5M, P1H)The span of time between each polling round (it is recommended to set a value no less than P1M).
offsetTimespan (e.g., P30S, P5M, P1H)The span of time the aligned poll time decided by interval is shifted.
product-nameString (e.g., server;daemon)This object is defined if a vendor license name other than the default GenericLicense=%hosttype% will be used.
license-serverString (e.g., hou105lin)The Bentley Cloud License Server name.
status-commandFileName (e.g., ${OpeniT.directories.bin}/openit_bentleycloud.exe)The binary used to obtain status from the license manager.
status-command.argumentsString (i.e., -a)The arguments used for the status command.
Bentley Cloud Data Collection Configuration

#Verifying Data Collection

After configuration, you can verify that the data is collected by following these steps:

  1. Open a command prompt with Administrator level privileges.

  2. Go to the bin directory, which is by default in C:\Program Files\OpeniT\Core\bin, run the command:

    Command Syntax
    cd $BIN_DIR
    Example
    cd C:\Program Files\OpeniT\Core\bin
  3. Run the command:

    Command Syntax
    openit_licpoll -# 1
  4. Verify that the temp directory, which is by default in C:\ProgramData\OpeniT\Data\temp, contains a LicPoll directory containing .data and status-*.log files.

#Next Step?

Continue reading if you need to configure log file parsing. If not, proceed to configuring Power BI.    Power BI Connection Configuration

#Configuring Log File Parsing

Open iT collects log files and converts them to Open iT format.

For this collection, the data collector initiates the raw data collector to request license usage history logs from the license manager every 5 minutes, triggering the data collection process. Once the logs are received, the raw data collector passes them to the data collector. The collected data is then sent to a data preprocessor for processing. After preprocessing, the final preprocessed data is stored in the Core Server.

Log File Parsing Workflow with Raw Collector

Log File Parsing Workflow with Raw Collector

This will produce the following aggregated data types used for historical reporting:

#Requirements

  • An Open iT Client connected to an Open iT Server or a coexistent Open iT setup
  • Non-SSO Bentley cloud account with Administrator access to the Bentley Portal

#Setting up Bentley Cloud Data Collection

These are the required steps to set up data collection from the Bentley portal.

  1. Open a command prompt with Administrator level privileges.

  2. Go to the bin directory, which is by default in C:\Program Files\OpeniT\Core\bin, run the command:

    Command Syntax
    cd $BIN_DIR
    Example
    cd C:\Program Files\OpeniT\Core\bin
  3. Run the command:

    Command Syntax
    openit_bentleycloudstat init --username <bentley_username> --password <bentley_password>

    where:

    ParameterDescription
    --username <bentley_username>Use this to specify the username for accessing the Bentley portal.
    --password <bentley_password>Use this to specify the password for accessing the Bentley portal.
    Parameters for Setting Up Bentley Cloud Data Collection

    Optional Parameters
    ParameterDescription
    --use-e365-usage-pageUse this to export e365 data from the Bentley portal's details page.
    --add-date-format "<date_format>"Use this to add date formats to accept. The supported date formats are:
    • %Y-%m-%d %H:%M:%S.%f
    • %Y-%m-%dT%H:%M:%S
    • "%Y-%m-%d %H:%M:%S
    • %m/%d/%Y %I:%M:%S %p
    • %Y-%m-%d
    • %d/%m/%Y %H:%M:%S
    • %d/%m/%Y
      where:
    • %Y - Year
    • %m - Month
    • %d - Day
    • %dT - ISO 8601 format, where T separates the date from time
    • %H - Hour (e.g., 00-23, 24-hour clock)
    • %I - Hour (e.g., 00-12, 12-hour clock)
    • %M - Minute
    • %S - Second
    • %f - Microsecond
    • %p - AM/PM indicator
    --debugUse this to turn on debug logging.
    --h --helpUse this to display the help message.
    Optional Parameters

    Example
    openit_bentleycloudstat init --username jsmith@email.com --password bentleyAdm!n123

    This example sets up Bentley Cloud data collection using the administrator account jsmith@email.com with password bentleyAdm!n123.

    Optional Parameters
    • export - use this to export Bentley Cloud data from the portal.

      Example
      openit_bentleycloudstat export <param 1> <param 2> ... <param n>
      ParameterDescription
      --data <data>Use this to specify the data to export.
      --from <YYYY-MM-DD>Use this to specify the date for which to start collection.
      --to <YYYY-MM-DD>Use this to specify the date for which to end collection.
      --last <Q>Use this to specify the last period of the data to be collected.
      --dir <export_dir>Use this to specify the directory where the exported data will be saved.
      Parameters for Exporting Bentley Cloud Data

    • query - use this to query Bentley Cloud data from the portal.

      Example
      openit_bentleycloudstat query <param 1> ... <param n>
      ParameterDescription
      --data <data>Use this to specify the data to query.
      --output-file <output_file>Use this to specify the output file.
      Parameters for Querying Bentley Cloud Data

    • parse - use this to parse the exported Bentley Cloud data from the portal.

      Example
      openit_bentleycloudstat parse <param 1> <param 2> ... <param n>
      ParameterDescription
      --dir <dir>Use this to specify the directory containing the csv files to parse.
      --data <data>Use this to specify the data to parse.
      --subscription-file <subscription_file>Use this to specify the file containing feature details.
      --target-dir <target_dir>Use this to specify the directory where the parsed data will be saved.
      Parameters for Parsing Extracted Bentley Cloud Data

    • collect - use this to collect Bentley Cloud data from the portal.

      Example
      openit_bentleycloudstat collect <param 1> <param 2> ... <param n>
      ParameterDescription
      --data <data>Use this to specify the data to parse.
      --parse-data <parse_data>Use this to specify the file containing feature details.
      --from <YYYY-MM-DD>Use this to specify the date for which to start collection.
      --to <YYYY-MM-DD>Use this to specify the date for which to end collection.
      --last <Q>Use this to specify the last period of the data to be collected.
      --dir <export_dir>Use this to specify the directory where the exported data will be saved.
      --target-dir <target_dir>Use this to specify the directory where the parsed data will be saved.
      Parameters for Collecting Bentley Cloud Data

    • --use-usage-details-page - use this to export the intervals data from the Bentley portal's details page.

    • recollect - use this to recollect Bentley Cloud data from the portal.

      Example
      openit_bentleycloudstat recollect <param 1> <param 2> ... <param n>
      ParameterDescription
      --data <data>Use this to specify the data to parse.
      --parse-data <parse_data>Use this to specify the file containing feature details.
      --from <YYYY-MM-DD>Use this to specify the date for which to start collection.
      --to <YYYY-MM-DD>Use this to specify the date for which to end collection.
      --last <Q>Use this to specify the last period of the data to be collected.
      --dir <export_dir>Use this to specify the directory where the exported data will be saved.
      --target-dir <target_dir>Use this to specify the directory where the parsed data will be saved.
      Parameters for Recollecting Bentley Cloud Data

    • remove - use this to remove Bentley Cloud data from archive.

      Example
      openit_bentleycloudstat remove <param 1> <param 2> ... <param n>
      ParameterDescription
      --data <data>Use this to specify the data to remove.
      --from <YYYY-MM-DD>Use this to specify the date for which to start removal.
      --to <YYYY-MM-DD>Use this to specify the date for which to end removal.
      --last <Q>Use this to specify the last period of the data to be removed.
      Parameters for Removing Bentley Cloud Data

    • cleanup - use this to clean up the exported csv.

      Example
      openit_bentleycloudstat cleanup

#Initially Collecting and Sending Bentley Cloud Data to the Server

The collect_bentleycloud.bat script automates the initial stage of data handling by collecting data from the specified sources and sending it to the Open iT server.

  1. Open a command prompt with Administrator level privileges.

  2. Go to the bin directory, which is by default in C:\Program Files\OpeniT\Core\bin, run the command:

    Command Syntax
    cd $BIN_DIR
    Example
    cd C:\Program Files\OpeniT\Core\bin
  3. Run the command:

    Command Syntax
    collect_bentleycloud.bat --from <YYYY-MM-DD> --to <YYYY-MM-DD>
    Example
    collect_bentleycloud.bat --from 2024-01-01 --to 2024-06-01
    Optional Parameters
    ParameterDescription
    --data <data>Use this to specify the data to collect.
    --from <YYYY-MM-DD>Use this to specify the date for which to start collection.
    --to <YYYY-MM-DD>Use this to specify the date for which to end collection.
    Optional Parameters for Collecting and Sending Bentley Cloud Data to the Server

  4. Verify that there are archiver*.in files created in the server in the archiver directory, which is by default in C:\ProgramData\OpeniT\Data\incoming\archiver.

#Activating Bentley Cloud Data Collection

These are the required steps to activate collection of Bentley Cloud data.

  1. Open a command prompt with Administrator level privileges.

  2. Go to the bin directory, which is by default in C:\Program Files\OpeniT\Core\bin, run the command:

    Command Syntax
    cd $BIN_DIR
    Example
    cd C:\Program Files\OpeniT\Core\bin
  3. Once in the directory, activate the collection of Bentley Cloud data, run the command:

    Command Syntax
    openit_oconfinit -u "collect_license_bentleycloud.root.scheduler.jobs.collect_bentleycloud.general.active=true"
Advanced Configuration

The collection runs every midnight by default. To configure the intervals, locate the instances attribute under collect_bentleycloud, collect_bentleycloud_licenselogscsv, parse_bentleycloud_licenselogscsv, transfer_bentleycloud_licenseevents, transfer_bentleycloud_token, or cleanup_bentleycloud in the same file and configure the attributes.

Refer to the Bentley Cloud Data Collection Job Scheduler Instances Configuration table to learn the attributes used to configure Bentley Cloud data collection and transfer.

Attribute NameAccepted ValueDescription
max-instancesUint (e.g., 5, 8, 9)The number of instances allowed to run at the same time.
max-handlingString (end-oldest, end-all-old, or end-new)The action done upon reaching the maximum number of instances:
  • end-oldest - Specify this option to stop/kill the oldest instance and start a new one.
  • end-all-old - Specify this option to stop/kill all running instances before starting the new one.
  • end-new - Specify this option to prevent a new instance from starting.
end-timeoutTimespan (e.g., P30S, P5M, P1H)The maximum waiting time before terminating a running instance.
quarantineTimespan (e.g., P30S, P5M, P1H)The waiting time before starting a new instance after a previous one.
Bentley Cloud Data Collection Job Scheduler Instances Configuration

#Activating Bentley Cloud Data Recollection

Recollecting the Bentley Cloud data is required to ensure that the token-based data is in sync with the recalculated license information from the portal. These are the required steps to activate recollection of Bentley Cloud data.

  1. Open a command prompt with Administrator level privileges.

  2. Go to the bin directory, which is by default in C:\Program Files\OpeniT\Core\bin, run the command:

    Command Syntax
    cd $BIN_DIR
    Example
    cd C:\Program Files\OpeniT\Core\bin
  3. Once in the directory, activate the recollection of Bentley Cloud data, run the command:

    Command Syntax
    openit_oconfinit -u "recollect_license_bentleycloud.root.scheduler.jobs.recollect_bentleycloud.general.active=true"
Advanced Configuration

The recollection runs every Saturday by default. The primary data is then transferred to the server according to the client timezone for processing. To configure the intervals, locate the instances attribute under recollect_bentleycloud, transfer_bentleycloud_token_recollect, or cleanup_bentleycloud_recollect in the same file and configure the attributes.

Refer to the Bentley Cloud Data Recollection Job Scheduler Instances Configuration table to learn the attributes used to configure Bentley Cloud data recollection and transfer.

Attribute NameAccepted ValueDescription
max-instancesUint (e.g., 5, 8, 9)The number of instances allowed to run at the same time.
max-handlingString (end-oldest, end-all-old, or end-new)The action done upon reaching the maximum number of instances:
  • end-oldest - Specify this option to stop/kill the oldest instance and start a new one.
  • end-all-old - Specify this option to stop/kill all running instances before starting the new one.
  • end-new - Specify this option to prevent a new instance from starting.
end-timeoutTimespan (e.g., P30S, P5M, P1H)The maximum waiting time before terminating a running instance.
quarantineTimespan (e.g., P30S, P5M, P1H)The waiting time before starting a new instance after a previous one.
Bentley Cloud Data Recollection Job Scheduler Instances Configuration

#Verifying Bentley Cloud Data Collection

After configuration, you can verify that the data is collected and sent to the server by following these steps:

  1. Open a command prompt with Administrator level privileges.

  2. Go to the bin directory, which is by default in C:\Program Files\OpeniT\Core\bin, run the command:

    Command Syntax
    cd $BIN_DIR
    Example
    cd C:\Program Files\OpeniT\Core\bin
  3. Run the command:

    Command Syntax
    openit_executor -r collect_license_bentleycloud
  4. Verify that there are archiver*.in files created in the server in the archiver directory, which is by default in C:\ProgramData\OpeniT\Data\incoming\archiver.

#Sample Reports

#Max Available vs Max in Use

This sample report compares max in-use licenses against max available licenses.
It offers several key benefits:

  • Optimized License Allocation – helps ensure you are not over-purchasing licenses you don't need or under-provisioning.
  • Cost Savings – identifies opportunities to downgrade or redistribute licenses, reducing unnecessary expenses.
  • Usage Trends & Capacity Planning – shows peak usage patterns, allowing better forecasting for future needs.
  • Avoiding Service Disruptions – helps prevent situations where users cannot access software due to reaching the license limit.
  • Compliance & Audit Readiness – provides a usage record to ensure compliance with vendor agreements and avoid penalties.
  • Performance & Productivity Insights – helps assess whether certain teams or departments are under-utilizing or over-utilizing software.

Max Available vs Max in Use Licenses per Feature

Max Available vs Max in Use Licenses per Feature

#Token Licenses Used per User

This sample report provides insights into how software tokens are allocated and consumed across an organization.
It offers several key benefits:

  • Optimized License Utilization – helps ensure you efficiently distribute tokens, reducing waste and preventing unnecessary purchases.
  • Cost Management – identifies users or departments consuming the most tokens, allowing better budgeting and cost control.
  • Usage Monitoring & Trends – tracks patterns in token usage to forecast future demand and adjust license purchases accordingly.
  • Fair Resource Allocation – ensures that tokens are available to those who need them most, preventing access issues for critical users.
  • Compliance & Audit Readiness – provides a detailed record of token usage to meet vendor compliance requirements and prepare for audits.
  • Productivity & Performance Insights – reveals whether users are under-utilizing or over-utilizing tokens, which may indicate training needs or inefficiencies.

Token Licenses Used per User

Token Used per User

#Troubleshooting

This section lists possible ERRORS that may occur when running bentleycloudstat and the suggested solutions.

#Cannot Log In to the Bentley Cloud Portal

#Problem

This issue may be encountered when the credentials used in setting up the Bentley Cloud data collection is incorrect, showing the error, An error occurred: Login failed. Please check your username and password.

#Resolution

Verify that the credentials are correct, then repeat the set up following the instructions in the Setting up Bentley Cloud Data Collection section.

#Cannot Collect Data from the Bentley Cloud Portal

#Problem

This may be encountered when there is an error during log-in, showing the error, An error occurred: Waiting for selector "#HFProfileAnchor" failed: timeout 30000ms exceeds. This may be because of a network issue or incorrect credentials used in setting up the Bentley Cloud data collection.

#Resolution

Verify that there are no network issues and the credentials are correct, then repeat the set up following the instructions in the Setting up Bentley Cloud Data Collection section.

#Cannot Parse Collected Data from the Bentley Cloud Portal

#Problem

This may be encountered when the date format from the Bentley Cloud data does not match any of the following formats:

  • %Y-%m-%d %H:%M:%S.%f
  • %Y-%m-%dT%H:%M:%S
  • %m/%d/%Y %I:%M:%S %p
  • %Y-%m-%d %H:%M:%S
  • %Y-%m-%d
  • %d/%m/%Y %H:%M:%S
  • %d/%m/%Y
  • %Y%m%d

showing an error similar to An error occurred: time data '7/15/2024 12:00:00 AM' does not match format '%Y-%m-%d %H:%M:%S.%f'.

#Resolution

  1. Go to the scheduler directory, which is by default in C:\Program Files\OpeniT\Core\Configuration\scheduler, and open collect_license_bentleycloud.oconf.

  2. Locate root.scheduler.jobs.collect_bentleycloud.operation.arguments and set its value to collect --add-date-format <date_format>.

    Where <date_format> is the date format from the Bentley Cloud data.

    collect_license_bentleycloud.oconf
    48| } 49| arguments 50| { 51| type=string 52| value=collect --add-date-format "%m.%d.%Y %H:%M:%S" 53| }
  3. Save the changes.

#Cannot Find Job in List

#Problem

This issue may arise from unidentified problems within the Bentley portal, resulting in an error message Cannot find job <export_job_id> in the list (e.g., Cannot find job 0000ab12-0000-cd00-e34f-567890g9876hi in the list).

#Resolution

  1. Open a command prompt with Administrator level privileges.

  2. Go to the bin directory, which is by default in C:\Program Files\OpeniT\Core\bin, run the command:

    Command Syntax
    cd $BIN_DIR
    Example
    cd C:\Program Files\OpeniT\Core\bin
  3. Set the Bentley Cloud data collection to export the intervals data from the Bentley portal's details page. Run the command:

    Command Syntax
    openit_bentleycloudstat init --username <bentley_username> --password <bentley_password> --use-usage-details-page

    where:

    ParameterDescription
    --username <bentley_username>Use this to specify the username for accessing the Bentley portal.
    --password <bentley_password>Use this to specify the password for accessing the Bentley portal.
    --use-usage-details-pageUse this to export the "intervals" data from the Bentley portal's details page.
    Parameters for Setting Up Bentley Cloud Data Collection

    Example
    openit_bentleycloudstat init --username jsmith@email.com --password bentleyAdm!n123 --use-usage-details-page
  4. Open bentleycloudhistorical.ini in C:\ProgramData\OpeniT and verify that the line use_usage_details_page = True is at the end of the file:

    bentleycloudhistorical.ini
    1| [DEFAULT] 2| username = 3| password = 4| use_usage_details_page = True
  5. Verify that data is collected. Run the command:

    Command Syntax
    openit_executor -r collect_license_bentleycloud
  6. Verify that the temp directory, which is by default in C:\ProgramData\OpeniT\Data\temp, contains a LogFileCollector directory containing raw-bentleycloud-*.parsed and *.status files.

#Column and Data Size does not Match

#Problem

This issue may arise when succeeding raw Bentley Cloud log file collector files do not contain the expected columns listed from an already existing csv-column-header.temp file in the Bentley Cloud LogParser directory, which is by default in C:/ProgramData/OpeniT/Data/temp/Bentley.Cloud/LogParser, resulting in an error message Columns_Not_Found: <function_name>: Column and data size does not match. Please check the raw log. (e.g., Columns_Not_Found: BentleyRcdParser::getCSVDetails: Column and data size does not match. Please check the raw log.).

#Resolution

  1. Check if the column headers in the csv-column-header.temp do not match with the existing raw-*bentleycloud-*.data files in the LogFileCollector directory, which is by default in C:/ProgramData/OpeniT/Data/temp/LogFileCollector.

  2. If the mismatch is confirmed, remove the existing csv-column-header.temp.

  3. Open a command prompt with Administrator level privileges.

  4. Go to the bin directory, which is by default in C:\Program Files\OpeniT\Core\bin, run the command:

    Command Syntax
    cd $BIN_DIR
    Example
    cd C:\Program Files\OpeniT\Core\bin
  5. Make sure that the existing log parser data are sent to the server, run the following:

    Command Syntax
    openit_apicontroller -t upload_bentleycloud_licenseevents
  6. Run the data parsing job:

    Command Syntax
    openit_logparserbentleycloud.exe --matchobjects "C:/Program Files/OpeniT/Core/configuration/matchobjects-record-bentleycloud.oconf" --srcdir "C:/ProgramData/OpeniT/Data/temp//LogFileCollector" --srcpattern "*-bentleycloud-*.data" --srcfilehandling rename --trgdir "C:/ProgramData/OpeniT/Data/temp//Bentley.Cloud/LogParser" --datatype bentleycloud --module license --resolution PT1H --disable-statlogging --debug warning --no-zero-usage

    Make sure no errors are encountered.

  7. Verify that the column headers in the csv-column-header.temp now match with the existing raw-*bentleycloud-*.data files in the LogFileCollector directory, which is by default in C:/ProgramData/OpeniT/Data/temp/LogFileCollector.

  8. Run the data transfer and clean-up job to ensure structured data. Run the following:

    Command Syntax
    openit_apicontroller -t upload_bentleycloud_licenseevents
    Command Syntax
    openit_bentleycloudstat cleanup

#Next Steps?

   Renaming Vendor License   Renaming Features   Create and Add Report    License Monitor

    We value your feedback!

    Please take a few minutes to complete our survey and share your thoughts on your recent experience with our documentation.

    Take survey

    Close