- Introduction
- Single Application Test
- Application Test report
- Multiple Application Tests
- PDF report breakdown
- PDF report samples
Introduction
You can view the results of individual Application Tests and compare up to five Application Test instances. The Application Test Results page provides a comprehensive overview of measurements and detailed events for each Test. You can also download both single and multiple Test results as PDF reports for easy sharing and analysis. See the instructions below for more details.
Single Application Test
When an Application Test is completed, you can start looking at the results. To do this, in the Login Enterprise sidebar menu > Results, navigate to Application testing.
The Overview page will display a list of previously executed Application Tests. The columns display various information, such as:
- Time - Displays the date and time of the Test.
- Name - Displays the name and sequence number of the Test.
- App. Failures - Shows the number of applications that failed relative to the total number of applications tested.
- App. Perf - Indicates how many measurements were out of bounds compared to the configured thresholds. The number on the right shows the total number of configured measurements.
- Comment - Displays the configured comment for the Test.
To view a summary of the Test, click the arrow button next to each Test result to expand it.
You can view the details of the events that took place during the Test. Clicking on Viewing details opens a pop-up with more information.
For Application errors, you can click to expand the details, click the download button to download the log file, or click the camera icon to open the screenshot, if applicable.
Results page
To view the Test results for a single Application Test, select a Test and click View. The Results overview page will display the Platform summary and Application summary for the selected Test.
You can also compare Application Tests by selecting checkboxes for multiple Tests. For details, see Comparing Multiple Application Tests.
Test summary
Details the Test summary with the information about the Test described above.
Overview section
Platform summary
-
Logon performance
- Actual: The actual time it took to log in.
- Threshold: The login time threshold set in the Test.
- Execution: Green if the login was successful, regardless of the time taken.
- Performance: Green if the actual login time was below the threshold or if no threshold was set; red if the actual time exceeded the threshold.
For more information on the Login performance, see Configuring Logon Components.
-
Latency
- Actual: The actual latency in milliseconds (ms).
- Threshold: The latency threshold set in the Test, in milliseconds (ms).
- Execution: Green if latency was measured; red if no latency could be measured.
- Performance: Green if the actual latency was below the threshold or if no threshold was set; red if the actual latency exceeded the threshold.
To learn more about Latency, see Monitoring Latency.
Application summary
-
Application(s)
- Execution: Green if the script was successfully executed; red if the script had a failure during the execution.
- Performance: Green if all measurements/timers in the script were below the threshold or if no threshold was set; red if one or more measurements exceeded the threshold.
- Screenshot: If a screenshot was taken, click the camera icon to open a pop-up displaying the screenshot. If multiple screenshots are available, arrows will appear in the popup to navigate through them. A download button is also available to save the screenshot on your desktop.
-
Measurements (Timers)
- Actual: The actual time recorded for the measurement.
- Threshold: The threshold set in the Test for the specific measurement.
- Status: The status indicator is green if the actual time is below the threshold or if no threshold is set; it is red if the actual time exceeds the threshold.
By default, the measurements (timers) include the app start time, which is automatically recorded when the START function in the workload scripts is executed. This function is used by default to launch target apps. Additionally, any custom timers defined in the workload scripts will also be displayed in the UI.
To learn how to configure custom timers, see the Scripting Functions Overview.
You can also retrieve Application Testing results through the API. For information, see Using the Public API.
Hiding, showing, and sorting columns
You can customize your Application Test results table by adding, hiding, or sorting columns. Here's how to manage your columns effectively:
Configuring your columns
1. Configuration icon: In the top right corner of the Application Test results table, you'll find a configuration icon. Click it to open a pop-up for column management.
2. Adding columns: You can add additional columns to your table. The following columns are available:
- App Failures
- App Performance
- Comment
- Test duration
- Connector
3. Hiding columns: If there are columns you don’t want to see, you can hide them using the configuration options.
4. Sorting columns: To sort columns, simply drag and drop them into your preferred order.
Notes and considerations
- The Time and Test Name columns are fixed and cannot be hidden or moved.
- Your configuration settings are saved in your local storage. This means your column preferences will persist when you navigate away from the page or log out and back in.
- The configuration may reset if you use a different browser or clear your browser history.
Application Test report
Generated automatically
An Application Test PDF report with the results of a single Test is automatically generated once the Test is finished. You can download the PDF report by clicking the corresponding Download button next to the Test:
This is the PDF report for which you can set email notifications. For more information, see the Report settings.
Generated manually
You can also generate a new single Application Test PDF report manually or using the Public API. To generate a PDF report in the Login Enterprise UI:
1. In the Application Testing results, click “>“ (right arrow) next to the Test you’re interested in.
2. Click Generate PDF report (A browser pop-up will open).
3. In the browser pop-up, click Save to save the PDF report.
Alternatively, select the checkbox next to the specific Test, and in the top menu toolbar, click Generate PDF report.
Email notifications for a single Application Test report generated manually aren’t supported.
Multiple Application Tests
You can select up to 5 Application Tests to compare.
Options for comparison
- Select a Baseline Test: Choose a baseline Test to which the other Application Tests will be compared.
- Add/Remove Tests from comparison: Use the checkboxes to add or remove Application Tests from the comparison.
Results overview
The results are divided into two parts, just like for a single Application Test:
- Platform summary: This section displays the login time and latency, along with the percentage difference between the compared Tests. For details, see the Platform summary.
- Application summary: This section compares the results of the Application scripts. For details, see the Application summary.
Notes and considerations
- The results of the baseline Test are pinned in the first column. The other Test columns are sorted from oldest to newest.
- If the results do not fit the screen, scrollbars will appear in the table. The baseline Test column remains pinned and does not move with the scrollbar.
- The results of the Baseline Test are the primary reference. All Applications and timers from the baseline are displayed. If a selected Test has additional Applications or timers not present in the Baseline Test, those will only be shown when that Test is designated as the baseline.
Generating an Application Test report
You can generate and download a PDF report with the results comparing up to 5 Application Tests. In the Login Enterprise UI, you can do this in one of the following ways:
a. Using the toolbar on the Application Test results page:
b. Using the Generate PDF report button on the Compare page:
Please note that an Application Test report comparing multiple Tests is not generated automatically. You can generate the report manually using either the UI, as described earlier, or the Public API. Also, email notifications for this type of report aren’t supported.
Downloading a PDF report via the Public API
You can download a single Application Test report and an Application Test report comparing multiple Tests via the Public API using the following endpoint: /publicApi/v8-preview/reports/{reportId}/pdf. For more information, see Accessing the Public API.
PDF report breakdown
The PDF report summarizes the findings obtained from the Application Testing process. It provides an overview of the Test specifics, results, and measurements. The report is divided into the following sections:
1. Introduction
This section provides an overview of Application testing and the metrics used during this type of testing.
2. Test Specifics
This section provides an overview of the key details of the test setup:
- Product version: The Virtual Appliance version.
- Connector: The specific Connector used for the Test.
- Launcher group(s): The Group(s) of Launchers involved in the Test.
- Test duration: The total duration of the Test.
- Date: The date on which the Test was conducted.
- Workload: The type of Workload or Test scenario used.
3. Test Summary
This section highlights key metrics for the Test:
- Application failure: Displays the number of failed Applications. For example: script failed to run.
- Application performance: This shows the number of Applications where the actual time exceeded the performance threshold.
4. Platform Summary
This section provides insights into the platform's performance based on latency and login performance:
- Login performance: Indicates the performance during login operations.
- Latency: This shows any latency issues that occurred during the Test.
Both Login Performance and Latency include the following:
- Test name: The name of the Test conducted.
- Bar chart: A visual representation of the performance data.
- Actual time: The time taken for the specific Test.
- Threshold: The predefined threshold, if applicable.
- Execution: The result of the Test execution.
- Performance: The performance analysis based on the Test outcome.
5. Application Results
This section dives deeper into the results for each Application with the following subsections:
- Summary: A high-level result of the Application's performance during the Test.
-
Measurements: Includes a table for each measurement or timer taken during the Test. Each table will include:
- Test name: The name of the Test.
- Bar chart: A visual representation of the performance data.
- Actual time: The actual time recorded for the measurement.
- Threshold: The threshold value (if applicable).
- Status: The status of the Test.
- Screenshots: This section includes screenshots taken on a particular application under test.
As an example, see attached Application Test report samples for a single and multiple Application Test results.