Launchers: Overview and best practices

The role of the Launcher in Login Enterprise

The Login Enterprise Launcher is a host used as an endpoint to communicate between the Login Enterprise virtual appliance running a test and the targets (VMware, Citrix, etc.). We support Linux (including IGEL, which runs Linux) and Windows.

To learn where the Launchers architecturally reside in the Login Enterprise setup, see the Architecture overview.

What Launchers do and how they do it

During a Login Enterprise test, the test configuration directs the Launcher to establish a connection to a designated target environment. The Launcher receives instructions on what to invoke, including which credentials (test user/password) to employ for connecting to the specified resource through a broker. Then, the Launcher displays a window, commonly referred to as a console, depicting the appearance and activities of the target desktop or published application throughout the test. Launchers can initiate and manage up to 30 concurrent target test sessions. Upon completion of their respective defined tests and subsequent logoff by the test users, the console windows associated with the targets will automatically close, terminating the connection.

Why does Login Enterprise support different Launchers?

Login Enterprise performs equally well regardless of whether the Launcher software is installed on Windows or Linux machines; there's no discernible advantage between the two platforms.

While Microsoft Windows remains the preferred choice among our customers, it typically entails higher licensing costs compared to Linux. Linux often provides free or more affordable licensing options, and some of our clients either have existing infrastructure tailored for Linux or simply prefer Linux endpoint hosts due to experience, contractual agreements, or security policy.

Additionally, we provide support for IGEL endpoint clients through our partnership with them.

What does the Launcher represent?

An endpoint

The Launcher functions as the physical access point for virtualized clients, acting as an endpoint. Therefore, it's crucial to consider the Launcher's location based on testing objectives to ensure accurate round-trip latency between the Launcher and the target (EUC client). Additionally, verifying if the Launcher endpoint has access to the target is essential, as this can highlight potential production downtime requiring attention.

An endpoint host refers to a specific network address or server enabling connectivity for a software application, service, or device to communicate with a target. This target could be another application, service, or device that the endpoint host needs to interact with.

For example, within a client-server architecture, the endpoint host typically represents the server where clients connect to access resources or services. In a peer-to-peer network, an endpoint host could be any participating node that communicates directly with other nodes.

Launcher groups and locations

To launch test sessions, you need to add Launchers to the Launcher Group.

Launcher Groups specify which Launchers should be used for a particular Test. Creating a Test requires at least one Launcher Group, so having at least one Launcher Group is a requirement. This is helpful when trying to logically divide Launchers, for example, based on geographic location.

For large environments, creating or managing a large number of Launchers can be a manually-intensive process. To automate many operations, such as creating Launcher groups, adding Launchers to a Launcher Group, or assigning Launcher Groups to test, use the Public API, along with PowerShell.

For more information about the Public API, see the Public API.

Creating a Launcher group

There are two ways to create a Launcher Group in Login Enterprise: through a Filter, or by Selection:

  • Filter lets you fill out the details and the Launchers you want to add to the group. You can also use wildcards so you will not have to select several different Launchers. Wildcards accepted include * and ?.
  • Selection lets you select Launchers to add to the new group manually.

Note that if you add a Launcher to a Launcher Group by Selection, i.e. you choose it specifically and not by wildcard, and that Launcher goes offline, it will still be part of the Launcher Group. It will not show up in the Launchers list, and it will not be available for Tests, but if the Launcher comes back online, it will immediately be available in that Group and any Tests that include that Launcher Group. 

To create a group, in Launcher Groups, click the green "+", and select your method for grouping.

Frame 34.png

The following steps show how to add a Launcher group via the Selection method:

  1. In Create a new selection group, give your group a name and, optionally, a description, then Next.

Frame 41.png

2. In the following window, click Show all Launchers.

Frame 42.png

3. Select checkboxes next to the Launchers you want to add to a group, then Save.

Frame 43.png

You have successfully created your Launcher group. These Launchers are now able to successfully receive and execute new Test connections to the target EUC stacks and platforms. Now you need to add Launchers to a Test.

Adding a Launcher to a Test

Launchers write a log file at %TEMP%\LoginEnterprise\Launcher\Logs. The log will be created as soon as the Launcher starts and will record successes and errors in its operations. The Launcher also displays a table in its window showing the Tests that have been assigned to it since it started this current session. This list is kept in memory, so it is lost on every program restart.

To add a Launcher to a Test:

  1. In the Sidebar menu, navigate to Configuration > Manage tests, and select the Test, for example, Continuous Test.
  2. Click on the particular Test instance, scroll down, and click Settings.
  3. In Settings > Launchers, select the checkbox with a Launcher you want to add to the Test.
  4. Click Save to apply the changes.

Frame 45.png

5. Go back to the Continuous Tests page.

6. Next to each Test instance, in Schedule, select the toggle so that it’s turned on.

Frame 46.png

When the Tests are turned on, you can now:

a. See the target session is being connected to in a console window:

Frame 47.png

b. The Launcher application shows the connection was made, and the connection information in the Launcher is visible.

Frame 54.png

c. You can see the connection sessions in the Launchers > Sessions.

Frame 48.png

When this particular Test ends, it will start a new test via the Launcher application. The Launcher operation is autonomous and needs no human intervention.

Adding Launchers to Locations

The Launchers can be added to defined locations. This is to more closely represent successful connections and latency that real-life users would experience in, for example, an office branch. Having a launcher set up in each location needing to be monitored is a best practice. However, if you narrow down your selection to one location that’s not connecting to targets successfully, targeted troubleshooting can occur.

In a Load Test scenario, the best practice ratio for simultaneous Test connections per Launcher is 30 to 1. So, if running a 100-user Load Test, you need 4 Launcher hosts.

If Launchers are attached to configured locations (see the screenshot below) when a Continuous Test type is in progress, these Launchers will be visible in the map view.

Frame 49.png

Observing Launchers and sessions in map view

  1. In the Sidebar menu, navigate to Dashboard.
  2. Click on the running Test.
  3. In View by, select the map view.

Frame 50.png

Now, you can view the Launcher location plotted on the world map. The hoverable tooltip shows connection and Test successes and failures.

Frame 51.png

Initiating a session

When the Login Enterprise appliance wants to initiate a session, it checks the Test for the Launcher Group and Account Group, picks an available one from each, and sends a message to that Launcher. The Launcher receives the session information and Connector configuration and begins the process of trying to launch a login for the user through the configured Connector.

It determines whether the connection was successful or not by looking for specific events and return codes, and then alerts the Login Enterprise appliance. It will also report if the session terminates unexpectedly. The Target is responsible for reporting back to the Login Enterprise appliance that the session login is completed. All the Launcher can say is whether the process is still running. If the Target has not checked in with the appliance within 5 minutes (configurable), the Launcher will terminate the session and the appliance will mark it as failed.

Once the Target checks in, the appliance tells the Launcher and the Launcher turns off that 5-minute timer and switches to simply monitoring the status of the remote session client’s process ID. It will report to the appliance if the session terminates itself. The appliance will tell the Launcher if the Target has reported completion and thus that the client is expected to terminate itself. So, between the appliance and the Launcher and the Engine, they know when a process starts, if it terminates unexpectedly, and when the termination is expected.

Launcher performance across different locations

Monitoring latency

Login Enterprise Launchers play a crucial role in capturing round-trip latency metrics between the Launcher endpoint host and the virtualized target. The target represents the location where the actual workload for Login Enterprise performance testing occurs. This latency metric quantifies the delay experienced by a user when accessing the virtualized target. For example, they measure the duration it takes for a user's action, such as clicking on an item in the virtualized desktop, to be registered. By collecting these metrics, Login Enterprise offers insights into potential latency fluctuations throughout a workday. Moreover, administrators can set a latency threshold. If this latency is exceeded, Login Enterprise can generate alerts, enabling prompt action to address any performance issues.

Continuous Testing scenario

For Continuous Testing, which typically involves testing against production infrastructure, it's crucial to deploy a Launcher in each physical location to emulate real users accessing virtualized resources. When two locations access two different published resources, this creates a multiplication effect, necessitating the configuration of four distinct Test scenarios.

Application Testing and Load Testing scenarios

While Continuous Testing typically focuses on production infrastructure, the positioning of Launcher endpoint hosts is equally crucial in Application and Load Testing, depending on the testing objectives and setup.

For example, during a Load Test conducted outside of regular work hours, when actual users aren't accessing the target EUC infrastructure (which remains in a production state), strategically placing the Launchers to mimic real user access can help identify any potential issues with the Launchers themselves before they impact production—an outcome best avoided.

Conversely, if the primary goal of the Load Test is to stress the target environment and evaluate its performance, with less emphasis on stressing the endpoints, precise representation of where actual users access the targets may be less critical.

In Application Testing, selecting real user locations can offer benefits, despite being a single-user Test. For instance, when validating an image before scaling up for a Load Test, utilizing a Launcher in the desired location representing an endpoint accessing a target resource can facilitate early detection and resolution of any connection issues.

Lifecycle of the Launcher

  • Setup Launcher host:
    • Install the Launcher software on all Launcher hosts, regardless of whether they run on Windows or Linux. Each Launcher, configured with a minimum of 2 vCPUs and 4GB memory, is capable of concurrently initiating up to 30 target sessions.
  • Install Launcher software:
    • Maintain multiple launchers to distribute workload and ensure redundancy from a capacity perspective. For more information, see Launcher Groups and Locations.
    • Use descriptive naming conventions for Tests, Launchers, Groups, and Accounts to enhance traceability and visibility. This ensures they accurately represent the physical locations where actual users access the endpoints connecting to the target infrastructure. Subsequently, the Launchers will be viewable and plotted on a world map within the Login Enterprise UI, providing comprehensive information about connection success and performance for each location. For more information, see Locations.
    • Install endpoint software, such as VMware's or Citrix's, on the Launchers.
    • Avoid domain joining for Launchers to maintain isolation, unless necessary due to organizational security policies.
    • Implement least privilege principles by avoiding the use of administrative accounts. Instead, use standard user accounts for domain users, ensuring membership in the local machine's user group. However, ensure compatibility with the Login Enterprise Launcher software, allowing seamless operation and connection establishment via the connection clients. Maintain updated endpoint client software on the Launchers, ensuring alignment with the protocols and standards used by actual endpoints accessed by users.
  • Run the software and keep running:
    • Disable power-saving modes to ensure continuous availability and performance consistency.
    • The Launcher software on the Launcher hosts can be configured as a startup item. This ensures that upon reboot, the Launcher will automatically commence running once logged into the system.
    • Before initiating significant or large-scale Tests, we recommend conducting dry runs with a few users per Launcher. This step ensures that all settings are configured correctly and validates the Launcher's capability to handle concurrent sessions successfully.
  • Periodically restart the Launcher host for optimal performance:
    • Monitor system resource utilization, ensuring CPU and memory stay within acceptable thresholds (e.g., 85% CPU utilization, 80% memory utilization).
    • To mitigate performance degradation caused by prolonged uptime, especially during heavy utilization like repeated Load Tests, it's beneficial to periodically reboot Launchers. Setting up a Windows scheduled task or implementing automation for rebooting Launchers can alleviate this issue. Since not all Launchers share identical configurations or run the same policies, there's no fixed requirement for the reboot schedule. However, based on our observations with customers, a weekly reboot schedule has proven effective.
    • To prevent false positive alerts regarding Launcher unavailability, ensure that in Continuous Testing configurations, testing is disabled during the scheduled reboot. This ensures that any downtime associated with the reboot doesn't trigger unnecessary alerts. For more information, see Daily and weekly schedules.
  • Update launcher upon new Login Enterprise releases:
    • When setting up a new version of Login Enterprise, we recommend installing the new Launcher version software over the previous version as a best practice. However, this is not mandatory unless the version difference constitutes a major update. In such cases, failure to update may result in the Launchers being unable to initiate sessions until they are updated to the latest version.
  • After the last step, go back to run the Launcher software and keep running; Rinse and repeat.

Best practices

  • Before running a test on a Launcher, verify that you can manually invoke a connection to the target. To do this:

    1. Access the target gateway: On the launcher, navigate to the web front end of your target gateway. Depending on your setup, this may involve accessing a web front end or using a different method such as the RDS client (mstsc) for Remote Desktop Services. Ensure you are familiar with the specific access method required for your environment to verify a successful connection.

    2. Log in: Use a test user account to log in.

    3. Open the published resource: Attempt to manually open the published resource.

    4. Evaluate the result: Check if the connection is successful.

If the connection works manually, proceed with using the Launcher. If not, troubleshoot the issue. Common problems might include insufficient Citrix licenses, downed bare-metal servers hosting the target desktop, etc.

This manual verification is crucial for both initial setups and diagnosing sudden connection failures. It helps identify underlying issues that could be affecting the Launcher’s performance. This best practice applies to all Launchers, regardless of the operating system (Windows, Linux, or IGEL).

  • Utilize thin clients over laptops for optimal performance and manageability. However, note that you might want to use a laptop/desktop in an office as a launcher to simulate an endpoint connecting to a target. The idea is that in the office you’re using laptops/desktops to connect to remote targets. So this launcher “model” would be more accurate with overall uptime representation.
  • During a Load Test, Login Enterprise will restrict the usage of the Launchers designated for the Test from participating in other testing scenarios. This implies that for the duration of the Load Test, which may extend to several hours if instructed, the designated launchers will be unavailable for launching Continuous or Application Tests, even if these Tests remain enabled or initiated during the Load Test. Upon completion of the Load Test, the Launchers will automatically resume other assigned testing tasks. Therefore, it is advantageous to allocate dedicated Launchers for Load, Application, and Continuous testing, and organize them within Login Enterprise Launcher Groups. For more information, see Launcher Groups and Locations.
  • Automate Launcher deployment and configuration to ensure consistency and efficiency.
  • Standardize the process of installing the Launcher through proceduralization and automation.

Important considerations

1. Dedicated use for Load Tests

  • Launchers running Load Tests cannot launch other sessions during the Test. This is by design.
  • Recommendation: Avoid using launchers for mixed testing scenarios to ensure accurate Load Testing results.

2. Session launch rate

  • A Launcher will only initiate a new session every 20 seconds by default.
  • If multiple Continuous Tests are enabled simultaneously and they attempt to launch sessions on the same Launcher, the launcher will still adhere to the 20-second interval.
  • Additional sessions will enter a scheduling queue, waiting to be launched sequentially.

3. Load Test scenario ratio

  • In a Load Test scenario, the best practice ratio for simultaneous test connections per Launcher is 30 to 1. So, if running a 100-user Load Test, you need 4 Launcher hosts.

Tips and tricks

  • Putting the Launcher shortcut in the Launcher host's startup folder can make getting the Launcher starting easier in case the Launcher needs to be restarted.
  • Pinning the Launcher to the Windows taskbar can make it easier to start the Launcher application if it's ever closed.
  • If you’re using the Login Enterprise Launcher (Windows-based), the Launcher can be installed over a network (see Silent installation). This lets you deploy multiple Launchers easily.

Related terms and concepts

Some other related terms and concepts that you’ll find in this article include:

  • Account: a username/password pair used to trigger connections and identify connections once they start (usually an AD account).
  • Appliance: a Linux-based Login Enterprise control center.
  • Connector: a Login Enterprise software used to interact with a Connection Broker to launch a remote connection to a target desktop. Also, a Connector may relate to VMware connection and custom connections that Logine Enterprise supports. For more information on Connectors and connection options, see Connectors and Connection Configurations.
  • Target: an individual end-user machine where a Test will run.
  • Test: a collection of Application scripts that will perform a series of actions and take measurements as defined in those Application scripts. Tests also include taking other metrics outside the application scripts (workloads), such as Protocol latency, Logon times, EUX measurements, and Session Metrics.

For more terms and concepts, see the Login Enterprise Glossary.

Additional resources