Citrix Web User Agents Test
Users can connect to web applications using many client devices –web browsers, mobile phones, tablets, etc. Each such client device is called a user agent. Users using certain types of client devices/user agents may be engaged in bandwidth-intensive communication over the web, scarring the experience of other users to the web applications. Similarly, certain types of client devices - eg., some browser types like Firefox, Chrome etc. - may not be compatible with certain web applications. Users using such devices may therefore experience slowness when some web pages are rendered.
To assure users of the best experience with web applications at all times, administrators should track requests from the different client devices/user agents in use, measure the bandwidth usage and page rendering time of each device type, and share with users the list of client devices that may impact user experience with the web applications. The Citrix Web User Agents test helps administrators come up with such a list!
This test auto-discovers the types of client devices that are used for connecting to web applications, and reports the requests received from and bandwidth used by each device type. This way, the test leads administrators to those device types that are popular amidst users and those that are consuming bandwidth excessively. Additionally, the test also measures and reports the time taken by each client device type to render web pages, thus pointing to those device types that are much slower than the rest in page rendering. With the help of these metrics, administrators can arrive at a list of client devices that may be incompatible with their web applications.
Target of the test : An AppFlow-enabled ADC Appliance
Agent deploying the test : A remote agent
Outputs of the test : One set of results for each user agent/client device type that is connecting to the target ADC appliance
Parameter | Description |
Test period |
How often should the test be executed. It is recommended that you set the test period to 5 minutes. This is because, the eG AppFlow Collector is capable of capturing and aggregating AppFlow data related to the last 5 minutes only. |
Host |
The host for which the test is to be configured. |
Cluster IPs |
This parameter applies only if the ADC appliance being monitored is part of a ADC cluster. In this case, configure this parameter with a comma-separated list of IP addresses of all other nodes in that cluster. If the monitored ADC appliance is down/unreachable, then the eG AppFlow Collector uses the Cluster IPs configuration to figure out which other node in the cluster it should connect to for pulling AppFlow statistics. Typically, the collector attempts to connect to every IP address that is configured against Cluster IPs, in the same sequence in which they are specified. Metrics are pulled from the first cluster node that the collector successfully establishes a connection with. |
Enable Logs |
This flag is set to No by default. This means that, by default, the eG agent does not create AppFlow logs. You can set this flag to Yes to enable AppFlow logging. If this is done, then the eG agent automatically writes the raw AppFlow records it reads from the collector into individual CSV files. These CSV files are stored in the <EG_AGENT_INSTALL_DIR>\NetFlow\data\<IP_of_Monitored_ADC>\webappflow\actual_csv folder on the eG agent host. These CSV files provide administrators with granular insights into the web appflows, thereby enabling effective troubleshooting. Note: By default, the eG agent creates a maximum of 10 CSV files in the actual_csv folder. Beyond this point, the older CSV files will be automatically deleted by the eG agent to accommodate new files with current data. Likewise, a single CSV file can by default contain a maximum of 99999 records only. If the records to be written exceed this default value, then the eG agent automatically creates another CSV file to write the data. If required, you can overwrite these default settings. For this, do the following:
|
Measurement | Description | Measurement Unit | Interpretation |
---|---|---|---|
Hits |
Indicates the number of requests received from this type of client. |
Number |
Compare the value of this measure across client types to identify that client that is used by a wide cross-section of application users. |
Bandwidth |
Indicates the total amount of data received from clients of this type. |
KB |
Compare the value of this measure across clients to know which type of client has been consistently consuming more bandwidth than the rest. |
Avg render time |
Indicates the elapsed time, from when the browser on this device starts to receive the first byte of a response until either all page content has been rendered or the page load action has timed out. |
msecs |
Compare the value of this measure across client devices to know which type of device is seeing the maximum page rendering time. |