News


End-to-End Testing Cloud Environments

End-to-End Testing Cloud Environments



On 18th Oct 2017

End-to-End Testing, Monitoring and Load Testing for Cloud Environments

To meet changing business needs and quickly scale, most IT organizations are now supporting applications on-premise, in the cloud or in hybrid environments. In fact, 76% of workloads are expected to run in the cloud within two years (451 Research), so a growing number of organizations are moving to the cloud. Are you ready to continue to deliver optimal user experiences, regardless of your deployment model?

In a culture of fierce competition and high expectations, end-user experience rules. Users expect – and demand – that the performance of their virtual applications and desktops match or exceed the performance of their physical applications and desktops. Yet, delivering virtualized applications or desktops from the cloud adds a new layer of complexity to troubleshoot the root cause of performance issues. Not only are you dealing with complex virtualization technology with multi-tier components affecting user experience, but you also rely on a cloud platform that can introduce performance degradations. And even a slight glitch can result in user disconnects, slow logons, keystroke lags, or screen freezes—all extremely frustrating for your users. To top it all off, unlike an underperforming physical desktop frustrating a single user, a virtualization performance issue affects many users.

5 STRATEGY PILLARS TO CONSIDER

Here are the 5 pillars to consider as you build an optimal End-to-End Cloud Testing and Monitoring strategy:

1. IMAGE RECOGNITION

Since you are streaming applications within a Citrix or RDS-based environment running on premise or from the cloud, you need a testing and monitoring approach based on image recognition (object recognition will not work!). All user action should be automated, tested and measured, so look for a solution that lets you capture, store and edit baseline bitmap images to define your test and monitoring scripts.

2. TEST AUTOMATION

With so many potential failure points, it is time to revamp quality and test automation. Functional testing, smoke testing and performance and load testing should be continuously embedded in your deployment cycles. As users place added demand into your Citrix or RDS environments you need to ensure readiness to handle it, regardless of your deployment model.

3. TEST, BENCHMARK AND COMPARE

.When migrating technology to the cloud or rolling out a new application, a good approach is to load test and baseline prior to roll-out, then load test post-deployment and compare (to ensure no performance degradation for your users). Look for the ability to reuse testing scripts for production monitoring to ensure cross-functional alignment.

4. PROACTIVELY TEST AND MONITOR END-TO-END FROM YOUR USER PERSPECTIVE

Simulate and measure availability and response time of critical end-to-end user transactions on a 24x7x365 basis, to find problems before your users are impacted. Look for proactive user-centric monitoring that periodically executes synthetic transactions, taking user SLAs response time measurements along the way to ensure that your applications are working effectively, alerting you at the first sign of trouble.

5. FOCUS ON PERFORMANCE ANALYSIS

Performance optimization boosts external-facing application speed and employee productivity and drives more revenue. Therefore you should benchmark and analyse application performance on an on-going basis to identify areas for improvement.

Your users have the same level of expectations regardless of your behind the scenes architecture, and how your applications are deployed and consumed. 


In a nutshell, a Remote Desktop Services (RDS) platform runs applications or user desktops on the server rather than on user workstations. No underlying objects or controls are delivered to the client. Instead, the RDS server sends screen images of the user desktop to the end-user workstation, and keystrokes and mouse clicks are returned to the server. This adds a new layer of complexity and challenge– since you stream your applications, many test automation tools would not work because they use object recognition methodologies. Instead you need a test automation solution that uses image recognition and visually examines the desktop, responds to changes and uses the keyboard and mouse just like a real user does. And this is exactly what ICTestAutomation does. Use TrendIC solutions for complete end-to-end testing and monitoring as outlined below.

BUILD YOUR TEST SCRIPTS AND AUTOMATE MANUAL, FUNCTIONAL, SMOKE AND PERFORMANCE TESTING

Easily automate all user actions (clicking on, comparing, verifying, awaiting display images…) with minimal effort. ICTestAutomation utilizes an advanced proprietary image recognition system to replicate the actions of an actual user and visually analyze every aspect of the desktop. Just like a user, ICTestAutomation does not need to ‘see’ an image in the same location each time. ICTestAutomation looks at the entire desktop and when an image needs to be clicked, it will move the mouse to the desired image and issue the appropriate mouse action, just like a user would do. ICTestAutomation also automates keystrokes in the same way that an actual user types. The simplicity of using these 3 elements (desktop visualization, keyboard and mouse) gives ICTestAutomation the power and flexibility to operate with any application so you can easily automate all your testing activities.

TEST YOUR APPLICATION UNDER LOAD

Use ICTestAutomation scripts and ICTestAutomation Virtual User to ensure readiness for peak traffic and real-world conditions for all Citrix, RDS & Microsoft Terminal Services environments. You can test, measure and validate application response time, at the client UI, under various controllable load levels with non-intrusive load testing. “Load generating machines” are used to generate user load against the server environment under test, while “measuring machines” actively measure response times at the client UI and gather server performance metrics on the back-end. Each ICTestAutomation Virtual User script executes in its own “desktop” and opens its own client connection to the server-under-test, just as a group of real users would. ICTestAutomation Virtual User automatically compares what it “sees” on the screen to baseline response images (created during test script development), and measures and reports response times at the client GUI to immediately identify underperforming components. Validate complex multi-step scenarios end-to-end (e.g. find item, add to shopping cart, enter credit card information and complete payment) and easily customize load levels and virtual user ramp-up times as defined in your test plans.

MONITOR RESPONSE TIME FROM A USER PERSPECTIVE

Reuse the same ICTestAutomation scripts and proactively monitor any application with ICTestAutomation periodically executes end-to-end transactions, taking response time measurements along the way. By automating the driving of any application just like a real user, ICTestAutomation can validate whether all critical aspects of an application are available and working within limits. If they are not, ICTestAutomation generates a real time alert to help you find and resolve problems before your users are impacted. Screenshots and movies are also taken when problems are identified to help you analyze root cause. Any end to end transaction can be simulated and measured in absolute values, percentages or statistical deviations to help you identify problems early on.

Are you ready to boost quality and customer satisfaction? Start testing and monitoring ALL your applications today, without any code changes or production impact! All you need is access from Windows to any target application.