Toll-free in North America

1-833-2READINESS

Everywhere else:

+44 203 633 5432

Interrogative Testing – The Future of application assessment

Greg Lambert
February 6, 2024
4 minutes

You may not like the following statement, but here goes,

“Testing is essential – and for most application use cases it is essentially useless”

I am not a fan of automated testing, as we have found over the past 10 or so years, that the results are often inconsistent, coupled with the high-investment cost of coding/training/scripting the test install led to a much reduced return on per-application investment. Add the “brittleness” of most testing platforms due to constant application changes and you will see why most people consider automated testing a “good thing”. But other than developers or application owners testing a SINGLE application across multiple platforms, scripting your entire application portfolio is generally considered a “no go”.

At Readiness we have taken a different and novel approach. Yes, sure we can automate the basic “smoke testing” – you know, running an application, exercising the shortcuts and running through the install, update and uninstall process. We don’t even consider this a major feature anymore – it has to happen, but it should be just assumed that it happened.

Why is  application testing needed? There are a few good reasons including:

  • Validating your assumptions that your installation went as expected
  • Ensuring a deployment process is working as expected
  • Ensuring the latest changes and additions are are working as expected
  • Making sure that you application is uninstalled or removed as expected.

The key words here are “As expected”. And now, what does that really mean?

Testing applications does not operate in vacuum. What we are really doing is comparing the results of change against a baseline. Software testing has been defined as, 

the act of examining the artifacts and the behaviour of the software under test by validation and verification.

At Readiness, we think there is an extra step which includes, validating the results against a planned, programmed or documented baseline set of behaviours, artifacts and outcomes. At Readiness, it’s not just testing, it’s a structured comparison exercise.  

Let me provide a few scenarios.

  1. You are migrating to a new target platform such a more modern desktop or server platform.
  2. Your team is converting your application installation packages to a new virtualization format (e.g. from App-V to MSIX)
  3. It’s Patch Tuesday, and the core build of your production environment will change in 72 hours.
  4. There is a new security baseline imposed, and you need to determine what will happen to installation, application exercise and removal of your portfolio.
  5. Here’s an easy one: a user request for a small, super easy to install, self-updating 3rd party utility that noone else will notice – honest, we promise. Really.

So how do you test? 

At Scale and at speed? 

And generate prescriptive, actionable next steps for your team?

You need an automated installation,  application smoke-testing system that operates at a portfolio level (thousands of applications) and provides you with a “Delta” report.  That is why Readiness developed the X-check technology with our proprietary Delta report. 

What’s a “Delta Report”? 

It’s simple – just show me the differences. We don’t generally need to know that 200,000 registry keys or thousands of files are installed. We (really) do want to know the following:

  • What applications installed on one platform but failed on another
  • Which services did not start on a particular platform
  • Did a firewall change on one platform get implemented successfully?
  • Did all of the printers get installed? For each application?
  • How were Microsoft Defender Exclusions handled?
  • Did all of the scheduled tasks for each application get handled properly on each platform.

I have included a sample image of a common application – and the X-check or Delta report of the results of testing against two platforms.  The image contains the results of installing Chrome on two different builds: Windows 10 and Windows 11.

image 1

This is a great example, though not all of the data is shown here. As you can see, things can look very different on two different builds – and we think that the only way to absolutely know is to use X-Check from Readiness.

Get Guaranteed results – and let us show you how.

Greg Lambert

CEO, Product Evangelist
Greg Lambert is the CEO and product evangelist for Application Readiness Inc. Greg is a co-founder of ChangeBASE and has considerable experience with application packaging technology and its deployment.

Planning business modernization projects?

  • Windows 10/11 migration
  • MS server 2022
  • Migration to Azure

Is your application estate ready?

Assurance.

Unbounded.

3 months of patch protection, assessments and dependency reports for your entire portfolio.

  • No cost
  • No limit of applications
  • No software needed
  • No infrastructure required
  • No obligation
Contact us to get started