Blog - Xray

Exploratory testing for non-functional requirements - Xray Blog

Written by Ivan Filippov | Mar 28, 2024 5:05:25 PM

Overview

In the past, testing for non-functional requirements (NFR) was often neglected, done to the extent possible at the end of the life cycle. For modern development teams, that is no longer an option - to deliver high-quality applications, they have to implement a comprehensive testing strategy with proper allocation of resources to both functional and non-functional components. 

Furthermore, exploratory testing (ET) is being used more frequently to ensure compliance and find more convoluted flaws and bottlenecks in intricate software systems. The trend is not surprising as the cost of compliance is increasing, both from the customer and the regulatory perspective. 

As we expect an increase in demand for expertise in exploring non-functional areas of software products, we have prepared this guide to help you get started.

 

Note: here, we assume a familiarity with the basics of exploratory testing, you can check our previous exploratory testing blog posts and tutorials.

 

NFR Testing Basics

Before diving into the exploratory aspects, let’s briefly talk about non-functional requirements and their testing overall. The high-level goal is to validate product properties that matter for internal and external stakeholders but do not fit the functional category.

 

Non-functional Requirements

Functional Requirements

Focus

Product attributes, the “how”

Product features, the “what”

Types

Performance, security, usability, etc.

Unit, UI, API, SIT

 

The ISO/IEC 250XX standards (e.g., 25002, 25010, 25019) cover many foundational aspects. It is common to group NFRs and evaluate categories as a whole, e.g.:

  • Operational: NFRs, like security, accessibility, and usability;
  • Revisional: NFRs, like flexibility and scalability;
  • Transitional: NFRs, like portability, reusability, and interoperability.

When it comes to testing NFRs, there are plenty of established scripted methods, especially automated ones. They often require a suite of specialized tools or internal telemetry to validate compliance, for instance, security testing tools like SonarQube, Snyk, ImmuniWeb and performance ones like WebLoad, BlazeMeter, K6.

However, as systems and requirements evolve, those scripted checks become insufficient - the creative aspects of exploratory testing could provide significant benefits.

 

Exploratory Testing for NFR

ET is a versatile approach that is especially useful for complex, high-risk areas. It complements automated testing by allowing testers to uncover unexpected issues, usability problems, or edge cases that scripted tests might miss. Of the many NFR testing types, in this beginner’s guide we are going to focus on the ones we consider more suitable for exploratory testing, namely:

  • Usability;
  • Security;
  • Compatibility;
  • Performance.

You would follow the process described below:

Step 1. Understand the landscape

There are two “challenges” you will need to overcome:

Knowledge about the product

Going into exploratory sessions with a completely “blank slate” can sometimes be useful, but in general, we recommend skimming through product requirements and/or documentation in advance. That gives you an opportunity to ask critical questions and suggest improvements early in the process. It is critical to collaborate with different groups of stakeholders at this stage to understand user desires as well as technical perspectives.

Some of the best practices you want to see from the non-functional requirements are:

  • Quantified expectations based on your product KPIs, including both units and measurement methods (e.g. “within 2.1 seconds including rendering of images” instead of “page loads quickly”);
  • Scope specificity, when applicable (e.g. “accept payment through cards except Discover" instead of “accept payment”);
  • Risk and business impact assessment to help with prioritization.

With that said, sometimes requirements do not quite follow those best practices, which is when exploratory testing and its fluid nature can shine the most. We suggest leveraging common standards for guidance (or at least inspiration), some of the examples are:

  • Usability - ISO 9241-210, WCAG, iOS or Android app guidelines;
  • Security (often industry-specific) - ISO 27001, PCI DSS, HIPAA, GDPR;
  • Compatibility - ISO/IEC 25010;
  • Performance - Google’s PageSpeed Insights and Core Web Vitals, USWDS, MDN, category-specific like Office 365, ISO 5055.

One of the recent examples: CSA is a new set of guidelines from the U.S. FDA. It includes recommendations on when to use unscripted testing in regulated environments.

To improve clarity, you may need to graphically arrange this information in the format that works best for you (e.g. workflows/diagrams or mind maps), to comprehend how the product functions end-to-end and to highlight important ideas and scenarios.

 

Knowledge about the scripted side

As we mentioned, exploratory testing is a complementary approach, meaning you must identify which NFR aspects are already covered with automated or manual scripted testing. Then, you have to decide which of them are critical enough to double-check in your sessions. At the same time, you need to note down the gaps you would focus on.

Possessing both pieces of knowledge will allow you to more clearly understand the role of your exploratory sessions at the project level and to avoid mismatching expectations across stakeholders. 

 

Step 2. Plan your exploratory journey

Again, a completely unstructured approach could be useful at times, but would struggle with coverage and traceability, especially in large and complex projects. We recommend applying at least some structure elements to your exploratory testing, for example session-based test management (SBTM).

One important aspect to keep in mind: your non-functional exploratory sessions will often be multifaceted - focused on a certain NFR type, but not restricted by it. e.g., performance and usability will always be “in the background” whenever you are exploring for security or compatibility.

We suggest using test charters (with or without mind maps) to guide your exploration. Some of the commonly used sections are:

  • Objective;
  • Scope (and, optionally, out-of-scope);
  • Constraints (e.g. timeboxing).

Last but not least, based on those charter sections, identify whether there is any tool preparation you need - real devices, virtual environments, or accessibility helpers like screen reader programs. 

 

Charter example

Objective: Assess the personal account section of the updated e-commerce platform to discover any issues. 

Primary focus = usability, especially around accessibility like Color Contrast and Alt Text.

Secondary focus = security and performance.

Scope: Registration, Login/Logout, all Account Management features (see Constraints) using Win 11 and Mac desktops with Chrome/Safari browsers at 1920 * 1080 resolution.

 

Constraints:

  • Resolution is restricted for this release, but not zoom;
  • VIP section of the personal account is not ready for ET;
  • Timebox of the session = 40 minutes.

Notes:

  • According to an updated study by Portent, “A site that loads in 1 second has a conversion rate 3x higher than a site that loads in 5 seconds.” Will keep in mind for performance evaluation.

 

Step 3. Explore and report

Your priority will often be “unhappy” paths that automated tests may not cover (sufficiently or at all) - “why would a user do that” kind of scenarios that heavily rely on creativity and domain knowledge.

As you explore:

  • Collect detailed evidence, applying the same “quantified, unambiguous” principles we mentioned for the NFR requirements;
  • Focus on identifying defects rather than simply confirming function (critical thinking); 
  • Analyze not only what’s broken but also what’s missing;
  • Continuously modify the testing approach as new information emerges.

 

Feedback loop speed is one of the key advantages of exploratory testing, so reporting is not really a separate step - document and report any issues, insights, and observations immediately. Having a tool like Xray Exploratory App that easily allows creating evidence and defects without disrupting the session helps tremendously. 

For better analysis, you can apply the combination of labels, components, and custom fields in Jira to assist with defect categorization. Consistent report formats along with the production monitoring will allow you to track your NFR testing trends.

Next, we will talk about specific scenarios by NFR type sticking with our e-commerce charter (but they would be applicable to most software applications).

 

Checklists of scenarios

Usability

We start with usability because it is arguably the least automatable area. It involves many aspects defining user experience with the product - ergonomics, intuitiveness, accessibility, etc. One popular approach by Nielsen Norman Group breaks it down into five dimensions:

  • Learnability: How easy is it for users to accomplish basic tasks the first time they encounter the design?
  • Efficiency: Once users have learned the design, how quickly can they perform tasks?
  • Memorability: When users return to the design after a period of not using it, how easily can they reestablish proficiency?
  • Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors?
  • Satisfaction: How pleasant is it to use the design?

Sample scenarios - “How does an application handle…”:

  • Non-standard resolution/zoom (within the boundaries discussed with the development team) when it comes to data truncation and formatting issues?
  • Repeated authentication and password reset failures? What about recreating an account from scratch and checking redundancy?
  • Out-of-sequence, back-and-forth user interactions?
  • The situation where the user edits an order and forgets to save it? What if the order process was interrupted by a brief connection failure?
  • Mixed payment options (e.g. card + points)?
  • The onboarding experience of the 2nd, 3rd, etc. visits?

Sample metrics to keep an eye on:

  • Number of seconds/steps to reach most frequently used features;
  • Amount of time to recover from errors and error frequency;
  • Amount of time to get onboarded vs the perceived knowledge/skill gained;
  • Number of media items without alt text.

For accessibility specifically, you can leverage WCAG checklists based on the level for your product to achieve. For learnability, an often overlooked aspect is the documentation testing (including any onboarding tours or videos, if applicable).

Security

The degree of rigor will depend on the volume and nature of sensitive information in your product, if any. Scripted testing typically covers an “average user”, which means the 2 personas you would want to pay the most attention to are:

  • A newbie very uninformed about cybersecurity;
  • An expert hacker.

Study the latest vulnerabilities and exploits, then try to recreate the most critical and creative ones. In addition to universal standards we mentioned earlier, you can also consider CIS and OWASP benchmarks.

 

Sample scenarios:

  • Script injection for the password validation after reset;
  • Access to internal/customer-specific links in logged-out state;
  • Access to authorization-protected assets after a permission level change;
  • Timeout and recovery;
  • Electronic signatures/approval processes.

 

Compatibility

This testing evaluates how well your product functions with other systems and how easy it is to interact with your product across different environments. Basic tests - like installation on popular platforms - should be handled by the scripted side, so the multitude of external/integration factors is your focus.

Sample scenarios:

  • Different sequences of upgrading your product and the corresponding platform (including upgrades by more than 1 version); pay attention to the ease of conflict merging/resolution;
  • Common end-to-end workflows with mismatching versions of involved tools (i.e. the latest version of your product with outdated integration apps and vice versa);
  • If there is a companion app (or just cross-platform support), simultaneous and/or sequential actions between the versions (e.g. start the order in the desktop browser, finish/edit in the mobile app, and vice versa);
  • Reaction to unexpected change of environmental variables (e.g. a sudden drop in the network quality and the corresponding switch of the connection protocol from ethernet to WiFi or from 5G to 4G);
  • Complete migration from one platform to another (e.g from an on-prem version to cloud).


This testing is rather demanding when it comes to device access, you can consider a virtual solution, e.g. BrowserStack.

 

Performance

As we mentioned, this NFR type will accompany all your exploratory activities as at least a background thought. But there are also a few dedicated scenarios to keep in mind, especially around speed and resiliency components. While scripted tests will often take care of the volume aspects, you would focus on performance in “awkward” and really stressful conditions.

Sample scenarios:

  • Check application’s response time is not more than 3 seconds once the user profile has 10+ pages of history;
  • Check application’s response time is not more than 3 seconds with the browser already handling 5+ other tabs;
  • Check application’s processing time when making multiple simultaneous changes to the calculation factors several times in a row;
  • For desktop and mobile applications, restart the platform unexpectedly (including unplugging/dead battery).

 

Building a strong foundation in non-functional requirements testing with Xray

Non-functional testing keeps growing in importance, helping teams achieve lower production risk and expenses related to the product’s security, performance, usability, etc. Exploratory techniques add crucial human unpredictability, substantially increasing the chances of delivering high-quality products that users will enjoy.

With the comprehensive testing strategy involving functional and non-functional tests, scripted and exploratory techniques, it is more important than ever to have visibility across all of your testing efforts. A dedicated test management tool like Xray, in conjunction with Xray Exploratory App and Jira, enables you to consolidate test entities and reports and to boost end-to-end traceability along with insight generation.

See for yourself how easy it is to:

 

You can also check out the on-demand Xray walkthrough and the Xray Exploratory App intro videos.