Membrane Transport 1.0.0-rc.2 Release Candidate Spot-Check Test

by StackCamp Team 64 views

Hey guys! We've got a new release candidate for the Membrane Transport sim, version 1.0.0-rc.2, and it's time to put it through its paces. This spot-check test is crucial to ensure everything's running smoothly before the official release. So, let's dive in and make sure this sim is top-notch!

Mentions: @KatieWoe @Nancy-Salpepi @vperezcadavid99 @brettfiedler @terracoda @kathy-phet @samreid

Simulation Links

Here are all the important links you'll need for this test:

Test Matrices

We have a few different test matrices to cover all aspects of the sim. Make sure to sign up and complete a test!

General Test:

  • [ ] Tester = , Platform = , Time =
  • [ ] Tester = , Platform = , Time =

PhET-iO Test:

  • [ ] Tester = , Platform = , Time =
  • [ ] Tester = , Platform = , Time =

Additional Features Test:

  • [ ] Tester = , Platform = , Time =
  • [ ] Tester = , Platform = , Time =

Features Included

This release candidate includes several key features:

  • [x] PhET-iO: For advanced interactive simulations and data tracking.
  • [x] Dynamic Locale: Supports multiple languages and localization.
  • [x] Alternative Input: Accessibility features for various input methods.
  • [x] UI Sound: Sound effects to enhance the user interface.
  • [x] Sonification: Converting data into sound for accessibility.
  • [x] Description: Text descriptions for interactive elements.
  • [x] Voicing: Screen reader compatibility for enhanced accessibility.

Focus and Special Instructions

Alright, team, let's focus on what's important! The main purpose of this test is to ensure that all the new features and bug fixes are working correctly and that the simulation is stable. This is a crucial milestone before the final release, so your thorough testing is vital. Here's what we need to concentrate on:

  • QA Credits: Please, please, please check the QA credits! If anything needs updating, open a new issue right away. We want to make sure everyone gets the recognition they deserve. This includes verifying the accuracy and completeness of the credits. Confirm names, contributions, and any other relevant details are correctly listed. If you find discrepancies or omissions, document them in a new issue with clear explanations and suggested corrections.
  • Specific Features: Pay close attention to the features listed above (PhET-iO, Dynamic Locale, Accessibility features, etc.). Make sure they're functioning as expected and not causing any unexpected issues. Dive deep into each feature, testing its functionality across different scenarios and use cases. For PhET-iO, ensure data tracking and interactivity work seamlessly. For Dynamic Locale, verify that language switching is smooth and accurate. Accessibility features should be tested rigorously with various input methods and screen readers to guarantee a positive user experience for all learners.
  • Platforms: Test the sim on various platforms (Windows, macOS, ChromeOS, iOS, Android) and browsers (Chrome, Firefox, Safari, Edge) to ensure cross-compatibility. We want this sim to work flawlessly for everyone, no matter their setup.
  • Non-Standard Tests: If you encounter any non-standard test scenarios, document them clearly and report any issues you find. Sometimes the most unexpected issues arise from unusual user behavior, so let's try to cover all bases. This might include testing with specific query parameters, unusual screen sizes, or unconventional input methods. Detailed documentation of these scenarios and any resulting issues is crucial for the development team to address them effectively.
  • PhET-iO Diff Wrapper: If you want to test the PhET-iO diff wrapper against a prior version, let me know, and I'll provide the details and link. This is essential for ensuring that our PhET-iO integrations are robust and reliable. The diff wrapper test helps identify any unintended changes or regressions introduced in the new version compared to the previous one. This involves comparing the behavior and functionality of the simulation in both versions, focusing on the PhET-iO aspects such as data streams, events, and state setting. Any discrepancies or unexpected behaviors should be documented and reported as separate issues.

Issues to Verify

These issues should have the "status:ready-for-review" label. Unless an issue says to close after verifying, assign the issue back to the developer.


For QA...

Okay, QA team, this section is all for you! Let's break down the specifics of what needs testing.

General Features

What to Test

For the general features, we want to make sure everything works as expected from a user perspective. This means testing all the interactive elements, input methods, and overall functionality of the simulation. It's about ensuring the sim is intuitive, stable, and doesn't have any glaring issues that might detract from the user experience. Here’s a detailed breakdown of what we need to cover:

  • Click Every Button: Start with the basics. Click every single button and interactive element in the sim. Make sure they respond as expected. Do they trigger the correct actions? Is the visual feedback appropriate? Are there any buttons that don't seem to do anything? This might seem simple, but it’s the first line of defense against obvious bugs. Clicking through each button systematically ensures that no interactive element is overlooked. Document any buttons that don't function correctly or produce unexpected results.
  • Test All Possible Forms of Input: Simulations are interactive, and users can interact with them in many ways. We need to ensure the sim works seamlessly no matter how a user chooses to engage. Testing all input methods ensures that the simulation is accessible and functions correctly for all users, regardless of their preferred input device. Different users have different preferences and needs, and our goal is to provide a consistent and reliable experience across all input methods. If an input method doesn't work as expected, it should be documented with specific details about the issue.
    • Mouse/Trackpad Inputs: Test all mouse and trackpad interactions. This includes clicks, drags, hovers, and any other mouse-related actions. Ensure that these inputs are precise and responsive. Pay close attention to how the sim reacts to different mouse movements and clicks. Test if dragging and dropping items works correctly, and if hover states are visually clear. Any issues, such as unresponsive clicks or inaccurate dragging, should be documented.
    • Touchscreen Inputs: If the sim is designed to work on touch devices, test all touchscreen inputs. This includes taps, swipes, pinches, and multi-finger gestures. Verify that these gestures are recognized correctly and that the sim responds appropriately. Test the responsiveness of the simulation to touch inputs, especially on different screen sizes and devices. Issues like misrecognized gestures or unresponsive taps should be carefully documented.
  • Sound Check: If the simulation includes sound, make sure it works correctly. Are the sound effects appropriate? Is the volume level balanced? Are there any unexpected noises or glitches? Audio cues can significantly enhance the user experience, but only if they function correctly. Pay attention to the clarity and appropriateness of the sounds, and report any audio-related issues, such as missing sounds, distorted audio, or incorrect volume levels.
  • No Lost Elements: Ensure that you can't lose any interactive elements or data within the simulation. Can you accidentally drag an object off-screen? Can you reset the sim to a state where something is missing? Data integrity is crucial. Make sure users can't accidentally delete or misplace important items or information within the simulation. Test scenarios where elements might be dragged off-screen or lost due to user actions. Report any instances where elements can disappear or become inaccessible.
  • Play Normally: Just play with the sim as a user would. Explore the simulation, try out different scenarios, and see if anything feels off or broken. This kind of unstructured testing can often reveal issues that structured tests might miss. This is where your intuition and experience as a user come into play. Spend time exploring the sim's features and interactions without following a strict test plan. Note down any aspects of the simulation that feel clunky, confusing, or broken.
  • Try to Break It: Now, put on your “evil tester” hat and try to break the sim. Push it to its limits. Do things that a normal user wouldn't do. See if you can make it crash or produce errors. Stress testing the simulation can reveal robustness issues and edge-case bugs that are less likely to surface during normal use. Try to input extreme values, perform actions in unexpected sequences, or overload the sim with data. Document any crashes, errors, or unexpected behavior.
  • Query Parameters: Explore the simulation with different query parameters. These parameters can modify the sim's behavior in various ways, so it's important to test them thoroughly. Testing with query parameters can help uncover specific issues related to configuration options and settings. Experiment with different combinations of parameters and observe how they affect the simulation. Common query parameters might include those related to language settings, debugging options, or feature flags. Report any issues that arise when using specific query parameters.
  • Previous Versions: When reporting an issue, check if it was present in a previously published version of the sim. This helps developers understand if the issue is new or a regression. Identifying whether an issue is new or a regression is crucial for prioritizing bug fixes. Check the issue history and release notes of previous versions to see if similar problems have been reported. If the issue is a regression, it indicates that a recent change might have introduced the bug.
  • Browser Versions: Include browser version numbers in your issue reports. This helps developers reproduce the issue on the same browser version. Different browsers and browser versions can interpret code differently, leading to inconsistencies in behavior. Always include the browser name and version in your bug reports to help developers replicate the issue accurately. This information is essential for diagnosing and fixing browser-specific problems.
  • Console Errors: If there's a console available (e.g., the browser's developer console), check it for errors and include them in the problem description. Console errors often provide valuable clues about the cause of a bug. The browser's developer console can display error messages, warnings, and other diagnostic information. Include any console output in your bug reports, as it can provide valuable clues about the source of the problem. Error messages, stack traces, and other console data can help developers pinpoint the exact location of the bug in the code.
  • Maintenance Issues: As an RC begins and ends, check the sim repository for maintenance issues. If there's one, check it and notify developers if there's a problem. Monitoring maintenance issues ensures that the development team is aware of any ongoing problems or technical debt in the simulation's codebase. Regularly check the sim repository for open maintenance issues, and notify the developers if you find anything critical or urgent. Addressing maintenance issues can help prevent future bugs and improve the overall stability of the simulation.
  • QA Credits: As the RC ends, notify the developer of any new QA credits that need to be added. Giving credit where it's due is super important! It's our way of recognizing the hard work and dedication of the QA team. At the end of the RC testing cycle, compile a list of everyone who contributed to the testing effort and ensure their names are included in the QA credits. If there are any omissions or errors, notify the developer so they can be corrected.

PhET-iO Features

What to Test

The PhET-iO features are all about making our sims even more powerful and adaptable. We're testing the integration between the simulation and the PhET-iO framework, ensuring that everything works smoothly for advanced use cases like data collection, customized simulations, and accessibility enhancements. This involves a deeper dive into the sim's functionality, focusing on how it interacts with the PhET-iO API and ecosystem.

  • Public vs. Private Files: First up, let's talk about file access. We need to ensure that our public files don't have password protection, and our private files do. Why? Because security matters! We want to make sure sensitive data is locked down tight, while resources meant for everyone are easily accessible. The key here is ensuring that the security protocols are correctly implemented. This involves testing the access controls for various files and resources, verifying that public resources are freely accessible without authentication, while private resources require appropriate credentials. Using a private browser ensures that no cached credentials or cookies interfere with the test results. Any discrepancies in access control should be documented and reported.
  • Standalone Sim: Next, we need to ensure the standalone sim is working properly. This means it can run independently without relying on external servers or services. Testing the standalone sim is crucial for ensuring that the simulation can be used offline or in environments where network connectivity is limited. This involves verifying that all necessary resources are included within the standalone package and that the simulation functions correctly without external dependencies. This test often involves downloading the standalone version of the sim and running it locally to confirm that all features and functionalities are available.
  • Wrapper Index: A properly functioning wrapper index is crucial for navigation and launching different versions and configurations of the simulation. We need to ensure that it is working properly. Wrapper index testing ensures that users can easily access and launch different versions and configurations of the simulation, which is essential for various use cases such as testing, development, and deployment. This involves checking the links and navigation within the wrapper index to ensure they function correctly and lead to the expected resources. Any broken links or navigation issues should be documented and reported.
  • Wrapper Functionality: Now, let's dive into each wrapper and make sure they are working properly. Wrappers provide different ways to interact with the sim, so we need to verify each one. Wrappers are crucial for integrating the simulation with different platforms and environments, and ensuring each wrapper functions correctly is essential for providing a consistent user experience. This involves testing each available wrapper to ensure it launches the simulation correctly and provides the expected functionality. Pay close attention to any specific features or configurations associated with each wrapper, and document any issues or inconsistencies.
  • XSS Prevention: We need to launch the simulation in Studio with ?stringTest=xss and make sure the sim doesn't navigate to YouTube. This is all about preventing cross-site scripting (XSS) attacks. XSS vulnerabilities can be exploited by malicious actors to inject harmful scripts into the simulation. We need to launch the simulation in Studio with ?stringTest=xss and make sure the sim doesn't navigate to YouTube. Preventing XSS attacks is crucial for maintaining the security and integrity of the simulation and protecting users from potential harm. Report any instances where the simulation navigates to an unexpected or malicious website when using the ?stringTest=xss parameter.
  • Basic Wrapper Example: For newer PhET-iO wrapper indices, save the "basic example of a functional wrapper" as a .html file and open it. Make sure the simulation loads without crashing or throwing errors. This test verifies the basic functionality of the wrapper and ensures that it can correctly load and run the simulation. Creating a basic functional wrapper and testing it ensures that the fundamental integration components are working as expected. If the simulation fails to load or crashes, document the specific errors and circumstances.
  • Login Wrapper: Load the login wrapper just to make sure it works. Do so by adding this link from the sim deployed root: /wrappers/login/?wrapper=record&validationRule=validateDigits&&numberOfDigits=5&promptText=ENTER_A_5_DIGIT_NUMBER. The login wrapper test is crucial for ensuring the security and controlled access to certain simulation features or data. We need to load the login wrapper and make sure it functions correctly. This involves verifying that the login mechanism works as expected, and that users are prompted for the correct credentials. Any issues with the login process, such as incorrect prompts or failed authentication, should be documented.
  • Recording Test to Metacog: Conduct a recording test to Metacog, further instructions in the QA Book. Do this for iPadOS + Safari and one other random platform. This test ensures that the simulation can correctly record and transmit data to Metacog, which is important for research and data analysis. Conducting a recording test to Metacog verifies that the simulation can accurately capture and transmit data for analysis and research purposes. The Metacog integration allows for advanced data collection and analysis, which can provide valuable insights into user behavior and simulation effectiveness. The QA Book provides detailed instructions on how to conduct this test. Document any issues with data recording or transmission to Metacog.
  • Memory Test: Conduct a memory test on the stand alone sim wrapper (rc.1). Memory tests are crucial for ensuring that the simulation is efficient and doesn't consume excessive resources. Memory leaks and excessive memory usage can lead to performance issues and crashes. Conducting a memory test on the standalone sim wrapper helps identify and address potential memory-related problems before the simulation is released. This test involves monitoring the simulation's memory usage over time and identifying any trends or anomalies. Document any excessive memory usage or memory leaks that are detected.
  • Debug Mode: Test one platform combination with ?phetioDebug=true on the Studio and State wrapper. Debug mode provides valuable diagnostic information and can help identify the root cause of issues. Testing with ?phetioDebug=true on the Studio and State wrapper provides access to additional debugging information that can help identify and resolve issues. The debug mode often includes detailed logging, error messages, and other diagnostic data that can aid in troubleshooting. Document any unusual behavior or errors that are observed while running the simulation in debug mode.
  • Pan/Zoom with State: If Pan/Zoom is supported, make sure that it works when set with PhET-iO State. This ensures that the Pan/Zoom functionality can be correctly controlled and configured through the PhET-iO API. Verifying Pan/Zoom functionality with PhET-iO State ensures that these features can be dynamically controlled and configured, which is essential for advanced use cases such as remote control and automated testing. This test involves setting different Pan/Zoom states using the PhET-iO API and verifying that the simulation responds correctly. Document any issues with Pan/Zoom behavior or state setting.
  • Offline Functionality: Test that the sim works offline:
    • Click the link to the phet-io zip file (at top of issue) to download the zip file.
    • Unzip it to a spot locally.
    • Open index.html by double clicking it on your desktop or in a Finder-view.
    • It should look like the standalone version of the sim in PhET-iO brand. Offline testing is critical for ensuring that the simulation can be used in environments where network connectivity is not available. Offline functionality ensures that users can access and use the simulation even without an internet connection. This test involves downloading the PhET-iO zip file, extracting it, and running the simulation locally. If the simulation fails to load or functions incorrectly offline, document the specific issues and errors encountered.

Accessibility Features

What to Test

Accessibility is a core principle in PhET simulations, and we're committed to making our sims usable by everyone. This part of the test focuses on the accessibility (a11y) features of the simulation, ensuring that users with disabilities can interact with and learn from the sim effectively. Accessibility testing is crucial for ensuring that the simulation is inclusive and usable by people with a wide range of abilities. Let's make sure we're not just building sims, but building sims that everyone can use!

  • No Negative Impact: Make sure the accessibility (a11y) feature that is being tested doesn't negatively affect the sim in any way. Accessibility features should enhance the simulation for users with disabilities without detracting from the experience for other users. It's crucial to ensure that accessibility enhancements don't inadvertently introduce new issues or conflicts within the simulation. A thorough test involves enabling and using each accessibility feature while monitoring the overall behavior and performance of the sim. If any negative impacts are observed, they should be documented with detailed descriptions of the issues and their context.
  • Supported Features: Here is a list of features that may be supported in this test:
    • Alternative Input: Alternative input methods allow users to interact with the simulation using various input devices, such as keyboards, touch screens, or assistive technology. The goal of testing alternative input is to ensure that users with motor impairments or other input limitations can effectively use the simulation. Verify that all interactive elements can be accessed and manipulated using alternative input methods. Pay close attention to keyboard navigation, touch gestures, and compatibility with assistive devices. Report any issues with input responsiveness or accessibility.
    • Interactive Description: Interactive descriptions provide textual information about the simulation's elements and their states, allowing users to understand the sim even if they cannot see it. Testing interactive descriptions ensures that users with visual impairments can access the same information as sighted users. This involves navigating through the simulation with a screen reader and verifying that all interactive elements have meaningful and informative descriptions. The descriptions should accurately reflect the element's state and functionality. Document any missing, inaccurate, or unclear descriptions.
    • Sound and Sonification: Sound and sonification enhance the simulation experience by providing auditory cues and information. Testing sound and sonification ensures that users can understand and interact with the simulation using audio feedback. Sound effects can provide feedback on user actions, while sonification can represent data and relationships in an auditory format. Verify that all sounds are clear, appropriate, and provide meaningful information. Check for any audio glitches or inconsistencies. Report any issues with sound clarity, volume, or information content.
    • Pan and Zoom: Pan and zoom features allow users to adjust the view of the simulation, which can be helpful for users with visual impairments or those who need to focus on specific areas of the sim. Testing pan and zoom ensures that these features are functioning correctly and enhance the user experience. Verify that users can pan and zoom smoothly and easily using different input methods, such as mouse, touch, and keyboard. The zoom levels should be appropriate, and panning should not introduce any visual artifacts or performance issues. Document any issues with pan and zoom functionality or usability.
    • Mobile Description: Mobile descriptions are tailored for users on mobile devices, providing concise and relevant information about the simulation. Testing mobile descriptions ensures that users on mobile devices have a clear and accessible experience. Mobile environments often have limited screen space, so it's crucial to provide descriptions that are brief and informative. Verify that mobile descriptions are clear, accurate, and provide sufficient context for users to understand the simulation's elements and interactions. Report any mobile-specific issues or inconsistencies.
    • Voicing: Voicing refers to the screen reader compatibility of the simulation, which allows screen reader users to access the sim's content and interactions. Testing voicing ensures that the simulation is usable by individuals who rely on screen readers for accessibility. This involves navigating through the simulation with a screen reader and verifying that all elements are correctly announced and described. The screen reader should accurately convey the state and functionality of each interactive element. Document any issues with screen reader compatibility, such as missing announcements, incorrect descriptions, or navigation problems.
  • Input Methods: Test all possible forms of input. Just like in the general features test, we need to cover all bases when it comes to input methods. This ensures that users can interact with the sim using their preferred input devices and that accessibility features work seamlessly across different input methods. The goal here is to provide a consistent and accessible experience regardless of the input method used.
    • Test all mouse/trackpad inputs.
    • Test all touchscreen inputs.
    • Test all keyboard navigation inputs (if applicable).
    • Test all forms of input with a screen reader (if applicable).
  • Website Updates: If this sim is not in this list or up to date there, make an issue in website to ask if PhET research page links need updating. Please assign to @terracoda and @emily-phet. Maintaining accurate and up-to-date information on the PhET website is crucial for ensuring that users can easily find and access the resources they need. Accessibility information, research pages, and links should be reviewed and updated regularly to reflect the latest developments and changes. Creating an issue in the website repository ensures that the necessary updates are tracked and implemented. Assigning the issue to the appropriate team members (@terracoda and @emily-phet) ensures that it is addressed promptly.

Screen Readers

This sim may support screen readers. If you are unfamiliar with screen readers, please ask Katie to introduce you to screen readers. If you simply need a refresher on screen readers, please consult the QA Book, which should have all of the information you need as well as a link to a screen reader tutorial made by Jesse. Otherwise, look over the a11y view before opening the simulation. Once you've done that, open the simulation and make sure alerts and descriptions work as intended.

Platforms and Screen Readers to Be Tested

  • Windows 10 + Latest Chrome + Latest JAWS
  • Windows 10 + Latest Firefox + Latest NVDA
  • macOS + Safari + VoiceOver
  • iOS + Safari + VoiceOver (only if specified in testing issue)

Critical Screen Reader Information

We are tracking known screen reader bugs in here. If you find a screen reader bug, please check it against this list.

Keyboard Navigation

This sim supports keyboard navigation. Please make sure it works as intended on all platforms by itself and with a screen reader.

Pan and Zoom

This sim supports pan and zoom with pinch and drag gestures on touch screens, keyboard shortcuts, and mouse/wheel controls. Please test pan and zoom and make sure it is working as intended and well with the use cases of the simulation. Due to the way screen readers handle user input, pan and zoom is NOT expected to work while using a screen reader so there is no need to test this case.


FAQs for QA Members
There are multiple tests in this issue... Which test should I do first? Test in order! Test the first thing first, the second thing second, and so on.

How should I format my issue? Here's a template for making issues:
<b>Test Device</b>

blah

<b>Operating System</b>

blah

<b>Browser</b>

blah

<b>Problem Description</b>

blah

<b>Steps to Reproduce</b>

blah

<b>Visuals</b>

blah

<details>
<summary><b>Troubleshooting Information</b></summary>

blah

</details>

Who should I assign? We typically assign the developer who opened the issue in the QA repository.

My question isn't in here... What should I do? You should:
  1. Consult the QA Book.
  2. Google it.
  3. Ask Katie.
  4. Ask a developer.
  5. Google it again.
  6. Cry.