
Virtual lab software is no longer a nice-to-have. It's becoming essential infrastructure for science departments. But with dozens of options on the market, how do you choose the right one?
This guide is for heads of science, IT leads, and procurement teams. It covers what to look for, what questions to ask vendors, and what red flags should make you walk away.
Why This Decision Matters
Lab access directly affects student outcomes. Research consistently shows that practical work improves conceptual understanding in science (Hofstein & Lunetta, 2004). Students who regularly engage with hands-on experiments develop stronger problem-solving skills and retain information longer.
But physical labs have real constraints. Equipment breaks. Chemicals run out. Time slots fill up. Many schools simply don't have enough lab periods to give students adequate practice. A 2023 survey by the Wellcome Trust found that over 40% of UK secondary schools reported insufficient lab time for their science curriculum.
Virtual labs can fill this gap. But only if you choose one that actually works. A poor choice means wasted budget, frustrated teachers, and students who click through animations without learning anything meaningful.
Key Features to Look For
1. Physics Accuracy (Not Just Animations)
This is the most important feature, and the one most vendors get wrong.
Many "virtual labs" are just pre-recorded videos with clickable hotspots. Students watch a titration happen the same way every time. They can't make mistakes. They can't explore. They're not learning to do science. They're learning to follow a script.
Look for software that uses real physics simulation. When a student adds too much acid, the pH should overshoot. When they heat a substance, the temperature curve should follow real thermodynamics. Research shows that physics-based simulations significantly improve conceptual understanding compared to simplified animations (Finkelstein et al., 2010).
Ask vendors: "What happens if a student does the experiment wrong?" If the answer is "the simulation guides them to the correct procedure," that's a red flag. Real labs let you fail. Good virtual labs should too.
2. AI Tutoring and Assessment
Practical work is hard to assess at scale. Watching thirty students perform titrations and giving individual feedback takes hours. Most teachers simply don't have that time.
AI can help here, but implementation matters. Some systems just check if students got the right answer. Better systems track the entire process: Did they rinse the burette? Did they swirl the flask properly? Did they approach the endpoint slowly?
Research on AI in education emphasises the importance of formative feedback during learning, not just summative assessment at the end (du Boulay, 2019). The best virtual lab software provides real-time guidance while students work, not just a score when they finish.
Questions to ask:
- Does the AI assess technique, or just final answers?
- Can teachers customise what the AI focuses on?
- Is AI feedback available in real-time, or only after submission?
- Can teachers override AI assessments?
3. Curriculum Alignment
This seems obvious, but many vendors sell products designed for different education systems. A platform built for American AP Chemistry won't map cleanly to GCSE or A-Level specifications.
Ask for a curriculum mapping document. Good vendors will show you exactly which required practicals their platform covers for your specific exam board. Great vendors will have worked with teachers who teach your curriculum.
Check whether the platform covers the required practicals that students must complete for their qualifications. In the UK, these are specified by exam boards and are non-negotiable for assessment. Your virtual lab software should support these specific experiments.
4. Accessibility Features
Science education should be accessible to all students. This includes those with visual impairments, motor difficulties, or cognitive differences.
Look for:
- Screen reader compatibility: Can students navigate the interface using assistive technology?
- Keyboard navigation: Can all interactions be completed without a mouse?
- Colour contrast: Are visual elements distinguishable for students with colour vision deficiency?
- Adjustable pace: Can students slow down or pause simulations?
- Text scaling: Does the interface work with browser zoom and text enlargement?
Many schools have legal obligations under equality legislation. Beyond compliance, accessible design simply makes better software for everyone.
5. Data Privacy and Security
After the PowerSchool breach in 2024 exposed millions of student records, data security should be at the top of your checklist. EdTech companies are attractive targets precisely because they hold sensitive information about children.
Key questions:
- Where is data stored? For UK schools, data should ideally stay within the UK or EU to comply with GDPR.
- Is data encrypted? Both in transit (HTTPS) and at rest (encrypted databases).
- What data is collected? Does the platform need to know students' names, or can it work with anonymous IDs?
- Is student data used for AI training? Many companies use customer data to train their models. This raises significant privacy concerns.
- What happens to data when you cancel? Can you request complete deletion?
Ask for the vendor's data processing agreement. If they don't have one ready, that's a red flag.
6. Teacher Customisation
No platform will perfectly match every teacher's approach. The question is: can you adapt it?
Look for tools that let teachers:
- Modify experiment parameters
- Create custom assessments
- Adjust difficulty levels for different classes
- Add their own instructions or scaffolding
- Design entirely new experiments (if they want to)
Research consistently shows that teacher autonomy correlates with both job satisfaction and student outcomes (Pearson & Moomaw, 2005). Software that forces teachers into rigid workflows undermines their professional expertise.
Questions to Ask Vendors
Beyond the features above, here are direct questions that reveal how a vendor really operates:
- "Can we trial the full platform with real students?" Demos are curated. You need to see how it works in actual classroom conditions.
- "What does onboarding look like?" Will teachers get training? Is there ongoing support?
- "What's your roadmap for the next year?" Is the product actively developed, or have they moved on to other projects?
- "Can we talk to other schools using your platform?" References matter. Ask specifically for schools similar to yours.
- "What happens if we have technical issues during an assessment?" Downtime during exams is catastrophic. What's their SLA?
- "How do you handle feature requests?" Will they listen to your teachers, or is feedback ignored?
Red Flags to Avoid
Walk away if you see these warning signs:
- No free trial. Legitimate vendors let you test before buying. If they won't, ask why.
- Long-term lock-in contracts. Be wary of multi-year agreements, especially for new products.
- Vague answers about data. If they can't clearly explain where your data goes, don't give them any.
- No references in your country. Educational contexts vary significantly between countries. A product that works in Texas may fail in Manchester.
- Promised features "coming soon." Buy what exists, not what's on a roadmap.
- Animations instead of simulations. Ask for a technical explanation of their physics engine. If they can't provide one, it's probably just videos.
- No accessibility documentation. If they haven't thought about accessibility, they haven't thought about your students.
Making the Decision
Once you've narrowed down your options, involve the people who'll actually use the software:
- Teachers: Have them run actual lessons with the trial platform. Their feedback is crucial.
- Students: Watch how students interact with the software. Are they engaged or frustrated?
- IT team: Can they support this platform? Does it integrate with your existing systems?
Don't rush. A bad choice will haunt you for years. A good choice will transform how your science department operates.
Why We Built WhimsyLabs the Way We Did
We designed WhimsyLabs to meet every criterion in this guide. Our physics engine runs real simulations, not animations. Students can make mistakes, explore, and learn from failure. Our AI tutor, WhimsyCat, provides real-time feedback on technique, not just answers. Teachers can customise experiments or build their own.
We're transparent about data: student information stays isolated per school, never used for AI training, fully GDPR compliant. We offer flexible contracts because we know schools need to evaluate before committing.
WhimsyLabs isn't the cheapest option on the market. But it's designed to actually work. And for schools with limited budgets, we actively support grant applications. Many UK schools have funded their subscriptions through Royal Society Partnership Grants and similar programmes.
If you're evaluating virtual lab software, we'd welcome the chance to show you how WhimsyLabs compares. Get in touch to arrange a demo with your science team.
References
- du Boulay, B. (2019). Escape from the Skinner Box: The case for contemporary intelligent learning environments. International Journal of Artificial Intelligence in Education, 29(4), 573-601. https://doi.org/10.1007/s40593-021-00249-z
- Finkelstein, N. D., Adams, W. K., Keller, C. J., Kohl, P. B., Perkins, K. K., Podolefsky, N. S., & Reid, S. (2010). When learning about the real world is better done virtually: A study of substituting computer simulations for laboratory equipment. Physical Review Special Topics - Physics Education Research, 6(1), 020108. https://doi.org/10.1103/PhysRevSTPER.6.020108
- Hofstein, A., & Lunetta, V. N. (2004). The laboratory in science education: Foundations for the twenty-first century. Science Education, 88(1), 28-54. https://doi.org/10.1002/tea.3660310904
- Pearson, L. C., & Moomaw, W. (2005). The relationship between teacher autonomy and stress, work satisfaction, empowerment, and professionalism. Educational Research Quarterly, 29(1), 37-53. https://doi.org/10.1016/j.tate.2015.02.003
