.jpg)
This framework provides a step by step approach to conducting AI-assisted evaluation using 10 Usability Heuristics by NN Group. It can help designers and researchers identify usability issues early in the design process, by ready-to-use prompts. This method is suitable for mobile, desktop, and web platforms and can be integrated into design reviews, educational workshops, or self-guided learning.
Before usability test or a review session, you can grab 3–5 screens from a single flow, run them through one of Nielsen’s 10 usability heuristics, and get instant feedback on:
- What’s working well
- Where the design might trip up users
- Actionable suggestions to make it better
- You don’t need to study all 10 heuristics at once - just pick one at a time
- Each heuristic comes with 4 easy, high-impact questions - no overwhelming checklists
- We’ve got ready-to-use prompts for the AI - just drop in your screens and the right heuristic, and it will walk through the review for you.
- Faster iterations - Catch main usability issues before usability testing
- Better decisions - You’ll see exactly which parts of your flow might confuse or slow down users.
- Continuous learning - The more you use it, the more you’ll spot patterns in your own designs.
Think of it as a design health check - quick, painless, and you walk away knowing exactly what to fix and what to keep.
1. Choose your flow
- The best approach is to select 3–5 sequential screens for each evaluation
- If you have more than 5 screens, split them into logical chunks
- More then 5 screens will lead to too much information per prompt, possible mistakes or too shallow responses
- Make a screenshot of selected screen/screens and download it
- We recommend a screenshot of your screens, because you can select only one picture to the RBIChat
.jpg)
2. Submit to RBI Chat
- Write short description about selected flow (e.g this is KYC flow from onboarding to daily banking app)
- Upload your screenshot
- Paste your selected prompt from template below
- Repeat for the next heuristic until the set is complete
.jpg)
.jpg)
3. Review and act
- Group AI suggestions into Quick Wins (easy fixes) and Larger Improvements (require more design effort)
- Decide what to implement in the next iteration
We recommend
for AI evaluation of screens, it’s usually best to go heuristic-by-heuristic rather than dumping all 10 at once. Here’s why:
- Better focus: When you put one heuristic at a time, AI can go deep on that specific principle without mixing issues from different categories
- More actionable feedback: Suggestions will be directly tied to each heuristic, and you can fix problems in a structured way
- Easier to digest: A huge list of mixed issues from all heuristics can be overwhelming and useless. Smaller chunks are easier to review and prioritize
- Fewer false negatives: Going one heuristic at a time avoids skipping subtle problems that might be overshadowed when evaluating everything together
You can also use this approach for your competitor analysis.
- collect your screens
- copy your prompts
You will collect “How to do it” knowledge from competitors screens as an inspiration for your new project.
1) Visibility of System Status
Evaluate the following screens for Visibility of System Status using these questions:
- Is the system’s current state clear to the user?
- Is there meaningful feedback when a task is complete?
- Is the user aware of their position or progress in a multi-step process?
- Are feedback messages timely and relevant?
I need you to
- Answer each question with Yes / No / Partially and a short explanation.
- Identify the most critical issues first.
- Provide specific, actionable suggestions for improving the screen based on the identified issues.
2) Match Between System and Real World
Evaluate the following screens for Match Between System and Real World using these questions:
- Does the system use familiar language, concepts, and metaphors?
- Are actions and labels consistent with real-world meaning and user expectations?
- Are menus, navigation, and content organized in a natural way?
- Do features and actions match the user’s context and goals?
I need you to
- Answer each question with Yes / No / Partially and a short explanation.
- Identify the most critical issues first.
- Provide specific, actionable suggestions for improving the screen based on the identified issues.
3) User Control and Freedom
Evaluate the following screens for User Control and Freedom using these questions:
- Can users easily undo, redo, or cancel actions — especially irreversible ones?
- Can users navigate freely without losing progress or data?
- Are action consequences clear before the user commits?
- Does the system protect users from losing work?
I need you to
- Answer each question with Yes / No / Partially and a short explanation.
- Identify the most critical issues first.
- Provide specific, actionable suggestions for improving the screen based on the identified issues.
4) Consistency and Standards
Evaluate the following screens for Consistency and Standards using these questions:
- Are interface elements and patterns used consistently across screens?
- Are terms, labels, and actions named consistently throughout the system?
- Does the design follow established conventions and domain standards?
- Is navigation clear, predictable, and consistent?
I need you to
- Answer each question with Yes / No / Partially and a short explanation.
- Identify the most critical issues first.
- Provide specific, actionable suggestions for improving the screen based on the identified issues.
5) Error Prevention
Evaluate the following screens for Error Prevention using these questions:
- Are risky or irreversible actions clearly confirmed before execution?
- Does the system prevent errors through constraints, defaults, and guidance?
- Is input validation immediate and clear?
- Do error and prompt messages help the user recover?
I need you to
- Answer each question with Yes / No / Partially and a short explanation.
- Identify the most critical issues first.
- Provide specific, actionable suggestions for improving the screen based on the identified issues.
6) Recognition Rather Than Recall
Evaluate the following screens for Recognition Rather Than Recall using these questions:
- Does the interface make available actions and options clearly visible?
- Is information organized to reduce memory load?
- Does the system leverage history, context, or personalization to help users?
- Is data displayed where it can be acted upon or edited directly?
I need you to
- Answer each question with Yes / No / Partially and a short explanation.
- Identify the most critical issues first.
- Provide specific, actionable suggestions for improving the screen based on the identified issues.
7) Flexibility and Efficiency of Use
Evaluate the following screens for Flexibility and Efficiency of Use using these questions:
- Does the system remember user preferences, settings, and recent activity?
- Are common and likely actions easy and fast to perform?
- Does the interface anticipate and support the user’s next likely step?
- Can users save progress and return later without losing work?
I need you to
- Answer each question with Yes / No / Partially and a short explanation.
- Identify the most critical issues first.
- Provide specific, actionable suggestions for improving the screen based on the identified issues.
8) Aesthetic and Minimalist Design
Evaluate the following screens for Aesthetic and Minimalist Design using these questions:
- Is all content and functionality essential, without unnecessary elements or distractions?
- Is the layout clean, scannable, and visually clear?
- Are text, icons, and visuals clear and easily understood?
- Do visuals and copy support the user’s goals?
I need you to
- Answer each question with Yes / No / Partially and a short explanation.
- Identify the most critical issues first.
- Provide specific, actionable suggestions for improving the screen based on the identified issues.
9) Help Users Recognize, Diagnose, and Recover from Errors
Evaluate the following screens for Help Users Recognize, Diagnose, and Recover from Errors using these questions:
- Are error messages clear, specific, and neutral in tone?
- Do error messages explain how to fix the problem?
- Are errors shown in the right place without removing correct input?
- Can users easily recover from errors?
I need you to
- Answer each question with Yes / No / Partially and a short explanation.
- Identify the most critical issues first.
- Provide specific, actionable suggestions for improving the screen based on the identified issues.
10) Help and Documentation
Evaluate the following screens for Help and Documentation using these questions:
- Is help easy to find and access without disrupting the task?
- Is help content relevant, clear, and task-focused?
- Is help available in context when needed?
- Are alternative support channels clearly provided?
I need you to
- Answer each question with Yes / No / Partially and a short explanation.
- Identify the most critical issues first.
- Provide specific, actionable suggestions for improving the screen based on the identified issues.