As AI technology rapidly evolves, new tools emerge weekly — but how do you know which ones are truly useful for your design needs?
Enter UX+AI, a platform designed to help you discover and learn about AI tools for UX design.
Role: UX Designer
Skills/Tools: User testing, UX/UI
As an IxDA volunteer, I helped transform the UX+AI initiative from a concept to a fully implemented platform. This involved conducting market research, rapid prototyping and user testing. Through user interviews and tests, we identified the UX community's needs for finding UX AI tools, which guided the development of the final UX+AI website.
With new AI tools emerging every week, it’s hard for UX designers to know which ones are genuinely useful. Many websites showcase the latest AI tools, but they’re often broad, unorganized, and lack real insights from UX professionals. UX+AI changes that by creating a community-driven platform where designers can discover, rank, and share their experiences with AI tools. By leveraging collective insights, we help UX designers find tried-and-tested tools that actually enhance their workflow—so they can spend less time experimenting and more time designing.
To better understand the landscape of AI tools for UX design, we analyzed three existing platforms that help users explore AI or design-related tools. Each had strengths, but none fully addressed the need for a well-organized, community-driven resource tailored to UX designers.
There's An AI For That
✅ Strengths: A vast database covering AI tools across various industries.
⚠️ Limitations: The interface is cluttered, making it hard to navigate. Tools lack meaningful reviews, and rankings are influenced by paid promotions, making it difficult to determine a tool’s true usefulness.
Product Hunt
✅ Strengths: Well-organized categories for AI and design tools, easy to navigate UI.
⚠️ Limitations: No dedicated section for AI tools specifically for UX design. Reviews can be biased, as product developers often invite users to leave feedback, potentially skewing ratings.
AI for UX Professionals: A Resource Directory
✅ Strengths: A great curated database focused specifically on design-related AI tools. Categorization and tagging of tools very useful.
⚠️ Limitations: Lacking in filtering options, making it difficult to find relevant tools. No community-driven insights—there are no reviews or rankings about the best tools. The Airtable platform does not support discussion interactions as well.
While existing platforms offered large databases or design-focused tools, none provided a structured, unbiased, and community-driven way for UX designers to discover and evaluate AI tools. We saw an opportunity to create a platform that not only organizes AI tools effectively but also ensures transparency through community engagement and real user insights.
Since this project focused on implementing a pre-defined platform concept, we began by researching effective ways to manage the website and its community, drawing comparisons from platforms that had a strong community contributed aspect like Product Hunt, Wikipedia, and There’s An AI For That.
While aiming for a self-sustaining system, maintaining consistent quality for content was top priority. We recognized the need for human moderation to prevent unsolicited content since most of it would be user-generated. My research revealed that all successful platforms had quality control measures, from dedicated review teams to community moderators. As a small team, we decided to manually review submissions initially, then planning to involve community moderators as we grow.
We expect there would be three types of user scenarios:
1. “Read-only” users that would browse then leave to try an AI tool
2. Users that would browse and leave reviews
3. Users that want to suggest an AI tool
I created user flows based on these scenarios as a guide to identify the most important features and webpages needed in the user’s experience on our platform, such as an informative homepage with top ten AI tools listed, a product page, and a way to submit tools and leave reviews.
User flow for "read-only" users that only browse the site then leave to try new AI tools.
With key features in mind, we created low-fidelity wireframes to organize each webpage. I wanted to create a straightforward search experience for users to find the right AI tool quickly, so I opted for natural language form-like drop down filters to prompt users to start on their search, with categories for the design stage they were in, the design activity they were doing, and an additional sorting option by name, rating, and more.
My initial design for the homepage with a simple filter search to start the search process.
After a team discussion, we decided to merge features from my colleague and my designs to create our first prototype with Balsamiq. We had two homepage versions: one with the the natural language dropdown filters, and another one with a GPT-style search input.
We designed our first user test with exploratory questions to learn how people were learning about new AI tools, what motivated them to leave reviews, and conducted an A/B test for the two homepages. The participants were given a scenario to look for AI tools for user interviews, and provided feedback through a think-aloud process.
[A/B Test] Left - Homepage "A" with dropdown filter search; Right - Homepage "B" with GPT search bar.
💡Our biggest finding was that most people very rarely or never leave reviews on other platforms.💡
Unless they had an exceptional experience, presented with monetary or special reward incentives, or were helping a friend out, it would be too much effort for them to write a review.
But community contribution was the very basis of how our platform would work; we were mostly relying on their altruism to help inform each other.
Now then, how can we motivate them to leave reviews? Do we want the rating system at all then? But if there’s no community to provide ratings, how would people know which tools are the best?
🤔
We explored alternative ways to share top AI tools beyond a website, considering a browser extension with a community-shared bookmark list. However, tracking tool popularity through bookmarks proved to be technically challenging with our resources.
While a curated bookmark list by our team seemed more feasible, it would require significant effort to maintain and to build credibility, as curators need to be knowledgeable and reputable to gain trust from the community. We decided to forego this direction due to the operational challenges.
A sample of the public bookmark list concept where the community can suggest new UX AI tools and it'll be shown as a compiled list with descriptive tags.
Another design direction was to create a GPT to automatically compile reviews and ratings into a top ten list, eliminating the need for user accounts and traditional reviews. The GPT would summarize user feedback for each tool and streamline new tool submissions, automating many processes. Of course, the team would still handle quality control to ensure accuracy.
Design Direction 2 - using ChatGPT's capabilities to help designers search for the right tools, and to help gather and summarize reviews to users.
A third direction was an updated interface of our first website prototype, taking out the filtering options on the side and simplifying it further. We also included an AI search bar for prompt-type searches.
Design Direction 3 - an updated UI of the first website prototype, with a table view to display more tools and an AI search bar.
With new prototypes of a GPT version and an iteration of our initial website, we conducted a second A/B test similar to the first one. We also focused our questions on how to motivate users to contribute to our platform.
Our initial idea of using GPT as an informative way to explore AI tools didn't pan out as expected. Key insights were:
Users preferred a simple, straightforward search with easily skimmable information. All participants favored the website with filter search over the GPT version.
We quickly returned to the website prototype, refining it to make tool information more accessible at a glance. We then conducted our final user test to finalize the design.
In the third test, the prototype only featured rating stars for reviews as a low-effort way for users to contribute, but participants struggled to interpret these ratings without knowing the reason why it’s good or bad. This highlighted the need for comments alongside ratings to provide more depth in reviews, so we reimplemented comments.
In prototype 3, we displayed the tool's description as a popup for quick viewing and only included rating stars as a low-effort way for users to determine the usefulness of a tool, and for us to collect reviews.
Concerned about community engagement, we asked participants if they preferred a curated top ten AI tools list or community reviews. We found they trusted the community and people from their network more (eg. LinkedIn network), so we decided to keep the LinkedIn login and review verification with LinkedIn feature.
Users need comments to provide depth in reviews and tend to trust the opinions of their network of peers more compared to strangers online. This made us reimplement comments and LinkedIn login for review verification.
So how does the platform help users stay up to date with the ever-growing number of AI tools for UX design?
From user test 2, people preferred a simple, user-friendly platform that enables quick tool discovery and presents clear, easily digestible information at a glance.
The final platform refines the initial website prototype, showcasing numerous UX AI tools in a grid layout with feature tags for easy skimming and comparison. Users can conduct a broad search using filters or find specific tools with the search bar.
For those who want to find the best options without extensive research, a top ten list on the side allows them to find the most popular tools as decided by community ratings.
Final homepage with an overview of AI tools available and a popular tools list.
From user test 1 and 3, most people rarely or never leave reviews on other platforms, highlighting the need for a low-effort feedback system. However, they value comments for depth in reviews and trust opinions from their peers more than from strangers online.
We kept the comments. But to encourage reviews, we made the review button more prominent and added a popup message on the tool's product page, prompting users to leave a review if they've used it before. Review verification with LinkedIn was also kept to help users identify credible opinions from their network.
Tool description page with a popup message to prompt users to review the tool if they've used it before.
With this platform, the design community can stay informed of new AI tools and collaboratively discover the best ones for their UX workflow.
We are currently building the platform at ux-ai.org and are excited to launch it soon.
The rapid prototyping and testing process provided a lot of insightful learnings about community-driven platforms. I realized that building such platforms was not easy, as a strong sense of community was needed. It made me appreciate platforms like Reddit where users could feel a sense of belonging in their community, where they could share their passion for a specific area of interest, and have the desire to engage with others.
This project may have come full circle, but it’s something that could happen since design processes are often unpredictable. Learning to pivot quickly when things aren’t working was crucial to avoid wasting time and effort on something that’s not suitable for our goal and users. Instead, we should keep an open mind and focus on finding the appropriate solutions.
We’d love for all designers to check out our UX+AI platform! We hope it helps with your UX design process and that you’ll join our community to share and discover the best UX AI tools together.