Corporate IQ
Redesign Project of Corporate IQ – A Gen AI-powered enterprise tool that generates answers for complex, large-scale bid proposal questions, with a focus on AI-human interaction design to enhance AI explainability, transparency, accuracy and trust .

Lead UX/UI designer, User researcher
My Role
3 months.
Timeline
1 lead UX designer/researcher (me), 2 AI engineers, 2 Product manager, 1 Frontend developer, 2 Backend developer.
The Team
Tools
Overview
The Solution - Corporate IQ
Corporate IQ is an AI-powered B2B enterprise tool designed to solve this problem. Using a trained AI model, it generates accurate, AI-driven answers for hundreds or even thousands of bid questions in seconds. Instead of manually working through spreadsheets, users can bulk-generate responses and collaborate seamlessly within the platform, improving efficiency, consistency, and accuracy across the bid response process.
The Problem - Bid Proposal Preparation
A bid proposal is a formal document companies submit to compete for contracts, projects, or business opportunities. In large enterprises, this process is highly complex, often requiring answers to hundreds or even thousands of customer-requested questions. In my organization, Sales, Bid Management, and other teams manually complete these extensive bid proposals by exporting question lists into Excel sheets and collaborating across multiple versions. This time-consuming workflow leads to inefficiencies, inconsistencies, and missed opportunities for automation.
What Makes This Redesign Project Unique?
A Deep Dive into AI-Human Interaction
This wasn’t just another UX redesign. Corporate IQ is an Gen AI-powered tool, and that meant my role wasn’t just about making screens look better—it was about going beyond traditional UX/UI and diving into AI-driven experiences, rethinking how humans interact with AI-generated content in a way that feels intuitive, trustworthy, and efficient.

Before I joined the team…
A Not-So-Successful MVP
Before I joined the Corporate IQ team, the product had already gone through an MVP launch. The MVP introduced two key features:
Quick Q&A – A chatbot-like module where users could input individual bid questions and receive AI-generated answers in real time.
Bid Q&A – A bulk-answering module that allowed users to upload large bid documents and generate AI-powered responses for multiple questions at once.
A Not-So-Good User Response
After the MVP launch, the team conducted internal user surveys and feedback sessions to assess how well Corporate IQ was performing. The results made it clear—the tool wasn’t meeting user expectations, and adoption was lower than anticipated.
<25%
Adoption rate
31%
User satisfaction rate
“The AI-generated answers are a good starting point, but right now, modifying responses feels restrictive.
”

My mission in the redesign…
My Mission – A Redesign with Urgency and Strategy
When I joined the Corporate IQ team, it was clear that the MVP had fallen short of user expectations. The tool had potential, but low adoption rates, usability frustrations, and a poor consideration of of AI-Human interaction were holding it back. The situation was urgent—the team needed immediate fixes to address critical UX issues while also laying the foundation for a long-term, scalable solution.
My Approach – A Three-Month Agile Roadmap
With only 3 months to deliver design solutions, I knew that strategic planning and problem-solving were key. The timeline was tight, the workload was intensive, and the expectations were high. To ensure we stayed on track, I quickly drafted a clear, structured roadmap, allowing us to prioritize, iterate, and execute efficiently without losing sight of the bigger picture.
My Core Focus – Enhancing AI-Human Interaction
Since Corporate IQ is an AI-powered tool, improving AI-human interaction was at the heart of improving the tool. It wasn’t just about making the tool more usable—it was about building trust, transparency, and efficiency in how users interact with AI-generated answers. My main focus are:
AI Explainability & Transparency
Human Control & Customization
Feedback Loops & Continuous Learning

Step 1. UX Audit
I began with a comprehensive UX audit, evaluating the MVP to uncover navigation issues, usability gaps, and experience challenges. I aimed to pinpoint specific areas requiring improvement and gather valuable insights that would inform the redesign and optimization efforts.
Heuristic Analysis - Screen By Screen Evaluation
I conducted a UX audit of the Corporate IQ MVP using Nielsen Norman Group’s 10 Usability Heuristics to systematically evaluate navigation, AI-human interaction, transparency, and overall usability. Below is the detailed evaluation, highlighting strengths, weaknesses, and key problem areas that guided the redesign.
Key Problems Identified
I distilled the most critical UX issues that needed to be addressed:
Outdated UI & Lack of Brand Alignment
The interface felt outdated and did not align with the company’s branding.
Unintuitive Navigation & Poor Site Structure
The navigation was confusing, making it difficult for users to find key features and complete tasks.
Limited Human Control Over AI-Generated Content
Users had no easy way to edit, refine, or override AI responses, increasing friction in workflow.
No Feedback Loops for AI-Generated Answers
The lack of a mechanism for users provide feedback on AI content impacts negatively on user’s trust.
Lack of AI Transparency & Explainability
No references provided on generated answers led to low trust in AI and increased manual efforts.

Step 2. User Research
Although the team had already gathered initial user feedback through surveys about MVP which gave me a broad understanding of user frustrations, it didn’t fully capture why these issues were happening or how users were interacting with AI in real workflows. To bridge this gap, I conducted one-on-one user interviews to gain a deeper, qualitative understanding of user behaviors, expectations, and challenges.
My Deeper Investigation Through User Interviews
My Goals
My goal was to go beyond surface-level feedback and uncover:
How users perceived AI-generated answers – Did they trust them? Did they find them useful?
Where AI was falling short – Were answers accurate? Did users struggle to refine them?
How users actually worked within the tool – What obstacles slowed them down? What workarounds were they using?
Participants
Analyzing My Notes

I discovered some new needs…
The Power of Real Conversations
After analyzing my notes from user interviews, I found that all pain points identified in the UX audit were validated by real users. However, these deeper conversations also revealed new user needs that were not as apparent in the initial evaluation. Through one-on-one conversations, users opened up about their frustrations with the experience, went deep into the isseus they were facing. These direct conversations helped me see the gaps between how the product was designed and how people actually needed to use it.

Step 3. Define Design Goals
This phase wasn’t just about defining design goals—it was about working collaboratively with the team to balance urgent usability fixes with a long-term strategy that would make Corporate IQ a scalable, AI-powered tool for bid management. With a tight timeline and a growing list of critical UX and AI-human interaction issues, I needed to strategically prioritize problems to ensure we delivered meaningful improvements within the project’s scope.
Defining the Redesign Goals – A Collaborative Effort
Through a collaborative process with product managers, AI engineers, and stakeholders, we aligned on six key redesign goals—balancing quick usability wins with a long-term strategy for AI-driven workflows.
Refresh & Modernize UI
Revamping and modernizing the UI to align with company branding to ensure a cohesive, professional look.
Improve Site Navigation
Optimizing the navigation structure to improve accessibility and workflow efficiency.
Enhance Human Control Over AI-Generated Content
Improve the AI answer editing experience, allowing users to modify, refine, and override AI-generated responses more efficiently.
Improving AI Customization with Prompt Engineering Capabilities
Enable users to customize AI-generated responses by creating customized prompts to align with industry and business needs.
Enhance AI Accuracy with with a Feedback Loop
Implement a rating and feedback system where users can rate and comment on AI-generated answers.
Enhance AI Transparency with Referencing
Enhance AI explainability and transparency by providing reference sources as a feature, allowing users to see how AI generated an answer and verify its accuracy.

We left something in the backlog…
Making the Tough Calls
The needs of collaboration features raised during the user research. Users needed a way to collaborate on the complex reviewing process within the too. However, after digging deeper, we realized this wasn’t just a simple feature—it was an entire workflow that would require extensive research, testing, and time. Given our three-month timeline, it became clear that forcing it in now would compromise the quality of everything else we were improving.
Instead of stretching our resources too thin, we made a strategic decision: focus on the critical usability and AI experience fixes first, then tackle collaboration in the next iteration. We documented the findings, added them to the backlog, and ensured the redesign would be scalable for future improvements.

Step 4. Ideate & Conceptualization
This phase involved ideating and conceptualizing potential solutions for the redesign. A significant part of my effort went into AI experience design, from researching, learning to implementing — a challenging yet valuable learning journey for me.
Ideate AI-related Enhancements
I started with my focus on one of the most critical aspects of the redesign—enhancing AI-human interaction. This included giving users more control over AI-generated content, improving AI accuracy through feedback loops, and increasing trust by making AI decision-making more transparent (aligned with design goals 3, 4, 5, and 6).
Researching Best Practices
Since designing AI-driven experiences comes with unique challenges, I started by researching how other AI tools approach these problems. My goal was to analyze best practices, identify gaps, and bring those learnings into Corporate IQ to create an experience that feels both powerful and user-friendly.
What Solutions Inspired Me
After exploring how other AI-powered tools handle AI-human interaction, I started connecting the dots. It helped me shape my own solutions to improve AI usability, control, and trust within the bid proposal workflow. Here’s how the insights I gathered inspired key design decisions in the redesign:
Bringing Structure to Navigation – A Scalable, Intuitive Solution
With a clear direction on AI-driven improvements, I shifted my focus to another critical pain point—site navigation. From both user research and the UX audit, it was clear that the existing navigation wasn’t intuitive, lacked structure, and made key features hard to find. Since this issue directly impacted usability and efficiency, fixing it was a top priority in the redesign.
Sketch New Sitemap
I first mapped out a new sitemap to get a clear, structured view of how users navigate across the platform. This helped me identify gaps in the old navigation, streamline access to key modules, and ensure a logical, scalable flow. By visualizing the hierarchy, I was able to design a navigation system that keeps users oriented, reduces friction, and supports future growth.
A Persistent Left Panel
My goal was to create a navigation framework that not only fixes current usability issues but also scales with future enhancements. By introducing a more structured, persistent navigation, I ensured that as Corporate IQ expands, new features can be seamlessly integrated without disrupting the user experience.

As the strategy took shape, we realized…
We Needed An Admin Interface
As we collaborated to ideate solutions for improving AI accuracy and interaction, something became clear - we needed an accessible UI interface for tracking ratings and feedback for AI-generated answers. Instead of relying on backend data exports or scattered reports, we needed a dedicated space in the tool where admins or super users (with role-based access) could monitor question inputs, user ratings, and feedback trends. Beyond solving an immediate need, this would also be a scalable foundation for more advanced data tracking and AI monitoring in the future.
Prioritize & Define What To Design
Since this feature wasn’t in our original roadmap, we had to prioritize key functionality while keeping scope realistic. Rather than building a fully-featured admin suite upfront, we focused on what would make the biggest impact now while keeping the solution flexible for future expansion.
Total Questions Tracked – Monitor how many questions users are submitting to understand tool engagement.
Trending Questions & Frequency – Identify the most frequently asked questions, helping refine AI responses for commonly requested topics.
AI Answer Ratings & User Feedback – Track how users are rating AI-generated answers and analyze comments to pinpoint recurring issues or improvement areas.
Fast Wireframing As Blueprint
With a tight timeline and a focus on rapid iteration, I created wireframes as a rough guide to shape the prototype. Instead of perfecting every detail, my goal was to lay out key interactions, structure content placement, and ensure a smooth user flow before moving into high-fidelity design. These wireframes served as a foundation for testing and iteration, helping align the team on functionality without getting stuck on visuals.

Step 5. Design & Deliver
It’s finally the exciting time where all the ideas begin to take shape. Where research meets design, this phase is so critical as it translates user insights and conceptual ideas into tangible design solutions.
Establishing the Visual Direction – UI Kit Creation
To ensure consistency, professionalism, and alignment with the company’s branding, I created a UI Kit that served as the foundation for the entire redesign.
Branding Alignment
I started by reviewing existing company design guidelines to ensure Corporate IQ felt integrated within the broader brand ecosystem.
Balancing Functionality & Aesthetics
While making the UI modern and sleek, I ensured that functionality remained the priority, keeping designs clean and focused on usability.
Collaboration & Iteration
I worked closely with stakeholders and engineers, iterating on component designs to ensure they were both visually refined and technically feasible.

I had limited time for testing…
Testing While Iterating – Designing in Real Time
In an ideal world, I would have had a dedicated phase for formal usability testing after completing my high-fidelity prototype. But in a fast-moving, agile team, waiting for a perfect moment wasn’t an option. Instead, I embraced a highly iterative, real-time feedback approach—testing while designing, refining while building.
A Collaborative Testing Approach
I made user feedback an integrated part of my design process. I presented my incremental progress daily to the product team, which included stakeholders, AI engineers, solution architects, and bid managers—actual end users of the tool. Instead of a separate usability study, they became my real-time testing participants, giving immediate feedback on how the AI-powered experience felt in action.
Delivered Solutions
See below how my delivered solutions directly addressed each of the defined goals, showcasing my ability to execute impactful UX strategies that drive both user satisfaction and business success.
Refresh & Modernize Interface
To address these issues, I redesigned the navigation system by introducing a persistent left panel menu, ensuring that users always had clear, structured access to all key sections of the platform.
Improve Site Navigation

Here comes the AI part…
Enhance Human Control Over AI-Generated Content
To empower users with control to edit, refine, or track modifications over AI-generated content, I introduced three major enhancements:
Improving AI Customization with Prompt Engineering Capabilities
To enhance AI accuracy customization and adaptability, I introduced Prompt Engineering Capabilities, allowing users to select, modify, and create their own AI prompts for more tailored responses.
Enhance AI Accuracy with with a Feedback Loop
By integrating a structured feedback loop, Corporate IQ now ensures that user insights directly shape AI learning. This redesign not only enhances AI accuracy and relevance but also builds trust—users know their feedback is valued and contributes to a smarter, more reliable system.
By implementing a two-sided system—one for users to submit feedback and one for admins to track insights—this solution enables AI to evolve based on real-world input, improving accuracy and trust over time.
Enhance AI Transparency with Referencing
In the MVP version, users had no way to verify how AI-generated answers were constructed. To make AI responses more explainable, I designed a “Thought Process” feature, allowing users to see how AI reached to its answers.
By integrating the "Thought Process" feature, I transformed Corporate IQ into a more transparent and trustworthy tool. Users no longer have to take AI-generated answers at face value—they can validate, refine, and make informed decisions with confidence.

How did our solutions solve the problems…
My Dual Perspective Approach For Designing AI-Human Interaction
In my redesign of Corporate IQ, I focused on not just improving the user interface but fundamentally enhancing the AI-human relationship. Each feature was designed to improve how AI functions, while also ensuring it delivers meaningful impact on the human experience. Below is how each solution contributes from both AI and human perspectives:
Next project