7+ years designing content strategies that bridge complex systems and human needs
Get In TouchHi, I'm Richard Walls. I'm a Content Designer who transforms complex enterprise systems into intuitive user experiences. My journey began deep in the technical trenches, documenting intricate application server integrations and system architectures. This foundation gave me something unique: the ability to truly understand the complexity users face and design content that bridges that gap.
Over 7+ years, I've evolved from explaining how systems work to designing how users experience them. I specialize in information architecture optimization and user-centered content strategies that drive engagement across enterprise AI software. My technical communication background, combined with practical application of privacy and cybersecurity principles, enables me to create secure, ethical, and human-centric solutions.
What sets me apart is this technical empathy: I've been where developers struggle with integration challenges, which informs every design decision I make today. Whether I'm simplifying AI governance platform onboarding or creating contextual guidance for generative AI evaluations, I draw on that deep systems knowledge to anticipate user needs and remove friction before it happens.
I create content strategies that don't just inform but accelerate adoption, reduce support burden, and achieve measurable business outcomes through operational efficiency.
The onboarding process for an AI governance platform was complex and overwhelming for new users, leading to confusion and slower adoption.
Audited existing user flows and conducted user interviews to design a guided, step-by-step onboarding experience with contextual help, tooltips, and clear, accessible language.
Onboarding time for new users was noticeably reduced. Support tickets related to setup confusion decreased. User confidence increased and platform adoption accelerated.
AI developers and validators needed a way to efficiently compare and evaluate multiple AI assets (like prompt templates) to select the best one for their use case. The manual process was slow and error-prone.
Co-designed end-to-end user experience for configuring and running AI asset comparisons by developing intuitive in-product language and guidance to enable simultaneous evaluation of multiple AI assets.
Users were able to evaluate and compare multiple prompt templates more efficiently, saving significant time and accelerating AI development cycles.
The organization lacked any standardized process for evaluating prompt templates and generative AI models. Users had no systematic way to assess the quality, performance, or effectiveness of their prompts, leading to inconsistent results and limited confidence in AI-generated outputs.
Developed a comprehensive evaluation framework from the ground up, including research-based methodology, plain-language documentation, and end-to-end workflows. Created accessible tools and guidance that enabled both technical and non-technical users to systematically assess prompt performance, establishing institutional knowledge where none previously existed.
Successfully launched the organization's first systematic approach to prompt template evaluation, enabling data-driven decision making. Expanded platform accessibility to non-expert users, established a scalable foundation for ongoing optimization, and significantly reduced user confusion through clear, structured evaluation pathways.
Ready to discuss your next content design project? I'd love to hear about your challenges and explore how we can create user-centered solutions together.