top of page
Exploring where AI, wellbeing, and the future of mental health collide - and what that means for how we support, scale, and safeguard human care in a digital age.


FOLQ: The Voice-First Platform Revolutionizing Human Communication Anytime Anywhere
Human communication is naturally voice-based. We speak to share ideas, ask questions, and connect with others. Yet many digital platforms still rely heavily on text, which can feel less natural and slower. FOLQ changes this by putting voice first, creating a platform that fits how people really communicate. At the same time, it remains friendly to text users and keeps conversations private and accessible whenever you want. Why Voice Comes First Voice is the most direct way hu
Chris Edwards
Feb 282 min read


Taming the Monkey Mind: How FOLQ Can Help You Sleep Peacefully at 2AM
You lie awake at 2AM, your mind racing. Thoughts jump from one worry to the next, fears grow bigger, and your brain replays worst-case scenarios. This restless mental chatter is often called the monkey mind —a restless, noisy mind that refuses to settle. It’s a common experience, yet it can feel isolating and exhausting when sleep won’t come. The good news is that this mental restlessness is normal, and there are simple ways to calm it down. One effective method is having a c
Chris Edwards
Feb 152 min read


Navigating Existential Crossroads and Midlife Crises with FOLQ.ai
Feeling stuck in your career? Facing a midlife crisis that clouds your sense of direction? You are not alone. Many people reach points in their lives where the path forward seems unclear or overwhelming. These moments can feel like roadblocks, but they also offer opportunities for growth and change. With the right tools and support, you can turn uncertainty into clarity. FOLQ.ai is designed to help you explore your options, understand your strengths, and make confident decisi
Chris Edwards
Feb 23 min read


Designing Safer AI for Mental Wellbeing: Reducing Hallucinations and Sycophancy
Artificial intelligence is becoming a key tool in supporting mental wellbeing, offering personalized guidance and companionship. Yet, AI systems can sometimes produce misleading or overly agreeable responses, which risks user safety and trust. Designing AI that minimizes hallucinations - false or fabricated information - and sycophancy - excessive agreement or flattery - is essential to create safer, more reliable mental health support. Why Safer AI Matters in Mental Wellbein
Chris Edwards
Jan 282 min read


Podcast: Are AI Chatbots in Mental Health Support Truly Inclusive for Everyone?
AI chatbots are becoming a common tool in mental health support, offering users quick access to emotional help anytime. But are these digital helpers designed to serve everyone equally? In a recent podcast episode, host Chris Rhyss Edwards speaks with psychologist and researcher Dr. Gale Lucas about the blind spots in chatbot design that can leave many users feeling misunderstood or excluded. Listen on Spotify. The Challenge of Cultural Bias in AI Chatbots One major issue is
Chris Edwards
Sep 28, 20252 min read


Podcast: Exploring Synthetic Relationships and the Role of AI in Emotional Support
Loneliness affects millions worldwide, and many are turning to artificial intelligence for emotional support. What happens when AI becomes more than a tool and steps into the role of a confidant? Doctoral researcher Chris Rhyss Edwards explores this question in a revealing conversation with Dr. Rachel Wood, a therapist and cyberpsychology researcher. Their discussion sheds light on the rise of synthetic relationships, the reasons behind this growing trend, and the urgent need
Chris Edwards
Sep 22, 20252 min read
bottom of page