
RealmIQ: SESSIONS
RealmIQ: SESSIONS is the podcast where we dive deep into the world of generative AI, cutting-edge news and it's impact on society and business culture. Listen in on conversations with leading AI Experts from around the world. Our relationship with technology has undergone a captivating transformation. Machines have transcended the role of mere aides; they are now instrumental in fundamentally reshaping our cognitive processes. In this context, AI evolves beyond an intellectual collaborator; it becomes a catalyst for change. Hosted by Curt Doty, brand strategist, AI expert and AI evangelist.
RealmIQ: SESSIONS
RealmIQ: SESSIONS with CHRIS LI
This Realm IQ Sessions episode features Chris Li, a serial entrepreneur and educator, discussing mental health innovation for Gen Z. As the co-founder of Say!t Mental Health, Chris shares his journey from teaching at UCLA and managing billion-dollar portfolios to launching a nonprofit that fuses peer-led campus clubs with a gamified emotional check-in app.
Chris and host Curt Doty explore the role of ChatGPT in mental health, the emotional intelligence gap in AI, and how data from real human experiences—collected ethically—could be used to train AI with empathy and soul. The conversation also critiques traditional university mental health services and addresses the alarming rise in depression and suicide among youth aged 10–24. Say It’s mission is not only to provide access but to train AI to understand human emotion by prioritizing trust, transparency, and Gen Z leadership.
Topics Discussed
- Why Gen Z uses ChatGPT for emotional support
- Emotional Intelligence (EQ) vs IQ in the age of AI
- The dangers and limitations of AI as a mental health tool
- Peer-led mental health support vs traditional therapy
- Gen Z’s achievement pressure and emotional struggles
- Ethical and transparent data sharing for AI training
- Training LLMs to be emotionally intelligent (Theory of Mind)
- Opt-in privacy practices modeled after GDPR
- The role of Gen Z in shaping emotionally aware AI
- Challenges in mental health funding and access
- Critique of current mental health infrastructure on campuses
- Big Tech’s failure to build emotionally resonant AI
Quotes
“ChatGPT is a smart parrot. It has no EQ.”
— Chris Li
“The success of any technology is trust. It’s the only thing that matters when you're dealing with Gen Z.”
— Chris Li
“They don’t need another lecture. They need a safe place to be real.”
— Chris Li on peer-led clubs
“We’re not building an LLM. We’re building trust.”
— Chris Li
“Gen Z is the AI generation. It’s their responsibility to train it with empathy.”
— Chris Li
“The problem isn’t access. It’s that the services don’t speak Gen Z’s language.”
— Chris Li
“AI needs soul. And soul comes from real human data—not performative social media.”
— Curt Doty
“At age 15, kids are having panic attacks. That’s just not right.”
— Chris Li
“Let’s go beyond what’s legal and do what’s right.”
— Chris Li on data privacy
“We modeled our opt-in system after GDPR and Apple’s transparency. No hidden third parties.”
— Chris Li
Website:
Sponsor:
https://www.opus.pro/?via=b5cf75
Sponsor our show - https://www.realmiq.com/sponsors
Receive our weekly newsletter: Subscribe on LinkedIn https://www.linkedin.com/build-relation/newsletter-follow?entityUrn=7024758748661391360
Sign up for one of our workshops: https://www.realmiq.com/workshops
Are you an AI Founder? Learn about our AIccelerator: https://www.realmiq.com/startups
LinkedIn: https://www.linkedin.com/in/curtdoty/
Need branding or marketing? Visit: https://curtdoty.co/
Okay, so look, I've talked a lot about innovation for good on this show, but we've never had a guest focused on health and specifically mental health before. So today we will speak with Chris Lee. He's a serial tech for good entrepreneur and the co-founder, executive director of Say IT Mental Health, a nonprofit that fuses peer-led campus clubs with a gamified app whose pilot cut Gen Z students reported daily stress by 29%. Amazing stats. So, he's a five-time founder with two successful exits. Chris has spent the past decade teaching business management at UCLA while shaping the next wave of emotionally intelligent technology. Early in his career, he advised Fortune 500 leaders in Deloitte strategy practice and managed multi-billion-dollar portfolios for leading Chinese investment funds.
Today, Chris channels that blend of business rigor, ac academic insight and startup grit into one mission training ai so digital tools can understand, support and empower real human feelings, especially for the generation that needs it the most. So, let's listen in.
Welcome to the show, Chris. So, let's start with mental Good morning. Yeah, great to have you on the show. Thank you for having me. Yes. we've had some conversations, and I love what you're doing, but let's start with, let's start with mental health and chat. ChatGPT There are numerous stories, as people are, yeah.
Using it as an online therapist. what's the good and bad around that reliance and, exposure? You know, it's interesting, I read some reports on that as well too, that many students, many Gen C students are turn to chat. G-P-D-S-A. It's a companion and I get a chance to speak with some of the engineers from Open ai.
And this is what they told me, that this was a very pleasant surprise. They, they didn't plan to create chat GP for this fashion. Right? Yeah. So here are my thoughts that on one hand that anytime a student or anyone who really wants to take charge of their own mental health to do something about, I think is a really, really good thing.
And for me, the interesting concept is that maybe we should ask. Why are the student, why are Gen C doing this? So, I can think of a couple reasons. One is that, hey, it's always there, right? It's on your phone, it's on your laptop, maybe it's on, it's on your desktop. So, it's accessible 24 7. So is, it's very useful.
And for me, I think the biggest reason why the student turn, who does a stat. Unlike other social media platform, GBT is basically nonjudgmental it. Just listen to you. Mm. And it points to me that many students that I speak to basically said the same thing. That oftentimes they feel alone, there's no one for them talk to.
And currently social media platform is not really structured that way, that they don't feel comfortable. Sharing what they truly feel. So, they have to be performative, pretend to be someone else. So really speak to that. There's a lack of platform for them to be who they really are. Hmm. Now I am a big, big fan of peer support.
I think for most cases that, having a friend, someone that had gone through the shared experience with you is really, really meaningful, right? And look, we all experience some temporary distress from time to time, particularly when your students. Okay? You, you going through a, a final, a midterm, you dating someone, you break up, someone, you're applying to college, you're applying for your internship, right?
All this has, yeah, caused some distress. So having a shared expense with some friend to go through with you, I think is really powerful. Which is the negative for something like a chat, GPT chat. GPT is a smart parrot, but it has no eq. It doesn't human emotion, right? It doesn't really go through; it hasn't gone through the same experience that you have gone through.
Yeah, so that could be a problem, and this could be a really big problem, particularly that if you are someone that who might have some more serious. Mental health condition in a trauma case, maybe you have some negative thoughts about hurting yourself. So, something like a chat without the emotional awareness could actually be very harmful.
Yeah. Can you, for our listeners, can you tell us what EQ is? Well, EQ is basically what we call “emotional intelligence,” right? So often times we talk about EQ. The EQ, right? Yeah. So, there's IQ and EQ. So, in my experience as a teacher that I found that. IQ is important, but perhaps EQ is a better indicator of how well you perform actually during life.
Because at the end of the day, the most of us, we need to work with other people on the team. Right? So, having that emotional awareness, how to work with people's report, and that's something that AI doesn't really have ai, that's pure logic, right? It's really good at doing calculation. It's good at, good at summarizing information, gather information.
It doesn't really have that EQ component to understand who we really are. So, for instance, that, going back to your questionnaire, that if you use check GPT as a therapist, right? Well. ChatGPT would basically answer, respond, just as you type it. It would oftentimes miss any nuances how you phrase certain things.
Right. Maybe we use it sarcastic. It doesn't know that. Right? Right. And maybe you're hinting that, Hey, I'm experiencing something really, really triple again without that eq. Ability, it might not be able to catch all those nuances that the way we often do when we interact with each other. So, for instance, that if I'm talking to students, if I notice that the student is like little upset, maybe, maybe something on my mind, I might speak a little bit slower, right?
If. Vice versa. If I notice that the student is really excited about this, I may speak a bit faster, I may speak a bit louder just to make sure that we capture the excitement of the moments. Yeah. Well, you speak to the texting generation, right? And they're just used to communicating via words, via and text on their smartphones.
And there isn't emotion assigned to those words unless they put a fricking emoji next to it. And that is somehow, somehow supposed to sublimate some emotional context. but you know, this younger generation. They, you know, they may not know how to speak, you know, emotional and, and open up because they're just used to just texting everybody.
and, and certainly COVID and the isolation of COVID that the environment that they grew up in as children, right? had to have an effect on these communication skills, let alone, uh. EQ of, of them, right? The, the, the kids. Not, not just the technology, but what is the EQ of these, this, this generation.
That's an interesting point, right? So, I think there's the assumption that, oh, okay, the Gen Z, they are the digital generation, right? They all grew up with the iPhone and iPad, something that we didn't quite have when we were kids, right? So, no question that they are very technological savvy, that they could find information.
They did really good at using that as a tool too. For all aspects of the life. Like you mentioned that even communications with the friends, right? The texting, the messaging, but you also point the fact that yeah, for them, sometimes texting's not enough. Right? Words' not enough. They do use emoji to express how they feel, and I actually think that Gen Z Jane, they are very emotional to me.
I call them basically hip piece 2.0. They exhibit all the things that, well, I wasn't part of the hippie generation, but from what I read and what I've seen, right. Just like they are very emotion. They care about the world; they care about each other. They're very much into social causes. Right. And back to what you're saying about COVID that.
It surprised me that I really thought that, oh, the kids would just love it, right? Going online for classes, they have to come to class. And for a while that was absolutely true, but after a while, some of the student actually told me that, Hey, Professor Chris, we missed coming to class. And what I realized was that, okay, they do want.
In real life interaction, no question about that. But we have to give them a reason to come to class, right? If all I do is read a lecture for my lecture notes or from, from the books and say, why should we come to class? I could just read on my own. I don't need to read for me. So, what is this? That they very particular, that they want positive.
They want engaging interactions. So, if we can provide that in real life, they want it just like anyone else. Yeah. Okay. So, I love your background and, you’re a founder now of an AI company, and as with any startup, what problem are you trying to solve? And how did you arrive at your insights to create your platform?
Okay, so first of all that we don't call ourselves an AI company. that journey was kind of byproducts of trying to understand what do our users really need. We started out and our main mission is like, how do we create a better, mental health? Platform to deliver services to Gen Z, to, to, to the student in high school, to the student in college.
That's what we sell to do. along the way we recognize that there's certain things that we can provide and when we provide those services, so for instance, that we on now add, we, okay, so let me backtrack. So, our team is all Gen Zs. So, some of them are my current students, some of them, my former students.
We came to realization that, what’s lacking in the mental health service for Gen C is not so much access to services. There are plenty of services on campus, on college level, even high school level. What's missing is that these services oftentimes created by a different generation, older generation, and the way they deliver service is not really make sense.
They don't speak to the current Gen Z students right now. So, case in point, I mean, at UCLA, where I teach, we, the school's been around for a long time. Right? Over a hundred years. It one of the top public universities in the country. Mm-hmm. And we have over 50,000 students. Undergrad grads and also professors on campus.
And the budget for U-C-L-A-I kid you not, Curt, is $11 billion a year. So that's a lot of money. Right, right. So, we are measuring that with that many resources, with that kind of history, with that many students coming through for a hundred years, they would really understand student health and mental health.
Surface really? Well, no, not even close, right? So, if you want mental health counseling, you have to go to a sub-basement of this really old building. So, there's no windows, no natural, I think very depressing. And you walk in; they give you this plastic card. The first thing says on card is we need to verify your student eligibility.
So. Think about it from the skin perspective, right? When you are a student, when you're Gen Z, you in the age of teenagers, right? I think that asking for help is difficult. Well, even for us now as we go, asking for help is always very embarrassing, right? Personal nature. So, when someone walk down there, you need to recognize and say, Hey, you're so brave to come down to actually positively and actively seeking help.
Welcome. How can we help you? Yes, I get that we are university. Yes, I understand there's processes, but the verification of your status can be done later on. Let's get them comfortable. Right. so when we did a little study, we sat in in the. In, in, in the clinic, and we watch how many students actually come down and by the time they receive the cart, most of them just walk away like, oh my gosh, these people don't get me.
Right. So, our team and I decided, Hey, the way to do this is can we redesign the delivery, the services I. The way to do this and what we need an app, because that's what they really want, right? Just like with check, if somethings available to them 24 7, they feel comfortable using it. So, one of the things we do on the app is that we ask them how do they feel every day?
A very sophisticated emotional check-in to check the vibe. And from there on, we allowed them too basically. We encourage them to express the feeling, however, good or bad, okay. We're not trying to fix them in different creative ways, maybe increase some arts. We can write a story, maybe even suggest a song that would, that they could relate to their emotion.
So, we allow them the freedom to really free express who they are and to share that expression. Maybe comments on other people, how they do it. And then during the day when things slow down a little bit, we ask, Hey, maybe we want to reflect upon your experience. So, during this process, we get to understand, oh my gosh, we are capturing a lot of information about the mental health, about the emotion.
And it started on a different journey. It's like, maybe this information could be very useful. And train ai because AI is a big topic. Is it? I think it's a huge technological advance that it's going to stay with. It's not fat. It's going to be here with us for a long time. Maybe we can use the information we are gathering to train AI to be more empathetic.
Hmm. So that's our journey. Okay. Yeah. So that's the problem you're solving. the, the, we're trying to solve. And, and look, I think that many researchers, much smarter than us, and many, folks around the world are all looking this issue, right? And we all know that if you look at the evolution of ai, this is where they want to go.
So, in the industry, the term, they refer to this as. Theory of mind, TOM and theory of mind is basically a psychological term. Say, okay, can you relate to another person? Can I step into your shoes, understand how you're feeling. Yeah, so the goal, the objective is that, hey, if we want AI to be truly useful, beyond just being a very smart parrot to mimic what we say or summarize what we said, that if it could be really useful too, that we need to really add this emotional layer to it.
So, the LLMs, the large language models that we have right now, they're really good at the language component, right? And many of them has really strong, logical layer component as well too. But what's missing is this emotional layer. So along with us, and again, many, many smart researchers around the world are all trying to figure out how can we get information to train ai.
To be smarter so they be more useful and beneficial to us. Yes. So that's the second problem you're solving. The first problem was giving access, right? To troubled college students, let's say, of the age, to seek therapy in a way that's responsive and accessible. The second problem you just described was trying to train ai, the LLMs.
To be more empathetic and create an emotional layer. Yes, yes. And I, you know, that's not really talked about a lot when, when, people talk about ai, they talk about a GI, they talk about reasoning, they talk about it. Smarter than humans, doing human tasks better than humans do. but there's never talk about the emotional component.
And myself, I'm a creative using AI and, you know, from the creative community, you know, it's, uh. Creativity is something that's very soulful and very human and humanistic. And, and so the critique is that, yeah, the, a lot of the ai, certainly imagery that's coming out lacks, authentic. It doesn't the soul, right?
It's the soul. It's been the criticism of, you know, lack of soul, like. Art is soul. Music is soul. Yes, absolutely. Yeah. And, and so that's been a big critique, but what you're saying is we need to train it to, to not only think better than a human, but also to feel and be Curt. I think I do agree with that now.
There are talks about a GI, right? And part of the AGI does involve the emotional layer. So, you don't remember that? The AI as we know it right now, the, the agent, okay. the catch GBT of the world, the cloud, the world is relatively new, right? It's simply been around for a few years now. The foundational technologies that got to this point has been around much, much longer.
It started almost at the right. Beginning of the PC era, right? So, people have been basically building upon technology for a long, long time and for the longest time the focus again. So, all these agents are built on what we call LLms, large language models. The first test that, the first thing the challenge general's office, okay?
When we talk to the, a time we talk to, the machine doesn't understand us. So. Like the name in implied language. Right? So, the first one was, okay, let's make sure that it understands us, right? So now when we talk to chat, GBT chat B'S actually pretty smart, right? in many ways is much smarter than Siri.
Sometimes Siri doesn't get what we asking, but chat GPT does, right? So, the language has been the focus and the way we've been doing that is basically scraping all the information from the internet. Tech space, right? And we just run statistical models like, okay, what does it really means? Context. So, I think that we got that really, really well.
And so, after that, the next challenge for them is, okay. Now if we were to summarize it, we would be to help the user to do certain things. Like we need certain logic, how things are done right. So, they've been really working really hard on that. Again, it's getting smart and small over time. The next challenge to really get to what we call a GI, general intelligence, it is about the emotional layer of the challenge for these companies.
It's like. There is a lack of emotional data available for them to get. Now they scrape all the information from internet already, right? There are very few things for them to Scrape, and also internet Doesn't really have emotional information right now. You may challenge me by send, Hey, what about social media platform?
There's a lot of stuff out there, right? Absolutely. But remember that, many studies have shown that most of us, many of us, okay, we are in a way lying our asses off. Can I say that? I'm sorry that we're being performative right on social media. So, the information you get may not be real. Right, and often, and also that there's an issue with the information you get on social media tend to be typically a snapshot in time that, okay, there's this moment, this person saying this, you really don't know what led to that before or what's after, right?
There's no institutional data available, so that's why has been a challenge for all these companies, and they recognize that, how do we get stated? And also, the other challenge is that we as user. Particular Gen Zs, we are getting smarter, we are more protective about our data or privacy than ever. Right?
Yeah. They don't trust a lot of the big tech companies. Right. But they don't want to freely give information away knowing that, hey, for many tech companies, the model is, is about that. They want to suck out data so they can sell more advertising. Yeah. It's not, it's very rare about using data. Don't to really help us.
I don't think advertising, I, I think it's about training. it could be about advertising, but you know, there's no advertising on chat JPT. No, but what I'm talking about is like social media Right. And maybe a search engine that they are trying to. They, so we've been trained and Gen C in particular, that we are very careful now how we want to share our data, especially with big tech company, knowing that the trend has been for them to suck the data, maybe to train, but more often than not is to sell us something.
We sell information. So, they're very protective of that information. Sure. So that makes a really big challenge for big tech companies. How do you get this raw information? And I think that we are unique in the sense that because we start out not as a way to collect information, we start out, it's like, how can we help the students?
Right? And how can we create something for them and getting information. It's just a byproduct and this's something that I, I guess it's just from my own family upbringing. My, my, my dares teach me that, Hey, you know what? You have to give more. What you're getting, right, and I think it's so true today. If you want to be a consumer facing entity, an app or company, right?
You have to going to give more than what you're getting or else you're not going to build a trust with your target consumers because. We've all been burned before. Yeah. Right. No building trust. that's a very good point because the success of any technology is trust. absolutely. Again, right. Always, I always use the analogy or metaphor of, the light switch.
It's like I trust that, you know, pretty much every time I turn the light switch on, it's going to turn light on. And I don't know how it works, right? I don't know. Ac dc currents. I okay. But I trust that it works, and I like the results. Right? Light. You can see the light on my face. It's like, yeah, love light.
You remind me of the quote’s matrix, right? Remember they talk about when they go back to Scion, it's like, yeah, we don't know how the machine works, but we trust that we turn around, there's water and infiltration and air. Yep. I, I get you. That's a whole nother world. Yeah. So, okay. Let's just telescope a little into trust because Okay.
With your platform and technology that you're building, you're building these relationships. You're building ostensibly conversations. Yeah, that's a good way to look at it. Yes. And, and, and, and, and they're emotional conversations. I mean, there's, it's like real, the real tough conversations. It isn't performative as in social media.
Um, it feels like, I would imagine, you know, this, this person, the student, the subject, is having a real conversation. cause they're seeking help and, and your technology is responding in a way that gives them help and advice and is empath empathetic. Meanwhile, all this is being documented, recorded, as data.
We are the data and. And you're saying with that data you're able to train, is it your LLM or are you built on big techs, LLMs. And who, who are you training really? Okay, so let me backtrack just one step. So, with our platform that we, we are not therapists, none of us are really trained medical professionals.
Um, so our goal is not to give advice, okay. What we do is that we create a platform to really encourage peer support. Back to what I was saying earlier that I am strongly believing peer support. That, we believe that using peer support is a great way to basically teach certain skills. Um. To, to the student, for them to really manage their own mental health.
So again, we don't direct the, the, the conversation, we simply allow them to discover their own part, their own community. Right? And since we, we moderate the conversation, make sure that it is appropriate, but we let them take their own steps. we are not creating our own LLM. That is a very resource expensive, time-consuming endeavors, and we are small little nonprofit. We don't have billions of dollars behind us to build another LLMs. And it does cost a lot of money, right? A lot of GPUS, expenses and a lot of smart engineers, we don't have that. So, our goal is basically that we want to Make our data set available to certain partners if they meet our criteria for them to tr use our data to train the LLMs. Okay? And our criteria is reason simple there. So, we want to partner with AI companies. With the intent of using our information, using our motion data set to train the LMS to be more empathetic, it's not using our data set for them to commercialize, to sell more stuff to the users.
Right, right. Yes. Okay, I get that. so, you have a, certain standard or requirement of any collaboration with big tech in order to release or provide access to share? Yeah. We, we, we want to be the company to set the high bar because again, look. We are dealing with students oftentimes that they're underage, right?
And second, we are talking about the mental health information. Okay? Very personal, okay? And we have to treat it with the highest regard possible and the way I encourage the teams to look, 9:56 AM. Sorry. I guess they misunderstood what I was saying. I, I asked, I told my team, I said, guys, we have to do beyond what's required by the law.
We have to do what is right. Yeah. So, for us, for instance, that we all agree that we hate. The current practice opting in. Right. opting out. Sorry. Okay. This, this, this, it's a messy process. It's hard to opt out sometimes. We want everything to opt in, and we want to very transparent, like, okay, what data you want.
Allow us to share and with whom we're sharing a big pet peeve we have. So, we hate these terms, like third parties or affiliates, right? We don't know what that means, right? Who are, I think a lot of company hides behind them. So, we don't do that. It's like, look, we want to be very transparent, very clear that if we share information, if you allow us to do that, this is a specific company.
This is how this specific company will use the information. That's fantastic. So, I love that transparency. So literally you, in your terms and conditions, opting in, you would say, do you want to share with, OpenAI? Do you winna share with, Google? Do you want to share with, Grok or Llama? You would have those choices.
So, we would explain to our users in ferry. Understandable terms. Okay. Hey, we like to ask you to, if you'd be interested in sharing the data, first of all, we have to say, why do we need to share the data? Now again, like I mentioned before, that Gen Z is the digital generation and maybe the new term should be, they are the AI generation.
Right? And I think that may be is their duty responsibility to really train ai. EQ so that it will be not only better for the generation, but for the rest of humanity. Hmm. They are very in tune technology. They grew up with it. They, they have a very clear understanding of what they want the technology to do for them, and how they can benefit everyone.
So, we want to make our mission align with the interest, right? So, make fir first steps, making sure that they're saying, why are we doing this? And then they give them option. Okay? Okay. What information they would like us to share? They allow us to share. So, its, again, sharing is not just, it’s not digital, right?
To me it's more analog. It's a spectrum. causse since we're collecting so many different information, give the user the opportunity to decide what they want to share, what they want, don't want to share. And finally, that let them know that, okay, if you agree, these are the partners. Okay. Okay. Again, you, uh. We, I don't know, we are going to have five different partners or maybe one partner, but what we do know is that, hey, we want to be very specific to name this as the partner we are sharing with.
And this is what they're doing with your data. Yeah. So, we are actually, so. We model this after the, basically the EU standard general data, protection regulation. And this is also something Apple does if you participate with the Apple research on the phone. They are very clear, transparent as to how your data will be shared, how it'll be used, and what you agreed to share and not to share.
Yeah. Okay. I get it. Um. So, that’s the GDPR, right? That's the Yes. GDPR? Yes. Yeah. Okay. I get that now. it looks like we're headed towards no governance of AI in the United States here yet. The EU AI Act literally just went into effect this past weekend. and it’s, August here, 2025
You know, no one's really talking about the implications of that. as it relates to these international companies, which are big tech, if they need to operate in the EU, they need to comply with their regulations. yes, meta was the only one that didn’t comply with anything. most of the others complied with at least half, if not all, of the requirements, which is good.
Are you taking into consideration those new laws as you're developing your permissions and or transparency? Well, again, I think obviously like we need to be in compliance, right? But our standard is that, hey, we need to do sometimes more than what's the law is required, let's do what's right. And I really believe that what we do is actually exceeding what the compliance requirement is.
Yeah. what it means in real terms that Yeah, it does cause us from the resource standpoint a little bit more. Mm-hmm. But if you built into it with the mindset going in as a startup. It's manageable. And again, I do believe that trust is the, is paramount. It's the, it's the only thing it matters when you are dealing with Gen Z, and we have to do everything within our capabilities to make sure that we don't lose the trust.
Yeah. Okay. Speaking of trust, what is some of the top line data around. College age, depression. oh gosh. Okay. because that, that really sets the table in terms of the problem you're solving versus just access on a, on a college campus. But what is the real data around the real problem, the real numbers?
Curt, you hit right on the head that this, this really why we started this journey to create this nonprofit for the student. Right? So, um. The way I look at the data is that not just at college, but I also look at before colleges. And also, after college. So, we are looking at this age group, 10 to 24. So, in the US 10 to 24, so it's mostly Gen Z and some Gen Alpha, it's a big number, 70 million.
So globally the number is fairly close to almost like 2 billion, right? So, it's a large segment of a population, something that we need to address. And again, that. In a way that their generation ultimately would inherit this planet, right? That they would be us in the 5, 10, 15 years they'd be running the planet.
So, it is imperative that we help support the mental health and mental health has been a big issue. one of the biggest issues, we talk about depression, not only in college, but also, I think starting in high school, is that school has become very, very competitive over time. not, and once again, it's through careers getting very, very competitive.
So, case and pointed at UCLA last year, okay, last year by 2024, there were over 150,000 applications for roughly about 13,000 spots for undergraduates not counting in professional school. That's less than nine 9%. What it means that, hey, if you have a 4.0 GPA, you gain, a's in all your classes, your application will be looked at.
Yeah. But there's no guarantee for admission anymore. Right. And. A place like USC. Okay. Even though it's super expensive, we talk about like 92,000 bucks for tuition one year, right? 88,000 people apply for roughly about maybe 9,000 spots. Again, about 10%, right? So, it's really competitive and what also notices that.
Something we don't really talk a lot about is that I think there's so-called achievement pressure here that you see that a lot in middle class, maybe upper middle-class family. Mm-hmm. And also, for high performance, high school. Right? So, there's this heightened expectation that. The kids have to do really well, has to almost be perfect.
Everything they do, they have to get perfect grades, perfect test score. They have like perfect, extracurricular activities just so that they could be competitive. Yeah. To get into the right college and then maybe get the right job. Right. So, a lot of people say that, oh, you know, achievement pressure is a good thing.
Right? So, you probably heard this term, oh, the tiger mom in the Asian culture. Right? Well. Any kind of undue pressure. I don't think nesters are good, right? Because it has been shown to lead to anxiety and depression and the studies, we, we look into from C, D, C, and also that there's the substance abuse and mental health administration, from the health department that 42% age 10 to 24, basically, that they feel hopeless.
They are feeling strong anxiety about their life. A lot of that has to do with the academic achievement pressure, right? If you look further that almost one third of that age group, they have exhibit sign of depression already. one terrible statistic that nowadays. The, some of the kids are having panic attack at the age of 15.
Look, when we will have panic attack, right? So earlier this morning, before I start the podcast with you, my very first podcast go, oh my gosh, what am I going to do? Right? I'm having my panic attack. But now kids having had age 15, this, to me, it's unreasonable. Okay? They should be having the time of their life.
And in the US again, 10 to 24 suicide is now the number two cause of death. So, all this. It's just so much pressure. Yeah, and if you add into the culture component here, right? So, a lot of culture, myself included from an immigrant family, that that's a heightened achievement pressure on top of that certain culture, like Chinese culture and Chinese American, that mental health is a taboo.
You don't talk to them with your parents. You don't, you don't discuss when you meet the family. So that's just say, no, no, you just suck it up. Right? So, all just essentially creating a really. Negative environment for students both in high school trying to get in college and in college, trying to do really well so you can get ready for job markets.
Yeah. Yeah. And that job market is also looking more competitive now. Wow. Yeah. So, trying, here's another interesting point. I want to share that at UCA Web, this really amazing program, it's called a shop fellowship. So, what it does is that we allow first year freshmen at UCLA. To apply for roughly about maybe 120 spots.
And if you get in, we pair you of an alum that who's done really well in maybe I banking in finance, in technology or management consulting, right? So essentially the student will shadow the alarm, him or her for the next two, maybe three years, getting a lot hands on information mentoring as to how you can succeed again to an industry.
So. About 120 spots, but over almost 5,000 kids applied last year. Oh, wow. So, all these kids applied a 3.9 GPA at UCA for first year already. Yeah. We have to turn away so many four partner things. Again, it just again speaks to the pressure that you get into college, then you start the next boss fights. You are thinking ahead already how they get the job market, and I just really feel terrible that they're missing out maybe the best time of their lives.
Right, because they are on this constant pressure to perform to get ready. Yeah. The service available to them is not speaking to them. That's why I said, look, it's still a way we can help them. Yeah. Yeah. No, that I, I love what you're doing. I talk a lot about innovation for good and you are, you are doing that and we we're trying to, well, I mean, you, you have the goal, you have your mission, and, and it is based on, on things other than, you know, greed and money and some type of race for a feature set, you know, which is big tech is just.
Who's going to win that feature? Set war. that has nothing to do with the real needs, solving the real pro problems of humanity. And you're tackling that. I truly appreciate that. Well, thank you. So, I guess maybe I'm part, uh. Okay. Maybe this, that part. Happy in me. Right? I do believe in the good of humanity despite some of the things I see from day to day.
Right. I do believe that internally we are good. Okay. Given the chance to do that, and I'm just so happy that we have a group of amazing, amazing team members, gen C students, some of them actually in high school. Okay. Some of them are just sophomore in high school. That. They see this as an important mission, that they all want to participate and do good to make it better for their generation and for everyone else around.
Yeah, and well, you speak to that generation being the key to our future because they're the ones that could and should be. Training the AI, to develop that emotional layer. It is up to them. It's, it's just old guys. Well, I, speaking for myself, but I, I do have hope in the next generation, and I, I'm glad you're enabling it and empowering them, employing them, and, and they're equally part of the mission as is.
The technology, that you're using to achieve your goals. So, kudos to you. thank you. this, please talk about the name of your platform, how people can find it, and any other things you want to plug here as we're getting to the end of our podcast, because I want to mm-hmm. Okay. Um. The name of our organization is, say it mental health.org.
We are a 5 0 1 C3 nonprofits. That, that's really how we started. So, as we mentioned that we have an app, we, we have a beta app where we're finalizing the version that should come out early next year. We, Mason team working on that. And in addition to the app that we also believe, like I mentioned before, that human interaction.
So, we have peer backed. Basically, campus clubs. So, we ran a couple of beta sites last couple years at two universities locally here in Los Angeles and two high school. And this year we are going to expand to about 15 different locations and hopefully with your support, with other support that we can expand over 100 campuses in the West coast next year.
So, the peer led. Campus club is basically; it's a trust circle. No white coats, okay, no lecturing. So, we model it after, at the mutual benefit, organizations such as aa, alcoholic, anonymous, OA over, anonymous. I look at what they have done, I have seen some of my friends have gone through those programs, and I think those are amazing programs that they help so many people.
So, we try to model ourselves after that to provide the level of peer support that I think is so crucial in any journey, particularly in your mental health journey. Finding the right support, finding the right support buddy in the right community. Makes a big, big difference. Yeah. So where are you in your, funding rounds and as you mentioned, we're a startup.
We always recently fund. It's, it's, it's, it's a challenge. No question about that. so, we are fortunate that we have couple of corporate sponsors already, Adobe. Thank you very much. We appreciate your, but so we continue to seek out to other foundations and other corporations, to this, it is an important mission.
It is an important mission that we just hope that we have the resources we need to get to where we want to go and to really help as many students as possible. Okay. Well, that is awesome. Okay, we're going to wrap up our show here. Chris, it was so fantastic to have you. Thank you on, and you're my first guest that's really focused on healthcare, let alone mental health and, it's a big issue.
Yeah, it is a big issue. And hopefully. A lot of people will see this show and, adopt your platform when it launches and at least track, your success. And we're wishing you all the best, in the world. And, and thanks to all of you for tuning in and catch more of our Realm IQ sessions on your favorite podcast platforms.
And please follow and smash that subscribe button. You can also follow us on. TikTok, LinkedIn and now Blue Sky. Thanks Chris. Take care. Thank you, listeners. Thank you so much. Hope to hear from you all. Bye-bye. Okay, great. Thanks.