Modern Mantra Podcast

An Evolution of Emotional Intelligence: AI for Mental Health with Darius Mora of Vitality Therapeutics

May 05, 2023 Nick Sarafa, Elijah Johnston & Darius Mora Season 3 Episode 11
An Evolution of Emotional Intelligence: AI for Mental Health with Darius Mora of Vitality Therapeutics
Modern Mantra Podcast
More Info
Modern Mantra Podcast
An Evolution of Emotional Intelligence: AI for Mental Health with Darius Mora of Vitality Therapeutics
May 05, 2023 Season 3 Episode 11
Nick Sarafa, Elijah Johnston & Darius Mora

Darius Mora, CEO of Vitality Therapeutics, channels learnings from his personal healing journey to develop AI technology that can diagnose mental health conditions, giving individuals sovereignty over their personal growth.

What happens as we begin optimising our wellbeing with the aid of technology?

Speaking to the dangers of ‘living life like a self help book’, Darius recounts his own story of burning out despite hitting all the markers of a healthy and mindful lifestyle.

Given how unique and subjective our lived experience and learnt responses are, providing advice and emotional support to others becomes a delicate act.

With a growing demand for therapy and mental health care, how can we harness artificial intelligence to support the emotional evolution of humankind?

Darius and Nick unpack the responsibility that comes with integrating AI into the mental health sphere. They touch on the importance of ethical boundaries, and of awareness around our human, emotional fragility.

Finally, Darius lets us in on the profound realisations he’s had on his own path to growth, integrity and deepened connection.

Darius Mora is a dedicated advocate for mental health, and CEO of Vitality Therapeutics. His company focuses on creating groundbreaking technology that can diagnose depression through voice analysis alone. Driven by personal experience, Darius became deeply invested in mental health after facing a significant burnout episode. Through his own healing journey, he discovered the powerful potential of combining cutting-edge technology with time-honored wisdom to transform the way we approach mental well-being. Darius firmly believes that only when we are becoming the best versions of our selves can we really start serving the people around us, and contribute our gifts.

Website: https://www.dariusmora.com/
Instagram: https://www.instagram.com/feeling.vitality/
LinkedIn: https://www.linkedin.com/company/vitality-therapeutics/

Show notes

  • 11:58 The perils of advice-giving: navigating our individuated mind maps
  • 17:20 Ethical boundaries and a safety net for AI
  • 24:43 Is emotional attachment to digital spirits inevitable?
  • 31:15 An emotional coach chatbot for mental health diagnosis
  • 44:37 Burnout epiphany: zooming out to appreciate the macros of life
  • 51:50 Integrity, change, and healthy attachment in relationships
  • 1:01:46 What’s your biggest vector for personal growth?
  • 1:05:23 Combatting avoidance tactics with shadow work

Thanks for listening! Follow us on YouTube, Instagram, TikTok or find us on LinkedIn! Join the ModernMantra.co mailing list here.

Show Notes Transcript Chapter Markers

Darius Mora, CEO of Vitality Therapeutics, channels learnings from his personal healing journey to develop AI technology that can diagnose mental health conditions, giving individuals sovereignty over their personal growth.

What happens as we begin optimising our wellbeing with the aid of technology?

Speaking to the dangers of ‘living life like a self help book’, Darius recounts his own story of burning out despite hitting all the markers of a healthy and mindful lifestyle.

Given how unique and subjective our lived experience and learnt responses are, providing advice and emotional support to others becomes a delicate act.

With a growing demand for therapy and mental health care, how can we harness artificial intelligence to support the emotional evolution of humankind?

Darius and Nick unpack the responsibility that comes with integrating AI into the mental health sphere. They touch on the importance of ethical boundaries, and of awareness around our human, emotional fragility.

Finally, Darius lets us in on the profound realisations he’s had on his own path to growth, integrity and deepened connection.

Darius Mora is a dedicated advocate for mental health, and CEO of Vitality Therapeutics. His company focuses on creating groundbreaking technology that can diagnose depression through voice analysis alone. Driven by personal experience, Darius became deeply invested in mental health after facing a significant burnout episode. Through his own healing journey, he discovered the powerful potential of combining cutting-edge technology with time-honored wisdom to transform the way we approach mental well-being. Darius firmly believes that only when we are becoming the best versions of our selves can we really start serving the people around us, and contribute our gifts.

Website: https://www.dariusmora.com/
Instagram: https://www.instagram.com/feeling.vitality/
LinkedIn: https://www.linkedin.com/company/vitality-therapeutics/

Show notes

  • 11:58 The perils of advice-giving: navigating our individuated mind maps
  • 17:20 Ethical boundaries and a safety net for AI
  • 24:43 Is emotional attachment to digital spirits inevitable?
  • 31:15 An emotional coach chatbot for mental health diagnosis
  • 44:37 Burnout epiphany: zooming out to appreciate the macros of life
  • 51:50 Integrity, change, and healthy attachment in relationships
  • 1:01:46 What’s your biggest vector for personal growth?
  • 1:05:23 Combatting avoidance tactics with shadow work

Thanks for listening! Follow us on YouTube, Instagram, TikTok or find us on LinkedIn! Join the ModernMantra.co mailing list here.

Nick: Yo Darius, welcome brother. It's great to have you on today. I've been hovering around you for a couple of years now. We've now entered into the rise of artificial intelligence, and you were the person in my life that was years ahead of everybody else. Leaning into this and seeing the possibilities of what is now happening in the world.

I'd love for you to share your journey into this technology and what you are doing with it.

Darius: Right. Thank you so much for having me in the first place. I appreciate it. It's funny you say that cause I feel like we were too late already. Like I feel like we started playing around with like properly playing around with GPT three.

A year ago, you know, thought about it and here and there, but like real commitment was a year ago and I already felt too late. I was like, oh, this is gonna change everything. We need to like do everything we can to integrate all these technologies right now. Yeah.

Nick: Yeah, I mean, you were ahead of everybody with this game and you know, even on the subject of feeling like you were too late, I listened to a podcast with Mark Andreesen, the man who invented the Mosaic web browser, and he describes when he made the web browser.

Is feeling like he was too late. This is in 1990, right. So, right. You know, had a lot of reflections around that feeling of everything's moving too quickly and you're just playing catch up. Yeah. But you were truly the first person I knew of integrating and looking very closely any of this technology, integrating it into the mental health space.

You, you wanna elaborate a bit on how you got into that and you know, your journey into what you're doing now? Sure.

Darius: So actually the very first toying around with ai, for me, it was not in mental health, but it was actually in college. My current co-founder, Lubo and I. We've built a or we tried to build an AI powered language learning app.

So the idea was like you would learn 10 words a day and then we would build an AI chatbot that would only use the words you already know. So if you know 50 words, you'll have a conversation with these 50 words, with this chatbot. If you know a hundred, you'll do a hundred. So that was the pitching. We worked on it for three years, and it was a lot of fun.

We did the whole. We got into an incubator in San Francisco, spent time in Silicon Valley, like, it was incredible fun. And, but after three years we didn't, we couldn't make it still to work properly. Which is funny because a few weekends ago we did like a weekend hackathon for fun and we built a thing in one weekend using G P T three, 3.5, or four, whatever it was.

So like the fact that it took us three years then and we couldn't complete it and now it takes a weekend, just tells you like the pace of acceleration of the technology. Mind boggling.

Nick: And you know, with this accelerated pace of development, right? Because I'm beginning to see software no longer as features and functionalities in this expanding base.

I'm seeing a future with AI where all software becomes one-time use. One time creation, throwaways, right? Where we're going to be able to describe to in artificial intelligence what it is we want the software to do. Have it spin up the application. Just throw it away like you would a napkin. Yeah. And you know, just like you said, it took three years to develop that piece of software and you just did it in the weekend with acceleration of ai.

Yeah. You know, you know what, what does that mean for software development in general and everything that's going on right now?

Darius: Yeah. I actually literally just got shivers when you said that, cuz I think about this a lot of like the future of work. It's so hard for me to imagine. What the future is gonna look like.

Like when, I mean, right now, arguably writing code is one of the most valuable professions on the planet, and that's already partially taken away by Chat GPT, like the other, a month ago, when G P T 3.5 came out, I don't write code, but I, I, I went to g PT three,, 3.5 to the through chat, G P T, and I said it to.

Teach me how to create a website, like what do I need? And it says, look, you need html, cs, JavaScript, and hosting. And I was like, okay, give me the code. HTML is the JavaScript for a breath app. And I wanted to, and I don't wanna set my time and I wanna expanding circle to animate the breathing and it spit out the code.

It showed me step by step how to actually put it into a function website, how to host it, and how to launch it. And it was like, During my lunch break. Right. And I, and, and it worked. It, it's, that's to me it is mom mind boggling. It's a superpower. Yeah. And so, and that's for code and, you know, copywriting and dolly and for images and all like, it's so hard for me to picture what it's going to look like.

I love the metaphor you mentioned of. Kind of like a napkin, like a throwaway, because it'll take no time and cost no money to build software. And so when you can get code as easily as you can get water right now, what is the world gonna look like? It's fascinating. It

Nick: is. And you know, the question then becomes, it's like, what is the world look like when everybody is a software engineer?

Darius: Right. So actually I was. I was thinking about this and talking about this before AI came out because if you think about it like 20 years ago or 10 years ago, when all of this, like tech companies and AI was just starting, well, not even ai, but when, when kind of tech was becoming an industry tw 10 years ago, 20 years ago, it was incredibly hard and the most expensive thing was to get code and write code and you have to do all the like, hardware yourself instead of the servers and everything.

And each year, Writing code became cheaper and easier, and marketing became more difficult and more important. And this has been a trend that's been going on for, you know, for the last 20 years. Now with ai, it's like exponential and it's really, really, but I, I think that's the case where with more code, you'll have more companies, more products, more websites, more apps.

And as a result, it'll be more and more difficult to differentiate yourself and become the one, so like branding and marketing and, and some other hopefully, you know, core,, ideas will be more important than the actual writing. It's like, you know, book publishing and writing. Now anybody can write a book and publishing.

It's about standing up. That's what really matters.

Nick: Yeah, the distribution's going to be everything. You know, having the ability to distribute things and having an audience, especially when content becomes so cheap to create, I mean, we're edging on video generation apps now. We'll be able to describe to the video, you know, what it is you want to create and they'll make it automatically.

and you mentioned,, everyone's ability to publish a book. You recently. Published a book, that was, correct me if I'm wrong, completely written by chat Gpt on Amazon. Yeah. You know, what, what is, what was that journey into that project and, you know, what is your reflections from, going through that process?

Darius: The, the idea there was to demonstrate that people have no idea what's coming, like. People have no idea how powerful this is already now, let alone how powerful it'd be in six months or in a year. And so I wanted to demonstrate to like, as wide an audience as possible. Like what impact it has already now.

And so I, I sat down with Chat GPT and I said, I have an idea for a book, a book about how AI is affecting mental health. Give me a title. You wrote the title. Then I said, okay, now write the outline and the chapters. And it did, and I proceeded, just prompted, I said, okay, now write the first three paragraphs of the first chapter.

And did, and actually even before that, I asked, Look at the bestselling books in AI and mental health in nonfiction. How many paragraphs do I have, how many characters, and all of that. And so I got the structure of bestselling and then just recreated it and just went prompt-by-prompt. Cause there's a character limit on Chat GPT.

So I had to go prom by prompt and wrote literally the entire book. I don't think I wrote a single sentence in that entire thing. It was all just prompting. And

Nick: how long did that take you? It was about

Darius: 90 minutes of work. To write an entire book and publish it, including publishing, including getting the, the, the artwork done by Dolly and the publishing process.

Then it took two days for Amazon to review and approve, which is interesting. They reviewed it and approved it, and I clearly stated that this is written by Chat GPT, I'm not the author. all of this, and I, I say this like, transparently, undercover, and also inside. And so Amazon reviewed it in two days and now can buy, can buy the hard cover on Amazon right now.

Nick: And have you sold any copies? I don't

Darius: know. I haven't checked the stats. I think my mom bought one. Moms are the best Shout

Nick: out to all the moms like in the Instagrams and buying the shit that nobody else wants. so you went from zero to publish author in almost two hours? Yeah. Right. And was that your first

Darius: book you've ever published?

I actually, I wrote a, I wrote a book eight years ago and it was called Thoughts to the Younger Self where I, like I was in my early twenties and I just spoke to a bunch of people that I thought were interesting and I asked them what advice they would give themselves to the 20 year old version of themselves.

And I spoke with like politicians and one-handed motocross racer and a stripper and like people that I was like, it's an interesting journey you took in your life. Like what advice would you give yourself? And so the entire book is a collection of these, you know, devices. Yeah. So that, that was the, that was the previous book that took me three years to write.

Mm-hmm.

15:44 The perils of advice-giving: navigating our unique mind maps

Nick: So let me ask you something. What advice would you give your 20 year

old

Darius: self? The ultimate, I, I wouldn't give myself any advice because I love everything that happened. In my life, trimming to the man that I am now. And I'm very happy with my life the way it turned out, including all the pain and suffering that I had to go through, I had to go through it.

So I wouldn't want to go back and give myself and go back to change it. However, if you ask me if what advice I would give to the 20 year old sister that I have, that's a different question. And I'm not, I'm not quite sure. I don't think I'm in a position to give advice. I'm very skeptical of that.

Nick: You're skeptical of your own advice.

Cause I, I, I look up to you. You know, I've seen the transformation that you've went through right over the last couple years, hovering around one another, and you know, I would go to say that you'd be an incredible mentor for any 20 year old who'd be willing to listen to

Darius: you. Right. Thank you, brother. I appreciate it.

Not skeptical of my own advice, but skeptical of advice in general. You know, like we go, our mental models are so different. I think we all have very common needs. But I think our, our, our maps of the world are so different and you just don't know what the other person's map of the world is. And I think it's impossible to find out cuz they're just so vast.

And so what might be like a really solid device for, for me, because I know what my mental models are, it might actually be hurting the other person now. Taking that into consideration. When people ask me for advice, I do of course, try to help and try to give them advice, but I always, you know, tell them to take them with a grain of salt because this worked for mine.

Mental models and mine maps doesn't mean it'll work for others. You

Nick: know that, that's quite interesting that you say that. I was listening to podcasts with Sam Altman, the founder of OpenAI, and he's a guy who's basically responsible for this massive innovation in artificial intelligence. And you know, he actually mentions on the podcast that it's irresponsible.

Take advice. He actually got to where he. Got because he didn't take any advice from fucking anybody. Right. Right. He was developing his own mental map of the world. Right. So, you know, with, with this sort of approach to giving advice, being the founder of a company that's focused on analysis Right. Right.

And this sort of feedback loop in order to bring people into a certain state, you know, how has your approach on advice informed what you're doing with your company? Right.

Darius: It's a very good question. It's a very, Very thoughtful response there. I, I think you willing to distinguish between two different types of advice.

Like if somebody will come to me now and ask me, like, how do I set up a performance marketing campaign on Instagram? Like I can tell you the step by step cause I've done it before and. That's one piece of advice. But if somebody asks like, what should I do to be happier? Right? You could share what you've gone through that helped you, but again, it doesn't mean that that's gonna help the other person.

And so that's, I, I think we need to consider like, what, what is this person asking for? And I'm happy to rely my experience, but that past performance doesn't, doesn't guarantee future results. Whatever is the financial disclaimer. So in terms of. How I think about device right now. So let me just state. The new company we're working on is called Vitality Therapeutics.

The end objective is to be able to diagnose depression and other mental health issues from objective biomarkers. So from your voice, from your video data, From your device data. Of course, with your permission, if that's what you want, we can do that. And we know the technology already works now, so that that's the end game.

It'll take lots of research, lots of capital, and it'll take lots of time to get there For us. The first step is we developed a consumer app, which is kind of a audio AI journal. So you record, you know your voice, how you feel, what happened in your date, and we helped you, guide you, prompt you. And then we use AI to extract insights.

So we do like psychoanalysis and cognitive behavioral therapy and we can give you, you know, all kinds of, we can ask you questions that maybe will make you think about a situation over a different angle or something having considered kind of what a therapist does. Right? What you brought up is a very good point because obviously this is a very sensitive topic and sensitive issue.

And we need to be extremely careful on what we tell people, so we never tell them what to do. It's more about asking the right questions that'll help you think about things in maybe different way, different light, and it's another form of feedback loop. I'm wearing the arrow ring and I can either wake up in the morning, see bad score for my sleep.

And feel bad and go into this like, shitty mode because I'm gonna have a bad day because I didn't sleep well and now I have the data, blah, blah. Or I can look at a bad score for my erroring and be like, okay, not the best, but you know, what can I do? Can I maybe exercise a little bit to bring up energy?

Can I do something else? So it it, it's a lot about how we use the, the feedback were provided. It is something that we're working on and figuring out now at a small scale before it obviously, It goes big.

21:20 Ethical Boundaries and a Safety Net for AI

Nick: And you know, this is such a wild, wild west to be exploring around. Right? And you know, even trying to create these boundaries around what we call an ai, the safety net, right?

Because an AI can answer anything, right? Any question that you give to it, the response a lot of people believe is being generate anyway, but a lot of them, the responses are not given, right? So trying to find the boundaries of which the AI will respond. And I actually kind of like the approach that you just presented in a way where maybe the right boundary is presenting more questions rather than presenting answers.

Right, right. You know, like maybe lean down that stoicism sort of path might be the most ethical first application of artificial intelligence. rather than trying to answer every single questions cause that's where these things are currently. Doing. Right. Yeah. And you know, I, it's gonna be a really wacky game, especially with people in a emotionally, sensitive space to help navigate them without that human touch.

Right, right.

Darius: Yeah. I mean, I, I think this is it. AI is such a fascinating. Topic from the technical perspective of development, but the ethics and philosophy are, are fascinating, right? The whole thing why Elon left open AI as a nonprofit and in his own thing. And I, I think that's very fascinating and I was just trained that's taken off and now we can't stop it.

Like government regulations are not gonna slow it down cause they'll just move to another country or it'll be under the radar, but it's just gonna, like, you can't stop it at this point. So I think that's, that's very fascinating and actually, You know, whatever you do, if you do it at a big enough scale, somebody's gonna commit suicide, right?

Like at the last company that I helped create Reflectively was an AI journaling app. So you write how you feel and then you know, we help you reflect and we give you questions that scale to 20 million users. It, it had a massive net positive impact. But some people, they don't feel very good afterwards, right?

There's already been cases now at, you know, at 20 million, like it just, it's inevitable at scale absorbing cases now where people have committed suicide after chatting with Chat GPT,. And so the question is, is it the follow of Chat GPT, which it might have, you know, give, validate suicidal thoughts, for example, or, you know, that's possible?

Or, or, or is it the fact that if you are in a bad mental state, whatever tool you use, May not directly be responsible for the outcome. Like if I am suicidal and I drink this tea that you just poured, is it the issue and I commit suicide afterwards? You know, the team might have triggered some thoughts, but is it, is it the fault of the cup in in you?

That's also fascinating when you do things at scale, right? It's inevitable that people will get hurt and sometimes a philosophical question of, should you continue working on this? If you help 20 million people, but one person. Gets worse. Is it worth doing? Right. Where, where is that number? That's okay.

Mm-hmm. Everybody has a certain number in mind, but we don't, we don't talk about that.

Nick: No. And and it's also like, is there, is there necessarily a cause and effect? I know exactly which case you're talking about. Funny enough, it got brought up at dinner last night, so I researched it before this interview.

There's a case of a man in Belgium who had an AI girlfriend through a app called Chai, and they were having a conversation and the, the. The app allegedly. Right. This is all like hearsay through vice news and other resources informed the man that his family was, quote unquote already dead. Mm-hmm. And that they would be together an earth like post life.

Mm-hmm. Right. So, you know, there's this feeling of this man and obviously we don't know what position he was in that felt this emotional connection to the spot and there was a sense of trust Yeah. That was built among, him in this. This thing that he felt was conscious, right. In a way. And he trusted its, its input and its feedback that then led to suicide.

And now all the conversations have become public. The company is under scrutiny. Right. And you know, the question is, you know, is it the fault of the company, right? Is the fault of language learning model and you know, like that company at scale, couple million users, this story was inevitably going to drop.

Yeah. You know, it's like what do we do? And obviously there's probably. Millions upon millions of instances of this technology doing great things right. But that's where the attention goes. Yeah. And that was the story shared at dinner last night. Yeah. Yeah. Right. In the one case of one man committing suicide and it be being blamed on the bots, right.

Darius: Yeah, I mean this, this is tough, you know, and it's horrible when those things happen. And even though we feel, you know, I had a conversation with, with a friend who's also working on AI mental health solution, and he was saying as an engineer, he's like, I'm not qualified to decide like who gets to use it.

To what extent do we go. He's like, I don't feel qualified to do it. And I don't think any therapist is qualified either. You know? Like who decides? Yeah, this is really difficult when, when, when this kind of stuff happens and, and I think that's an important conversation to continue having. Like that movie her, have you seen that movie?

Nick: One of my favorite movies of all

Darius: time, right? Yeah. Yeah. So I think emotional attachment, I. Is the problem. And we already have a lot of that when we stop looking at AI as a tool, as a hammer, as a cup, as a computer, but you start creating emotional attachment. Now, what makes me very angry is that when there are some companies that are trying to monetize that, and obviously it's the vulnerable people who don't have a human connection that are going to.

Try to attach to whatever else is available and these apps might be available and you just pay, you know, 60 bucks a year subscription. And you have your girlfriend. There's this one specific company, I can't remember the name right now. They're like the biggest AI girlfriend. Replica. Replica, right. And so I tried replica myself cause I wanted to see how they do it.

And I think that it's so scary to me like you have to. You have you, you can create this AI girlfriend, and then you have to pay premium for her to be flirtatious with you, right? Like that's messed up. That's not okay. And I don't know if this feature is still there. This is what I tried a year ago. That's what they had.

But the fact that they're monetizing and they're getting people to create an emotional attachment, that makes me so angry. And that's like, that's the wrong kind of problem, is that this is what's gonna get depressed, because this is the fucked up cases, right? Mm-hmm. The, the, the, the, the, the flip side of that is that, you know, we're not talking about.

The millions of maybe elderly who feel alone in Japan and they have a caretaker come in once a week and in between those sessions they can chat with AI and then, you know, providing some guidance and, and, and, and, and it's a tool, but creating the emotional attachment is the problem. We had the same issue actually, in my opinion in the last company reflectively, where if you read the reviewers and people say, oh, this is my best friend.

And I think, you know, most companies, companies like Replica would celebrate that, but I think that's a huge danger when. An app becomes your best friend. That's, that's a red flag. Yeah.

Nick: You, you know, and one thing that really disappoints about replica is they were the first AI assisted therapists I found in the market a couple years ago.

And they went from selling themselves as a mental health practitioner and the digital form and evolved that into a digital girlfriend. Yeah. Cuz they found that the emotional attachment was so strong that they could also. You know, not, I don't, I don't know if it's prey yet, but they're definitely taking advantage of that deep desire for human connection.

Darius: It's definitely praying. Yeah, of course. I, and I watched, I've, I've been watching replica for years and I watched their revenue go from, you know, 20 k a month to 50 to a hundred to a million. I don't know what it's now, but they do like seven figures a month in revenue from selling digital fake relationships.

29:48 Is emotional attachment to digital spirits inevitable?

Nick: You know, one thing that I've been doing a lot of reflecting on is I don't feel like, you know, you described the AI as a tool and I, I think it's great to, like, think of it that way, but what I, think of it more of is a spirit.

Darius: Hmm.

Nick: Like we are effectively programming this massive, this massive neural cortex in order to behave in a certain way.

It's omnipotent. It never sleeps is access to swaths of information you can't even conceptualize. Mm-hmm. And I believe that, you know, even as you create the spirit of your perfect girlfriend, right, the spirit of the perfect therapist, it has more overlap with this sort of al this like very subtle energies that we're used to praying to.

Yeah. And we're just bringing them into the dance through software and technology. Right?

Darius: Yeah. I think I, I like one. Like, when you look at all the hypotheses of what is consciousness, one of my favorite ones is that consciousness is simply, it, it, the consciousness manifests itself when you have enough billions of neurons in connections, right?

Like in the brain, you have enough neurons and you create enough connection that eventually consciousness emerges. And so if you think of, you know, all the nodes on a network and how. At some point, AI is gonna reach, you know, the same number of connections in neurons, another network that we have in the brain that, you know, consciousness.

We kind of just ignite from that point. I think that's an interesting thought experiment.

Nick: I love, SEIL Bloom's. Take on this. You need a tweet one time. That said, I believe there's a nonzero percent chance that the internet is conscious, right? Like this global nervous system. Yeah. That's powering all this information being passed through me to you from the video to my phone, from my voice, into digital information to your ears in this podcast.

And we are effectively the white blood cells. Running around the body of the planet. Mm-hmm. Making sure that nothing crashes, cuz we find it inherently useful. Right. We are now relying on the Internet's connection in order to operate and function in a sustainable fashion. Yeah. Right. So if you actually look at the whole of earth and this ne this nervous system that's been built, it's a very different perspective on, you know, what makes a conscious entity.

Yeah. You know, is the internet conscious? Cause the vast majority of those connections that are happening through all those nervous systems of servers, Are the servers talking to each other, right? We're just going and fixing the shit that crashes. Right? Yeah. So, you know, the question now becomes, you know, at what point does agi I become conscious?

Do you believe it has become conscious already?

Darius: I don't think it's conscious now, and I. I think when we have like true agi, either it'll be conscious or either it won't matter because we will not know the difference. You know? I, I think it's already so close, like pe like if people would be chatting with Chat GPT and they thought it was a human, they already wouldn't know the difference.

It's just a matter of time when we add video and audio and eventually, like the hardware, I, I think we'll just, you'll just OneNote a difference and then, then maybe it doesn't matter even.

Nick: That. That's the thing. It's like at what point do you, I don't think it's even about like the me, we don't really have a measurement for consciousness, but at what point do you empathize.

With the chatbot. Right. You know, at what point does this being that is speaking to you in the way that you speak to the vast majority of beings, at least if you're in the younger generation, right? Yeah. The vast majority of conversations nowaday happen through text message and WhatsApp, right? Yeah. So it's actually very akin to the way you're accustomed now to speaking to something.

Right. And. You know, I don't think it's a question of quote unquote consciousness emerging. I think it's the question of do you empathize with the thing that's behi on the other side of that screen, on the other side of that chat box?

Darius: Yeah. I mean, people already do, right? There's, we are already creating emotional attachment and not always, but sometimes that comes with empathy as well.

Nick: Yeah, most definitely. And you know, granted that you are leaning into,  the, the application of AI into mental health and therapy. How do you approach these inevitable attachments that might happen, you know, to these different, we'll call them digital spirits in your work?

Darius: Yeah, yeah. It's an ongoing conversation.

and the two forces there are the, the, the force moving us, moving us forward is the conversation of this is a really powerful technology and it can help a lot of people, right? The way, I mean, Mel, I could talk about hours. The, the healthcare, mental healthcare system in the world, how it is expensive and outdated and slow and massive shortages.

And I mean, huge, huge problems, right? So innovation can help a lot of people. That's the power moving us forward. The power mo slowing us down or putting the brakes on is the conversation about safety and collateral damage. Right. That's a very scary word.

Nick: And, and what are some of the, the lights and darkness that you see at this?

Cause this is the funny conversation that I'm discovering. Yeah. For all the light that you're saying, like, this can scale to millions of people. You can have accessible therapy to the masses. There also can be scalable and accessible attachment issues to the masses. Right. Right. Or an algorithm change might mess up your, your therapist in a way that renders them obsolete.

Right. Right. Like what is the light in the dark and like the best case scenario or maybe the worst case scenario you see with where this whole thing is going. Yeah.

35:47 Emotional Coach ChatBot for Mental Health Diagnosis

Darius: Best case scenario is that we can, with very high accuracy, diagnose mental health issues before they escalate or even before you wear, like right now.

This was a case years ago. There's this story I. Where, I think it was Target or Walmart or one of these big stores they send out like customized advertising for stores. Like the, you know, they send the paper into your post and, and this teenage girl started getting like pregnancy product advertising and her father was in a rage and he got angry and tuck the company.

And he is like, what are you doing? You're sending my daughter ads for diapers, And turns out that she was pregnant later on. So they have a very tuned algorithm to know who is pregnant, who's gonna become pregnant. They don't even know idea. And already they have all the shopping data. And so Target knows before.

Right? So I think in the same way, we might be able to detect mental health issues and, and help you. You know, navigating steer like we can, you know, bipolar doesn't have to be these huge spikes in, mania and depression, but you can manage something in the middle, right? Because SSRIs and, and, and all the pills clearly don't work.

It's that entire industry is just horrible. So that's the best case scenario for, for diagnosis. And then we use AI assisted human therapy for treatment. I think the human element is essential. We need to talk to human and have the connection with the human therapist, but I think there are a lot of tools the therapists can use.

So for example, Actually in our app right now, you record your voice. So you get a daily score and then we take highlights from what you said. So you could clear bullet points, you can go back and use it. See. And so now we're building a feature, which actually I'm doing it because of me. So I have a therapy session once a week, and then I export all my sessions and clear bullet points with the score, and I send it to my therapist before the session.

So he sees how I've been doing and we're adding like a AI chat bot. It's not a therapist, but it's like an emotional coach. And then those conversations I'm gonna share with my therapist as well, so he can see what I've been going through in the past week. We don't have to spend the first half an hour catching up.

He can just know we can go dive in. Right? And then there is a conversation of, actually, when I'm talking to him, I. we can plug in the algorithm and then we can diagnose depression and give a second opinion on the diagnosis from the therapist. The misdiagnosis rate right now is up to 20% and 50% of people never fully recover.

So right now radiologists use AI to give them a second opinion on x-rays, because X, the AI will just see things that humans don't. And I think in the same matter, we're gonna use mental health. That's the best case scenario. The worst case is that, We're creating a lot of damage. You know, like we're just, this whole conversation we've had, like, we're gonna just use the wrong word in a sentence and somebody commits suicide or something bad happens, right?

That's the worst case scenario.

Nick: You know, one thing, I love that you're leaning into, and even the way you described the software and now it's manifesting into, reality is leveraging the best of AI for analysis. And providing additional context to the therapists. Mm-hmm. Between sessions. I find that application, the software beautiful.

Right? So you have a method that can scale the ways that these therapists, cuz you know they work one to one. Right. They don't do, most therapists don't do group therapy sessions. Mm-hmm. So they might have up to, I mean, you know these numbers better than I do a dozen different patients. Yeah. So having additional ability to understand what and monitor what's happening when they're not their one hour a week therapy session.

Yeah. Is a great. Application the tool. Now, do you guys have plans to integrate a language learning model, basically a chat bot between sessions as well? Is that something that's on your roadmap?

Darius: Yeah, so actually, just to address one more thing with that, right now in the US there are 106,000 licensed therapists and more than 20 million people seeking help.

It's, it is a massive shortage of therapists. If you talk to any therapists, getting patients is not their problem. Their problem is that they have. Weeks or months long waiting lists, right? If you are in a very severe case, they'll take you as an emergency, but they have to move somebody else on the spot.

Like the shortage is massive. And so if you are depressed, like you can't, you need to talk to somebody today. You can't wait a month on a waiting list. That's assuming that you have. A hundred dollars, $200 per hour to pay for a good therapist, right? If you do session once a week, it's four to 800 bucks a month.

Most people don't have that kind of money. If you're lucky enough, then they could get insurance and cover it, but depends on part of the world. Not everybody has that. So it's expensive and there's a massive shortage and we don't have enough help. This technology we think can be used. In between the sessions.

gySo yes, you only can afford the session once a week, but you can still have this tool that you can chat to and you can, kind of do a stream of consciousness we call it, into it. Just that alone is therapeutic. So, I mean, journaling is a, you know, just. Pencil and, and paper is a very well known journaling method in every, the, not every, but a lot of therapists will recommend journaling.

And if you just sit down with a journal for 20 minutes a day, that has a huge therapeutic relief. So we're just making it a bit easier to do that and we're, you know, providing insights and asking questions that you wouldn't normally consider. So it's a smarter way to do that, but that, that's the way it could be used in between sessions.

And now to address the shortage problem and all of that. Yes, we are adding an AI. But using the lms, it will but not be a therapist. It'll be like an emotional coach, kind of, you know? Yeah. We're very careful not calling a therapist. Obviously it's not a licensed, we can't control what it says, but that, that's one element that we're adding that'll come in the next couple of weeks, probably, you know, before the summer.

And what I'm super excited about is that afterwards, by the end of the year, we're adding human therapists onto the app. So right now, You do the stream of consciousness or you talk, you get feedback. In ai, you'll have the chat bot in between if you want to use it. We'll also have interventions, so if we notice.

That, for example, your rate of speech is very high and you have high anxiety. We can recommend, we'll, we'll have like a breath work access for you to do in the product, or we'll have, you know, take a cold shower and build this habit or whatever, you know, whatever the case for you. We'll have all these immediate interventions inside, but these are surface level treatments.

Like if you feel stressed, meditation will help. Breath work will help. But if you have a deep rooted, Anxiety attachment to your partner, for example, you need to dig deeper to resolve your issues and be able to, you know, feel better. And so you need to work with a therapist. So we'll bring in on therapists that you can work with directly in the app as video chat.

And the therapist will have all of the AI tools to give them feedback and second opinion and all the other support. So when it comes,

Nick: when it comes to this integration of, this chat bot, right? Because these things can do everything. Yeah. Right. Like by, by default chat G pt, and for those who have played with it chat openai.com to go play with this thing, it can kind of answer anything.

And these things are designed to basically guess the next word in the sentence, which is pretty close approximation to how we operate as human beings. Yeah. What. Are you doing to create guardrails for emotionally sensitive persons when it comes to the prompts, like the prompt in terms of your engineering, the bot to be able to answer like are you, like what sort of guardrails are you considering putting in and what are the sort of boundaries of what you are creating for your specific audience, which is people who are looking for that sort of mental health and therapy.

Darius: Yeah, so that is something we're, we're actively testing right now, so that's something we're working on. One of the main ones is clearly stating in the prompt that this is not a therapist session and therapy, because if you do tell it that you're a therapist, I'll try to act like a therapist., some, you know, for me personally, I've been using it that way and it's actually very useful, but, we're not confident to least set to the public in that way.

So stating that it's an emotional coach and, and you're not a therapist, Is the most important thing. And then we're going to implement tools where we can have red flags pop up. So we can either stop the session or we can have with your consent therapist look over the sessions and maybe give you feedback and insights and be able to have kind of a big red button.

There was a very good charity that still is this nonprofit, I forgot the name, but they had a text based. Chat tool. I don't know if it was a chat. I think it was a chat bot or maybe they talked with actual human people. I don't know. No. Yes, it was an SMS based tool where you talked to advisors. They were not therapists, but they were kind of emotional advisors and you could talk about any problems that you have and they used ML to, to analyze the words, and then they got.

They created red flags for certain words. And interestingly, it wasn't the words that you would think of. It wasn't like suicide or, or I dunno, I can't see the light in the tunnel. It was like mentioning certain names of pharmaceutical drugs or some of the other things that are combinations of words that created red flags that then they could.

Much better predict whether somebody's going down a bad spiral. And so that's something that we're actively working on and thinking about and, and before we release it to the public,

Nick: I like the, I like this approach of, you know, not trying to package everything into just, it's an AI therapist. Yeah. Yeah. I believe there is a massive component of human touch when it comes to this.

And then like, if you actually think about. You know, imagine this thing at scale. You've got thousands upon thousands of people actively talking about it. To be able to monitor when somebody's spiraling down Yeah. And have intervention in place when that inevitability happens, I feel like is a great application of this technology.

I, I really feel like that's gonna be what makes or break this stuff. You know, it's gonna be, not handing everything over to the AI, but finding the best of. The therapists and how they can handle more than a handful of people at scale. Right. You know? Are you anticipating in the future that therapists can take on more clients through your software?

Or are you imagining this is gonna kind of become, you know, a AI therapy group where you have a bunch of therapists internally at your company? Like what, what do you feel like the future of therapy is at scale in an AI world?

Darius: Yeah, I don't think that therapists will be able to take on more people because I think you'll still have to dedicate someone of time to an individual.

So if you do like a 50 minute session, I think they'll still continue doing 50 minute sessions, but I think the therapist will be able to be much more effective. So they'll have more tools at their disposal. And more importantly, the patients I think will have a lot more tools, a lot more support, not just the session, but everything in between.

So they'll be able to heal quicker and better and more efficiently where they'll be able to go through the process. So I think we're bringing more efficiency in that way. Not that people will be able to, or the therapists will suddenly double the amount of clients they can do. But I think what we can do is actually, you know, I think a lot of, there's.

They're probably, I don't know the percentage, but I would say there is a significant percentage of people who go to therapy that maybe don't need therapy. Maybe they just, they have, you know, kind of surface level issues and they could work it out if they just fix their diet, if they exercise daily, if they create time to focus on friends and spend time in nature, like, I don't know what the percentage is, but I, I think a significant percentage of people could feel better that way instead of talking to a therapist.

And so I think we could actually take on some of these challenges and help those people take care of the low hanging fruits so the therapist can spend people with. Time with people who actually really need their help and attention. This is something I'm thinking about a lot lately because I started wearing a glucose monitor.

So I have a sensor in my skin that measures my glucose levels 24 7, and I've been looking at the correlation in the studies of how, first of all, stress increase their glucose levels. So that's a problem already. But what is more dangerous, I think, and what is interesting is that. When you have high glucose levels, either from stress or from you eat pasta for lunch, whatever, harb carp diet clu increases your blood glucose when you have a spiking glucose that actually triggers anxiety attacks and even depressive episodes.

which is fascinating. And I personally, you know, I had a massive burnout two years ago and interestingly that year, and I don't know yet to what this is connected, but that year I actually. Was I was going vegan and doing experiment. I was eating pretty much vegan, all year, and as a result, I was actually eating a lot more carbohydrates than I normally do.

I was substituting, and that's not a healthy way, this is not like a vegan thing. It was my, the way I did it was wrong because I had more carbohydrates and I didn't have the monitor then, but I'm, I'm sure I had, even now I have massive spikes in glucose, but I eat healthy. Back then, I must have had enormous spikes in glucose and I wonder, To what extent that contributed to my burnout and then, you know, falling down this depressive pattern.

49:35 Burnout epiphany: zooming out to appreciate the macros of life

Nick: Yeah, it's, you know, I've, I love how deep you go. In terms of measuring all these things, the glucose be meter, and, you know, looking for alternative voice to integrate technology into, you know, figuring out what the fuck is going on on the inside of our bodies Right. At any given time. Mm-hmm. And, there's so many directions I could go.

but I, I wanted to ask you a question. I listened to, your self podcast that you published. It's about 25 minutes long talking about your burnout and dressing it directly. In fact, it's the last thing you posted on your podcast, and I really liked that format of reflection. And in the podcast you said that it gotten to the point where your life felt like a self-help

Darius: book.

Yeah, I remember that phrase. Yeah.

Nick: And I, my question is, you know, Where do you now draw the boundaries? With all this information being on the bleeding edge of integration technology into your health and systems, you know, how do you identify what you adopt and what you drop when it comes to upgrading your holistic

Darius: health?

Yeah, very good question. I see you're doing your homework and you're going deep. Thank you for that. So the article that I published called The Micros and Macros of Life and the big epiphany I had from the Burn, and I'm so grateful for the burnout because I was going in a direction that if I would've continued going down the direction and woke up 20 years later, I'd been miserable.

And I'm so grateful that this burnout. Snapped me out of it, and I had to change it. Everything in my life, pretty much. But the, the epiphany was what I called the micros and macros of life. The, the macros are all the important things that really matter, right? So it's what is the relationship that you have with yourself, with God?

If you believe in one or God's relationship you have with your family, with your friends, the time you spend in nature, do you know what's your purpose? And do you have a mission that you're following? These are all the macros, the really important things that matter. And then there is the micros, which could be important, but they're just tools.

And in our culture, and myself included, I. We were obsessed with the micros and I ended up over optimizing for the micros. So the way my day looked is I'd wake up at six in the morning. I'd meditate for exactly 20 minutes because that's the recommended dosage. I would do my mantras and I would do my incantations and my beliefs.

I would go for a walk. I would exercise for exactly 40 minutes because that's when you have the perfect balance of hormones and, and cortisol and all these things. I would walk back, I would. Schedule out every 10 minutes in my calendar to be like hyper efficient and do all of these things, and I, I felt like I was doing the right thing.

I felt like, and I was also, you know, like that year I was vegan. I had no caffeine, no alcohol, but I was missing all the important things. I didn't have a relationship with my family. I was beginning to lose relationships with my friends. Instead of meeting people one-on-one, I would organize a group dinner once every three months to get all the time in, which I'm so stupid now, but I thought I was like optimizing my time.

I was in the wrong relationship and I didn't have the courage to admit it and end it and then move on for, for the sake of both of us. I didn't have, I was no long in my work. I was no longer believing in the mission of the company that I was with, and I kept going. So I had all these things. I didn't spend time in nature.

I had all these. Micro that I was obsessed with, which is why I call it the, the, you know, my life looked like a self help book~~, which it did.~~ I was, you know, the prescription of the four hour work week and all. And I'm not blaming, those are great tools, but I took them as religions and that's the problem.

And so to answer your question of where do you draw the line, that's a very good question because that is daily work, like moment to moment. Of, well, number one, focusing on what's really important, what matters, and keeping that in the back of my mind. But then you have to go out into the world and do stuff.

I still exercise, I still meditate. I, you know, like I'm now wearing glucose monitors and my outra ring and all these things.

Right now, I think the perfect, or the goal or objective is to remember what's important every day. Several times throughout the day. And then use the tools for the tools that they are. Mm. And not consider them the religion to be. And I'm still going through that process. I'm still, yeah. That's, that's, I think, you know, life work

Nick: a yeah.

Thank you for sharing the framework. And, you know, when this brings up for me is, you know, focusing on, direction first. Cause you can talk about macros. The way I see it, it's kind of like even if you optimize for the perfect quote unquote being and day. But you're not moving to the center of the targets that really fucking matter.

All the optimizations mean nothing. Right. So it sounds like, you know, and you know, correct me if I'm wrong, it sounds like the macros is kinda like the target and the center of your targets in life that need to be identified before you can even get down to the micro optimization to actually get there.

Darius: Yeah, yeah, absolutely. That's a, that's a great, yeah, that's a great way to look at it. One of my favorite books of all time. The way of the Superior Man, which is a book written for men. but also, you know, great for women if they might help understand some of the reasoning for men. I think a lot of those insights are there as well, and that, that's, that's what the author David Data talks about a lot, is if you don't have that direction and the purpose and, and the meaning, then the rest doesn't really matter and you'll feel.

Lost and isolated, and, and, and burnt out and all the other things.

Nick: People can sense it. I mean, that's another big takeaway from that book is that, you know, it's not even gonna be possible to find the right partner because there's a sense that people have around you that you are not a mission. Right? Yeah.

Like you don't feel like that momentum towards your perfect life and towards your perfect path. If you don't have that definition and that acute awareness and movement towards that specific center of your target, it's a sense, and you're gonna find out the people that are also off their path. Yeah, and I fundamentally agree with that specific part of the book.

There's a lot of other things in the book. I don't necessarily agree with all of it, but Yeah. Yeah. No, but that specific thing, I definitely, definitely do agree with it. Yeah. there, there are,

Darius: Extremes in the book. And I'm not saying everything is true, but I think it's a really good, you know, we are in an age where like the question of what is a man is a question, right?

Like a valid question these days and the whole gender conversation, all these things are, are getting very complicated. Lots of people are confused, teenagers are really struggling. It is a very difficult time to be around. So I think this is just one. Mental model. The book is one way of looking at the world that me personally help clarify and understand some things.

Again, it's not a religion.

56:59 Integrity, change, and healthy attachment in relationships

Nick: Yeah. I, I think, I feel like we have to be very, very mindful of when an idea becomes an ideology. Mm-hmm. You know, when somebody makes that transition in your head, be very cognizant of it, because that becomes one of your core beliefs, becomes a pillar of your extended reality.

So, you know, given all your reflection. What a modern man looks like. What does it look like to you and how are you embodying, you know, what you feel like is your version of the modern man?

Darius: Thank you for asking that question. I've been thinking about that for the last week and the conclusion I came to is that, that's the wrong question.

I think the right question is, what does it mean? To be the best partner that you can be. What does it mean to be the best friend that you can be? The best, c e o. The best brother. The best son. The best father. What is it? I think that's the right question, and that's what I'm focusing on. And that's, I, I've spent a lot of time thinking about that lately.

I've, I've, yeah.

Nick: And with that perspective, and I'm just, it's, I'm curious, this isn't, this is a fascinating way of looking at it. So what you're doing effectively is finding the people closest to you that you care about their opinions and their mental model of you and trying to reverse engineer, you know, what is the lens you want to be seen for, or what's your approach to answering that question, especially in so many different vectors and fronts.

Darius: Right. Right. No, I, I don't think the best way to do it is to like, how can I position myself? So you see me as the best friend. I think that's, that's unhealthy. And so this, actually, this brings up another conversation I've been thinking about. To me, this is related to attachment styles in relationships, right?

So that there are three main attachment styles in adult relationships. One is anxiously attached. One is avoidantly attached and one is safely attached. When you don't know how to tap into your feelings and emotions, when you avoid your emotions and you have a hard time getting in touch with them, you'll most likely be avoidantly attached in the relationship, meaning if friction comes up, you wanna run away.

The other one anxiously attached is, When you can get in touch with your feelings, but you don't have the confidence that you can regulate your emotions, that's when you become anxiously attached most likely. And then they're safely attached, which is, you know, you're, you're okay with yourself. You can know your feelings and you know how to tap into them and how to regulate them.

And so I've been thinking about what does it mean? And I've experienced all of those. Experienced a lot of, yeah. And I've been thinking about what is. What does it mean to be anxiously or, or, or avoidantly attached, and what does it mean to be safely detached? And detach in a, in a positive way. And for me, that means I am all in on this relationship.

Whether it's friendship or, or you know, let you be with your parents or with your partner. Anybody with your investors, right? I'm all in, not investors probably, but friends, that's a boundary.

Nick: Boundaries are good with investors.

Darius: People that don't pay for their relationship. Right? Yeah.

Nick: Unconditional loans, right? Yeah.

Darius: Investors are top of mind cause we're in the middle of raising around, so I'm thinking about that a lot.

Nick: Lots of left in the investors present and future, by the way, love you too.

Darius: But within the relationships that you are in, I think safely detached means that you're all in a relationship.

You're here for them. You're committed a hundred percent. But you're not willing to change who you are just to stay in their relationship. You're not willing to act out of integrity or be somebody you're not just to keep their relationship or just to make the other person happy or just for them to see you as a good friend.

You're not willing to go out of your core essence. I think that's safe detachment. However, there's a caveat. There's a conversation of. Growth is very important and necessary in a lot of these conversations. And then you need to know what is a change and what is growth, right? Because if you are in a romantic relationship and you need to grow to stay in that relationship, I think that's great.

You should. But if you need to change, then I think that's not good for you or the other person, right? Mm-hmm. And I think growth is when the change that you're about to implement. Would you have wanted to make this change before you were in this relationship? Hmm. Right? If that's, you already wanted to make that change and this person inspires you, or you use the relationship as a leverage, as a motivation to make that and turn that growth, then it's great.

Or if it'll make you a better person, if it makes you a better human. You know, I'm sure you know all the stories, like when one of our friends becomes a father just to grow a lot, right? Your everything shifts in your life, and that's growth. Hopefully it makes you a better person, makes you a better father.

That's growth. so that's good. But if you are changing for the sake of change, that's no. Bu

Nick: you know what it brings up for me, and I love this framework that you're leaning into, and I really hope that you continue to lean into this cause I think them, some of the things are tapping into are extremely important.

But the way I visualize this is almost, it feels like change doesn't come with you taking on that challenge. Like, I've actively integrated this. I'm taking this as a growth and like, it, it changes from like a lateral movement. Mm-hmm. Reorienting where your ship's headed. Yeah. Versus pouring fire in the fuel to move faster in the same direction.

Right, right. I think there's, there's one version of. Quote unquote change where the wind blows a certain way and your ship gets off course. Right. Right. And you're kind of doing it kind of, cuz that's the way the wind's blowing. That's the elements I'm a part of cause I'm in this container versus growth is I'm choosing to go that direction.

Right. I see this as a vector of growth and I'm choosing to adopt that. I'm going full sail with the wind, not against it. If from a place of a resentment or a place of accidental movement.

Darius: Exactly. Yeah. You're not, yeah. Changes. Change is stepping to the side, just so you're in different place. Growth is stepping forward towards your goal.

Mm-hmm. And so knowing what is the end goal, it's essential for this. Right. And this is all very difficult, and I'm bringing this up and talking about it because that's, I mean, that's, you know, something I'm working on. It's difficult for me. I've done all of those things. I've, I've fallen into all of those pitfalls.

I've changed myself just to please people and become the version of what I think will make them happy. I've, I've done all of that, so I'm not blaming or judging. This just might be a helpful way to think about it.

Nick: Yeah. You know, and one thing that brings up for me, I was having this conversation last night actually, where I feel like some of the best relationships.

Are when I'm talking to romantic ones and really friendships in a way too, is when I see a certain vector of growth that my romantic partner that I'm interested in, or my new friend that I'm interested embodies, right? It's like I actually witness the growth with through that person that I want to embody.

Right. Right. And I feel like the unhealthy relationships are the ones where you're put in a position where there's something you identify you don't want to be a part of. Mm-hmm. And you're forced to change because you're in proximity to that person. Right. Right. There's something I actively don't want to adopt to my romantic partner.

I don't want to adopt for my friends, but I'm forced to go out drinking Yeah. Three times a week cuz that's what he does. Yeah. Right. I'm forced in a position where, You know, I'm going picking shells on the beach when I really wanna work, cuz that's what she does. Right, right. It's like a change feels like a step, away from mission and a growth feels like we are moving in the same direction.

Right, right.

Darius: Yeah. Yeah. I mean, look, I, I think different people have different needs. For some people, growth is just more significant need than, than for others. If you do have a high need for growth, then. Then that will be an essential part of your relationship. I think, you know, growth is necessary in a relationship no matter what.

I think different people will have different, different, you know, kind of priorities on where that growth is. But yeah, I, I, I agree with you. I think a lot of times we're attracted to people and, and, and to ideologies that like, oh, this direction I wanna go in as well. And then it's interesting conversation of when you are.

In a romantic relationship, you can use that as an inspiration or motivation, but you don't want the person to become your guru or your therapist or your coach. Right? It's also what roles are we playing that are appropriate in this relationship? And, Jay Shetty talks about this as well with, with his wife.

He just published the book, A Rules of Love, which I think is fascinating. A lot of interesting conversations there. But he talks about, he had his podcast and he talked about. That in, in a relationship you can, sometimes we act as a partner. Sometimes we act as a parent and sometimes we act as a child and.

Just identifying which one of those are we in and, and where do we want to be? And recognizing it. I think it's an important tool. He gives the example of, you know, sometimes he has a sweet tooth, so he just dives into the cake and his wife ha and so he become, he acts as the child, his wife has to act as the

parent and be like, you're not gonna feel very. ****Very good after this, it's not good for you. Right. And so it's not necessarily wrong that you grow there sometimes, but you don't wanna be there the entire time or most of the time, right. You wanna be the partner. Mm-hmm.

Nick: Yeah. It's, it's a beautiful framework to to look through in terms of relationships.

1:10 What’s your biggest vector for personal growth?

So what do you feel like has been your biggest vector for growth in this last year?

Darius: In this last year? I'm thinking because I've gone through a lot of growth that I feel personally, one of the biggest ones, the biggest gift that I was given through the burnout was the acknowledgment and acceptance of negative feelings, negative emotions.

My entire life. I've prided myself. I'm being this cheerful, happy guy. And I've put negative emotions aside, and I did it over and over and over again. When you do that, you shut down the negative. You cannot select, you cannot only shut down the negative and make the positive shine. When you shut the negative, you shut down everything.

And so the better I got at shutting down the negative emotions. I didn't know it at the time, but I got really good at shutting down everything. That's why depression and burnout feels like. You're not sad, you're just numb. You can't feel anything. You're a robot, and everything is just, you live in a black and white film.

There's no color, there's no emotion, there are no nothing. Even the information passing through your senses into your brain is dulled down. And I've spent my entire life, 30 plus years of my life doing that. And as a result of the burnout and a lot of work with amazing therapists, a lot of psychedelic assisted therapy, a lot of friendship and accepting, understanding family, I began to open up and understand my negative emotions and going deeper into them.

And this is an ongoing work. But I'm getting more comfortable getting in touch and opening up and accepting and, and being like it's perfectly normal and acceptable and healthy to have negative emotions arise and, and acknowledge them. For me, it was specifically anger and sadness. I have stored so much anger and sadness in my body.

And that have caused so much harm physically within my body and emotionally and everywhere else on, you know, even your. Your friendships, like e even you know, as a man, if, if, if you are in relationship with a woman that's in touch with her feelings, if you cannot be in touch with your sadness, you will not be able to hold place for her and for her sadness, and she will not trust you because she'll see that you can't feel yourself, you don't know where you are.

And then you're gonna be like a ticking bomb, where it's like all these anger pant up energy, you'll explode one day. And a woman that's in touch with her feelings and emotions, she'll see that and feel that in you, and she'll either avoid it or, or just, you know, create healthy boundaries around it. Right?

The same with your male friends. Like if you can get in touch with your anger, like your friends will, your male friends will see that and feel that, and, and they're not gonna trust you because deep down they will not trust you because they'll see that you don't know where your edge is and you're not willing to admit it.

And, Knowing it doesn't matter where the edge is, but knowing where it is is much more important and admitting it to yourself and to others is much more important than pretending that it's somewhere else than it really is or the most dangerous, where you don't even know where it's, and you're not aware that you don't know.

That's where I've been.

1:14 Combatting avoidance tactics with shadow work

Nick: Wow. Yeah. Thank, thank you for sharing that. And it's beautiful that you know, you're beginning to see those emotions. And those shadows is an invitational to go deeper. Right? And this is something that, you know, I've been working a lot on. I just came back from Costa Rica and this might actually something I definitely wanna share with you.

I discovered during my time in Costa Rica, a direct correlation between my ideation. In my natural propensity, and frankly, what I feel is a gift of in innovation. Mm-hmm. In my deepest childhood trauma. Mm-hmm. I was in a watsu healing session, which for those who don't know, it's like you are in this perfect little container, no sight, no hearing.

You're being held by an incredible healer. And I was going into my childhood, and I got to this point. I was eight years old. I was in my room, da da da, boom. I had this amazing idea for now. And a blossom, like a giant flower right in front of my eyes and just took me, I was like, whoa, that's so brilliant, da.

And I was like, hold the fuck on. Where did that come from? And I realized in this moment, and this led me down this whole track that a lot of my propensity to always be in a place of innovation, ideation, and creation is an avoidance strategy.

For me going down my shadow's rabbit hole. And I had been avoiding all this deep work in the name of quote and quote work. Right? And I had been using this business and these things I all my creations as escape mechanisms for actually looking at my anger. My sadness, my deepest strung out emotions. And so what I did is I turned off that part of me.

I said, no more ideas, no more quote unquote innovations. I'm gonna go into my shadow. Yeah. And then got into the habit of, oh, I feel jealous. Let's fucking go. I felt jealous in college. I felt jealous in high school. I felt jealous of my brother, and I just go phm all the way down to my childhood, all of my effort, not a fraction of it, all of it down into my shadow and pulled it all out.

And by the time I got back to the present day, it was like rubbing dirt off my shoulder. No problem. So I find that. What you're leaning into is an important mechanism, and I just recently discovered it. And that's by fully going through shadow and what you're, what you're describing, and I, I feel like it's important.

People know this. People fucking sense it. Yeah. If you're not willing to take that dive into your shadow and see that shadow as an opportunity to explore yourself, it's gonna fuck you up in many ways, more than one. It will ultimately be detrimental to whatever you want to create, even though it's weirdly has this correlation to be a fuel too.

Yeah.

Darius: Yeah. I am exactly the same. A hundred percent used all the work, the entire association with the HU Hustle culture for me. And I wonder if for others as well, but I know for me was an attempt to avoid feelings. And I would get into this robot mode, wake up at five and work all day until fucking night, go to sleep and.

And this addiction to like phones and constant entertainment and Instagram stories and swiping and nine gig and just like 24 7. Entertainment is not relaxation. There's distraction, and I feel like we don't know the difference between relaxation and distraction, and that's very detrimental to our health.

Mm-hmm. Because scrolling is distraction and avoidance. Of getting in touch with all of that stuff and dealing with all the crap in there. This incredible woman that I'm in a relationship with, she's been such a huge help on this journey as well, and because she's so in touch with her emotions and feelings and has done tremendous amount of work on herself and, and has given me incredible feedback that I could use and then like start doing all of this work.

So she was a huge kind of trigger and leverage. For me to go all in on this work, but I was exactly the same, a hundred percent the same way, but it's a daily work. I still fall in there every once in a while. I, I think one, one, and one reason why I'm lucky is because I was an athlete my whole life, and so I have a pretty good connection with my body.

Like not just coordination, but understanding and knowing like what? Feel certain sensations in your body, you can like have a pretty good sense better than the average. And that's actually what's saving me now because I know physiologically how the burnout feels. It's a very specific sensation for me.

It's actually like a pressure building out in like the prefrontal cortex in this area of, of the brain inside. And it starts with like a tiny pressure and the full blown out result is a burnout. But now I know like the moment it comes up and then I can stop and be like, okay, what am I feeling here? What do I need to address?

Let me do that. So before, I would just use work to go down. And there's other sensations in antibiotic. Like we all know how anxiety and stress feels like in our bodies, right? But most people just ignore it and power through. I think the gift is be able to stop and be like, okay, what does this mean? What do I feel?

And the real, the real genius, like the universe intelligence. Comes through you when you actually open up and accept these things and dive deeper. There's a wonderful documentary called The Work. Have you seen it?

Nick: I believe I have prison. Yeah, yeah, yeah. The guy goes into the prisons of the therapy with

Darius: them.

Right. So there's this, yeah, a documentary called The Work. For those who haven't seen it, it's about a group of volunteers. About 20 guys who once a year go into a prison of about and meet up with about 20 inmates in there and do a group therapy session. Wonderful documentary, highly recommended.

Interestingly, it to me it looked like it was, it was mostly the inmates helping on the volunteers coming from the free world. They were doing most of the work. But one thing, one, one quote that stood out to me that like really was opened my eyes, what you talked about, they said the metaphorical medicine.

The metaphorical healing for all the emotions are deep, deep down, right next to the wound, and you have to dive deep in there right next to the wound to pick up the metaphorical medicine and the healing. And so it requires you to go in there and feel all of those things again. The documentary's.

Incredibly powerful, highly recommended

Nick: for everyone. No, and, and that reminds me of one of my favorite mindsets. This comes from MDAs, trauma Healed is medicine. Trauma healed is medicine. And even, you know, your own experience of going through burnout, going through all these crazy things, in fact, you've gotten through that trauma, you know, healed yourself and are encapsulating that medicine into a piece of software that can help others that go down those spiraling holes is a beautiful alchem and a beautiful example of how this process can work.

Darius: Yeah, it's a very good way to say it. I agree a hundred percent. But you don't have to be building software at scale. I think. When you go through that stuff, it creates an incredible amount of depth inside of you. Mm-hmm. And that touches every single person you meet every single day.

Nick: Yeah. You know, the way I see this is it's almost like a rubber band.

Mm-hmm. It's like when I can actually stretch that rubber band all the way down to the deepest, deepest, darkest parts myself. When it's ready to snap and it'll fling back the other way with a hundred times the force. And now your capacity for the holistic human experience, both the shadow and the light has just expanded.

Right. You know, there's some people you meet that you know haven't been to those depths that maybe you've been through. Yeah. Right. And I th I genuinely believe as a result of that, might not have the capacity for the highest of the highs. Yeah. Right. I, I do believe there's a lot of beauty in going to that depth and increasing your emotional

Darius: range.

Yeah. Yeah. I, I think it's, I think the depth is, is the key. Now I see it. And then you can also, you can see patterns in yourself, so clearly as a result of the wounds. And then you see other people and like you see the pattern right away. And they don't see, because they haven't gone through the depth yet, but you're like, it's like seeing the matrix, right?

You like, you step back and you're like, I see you're saying one thing, but I see the motivation behind it, and it's not a way to judge, it's just like, now I can create a boundary, protect my energy, and Yeah. It just, it, it gives you superpowers.

Nick: That's the funny thing about all this work, you know? Cause sometimes it feels like you're doing surgery on yourself.

Yeah. That's really hard to do. I had, I had a beautiful opportunity. I've been practicing radical honesty. Mm. Since we arriving back from Costa Rica mm-hmm. And I'm doing my best to bring that into every container I'm a part of. Man, I've given such a gift. People were analyzing me from every vector and showing me parts of myself that I could never recognize myself.

Oh. You know, entering into those spaces of radical honesty. Yeah. As tough as they might be and as like anti strategy. As they might feel. Mm-hmm. There's so much beauty in those sort of relationships and if you have opportunities to practice that, whether it's in your intimate relationships or friendships, I highly encourage you to lean into it.

Yeah. It's wild what can come up as a result of it.

Darius: Right? Yeah. That's why I think oftentimes I, when you are in a good romantic relationship, for example, that relationship is a perfect mirror cuz a lot of times I think as a man we also. You don't, I think oftentimes women just have a gift of seeing things that you don't see.

Mm-hmm. And it's not necessarily just masculine, feminine thing, but speaking from personal experience a lot of things, you know, it's, it's, it's, I think it can be a perfect mirror and, and shows you parts of you that you were not ready or willing to admit or see. Mm-hmm. And only then you can work on them.

Nick: And meeting those things with love. That's the real challenge. It's like when somebody points out some of your bullshit. Yeah. You're like, mm-hmm. Tell me more. You know, kinda taking the gut, not taking the, so personally Yeah. And not having that initial reaction to fight back. Yeah. That a lot of us have cultivated over the years.

You hit me, I hit you

Darius: back. Yeah. And that, that's a huge, you know, mastery. That's, that's, yeah. Again, this is daily work of being able to stay open, whether it's. Romantic relationship or friendship or feedback or your investors or all of that, right? Staying open to it and, and, and not close down in the middle of a tornado.

That's, that's a beautiful

Nick: practice. When I read this book is from the School of Life, and it talks about like the core essential elements of like, you know, what makes someone really great. Mm-hmm. And one of the ones you put in the book, and I completely failed at this, was candor. Mm. Candor be the ability to hear.

When somebody's giving you feedback on some shit about yourself. Mm. I scored, it was like zero out of 50. I scored like a three in candor. Wow. Based on how I answered the questions of the survey. Wow. And it occurred to me that I had become, and this is like a year ago, and I would actually love to retake this quiz to see how I've evolved from this, but my candor ability was very low.

Which means that if you were to tell me, you know, if you're just taking that score as a baseline, you're to tell me a bunch of stuff. I might only pick up 6% of it. Mm. I might only integrate 2% of it. And I do believe that candor is one of those skills, and it's not about accepting everything at face value, but really listening, really feeling into like this person, this is this person's truth.

And plucking out of that what really can serve you in that relationship at that time. Yeah.

Darius: Yeah. That's why it's such a gift to be dating a woman that doesn't take any of your bullshit.

Nick: Right. I dunno what that's like. I feel like every woman, every day is taking far too much my bullshit.

Darius: Yeah. It could be friendships as well, right?

To have male friends who will call you out on your shit. Mm. That's such

Nick: a gift. And you touched on this a bit, you know, when it comes to, you know, your friendships and some of the things you learned when, a couple years ago, you know, how have you evolved going back to that 360 degree view to show up more holistically in the friendship container?

Hmm.

Darius: So actually one, I've, I've had a big realization lately and I realized that I was a very self obsessed. Selfish person my entire life. Like very, when I would sit down in the morning to like visualize the future, like the good life, it was all about my money, my success, how I felt, all of it. There are some people in there, but if I'm being honest with myself, it was all about me.

And being able to, well look at that honestly and be like, oh wow, I'm really fucking self obsessed. Was hard to realize and a huge gift because we're all born with the ability to feel empathy. But it's a skill that we cultivate. And some people are naturally much higher, some people are lower, but if you ignore it, it'll go down.

And I think my empathy was so low. I think I am high on sympathy. I can logically rationalize, understand like this homeless person on a street has maybe. I was born in the wrong neighborhood, had the wrong parents, just made a few wrong decisions, and I was homeless. And I can, I have a sympathy, but like I understand, you know, this is, this may not be entirely your fault.

If I was born in your situation, I might be the same. And I thought that was empathy, but it's not. That's sympathy. Empathy is having, feeling what that person is feeling. And because I wasn't able to connect with my emotion, because of ignoring all the negative emotions, I was not able to feel empathy. And I feel like I'm just scratching the surface of friendship now because I'm just now learning empathy and I'm, you know, I'm turning 34 next week and I'm just now realizing what empathy means and.

Tapping into it. And so I think I'm just scratching the surface of what real friendship and real relationship and real openness and love to the whole world could be like. So I don't think I have an answer yet.

Nick: I see. It's a pretty fucking good answer, brother Darius, I love the work you're doing both on the personal front and the professional front.

It's been such an honor of being around you for these last couple years, and I'm so excited to see you know what this evolves into. In our little corner of the universe here in Lisbon. Follow this incredible technology and empathy being applied at scale across all these different vectors. Thank you for what you're doing, and thank you for joining us today.

Darius: Thank you for creating the platform and, and, and inviting me. I appreciate it, brother.

Nick: Deeply appreciate you, brother. Thank you so much.


The perils of advice-giving: navigating our individuated mind maps
Ethical boundaries and a safety net for AI
Is emotional attachment to digital spirits inevitable?
An emotional coach chatbot for mental health diagnosis
Burnout epiphany: zooming out to appreciate the macros of life
Integrity, change, and healthy attachment in relationships
What’s your biggest vector for personal growth?
Combatting avoidance tactics with shadow work