Rich Fernandez
Welcome again to the Return on Intelligence podcast. I'm your host, Rich Fernandez, and I am so excited today and delighted to host my friend and colleague, Doctor Nikki Mirghafori. I think you're in for a treat. she brings a really unique background as an AI researcher, academic and entrepreneur, as well as a senior Buddhist teacher. Nikki is somebody who sits on the board and is a stewarding teacher of the Spirit Rock Meditation Center, and she is also a principal author on over 40 papers on artificial intelligence.
Rich Fernandez
She advises, startups here in the Silicon Valley area. and also, is an instructor at UC Berkeley in artificial intelligence. and that's also where she got her PhD in artificial intelligence. Today we're going to be talking about the future of work and artificial intelligence and all of the ways that we as humans in this new changing environment in which artificial intelligence is increasingly becoming a part of our workflows.
Rich Fernandez
You know how we can be our best. Nikki, welcome. It's delightful to have you here. Thank you for joining us today.
Nikki Mirghafori
Thank you, Rich, it's such a delight to be here. Thank you so much for inviting me to be part of this conversation with you today.
Rich Fernandez
Absolutely. It's so interesting. Your background, you know, this, in some ways, dual pursuit of deep understanding of artificial intelligence and, like, being a principal investigator on numerous academic papers, over 40 plus research articles, I think teaching artificial intelligence at UC Berkeley. But also being a, meditation and mindfulness instructor in this tradition of care about Buddhism at the Spirit Rock Meditation Center and the Insight Meditation Society.
Rich Fernandez
It's such an interesting background. So I'm curious, and perhaps our audience is curious, how did you get here, Nikki? if you wouldn't be so kind as to tell us a little bit about, you know, your kind of life and work journey.
Nikki Mirghafori
Thanks, Rich. gosh. I'll give you the CliffsNotes version. you know, in some ways, maybe in one word version, it's life.
Nikki Mirghafori
Right? Like, it's.
Nikki Mirghafori
Doesn't exactly go as we expected. Predicted, but but what I would say is that, I'm both a nerd. True and true. So growing up loving math and physics and and puzzles and, and, at a young age, I knew that I wanted to to study and I wanted to become a scientist. That was my dream. I remember, reading picture books are books.
Nikki Mirghafori
I want to be a scientist someday. So. And I also, I had a curiosity, deep curiosity about, life, about not just science and and the but also the nature of reality. What is this? What is this? why are we here? And and, growing up, I was born in Tehran, Iran. So reading the poetry of Rumi and Hafez in Farsi and original language, they were written when I was a kid reading and and and having a sense that these, poets know something I don't they, they, they see reality in a particular way that I don't.
Nikki Mirghafori
They have access to a knowledge. It's not not just scientific, but but also Einstein. That's another has been another idol of mine who was both a scientist and a mystic in in some ways, not in a woowoo way, but in a way of, of deep, deep respect, and curiosity for for this, for this world, for this life.
Nikki Mirghafori
So with these influences being a nerd. So I studied computer science and, and got my, bachelor's, master's and PhD in AI, and, and I'm going to date myself. I did my PhD 25 years ago, 20 almost 26. I was ten years old. Just kidding. And, and and I was fascinated at, it when I was an undergrad, actually, that's when I decided to, to study AI because at the time, I was reading about neural networks and how neural networks where, a way to perhaps model the way the brain work is a very, very simplified model.
Nikki Mirghafori
And, and I remember, my undergrad at this undergrad seminar where there was these studies were reading papers on studies where, where, you have various simulation simulations of, of human behavior where we're being, we're created by these systems. So that's what I wanted to do. I wanted to study how the mind worked, the inner workings of the mind.
Nikki Mirghafori
that's why I wanted to study AI. It seemed like I really brought my interest in computer science and philosophy of mind and psychology together. And those are in my minors, philosophy and psychology. So I studied, AI and, and then for my PhD, I worked, on speech recognition. So nowadays, Siri, we all take it kind of for granted to Syrian Alexa and, and, all their friends, you know, we, we take the speech recognition, the power of it for granted.
Nikki Mirghafori
But, it it was clearly not where it is today. So I did a lot of basic research in that in language model. So training language models. And now given that we have so much more, GPUs, processing power and, and cheaper memory, etc., all of that, the power of processing has allowed these algorithms and in addition to some other algorithmic shifts and changes, not a lot, but some algorithms, shifts and changes, enabling these, algorithms to actually perform so much more powerfully.
Nikki Mirghafori
And we also have a lot more training data. But anyway, I'm veering into some technical description coming back to the, to the path. So, so, I was, a scientist at Berkeley, and, and I, I went to Silicon Valley for a few years, did research at startups, came back to Berkeley, my alma mater was doing research and and leading collaborations and PhD students and postdocs and, and teaching and, and I, at that point, I decided that this quest, this, this, until this, this, existential curiosity that I had, I wanted to devote some time to it.
Nikki Mirghafori
So I took a year leave of absence. By that time, I had been meditating and, had done many silent meditation retreats, increasing by increasing the length of them from, from one day to, you know, a week, ten days to a month. And, and I was ready to delve deeper. And, you mentioned Spirit Rock a meditation center where I'm a, teacher now and, and, and also serve in various ways.
Nikki Mirghafori
So, so, so spiritual was my home. This my home practice center. So I would, I took the year of, a leave of absence, practiced a lot. And also on the East Coast at IMS, three month retreats, three months silent retreats, and but at the end of one year, I wasn't quite done. If it seemed like it was just getting juicy.
Nikki Mirghafori
This, this, this, finding some answers to the questions I had had, it shifted. It really shifted things for me, shifted my perspective. And it felt like it, it poured cool water on the fire of existential angst, and the questions became answered. In a way, they became irrelevant anyway. So shifts happened, and then, various shifts and changes, that I continued to love to practice.
Nikki Mirghafori
And I was called to teach, invited to teach more and more in the Buddhist context and in the mindfulness context. And I taught more and more and more. And there's another thing that I will mention, which is I had a chronic illness that entered the the picture at that point around that time, and I had limited energy and and it felt to me that the, the, my own study and practice and teaching was so important to me, the limited energy I had, that's where it went.
Nikki Mirghafori
So I spent less time doing research and less engagement. So. So, I'm not teaching at Berkeley at right now anymore. but primarily I've been teaching the various, courses, both on, I think you mentioned, mindful leadership training and various Buddhist practices and, and also, engaged with ethics of AI for the past few years, but not hands on research at this point and teaching and AI.
Nikki Mirghafori
So given anyway, so so that's a little,
Rich Fernandez
We could probably end the podcast right there. Thank you so much. I mean, what's still sort of resonating in my head is I think you said pour, cold water on the fires of existential angst. That's kind of a lifetime learning, I think, for so many of us. and aspiration, and I so appreciate also bringing together the threads as you have, so many different influences from a young age of, sort of mystic poets like Rumi and Hafiz together and Einstein.
Rich Fernandez
Those were your heroes. That's amazing. informing that in your, your journey as a scientist and entrepreneur, an AI researcher, as well as then very much it sounds like a meditation, teacher, it's kind of astounding to hear a three month silent meditation retreat that's, for many, perhaps hard to fathom. I guess you had to do a PhD and do such long retreats together just to, like, really max out on, on the learning.
Rich Fernandez
It sounds like.
Yeah. it's it's the youre kind. Richard, I don't I.
Nikki Mirghafori
Don't think about it as maxing out. You know, I'm a learner too. The more I learned, really, the more I know that I know. So little. I know so little about about. Yeah, I know so little about so little.
Rich Fernandez
it's beautifully said then. So bringing it to today, you've shared a little bit about what you're working on, but especially given your background, and as we think you mentioned, AGI, artificial generative intelligence, and it's increasing prominence in our lives, you know, and given your background and all of these different threads, what's your perspective on that?
Rich Fernandez
I mean, where are we going? What do we need to learn? I especially I'm curious about your take on the ethics. Have AGI and, how it affects not only our lives but our work.
Nikki Mirghafori
Yeah. Thank you for that question. I'm so glad to be talking about this. So let's define what is AI. Let's just define that first. So we're on the first page. Yeah. So A is defined as a system that performs a an an action, you know, input output. It performs an action, that we generally as, as humans, we associate that with human intelligence.
Nikki Mirghafori
But that's a definition that we AI. So let's unpack that a little bit. So, so we consider speech one of the marks of human intelligence. We think that animals don't have speech even though asterisk the limited way they do. And the research in that. But in general, the way that we communicate as humans, so we think as NLP, natural language processing, natural language understanding as a domain of, AI.
Nikki Mirghafori
And of course, with ChatGPT, we see that the first, the first of the largest, or or say most public applications are natural language processing. More about that in a moment. other intelligent behavior is planning. is, vision, computer vision, for example, is is it is is under the AI, speech recognition, as I mentioned, which is my field of expertise, speech and recognizing speech.
Nikki Mirghafori
Just, recognizing speech. Understanding speech is yet another. But transcribing speech, what people say, that's intelligent behavior. Understanding, text context. That's another. so, I mentioned vision. I mentioned basically hearing is speech recognition. Speaker recognition recognizes who is speaking these various speech and, and many other, you know, pattern recognition, basically, we think of which have.
Rich Fernandez
Been with us for a while. I think that's the other thing with AI in living with AI for a long time, actually.
Nikki Mirghafori
A long time. Exactly. Yeah. Yeah. So so these are definitions of AI. So now defining the difference between narrow and weak AI in general generally, generally AI stronger so so weak AI are or applications where the the that that a human being, defines the input, defines the output, defines what is the desirable behavior, what's a desirable output, what's correct, what's incorrect.
Nikki Mirghafori
And, the and the system, the AI system basically is relying on a human being to define what their learning task is. All the systems that we have had up to this point, and we continue to have, working autonomous vehicles, Siri.
Rich Fernandez
Google Translate.
Nikki Mirghafori
Google Translate, thank you. Yes, of course. ChatGPT all of these examples of or. Oh yeah. Recommendation systems. Let's not forget recommendation. Yeah.
Rich Fernandez
We've been to.
Rich Fernandez
RFI. I would like here's what we have for you, you know we learned what you like.
Nikki Mirghafori
So here's in fact the first one was Netflix. So Netflix started the the the race. There's a story there. When I was a graduate student actually, that I think I was, I was already, part of research faculty at that time, so I can't remember the exact year. But anyway, they started the very first, Bake Off.
Nikki Mirghafori
They would give $1 million to any group anywhere in the world that would come up with the best recommendation system. It was an open. It was an open competition, and many labs were tempted. And so people were working, you know, graduate students and postdocs were working on that in their spare time to come up with a better recommendation system.
Nikki Mirghafori
So pattern recognition, what people like based on what they liked, recommend. Anyway, long story short, also, so you've been exposed to to AI for, for decades in so many ways for decades. And people yeah, people. No, I know it's not new. You might.
Have been exposed to it.
Rich Fernandez
Exactly like you. So so legacy Google, as you probably know, you know, and I sat in a lot of rooms where we were talking about the knowledge graph and what we derived from the Knowledge Graph, like that web of information that, you know, people are constantly generating to indicate their interests and what they're searching for. And how can we auto feed?
Rich Fernandez
or then it became the social graph as we brought in social media and Facebook was coming. And then how was it Google when were trying to invent the competitor that G+, Google+. So how do we harness the knowledge graph to create exactly that, the recommendation engines to help people like, you know, really, feel like they're getting targeted information in their social media landscape.
And so let's see where we are now.
Rich Fernandez
Yeah. Decades.
Nikki Mirghafori
We I think it's been decades. And and the for example, with the case of speech and speaker recognitions, the, the research started about maybe now I would say 60, 70 years ago. So it's, it's been going on for a long time. so it's it's definitely not new. And so, so all of these examples are narrower and narrower and they're very powerful.
Nikki Mirghafori
We need to be very concerned about narrow AI because, when we're talking about ethics of AI, we can do a lot of damage. For example, what what actually, I'll get I'll get.
Nikki Mirghafori
I'll get back to this. It's so many footnotes, so. Yeah.
Nikki Mirghafori
Let's come back to this. Okay. Now, a general AI. So AGI, artificial general intelligence is the hypothesis. It's not a reality. It's a hypothesis that some day there will be machines, AI systems that can, learn on their own. They are not dependent on human beings to define the input and the output, and they can actually define the, the algorithm, the learning algorithm themselves.
Nikki Mirghafori
They can shift and change. so that's one hypothesis. And then there's another layer of hypothesis on top of that, that, oh yes, this, this machine someday can become also conscious and sentient and have feelings etc.. So these are the two different levels, but let's just stick to the simpler one, which is the they can think, basically think and decide and, and and learn for themselves.
Rich Fernandez
Yeah. And I think that's the one for our audience to, you know, who probably are interested in professional development to learning leadership, these, these attributes of, of learning, or education, especially in organizational context. I think that application, is the one that's really front and center. as we considered like the future of work, what does this mean?
Rich Fernandez
You know, like, learn how to do these tasks. And what does that mean for the workforce? Upskilling. well.
Nikki Mirghafori
Actually, Rich, I'm going to, disagree with you here. because we don't necessarily we don't need AGI for that at all, but we a narrow AI can really help with these learning tasks as as long as we're domain specific. I think what so I think narrow AI is sufficient for these learning process as long as it's within a particular task.
Nikki Mirghafori
What what really distinguishes AGI is that they can define a new task and their new domain. So so for example, if they if it's a computer vision system now it's going to give, you know, the figure out how to gather data from different corpora written corpora and how to clean up data, how to train itself to do, speech recognition, for example.
Nikki Mirghafori
Like so domain independence. That's that's what I mean. It's and that. Right. So yeah, we don't need that. I was in, in in what you're pointing out with the people who are interested in learning and development and culture, etc., and narrow AI is good enough. Is that that and and that's what we have today because AGI many scientists don't believe that it's possible we're not there and we will never be there.
Nikki Mirghafori
And and even if we're ever going to be there, it's going to be years and years away. So, let's not bang our buck and the we don't need it. And there's so many ethical issues and questions about AGI, and, and many people are actually afraid or actually rather freaked out about AGI, which we don't even have.
Nikki Mirghafori
We may not even we may not ever have. And that's where people think about, AI systems becoming overlords and learning how to do everything and, and superintelligence, etc. that's where it gets scary and juicy and but we're not there. So, so in this way, I'm trying to calm down the fears of people who are afraid of AI because we don't have HCI.
Nikki Mirghafori
We may never have AGI again, but opinions differ, and even people who think that it's around the corner is still around the corner is is, you know, ten, 20 years away at least. so, so eight but we don't need AGI, which is really a different and different animal per se. For example, if we come back to the example of, of, ChatGPT, which many people might even have a subscription and use it on a daily basis, it it feels like, oh, yeah, it's learning.
Nikki Mirghafori
I'm telling it what you know, it's it's there's and, you know, giving it some examples and it's doing its research, but and, and it's pattern recognition. It is pattern recognition. But, but it's seen so many different patterns. and it's generating a new pattern within this domain. Within this domain. It's not learning. It's still and it's when it's being adopted by various companies and various people are coming up with new applications.
Nikki Mirghafori
They are defining what the domain is. They are figuring out how it needs to behave in a particular, I don't know, in the law domain or in this domain or in that domain. They're they're defining it. So that's a really important distinction to keep in mind. and, and with, with narrow, I actually let me just before I go into to to ethical issues, I'm just going to pause for a moment.
Rich Fernandez
Yeah. Because I think I thank you for for the pause, because I think if we do even focus on what we're calling narrow AI, the ChatGPT example, or other, models that are able to take large sources that they've been trained on and create content. It's still a way that they may, be one put, perhaps nefarious as it uses.
Rich Fernandez
Right. we're seeing a lot here in the United States, for example, concern about, you know, maybe false images, false video, etc.. We had the case in Pakistan, I think, of Imran Khan was like voice recorded this, you know, with the for the the former head of state who was imprisoned and then, you know, artificial intelligence was used to kind of dictate a variety of communications out to the public.
Rich Fernandez
and, you know, even Chad GPT and others, models like that, are all trained by humans and can be applied in ways good and bad. and so I think that's one concern from an ethics perspective. But there's also this concern about the future of work, because even these narrow AI models, can replace. Let's take the simplest example.
Rich Fernandez
ChatGPT. Why would we need journalists when we could just train ChatGPT to go report on the issue? Right. So there's a way that there's some concern, even with these narrow models of, replacement of the workforce or a new way of needing to work with artificial intelligence in this regard, maybe you're a marketing professional who needs to harness ChatGPT, or I'm just saying ChatGPT, but there's so many other tools out there that you can use create content, for example.
Nikki Mirghafori
Yes, exactly. So so we, we need to be concerned about the ethical uses of an AI because it's actually quite scary how easy it is to have such a short. And I was hearing report or some time ago that, that the, the duration of the amount of speech you need from someone from, from their voice to actually, to, to artificially create longer sentences by a longer audio segment by them.
Nikki Mirghafori
It's it's scary short. It's become scarier, shorter and shorter and shorter. We need to be concerned about the ethical application. What kind of data is it being trained on? Is is it inclusive? you know, there have been cases of, of yeah, I think maybe even why is Google that, that or maybe it was some other, company.
Nikki Mirghafori
I can't remember now, but but, the, the, image recognition was not trained on, on a diversity of skin tones. So it was not seeing it was not even seeing a darker skin tone in In the Dark. It wasn't, finding their face, or in the case of resumes and scanning resumes and deciding who who might get to be looked at by a human being or not.
Nikki Mirghafori
Oh, you know, the same biases that human beings have. Oh, yeah. This person has an ethnic name or or. Oh, it's a woman. Oh, yeah. Yeah. So it was the same. yeah. The same biases will be propagated. And what happens to the data? How is the data saved? How is it, how is it shared? How is it protected?
Nikki Mirghafori
Is it used? Is it I mean, there's so many issues. In fact, the, the AI Bill of rights, came out then, and if you haven't seen it and if the audience hasn't seen it, maybe it could be part of the show notes. it's great to see these different areas that that, people need to be concerned about.
Nikki Mirghafori
And companies need to, to, to show some accountability. For example, the areas that we've mentioned, you know, data use, data security, how the data has been trained, explainability. Can we explain how the algorithm is making this decision, or is it just like, oh, yeah.
Nikki Mirghafori
It's a black box. It's like a neural network. I don't know if data in doubt.
Nikki Mirghafori
I don't like explainability. So, so, and, we're not even getting to the nefarious and of course, the nefarious uses, are, are a case on their own, but even, the use of the in unintended uses of, of propagating, our biases or not safeguarding data or, sharing or selling them and, and and basically optimizing as a scientist, as, as engineers, we're always optimizing for something.
Nikki Mirghafori
We have these algorithms are we're optimizing, some variables water. And as humans, what are we optimizing for? Are we optimizing for maximum benefit? of of maximum dollars? are we optimizing for the greater good and the greater good may may look different than. Oh, yeah, let's just take whatever data we have and not care about this population.
Nikki Mirghafori
And, so so what are we optimize for? What are we optimizing for? Yes, we need to be concerned in the long term about AGI, but really have a lot more concern about about and AI. What it what is here and now. And this is a, a perspective that's also shared by others. For example, Feifei Li who is a Stanford professor and, and some people consider her to be the, the grandmother of, of AI and in this generation and it's the same, the same, perspective that.
Nikki Mirghafori
Yes, let's be really concerned about an AI right now. and yes, let's not be freaked out about AGI because it may or may not be here. Let's keep an eye out. But but I so together with her, I want to shift the conversation. I want to shift the concern that people have, from being freaked out about AGI than to be concerned and really take action with and I so not coming back.
Nikki Mirghafori
So that was a long segue. Now I want to come back to the second point you made, which was about, the future of work. So what does it mean in terms of replacement? For example, you give broad example of, of journalists being replaced. So, so what I'm seeing as a scientist right now and, and, and also being an, and, and advisor is that people are using these systems to help them, to help them accomplish more.
Nikki Mirghafori
So the ways to, for example, with ChatGPT, to help me craft a better text, you know, the first version, like me, comes up with a first version, like, yeah, the first version is not exactly what I want it to be. And and I edit it, but it's not producing the final version. It's it doesn't have the creativity.
Nikki Mirghafori
It doesn't have. It has a lot more information perhaps, but it doesn't have that bad Genesis. Why doesn't have that nuanced way of seeing? And I'm just seeing that over and over and over again. And not to say that at some point it it might have that, but but, maybe what we need to be concerned about is how can we not to be not to be scared of this technology, but how can we learn to actually empower ourselves to become more, not just productive?
Nikki Mirghafori
I don't want to just make it, you know, productive is not an end all God that we need to pray to. But how can I be more empowered to be the best version of myself? with with these tools that I have available?
Rich Fernandez
That's right, that's right. And in much the same way and building on what we've been using them all along for, right? So if you've ever used search, whether it's Google or Safari or Firefox, you're using AI and you have been all along and has it helped you search for the things you've needed to to create what you wanted to?
Rich Fernandez
you know, I also did my PhD and I at Google wasn't around when I did my PhD, so I spent way too much time in libraries looking at articles and trying to find journals that were bound and published in analog version. And today, when I write articles, the research is is much more both effective and efficient and, and, and broad ranging than I think I could have definitely found that in looking through the files in a library and trying to find a shot on a book on the shelves.
Nikki Mirghafori
Yeah, you said it, rich. I love that example. Yeah, I remember going to the dusty basements of.
You know, the stacks. I remember the stack.
Rich Fernandez
Sheet to try to photocopy the damn article because, you know, not ancient history. I mean, I I'm somewhat ancient, but, like, I mean, we're talking about you up a decade.
Nikki Mirghafori
We're both ancient rich comparatively.
Yeah.
Rich Fernandez
Before Google and after Google. Well, and in fact, what I would love to do is perhaps invite, slight pivot here. and think about, how a lot of what we've been talking about, the use of artificial intelligence in its narrow way, and this might apply to the future of AGI, but, certainly, I think underneath it all is perhaps a consideration of the ethics of the thing.
Rich Fernandez
Right. So what I'd love to invite is maybe some consideration of how might we how might we consider an ethical approach to the use of AI? and here I also invite you to bring in I know one thing we haven't talked much about thus far is your very deep background in mindfulness and meditation and terrified in Buddhism, which has its own ethical basis as well.
Rich Fernandez
and foundation. So how how do we think about all of these things from your perspective and bringing in some aspects of your background? How do we think about benevolent AI or at least the ethical deployment of AI?
Nikki Mirghafori
Yeah, yeah. Thanks, Rich. Thanks for the, question. And and, so, so the question of ethics, in goes back to values, goes back to values. What is important, what are we trying to optimize, as I put it in, in nerdy terms. In engineering terms, what what are we optimizing for? And and when we take a, large a perspective instead of a, just a small perspective.
Nikki Mirghafori
Oh, I need to, to get, and, become famous or, or get more money. And, you know, the, the the smaller the, this, the smaller, perspective that, is often very, entangled with our sense of self. Me, me, me. It's all about me. What do people think about me? How am I doing? and and when we take a bigger perspective, we care about me.
Nikki Mirghafori
We we care about ourselves. And also we care about others. We care about others. And the impact that that my actions have, on, on, on others and the bigger picture. So as if we take a bird's eye view in the actions we are taking, the, and, and it's easier said than done always. But to take that pause, to take the pause, to take a wider perspective of what what are the implications of this?
Nikki Mirghafori
Are we just optimizing shareholder return value without any, regard or care for, particular communities, for, for the environment, for who, you know, who's being left out. So, so to bring that into our perspective, to bring that into the conversation and, and it really in that way the, the ethics, of AI or the ethics of any kind of work that we do be becomes tied to personal ethics person you the way that we want to show up and have impact in the world.
Nikki Mirghafori
It's it's, and it's value driven. If if I were to die and, you know, bringing the death card is actually quite important. If if I were looking back at my life from my deathbed and if I were to die, say, in a week, in a month, not pushing it back, you know, way, way back into decades from now.
Nikki Mirghafori
But how would I look back at this moment and, and evaluate my decision in this mode? This is this is why is this kind is this compassion? Is this for the benefit of of myself and others, all beings? and and really take that bigger, wider perspective, bird's eye perspective. in an on our actions, instead of this narrow perspective that we usually take about just me and my community, my this, my company, my a wider perspective, how can how can this be of service?
Nikki Mirghafori
How can this be of service?
Rich Fernandez
Yeah, I love that. You know, and and there's an end for me because. Yeah. And it occurs to me that, of course, this takes place in a context in which a lot of the fuel for these efforts are provided with the expectation. And here I'm talking about venture capital, private equity, simply reporting to Wall Street, you know, your quarterly earnings, they occur in a model which sort of socially rewards people for driving growth almost by any means necessary.
Rich Fernandez
And so how do you how do you understand that conflict and how might we skillfully work through that?
Nikki Mirghafori
Yeah, yeah, I think it's that by any means necessary. I think that is where the the rub is with that rub. if, you know, presented as a conflict that just cannot be solved, if that is the way, then you know.
That.
Nikki Mirghafori
We can't do anything. We just have to, you know, sit and cross our arms and like, go, yeah, but I don't think that is the case. I don't think it is by any means necessary. And there are plenty of of companies and plenty of entrepreneurs and plenty of people who are doing wonderful work in the world, still within this system that that tries to push in a particular way.
Nikki Mirghafori
And I'm not saying that it doesn't push in the particular way. Of course it does. And sometimes we're almost we're going within the stream, but a little against the stream, trying to, to still, be within the system. is it the capitalistic system? And, and yet, shift and change and turn instead of just putting our, all our, our values and intentions in our back backpack and say, always just about making money and whoever has a bigger paycheck and a bigger company and bigger this like, whoever had them has the most shoes when they die wins it.
Nikki Mirghafori
I mean, we know that's not the case. We know that's not satisfying. on a personal level, we know that it's not satisfying in a personal level and, and leads to burnout and leads to to the sense of, yeah, perhaps even nihilism. Like, what's the point? Or, like, doing more and more and more and bigger and bigger.
Nikki Mirghafori
It's without the connection to to meaning and values, both on the personal and on a systemic and, team level. And the company, if we lose that and it's just about more and more, it becomes, it's not as satisfying.
Rich Fernandez
It sounds to me like it's a, and this is my perspective, too. So, I don't want to be, not, sort of authentic to this idea that, it is deeply personal, and the personal foundation of leaders and, the folks who are creating these systems, the folks who are training these systems, the folks who are deploying these systems, a lot then it sounds like turns on their personal formation, their personal values and ethics base, as they engage in this work, because they have to bring that as they confront.
Rich Fernandez
At the same time, frankly, incentive structures that reward the opposite reward, but by any means necessary, you know, get as many users as you can as fast as you can. So we can, you know, do our pay or IPO or whatever it is. and so they're working against that in some ways. And it requires, I think an internal compass, perhaps.
Rich Fernandez
Which brings us to the other aspect of your work. the mindful leadership aspect, the kind of internal cultivating the internal qualities. So can you share with us, Nikki, a little bit about where you see that intersection of that inner journey and then the outer work we do in the world? Yeah.
Nikki Mirghafori
Yeah, absolutely. Thanks, Rich. and I know that this is an area that, that is of, great interest to both of us serving in this particular way in the world. And, to your earlier point, leaders, shape cultures, shape cultures and companies in their own image. And we see I know that you've seen this over and over and over again and, and whatever their values, their personality, whatever they're optimizing for, it becomes the shape, the culture, of of the team, of the, of the company.
Nikki Mirghafori
And so, so culture shift begins on the internal and the personal, personal level. There are a lot of ways to shift and, and turn and, and be transformed into a better versions of ourselves. It's not just that. Oh, this is all I have. These are the tools I have. There's no learning to do. Of course there are tools or ways we can learn.
Nikki Mirghafori
I know that I'm I'm a different person than I was before this path of practice, before the ways that I have shifted and changed through through my practice. and, maybe just to say a few words about how, the practice of mindfulness and even, in the way of observing and seeing and not shying away, not shying away from what is, being seen internally and externally.
Nikki Mirghafori
And often we in the West, we think of mindfulness is as just closing your eyes, sitting cross-legged, perhaps, and, and, and just going internal. But but mindfulness is both internal and external. When we go back to the Buddhist roots in the Buddhist as internal external all the time, always. So it's not just what we're becoming aware internally what's happening in our minds mind state, heart state, landscape of the mind internally, but also observing what's happening around us, being aware of of the people, of the environment, what is happening around.
Nikki Mirghafori
So with the power of stabilizing ourselves or in, sense of embodied sense of embodied awareness, so that we actually show up from a place of intentionality, embodied place of intentionality, instead of being like a headless chicken or we often are, disembodied, one example, one metaphor I like to use often is these, pop up dolls that have sand on the bottom and a pop up, a large pop up doll.
Nikki Mirghafori
And if you, for example, knock them over, they they stand up again because they have they have scattered sand. Sand on the bottom. but if that sand was on the top, was in the head, it would be toppled over and it would not get up. I would be imbalanced. So I like to think of in mindful and compassionate leadership as the way we grow into it.
Nikki Mirghafori
It's not something we do. It's something that we become through practice. We embody the body of a leader that is stable, that is grounded, that is kind, compassionate for ourselves and for people around us. And we have sand on the bottom, especially embodiment, mindfulness of the body, which is a big part of, of of what I like to bring in, we like to bring in into our curriculum, is the sense of embodiment so that when difficult, challenging news comes a decision that needs to be made fast.
Nikki Mirghafori
Okay. Do we have the sense of embodiment, of kindness, compassion, wisdom that we can draw from instead of, what's going on? And then we just get toppled over in the moment and and don't get up. So. So it really starts with that sense of connection. Connection to our values, connection to our bodies, connection to our core of, of, stability.
Nikki Mirghafori
And we can lose it. It's of course we will lose it. But can we come back to it? Can that be something that we cultivate on a regular basis? And we can we recognize it because for many leaders, it's not even recognizable. Like, whoa, what do you mean? To be in my body or feel my, my, feet or or, or.
Nikki Mirghafori
Well, I'm having a conversation that be embodied and, and and training leaders in these ways to actually have this embodiment, connection to their good, well, connection to their values, connection to their bodies and the stability when they are engaging in, in work, personal work, when they're sitting, writing emails, making decisions, when they're in relationship, relational work, during their having difficult conversations, which we have.
Nikki Mirghafori
It doesn't now, of course, it's part of being a human. We have these courageous, difficult conversations. Can we be rooted in that? can. And when we're interacting with with organization, with bigger organizations that are shifting and forming, can we be rooted in that? so in this way, this sense of personal, values and ethics and connection and the training that we do, the training that I do can show up, can serve in ways that I'm not trying to to act in a particular way.
Nikki Mirghafori
Oh, yeah. Yeah, Nikki said, being by being like, what does that even mean? You've practiced it enough times that it just shows up. It it you, you become it.
Rich Fernandez
Yeah. It gives you, I think, being a practitioner, an aspiring practitioner, at least to these practices, perhaps what we're talking about here is, a certain groundedness, a certain agency and a certain ability to have clarity and intentionality around the work, rather than being kind of like, like a like a flag on a flagpole, right?
Rich Fernandez
Whipped around by the wind being the flagpole being the rooted core, the center, so that whatever direction the wind blows, whatever comes your way, as you said, like with that sort of, pop up doll, you know, it kind of writes itself, because the complexity of the challenges and the demands, will always be there for leaders, right?
Rich Fernandez
It's almost like a continuous state of being challenged. and.
I like stress.
Rich Fernandez
Yes, exactly. You know, it's just kind of what it is. It's, it's like a feature of our lived experience and being alive today with all the churn that's going on in the world. And it's even more amplified for leaders. It's a feature, not a bug, of the lived experience of being stressed and challenged and disrupted is just going to happen on a continuous basis.
Nikki Mirghafori
Yeah. Yeah. Absolutely. It's it's part of the deal. And actually. Hallelujah. I and the reason why I, suggesting I was suggesting the word stretch because when I feel it. Oh, yeah. This is really stretching me. I feel I welcome, I welcome being stretch because now I can contain more I, I'm, I'm containing more, I, I have more capacity than I did before.
Nikki Mirghafori
if I just think I'm being challenged, like, oh, my God, I'm being challenged again.
It's like, oh, yet another child.
Nikki Mirghafori
Is like, no, I'm being stretched. This is, so for me, that metaphor personally works better.
Rich Fernandez
She wants to meet the stretch.
I think. Yeah. There you go.
Rich Fernandez
Yeah, I think we could probably talk, forever, like, about all of these things and so many more things. but I just want to be cognizant of, you know, the time we have together. but if, folks want to. And we'll share the information also. But if folks want to find you off, tell us where you are, how to find you.
Yeah. How to with. Yeah.
Nikki Mirghafori
Yeah. Thanks, Rich. So, they can find me online, on my website, https://nikkimirghafori.com. My first name.
Unpronounceable last name.com o.
Nikki Mirghafori
Or if they in Google or, in any other search engine, if they just put Nikki meditation and I Nikki meditation, they seem to be the first organic, hit that pops up, and they can find the classes I teach talks, lectures, various ways. and, yeah.
Rich Fernandez
Thanks. Yeah, well, I can't thank you enough, for sharing, your deep expertise and wisdom, and bringing both of those, today. So thank you so much for being here, my friend. It's a joy, as always. And, I hope, as we said in the beginning of the podcast, that everything that we talked about will be of service and benefit.
Nikki Mirghafori
Yeah. So thank you, Rachel. Thank you so much. It's been so much fun to to connect and to chat today. And and as you said, we could just keep talking, you and I. So, this is this has been fun. I really appreciate you, inviting me to be a guest and and just connect with you today. And you may be of service in ways that we cannot foresee.
Nikki Mirghafori
Yeah. Thank you so much.