S.04: E.03

The Innovator Ethos

with Rebecca Reser

October 22, 2019 • 36:52 min


  | The Innovator Ethos (w. Rebecca Reser )


Rebecca Reser has a certain affinity for systems design—at one time, she was on a path to international policy. Now an experience strategist, she uses the same suite of techniques to inform her approach to innovating the enterprise omnichannel experience. Spoiler alert: the process to create effective policy is not far removed from the one to design effective experiences. In this conversation, we explore how technology can augment human ethics, designing business infrastructure to support innovative experiences, and why design thinking is the key to solving the world's toughest problems.

To hear more from Rebecca, find her on:
Twitter: @rebeccareser
Instagram: @rebeccareser

Written, produced, and hosted by Kenzie Haynes.

Kenzie: Welcome back to It's Worth Doing Right, a collection of conversations about the creative side of strategy. I'm your host, Kenzie Haynes. Today on the show, we're talking to Rebecca Reser, top experience strategist and one of those people you go to when you have a deeply philosophical yet highly technical quandary to unravel. Rebecca has a fascinating background touching most culture-shaping industries and pulls on all of it to define sophisticated future-forward omnichannel experiences. We talk the talk on everything from the ethical nature of feedback loops to the future of the world. Let's dive in. Welcome to the show, Rebecca.

Rebecca: Thanks, Kenzie. I'm excited to be here today.

Kenzie: So, just to kind of set the scene, can you describe your career, kind of the main milestones of where you started and where you are today?

Rebecca: Sure. So, in school, I was a European studies major and I always thought I was going to go and do some sort of idealistic, make-the-world-a-better-place thing with public policy or international trade and that's kind of where I started my career. After doing that for a couple of years, I saw how technology was changing the world a lot faster than politics was. And so, things like Facebook were coming into the core of how we're interacting day-to-day. So I made the shift over and I worked in a small agency and actually hopped to another one and in the process started my own company. For about three years, I ran my own company and did consulting in the digital strategy in human-centered design spaces. Then I got scouted to come to a Fortune 100 financial services company. And I've been there for the last four years.

Kenzie: And I understand that it actually went from a Fortune 200 to Fortune 100 while you were there.

Rebecca: Yeah, that is true. I was in corporate strategy when that happened. We were pretty excited about it, but there was a lot of momentum that led to that. So, we probably shouldn't take all of the credit being in strategy, but it was an exciting time.

Kenzie: Yeah, we'll connect that dot. So today, you've moved from kind of traditional design roles, experience design, more into the business design role. Can you talk about what that means for you? What is business design to you?

Rebecca: Yeah, sure. When I started in the digital space, I was really just doing what you could as a kind of entry level. So I was an account manager type that would convey what the technologists were doing to the clients. And that got me familiar with their tactics and how to also talk to a non-technical audience about their own business goals and how that could translate into the digital space or really, even back then, we were talking a little bit more than just digital. It was digital plus phones. So, advertising campaigns, you have people call in, you have people who might want to send a message online. It was kind of like the start of an Omni lens but in my own perspective. I thought it was all just digital.

Rebecca: And so I think that as I've gone in my career, I've broadened out from a real digital focus to a, what we call an omnichannel kind of focus. So, now I do omnichannel strategy. And so, that's not just designing for the digital space, it's looking at the customer or the user in every plane. From there, walking around day-to-day and they're distributed devices like Alexa to text messaging to calling on the phone to interacting in social media to direct websites. So, there's a lot of obviously, channels that are sometimes happening in isolation, sometimes happening simultaneously. And so, it's that broader perspective as to how you systematically approach designing an experience as almost like a destination as opposed to a channel as a destination.

Rebecca: So, now that I'm sort of in the business and able to do strategy, I'm using things like design thinking methods to facilitate robust conversations across the company in order to gain alignment and do it in a conflict-free way. It helps us rapidly move through what could be a lot of bureaucracy or silos of the past and truly go take an integrated approach to marching towards our strategic direction. Getting strategic alignment, breaking that down into workable problems, and getting people oriented in the right direction with the research to give them a strong launch. When you're in strategy, you have a lot of unknowns. And so, I think it's kind of fitting actually that I ended up in this space because while I love the building of experiences, the philosopher side of me gets excited about the fact that I can apply that innovation, that creative lens to every aspect of designing the business.

Rebecca: Where I like to start is still with that human-centered design lens. So, making sure that whatever we're doing as a business fundamentally starts with a human need and is going to address that in the most innovative kind of way. Then from understanding what that target vision is, deconstructing all of the things you would need in order to create that and so that those become like a backlog of problems to solve.

Kenzie: And what I love about kind of how you position your role is it's not just identifying the problems and laying out a roadmap to solve them, but also looking at the business infrastructure and laying the groundwork that the business can support the solution once it's in place.

Rebecca: Yeah, I think I'm lucky that I work in an organization large enough to recognize the persistent need for investment in the infrastructure. And I guess maybe that's just the times that we're in. But having been in consulting, working with many startups or solopreneurs, it's just not always possible financially for companies to kind of take that approach. We know that innovation is so critical to staying competitive and that the landscape is changing very quickly. And of course, when you're in any sort of enterprise-scale business, you encounter legacy technology debt. So, I think in some ways, it's almost not an option to not innovate at this point because we just have so much to overcome.

Kenzie: It's almost a liability.

Rebecca: It is. And so, these functions exist. I didn't know how to deconstruct infrastructure and map that to the strategy before I got to where I'm at. I did it on a smaller scale before, like I would talk to a client and do the research and design a website hypothetically and then sell it back to the client. Like, "Okay, this is what I think we need and this is how we would build it and this is how many hours I think it will take. And I think it would meet these needs for these users. And we're going to position your brand like this."

Rebecca: Now it's really kind of shifting gears into the space where there are literally people whose jobs it is to just... You have technical architects, but you also have process architects and information architects and human resources architects. And you're taking all of these lenses to where you're trying to go and making sure that everything that you would need would be accounted for down the road as you're going there. Now, it's not saying that we're going to build all these things, and then we're going to build the product or whatever the experience would be. It's, again, solving the problems in the right order to deliver just enough value to continue to march down that road. If you stopped delivering value, then you need to pivot.

Kenzie: From your perspective, where are we now in AI? What are we able to use it for? What are we using it for? And what is the untapped potential that we haven't amoked yet?

Rebecca: We're very high up in the hype curb of AI and it's just a natural progression from having the big data explosion. And then it's like, "Well, what do we do with all this data that we've collected now?" So you start to-

Kenzie: How can human possibly parse through all of this?

Rebecca: Right, and you start to try to get these insights from crunching algorithms. But I have to, I really, when this topic comes up I have to mention Tricia Wang because I've learned a lot about the integration of kind of that quant side and the qual side that maybe I'm more familiar with coming from a design background, and how they work beautifully together. So, Tricia Wang has this term for the kind of equivalent to big data in the quant world. So for her, the qual side of that is thick data, and she talks a lot about how humans are able to perceive trends that data might not be collecting yet. So we might have instrumentation but it's not oriented at the newest trend. And so without a human lens, you're not going to see the signal and the noise about the next thing that's going to shift your business model.

Rebecca: So, where I see AI coming into play is really kind of the convergence of those two lenses. And it'd be very interesting to see, as AI, general AI gets developed and is able to account for both of those lenses the art of the possible. Because again, that's taking kind of the human and the machine lens and marrying them and even opening new doors of what could be possible.

Kenzie: Can we talk about behavioral design?

Rebecca: Yes.

Kenzie: Yes. Okay. So, something that really fascinates me is this notion of behavioral design. I think as an industry, we've gone through a decade of being really excited about experience design and really pushing that, brands designing these experiences that ultimately fold back into the brand and push their KPIs. And we've seen how powerful that is. We've seen how successfully design can push an experience and kind of manipulate behavior. Now we're kind of seeing that play out in these really public venues and how that can go wrong, how that can go right. And I think now we're sort of recoiling on ourselves and thinking, "Well, if we can manipulate these experiences and manipulate behavior so profoundly, maybe we should pivot that towards focusing on ethical ways of doing so that we are only focused on the consumer and benefiting them. And in turn, that's going to fold back into the brand and benefit the brand." What's your perspective on that?

Rebecca: It's just so timely. Because absolutely, everything you just said. 2018, I went to the Rosenfeld Media user, what is it, Enterprise Experience Conference and ethics was a huge topic. And there was some great metaphors to medicine and the Hippocratic oath like maybe we need, maybe we're at a time when the maturity of the design industry has reached a place where we ought to create some sort of equivalent to a Hippocratic oath. Because we are the ones who are creating these systems, essentially, that now have huge implications when the design can scale. You're designing something once and then it can scale to a billion interactions. Or more. By being thoughtless about designing that one thing could have huge billions of ripples of implications in people's lives. And especially when we get into behavioral design, designing systems that actually are intended to influence people's behavior for one reason or another. That's where you really get into these huge ethical constructs. And so there is plenty of work to be done in this space.

Rebecca: But ultimately, I mean, if you're working at a company right now, and you're having to be pushed in this direction of behavioral design, and you're feeling uncomfortable about the ethics, I think that that's the time to really align with those who you're working with around the core values that you have around this product or project, and making sure that everybody is going to walk forward in integrity around those values, and that you have thought through the ways that you're going to control for those values. So, whether it's things that you introduced into the user testing or feedback loops once things have been released into production, and making sure though that you're not just designing the thing, but you're thoughtfully considering all of the ways in which the design of that thing could go wrong, and what signals you have set up and feedback loops you have set up to make sure you know when those things are going wrong early enough to be able to course-correct.

Kenzie: From a philosophical perspective, how can designers of these systems make these ethical judgment calls? How do you know that the experience you're designing is in the end user's best... Oh, my God.

Rebecca: Interests?

Kenzie: In their best interests. How are we qualified to make that judgment call?

Rebecca: It's a big question. And I think relativism is always real, I mean, because no one has an objective perception of reality. So, I mean, I think the answer to your question is, it's not possible. The best we can do is the best we can do as humans plus the machines that we were working with. So, in a bank for example, I can tell you that you are regulated to have controls in place to know when certain things are performing correctly or when they're not performing correctly. And that's actually one of the best learnings I've had working in financial services is just the importance of building controls, because otherwise, things go wrong and you don't know that they're going wrong. So, whereas many industries don't have that level of oversight to have to have that level of infrastructure and maturity in the constructs of whatever processes they are putting in place.

Rebecca: As we develop more industries that are in these very sensitive parts of our life, either by regulation or by social pressure, we need to make sure that those companies have a basic level of expectation from their consumer to be responsible about the way that they're designing those types of experiences. So, I think what's difficult is that we are in a place of such high apathy when it comes to our consumer world, it's like... I remember in college I was, again, a European studies major and I focused a lot on kind of modern European history and thinking about surveillance in the Soviet Union or Soviet-occupied Eastern Europe and how that impacted people's lives on a day-to-day basis. And their political alliances because they were constantly living under the fear of observation and having that information used against them in a very dangerous way.

Rebecca: Today, it's like we've pivoted into a world where that is still happening, but it's happening so far removed from my direct perception that I've lost my worry about it. But when you-

Kenzie: Well, and you also build a consumer world where observation is the value that, well, like social media. That is the value of social media is that other people can observe your thoughts, your life, your anything you want to be observed and maybe not want to be observed, like your data.

Rebecca: Right. But a general consumer doesn't understand that some type of computer could then take that data that you've given to that social media platform and essentially reverse-engineer your psychographic constructs to the extent that you could actually be manipulated into voting for a totally different person than you might've voted for. I think that's the level of manipulation and data utilization that is out of our perception when we use these types of platforms. And most consumers are not aware of the sophisticated backend inferences that can be made.

Rebecca: There's a great example. A couple of years ago, Netflix had some sort of crowdsourced project where they released specific data of their users that was anonymized. I think that it was some sort of develop a new algorithm for providing recommendations or it was something, some competition like that. Netflix actually ended up getting sued by one of the people whose data was released, even though it was anonymized, because based off of what she watched and where she lived, you could actually triangulate who she was and that she was gay. She basically said, "You have exposed me." So, it's that level of sometimes, just a few data points, even when anonymized, are still exposing people. Again, we're not kind of equipped in normally when you're a consumer, just thinking about like, "I'm just using Netflix so I can watch my favorite shows." You're not really thinking like, "And this is going to out me as gay in my community."

Rebecca: It's actually all coming to the surface and I think actually in government right now too, you're seeing how the big companies, Google, Facebook, their executives are being kind of called before Congress. And there's more interest in oversight of these platforms obviously with what happened with our elections. There's more scrutiny here, but I think you're seeing the slow ahas coming to the forefront of our collective consciousness. As a culture, and I think the next step is, and then what are we going to do about it? And so, just like when banks went wild and there was the great depression or the markets crashed and whatever the derivatives market is made out of, it's like a fantasy of a fantasy of a real financial instrument that somehow crashes our economy, like those types of things play out and then all of a sudden, unfortunately, the way that we tend to operate is there's this crisis and then we realize, "Oh, we shouldn't have done it like that." So then there becomes more focus on, "How do we prevent that?"

Rebecca: So, that's where I think that if we want to be good designers, we can think about, again, what are the ways in which we can create these systematic rules and controls to ensure that the right outcomes are taking place and the intended outcomes are taking place? And that if they're not, that we have these signals and that they're all built around a core value system. I think eventually, we might get there where our government is sophisticated enough that it could be enforcing regulations within corporate America. But I mean, it's generally reactive in our history. So I would anticipate it's going to continue to be generally a reactive kind of approach to regulation.

Rebecca: And so when you're out front, when you're an innovator who's dealing with these questions, I think again, that's when having sessions that are kind of like, "What could go wrong?" Then protecting the invention by making sure that as you're designing that project, that thing, whatever you're designing, that you have mitigated those risks and you're not just like, "Well, we thought about the risk once and I think we accounted for them." But you have consistent risk management and risk mitigation and that is something on your ethical... It's part of your ethos as the company and it's part of your operational cadences and you have management routines around it. I mean, that's a level of maturity we need to get to with the type of power that we are kind of being handed in behavioral design. I mean, because unfortunately, conversations around ethics are so few and far between. And it's like the uncomfortable thing that nobody wants to bring up unless you're Elizabeth Warren and then that's your brand. It's like, "We're going to talk about all the ways that these people are risky and we protect the consumer."

Kenzie: So, why do you think that there is sort of this reticence to talk about it? Is it the lack of mental framework? We don't teach this in schools anymore unless you seek it out and then you're a weird philosophy major. But is it that we don't have the mental framework to sort of approach it and sort of understand it or is it that we are afraid of what we're going to find if we explore it too thoroughly, that it may shut down our innovative ideas, that it may kind of expose weaknesses in our business that may not be on the agenda for this quarter?

Rebecca: Yeah, I mean, I think... Not to be the behavioral science dork, but I mean, I think it goes back to human bias. I think that there is just a... It's not just that I think. Actually, I did a lot of research on people's bias against, well, I mean, there's so many different biases. There's thousands of biases built into the human brain. But particularly when it comes to thinking about negative outcomes, we have lenses that make us think that, "Well, it will never happen to me," that it's going to the optimism bias, that it's not convenient. It's not necessarily the joy of what people think about and aspiring in life. But at the same time, it is critical and I think there is something to be said about education. Because that's one of the main mechanisms we have to kind of combat human bias, to instill things in people, and to create forums for discussion.

Rebecca: And so, as opposed to taking all that we have learned with all of this science that we are using to market to consumers, and turn it around and actually use it to better humanity. We're only using it to market to consumers, and we could take that same lens and apply it to, "How do we innovate education?" And we're just not doing that. So, this actually gets into a totally different topic, but it's something that I think about in my spare time. The art of the possible in taking the methodologies, the problem-solving methodologies that we get through experience design for private organizations and Fortune 100 companies and turning that towards public problems.

Rebecca: And I just think we're at a time, you and I, I mean, we're millennials. We were kind of born into a world where it felt like the third act of a Shakespearean tragedy. It's like, "Will there be a future?" I don't know. Global warming is, or climate change is very real to us. In our generation, it's kind of like if we all step up to do something about it, if we don't just live our lives to make money, but if we live our lives to actually try and impact the world in a positive way, like pick a problem that is a very large-scale social problem and make sure that your career is not just about you making money but you also solving that problem, then maybe our generation will get to live into a four-act Shakespearean comedy.

Rebecca: So, it seemed like it was going to end in tragedy and then we've changed the narrative arc. But if we don't, there is a sense of urgency that things will not work out well, you know? And so that's where I think we absolutely have to start to take these types of methodologies that are so effective in problem-solving, and make them more ingrained in the regular education process, and provide those same types of skills to public problems so that we can actually move the world forward. And not just keep kind of taking these old approaches to problem-solving that's just frankly heeled in gridlock and bureaucracy and corruption and, and and and... I just feel like we're on the cusp of that though. I really do.

Kenzie: Can we behaviorally design ourselves to focus on these big issues and really value outcomes over profit?

Rebecca: Well, it's funny too because one of the first jobs I had, I actually really thought about that because it was around a carbon exchange. So, if you're not familiar with what a carbon exchange is, in some countries, they actually keep track of how much large entities are polluting, how much carbon they're putting into our atmosphere, and therefore are taxed for exceeding what they have been allowed to produce in terms of carbon. Therefore, that creates a way to monetize both the buying and selling of carbon. And so, you can actually then create incentives for people to change behavior because you've actually valued the thing. So, to get less abstract, how high of a carbon tax would we have to have in order to create some sort of app or ecosystem experience where, as I live a more green life, I am actually financially rewarded for it?

Rebecca: I mean, maybe that's not going to be the way that we solve climate change and we won't need to do it, but it is one way you can approach solving that social problem. And so, there's other ways where you can take behavioral design and look at a social problem and think about what would we have to do from a policy standpoint in order to shift the way that society values X in order to create Y outcome. Taking again, these design thinking principles: rapid problem-solving, ideation, analogous inspiration, and applying those types of innovation techniques to these problems that I feel like tend to get a different type of problem-solving approach that hasn't been as effective.

Kenzie: For you, it sounds like design thinking has been a really useful tool, but I hear a lot from my peers that design thinking is overused. People don't know what design thinking is. It's kind of this esoteric term, catch-all term that is being thrown around. So for you, what is it and how do you utilize it to great effect?

Rebecca: What I find valuable about design thinking is probably because of the way I have seen it applied in institutions. And I think that a lot of the problems that our society faces are blockaded by bureaucracy. In an enterprise where there is a lot of bureaucracy, I've seen the success of design thinking methods in moving solutions forward and breaking silos. So, one, I think that it levels the playing field between introverts and extroverts. And so you, first of all, can get more solutions out of people in a shorter amount of time and give them an equal voice. I mean, and obviously, it depends on who is facilitating and how sophisticated their design thinking background is.

Rebecca: So, for sure, if no one's even affinity-mapping and you've just put ideas on a wall and you don't even take them forward, that is a waste of time. I can defend every form of design thinking. But for those of us who are skilled practitioners and have taken the time to really think through and test and iterate these mythologies of going from problem space to working solution, and not only that, but aligning all of the people involved that have to bring them that solution to life at scale and getting the momentum behind that. Also, I wrote my undergrad thesis around, how do you create the most legitimately perceived policy by the people that it would govern? How do you create good governance essentially, and at the time, the research basically said that those who are governing and those who will be governed ought to come together to create a common space for the conversation around that policy to be created.

Rebecca: And you set the vision from the top down, but you also provide the lens of the landscape with giving the ground up voices as well. You create policy in the language of those who are going to be governed. The constituents can't be speaking a different language than the policy that's governing them. They have to understand it in order to perceive it as legitimate, and they have to be involved in the creation of that policy as well.

Kenzie: This sounds familiar.

Rebecca: Right, right. And so, exactly. So, you pivot to 10 years later, here I am in my career and I'm essentially applying what I learned. It's kind of funny. Full circle in my undergraduate thesis research to what I do day-to-day. Like taking teams that can sometimes be off in 10 different directions, bringing them together, creating a container for them to feel safe enough to share their ideas, heal from past discrepancies that might've happened. A lot of times too, like in corporate America, we just silo because how do you operate at scale? When I first got to the company I'm at, I had no idea of any of the departments that existed outside of the department that I worked in. You're just trying to learn the job. You're not necessarily trying to take in the whole entire thing. That's totally understandable, but it also creates problems. So, by using these methods, we're able to create a new way of deciding and essentially governing. We're able to move forward again in a more democratic way. And ultimately, I think people then feel like whatever the outcome is, they have more ownership in it.

Kenzie: It's legitimized.

Rebecca: It's legitimized and they feel bought-in to keep it moving forward.

Kenzie: I think a big value that I've seen is, like you're saying, it breaks down the silos. It brings everyone together, and versus just having kind of an unstructured brainstorm, like let's get everyone in a room and we'll just throw ideas around, which I think quickly kind of all the cats run in different directions and people aren't talking about the same thing, though they may think they are. The conversation kind of loses momentum as those different threads sort of branch out. Whereas with design thinking, it's usually based around a very specific agenda that is driving at a very specific outcome. We need to decide X through this process. So you've already kind of established the outcome and the agenda to get there and you break down the pieces so that you start at square one. You don't assume everyone knows square one. You may know square one as primary stakeholder, but as so-and-so from marketing, or this person from accounting, they don't know square ones.

Kenzie: So let's get everyone on the same page on square one. Great. We agree, we have a shared understanding of that. Now let's move forward. And then the other part of that is we're establishing this kind of context and shared understanding, but we're also giving a framework for it that, like I said, there's an outcome. We need your input and here's kind of a constructive way to provide that input instead of everyone sort of going through their therapy session. Because I think we've all been in those great work therapy sessions where everyone has a solution to a problem that no one else is experiencing or just a problem to vent. This is kind of saying, yes, there are problems, but let's walk through methodically and kind of arrive at a solution through our shared understanding.

Rebecca: What is strategy without execution, right? I mean, and the other thing is culture eats strategy for breakfast. So, you can have great strategy, but again, if it's not something that is the status quo, if it doesn't fit to the status quo, if it doesn't even have some sort of reverse-engineering to make sure that the status quo somehow bridges over to fit this, it's not going to go forward. It's just too foreign and so, you have to take that into consideration not just in the deliverable but in the way that the deliverable is produced.

Kenzie: In the broad scheme of things, in your career, in what you see possible with technology or design or customer experience, what are you most excited about? What's got you on edge? Like, "I can't wait to see more from this, or to see this evolve further."

Rebecca: I think the... Well, I mean, quantum computing for one. I think we barely understand the laws of physics that exist in this universe, and what computers could potentially reveal to us, and how that could shift our perception of reality. I think that that's one whole kind of area of fascination for me. I get excited about the art of the possible with distributed ledger technology applied in our society and hopefully not controlled by the powers that be today, but if leveraged in certain ways, could be how we get to true democracy without corruption, could be how we get to just levels of ethics built into the everyday systems that are backing every type of interaction we have, in a way that again, the corruption and ethics or corruption is not possible. Ethics are baked into the code.

Rebecca: So, for example, today we can have a voting system online. And it can be corrupted if somebody takes the code and just manipulates it slightly, where it's like every third person who votes no, change their vote to yes. Unless again, you have the right controls in place to totally insure from end to end, that system is monitored for any sort of corruption. It is possible that corruption will happen. But if you're operating in a different type of system where there is no ability to erase something, ever, then you have complete auditability. And therefore, you can create these types of closed systems that can be as ethical as we could design them. And so-

Kenzie: Ethical in their incorruptibility?

Rebecca: Yes, yes. So, you then could design a system where everyone's vote is recorded and it's immutable. And so, someone could try to change it, but there would be record of that change happening. I mean, I think again, this is exciting in the possibility for certain use cases, and it's terrifying in the application towards others. Because I think humans operate in a certain level of gray ethics day-to-day. I wouldn't want every thought I had to be immutable and public. If we apply it to problems like voting, it's an exciting kind of future to think about. And yet if we apply it to surveillance technology where you can, again, take something and turn it around and manipulate the construct by which it's portrayed and do evil with it, there's still ethical problems there. So, yeah, I think those are a couple things that I get very excited about. Just the ethical problems.

Rebecca: I'll say one more thing too. I do wonder what it will be like in 20 years when people have grown up or maybe it'll be longer than that. But when we get to a place where, say, all coding is done by machines, how are younger generations going to know how to think about these problems philosophically? Because if they weren't the ones that designed or had to build the systems, they might not be as sophisticated enough to think about how to protect themselves from the potential evils that these systems could create. So, I guess we're ending in a singularity. I don't know.

Kenzie: Wow.

Rebecca: Apropos, right?

Kenzie: Well, Rebecca, thank you so much for coming on. It's been an absolute pleasure, meandering through the art of the possible with you.

Rebecca: Thank you so much for having me on, Kenzie.

Kenzie: That's our show. If you'd like to learn more from the Accomplice team, visit us at itsworthdoingright.com. Or drop us a line at podcast@itsworthdoingright.com. See you next time.