S3E30: Bethany Sonefeld Envisions a Healthy, Balanced Tech World

'Bethany joins us today to discuss her latest project, "Create with Conscience." We also talk about the balance between technology and a healthy lifestyle, the difficulties end-users face when trying to build that balance.'
bethany sonefeld

Listen

Apple Podcasts
Spotify
Amazon Music

Show Notes

In today’s episode, Bethany joins us to discuss her latest project, Create with Conscience. We talk about the balance between technology and a healthy lifestyle, the difficulties end-users face when trying to build that balance, and how creators can use Create with Conscience to build ethical technological systems. Bethany shares the three principles of Create with Conscience that are used to help create healthier technology, some of the ways you can start to create healthy boundaries and values in your tech design, and how businesses as a whole might be incentivized to implement these ethical changes to their design. We also get into the design of modern tech products, how they are designed to manipulate and control the time of the end-users, and how the battle for attention from all sorts of tech products and design patterns has contributed to the attention crisis in the modern world.

bethany sonefeld

In this episode you will hear: 

  • What is Create with Conscience?
  • How creators can use Create with Conscience
  • Principles for creating healthier technology
  • What makes for healthy boundaries
  • Giving control back to the users
  • Tackling the addictive power of modern tech products
  • The attention crisis of modern society
  • What are the business incentives for these changes?

Bethany is a designer that specializes in systems thinking, detail-oriented design, and scalable enterprise solutions. Her past roles include leading IBM’s Carbon Design System and building a team for Cloudflare’s Zero-Trust Product.

Currently, Bethany is working as a Design Manager at Duo Security while also building Create with Conscience, a space dedicated to educating and committing to designing healthier technology.

Connect with Bethany Sonefeld:

Website: createwithconscience.com

Email: [email protected]

LinkedIn: linkedin.com/in/bethany-sonefeld

Twitter: twitter.com/bsonefeld

Connect with R Blank and Stephanie Warner: 

For more Healthier Tech Podcast episodes, and to download our Healthier Tech Quick Start Guide, visit https://www.healthiertech.co and follow https://instagram.com/healthiertech

Additional Links:

Transcript

Bethany Sonefeld 0:00
figure out what’s going to work for you to change that behaviour. If it is something that’s affecting you in a negative way, I think a worry I have is that we’re going to become so engrossed in these devices because they offer this immediate dopamine hit that the world around us is becoming less exciting, and the people around us are becoming less exciting. So I am worried about the long term effects. And I think it starts with just kind of reflecting on your own personal use, and understanding what what do you want your relationship to be with tech?

Announcer 0:33
Welcome to the healthier tech podcast, the show about building a healthier relationship with modern technology. Now, here are your hosts R blank and Stephanie Warner.

R Blank 0:46
So that conversation with Bethany is well it’s it’s got to be the most aligned conversation I think we’ve ever had on the podcast. She her language, her mission, her goals. I just can’t wait for the listeners to get introduced to Bethany in the Create with conscience movement.

Stephanie Warner 1:04
Yeah, I totally agree. And I think our listeners are going to love the tips that she brings to the table.

R Blank 1:09
Excellent. So let’s get into it with Bethany Sonnenfeld. Bethany Sonnenfeld is the founder of create with conscience a space dedicated to educating and committing to building healthier technology. Create with conscience was something Bethany developed out of interest in creating a healthier balance of technology in her own life. Bethany is a design manager to a security and was previously at Cloudflare retail me, not an IBM. Welcome to the healthier tech podcast, Bethany.

Bethany Sonefeld 1:39
Thanks, y’all. I’m so excited to be here with you too.

R Blank 1:43
So just just random gas you’re texting?

Bethany Sonefeld 1:47
Well, I call her I call myself and adopted Texan because I am actually from Michigan originally, but I’ve been in Texas going on eight years. So Oh, wow.

R Blank 1:57
Oh, wow. Yes. So here’s the most pressing question that we have for you today. What kind of dog is Oakley,

Bethany Sonefeld 2:06
Oakley? Oh, gosh, she is a mix of a bunch of different breeds. I was a crazy dog mom and did one of the DNA tests for her. And it came back she’s part ciaochao part Border Collie. And part what was the third one I can’t remember. But it was like a whole bunch of different percentages. But she’s like, got this black fluffy coat. And just this really cute, like Rottweiler looking face.

R Blank 2:34
Now she’s Uh, she’s adorable, and her sunglasses are adorable.

Stephanie Warner 2:39
We will put the image up on the in the show notes so everyone can enjoy.

Bethany Sonefeld 2:44
Good show.

R Blank 2:46
So you founded create with conscience, which I mentioned in the intro out of an interested in creating healthier balance with technology in your own life as a creator and a consumer. So a couple of questions on that. First is how would you describe create with conscience? What is it? What kind of project is it?

Bethany Sonefeld 3:04
Yeah, great question. So right now the state that it’s in it is a space that can hopefully Empower creators and I, I call them creators, because I don’t want to say designers or engineers or product managers, so creators are anyone working in the tech space. It is there to help empower people to build healthier technology, whatever that might look like in their organisation. I like to keep it pretty broad to start out just so that there are guidelines and principles to follow. And research that I’ve shared with with folks that can hopefully gain help people gain an understanding of why this is important, and how to get started with implementing something like this in your everyday life, like you mentioned, as a consumer and as someone that’s creating this technology.

R Blank 3:56
So and this is this is part of what I find so injured. I mean, what the issues you’re talking about, because we talk about a lot of them on their show and in our community, but and they’re growing in awareness and acceptance, but they’re still you know, not dominant. And in particular, I don’t often encounter people with these views and opinions from within the tech world. So what started you I mean, I know it’s this comes from your own personal experience, but a little more detail what started you on this journey and thinking in just an even in these terms with this kind of language?

Bethany Sonefeld 4:28
So it you’re right, it started with a personal problem that I wanted to create a healthier balance with technology in my own life. I found myself on my device all the time spending hours on Instagram, and I just it wasn’t making me feel great. And so I started researching Tristan Harris’s work I call him like The Godfather of ethical design and tech. He founded the humane Centre for humane technology. And some folks may know him from The social dilemma and Netflix. And it was really his work that inspired me to think about, in order to make change, it really has to start with the people that are creating these products. Because as much as we as consumers can try to build healthier boundaries with tech in our own lives, you have 1000 engineers on the other side of the platform, working against you working to build algorithms that are designed on purpose to be addictive, and to meet business goals, which oftentimes are focused around user engagement. And so for me, it was, gosh, my ethical values, I can’t continue along being a product designer and building products in a way that is not catered towards the health of our end users. Because in some way, shape or form, we all are, and users have some

R Blank 5:57
product. Yeah. And so you talked about yourself, as from your individual perspective, as a creator, what can individual contributors working in the tech sector, do with the community and the tools and the resources that you are building? Create with conscience?

Bethany Sonefeld 6:16
Yeah, I think it’s starts with just educating yourself on best practices and around, you know, what has been done that’s deemed unethical or unhealthy. And that’s kind of a weird balance to ride right now, because it’s very opinion based. And there are no regulations around tech, as it stands today. Not yet, at least. So I would say it starts with just educating yourself on how you use products and technology and taking stock of how it makes you feel. And then in order to actually implement change in your organisation and start these conversations around that. I like to encourage folks to just build a set of values or a playbook of sorts, and that can start, it doesn’t have to be, you know, you working all on your own, you can pull from some of your organisational values. So for example, the company that I work for one of our values is being kinder than necessary. And I think that goes a long way. In terms of, well, if we’re building products, and we want to create a healthy balance with our users, and the amount of time they spend on using our products, building Kinder or being kinder than necessary is is step one in that journey. So it doesn’t have to be this this cumbersome process of, you know, creating an ethical document per se. But I think it just starts with asking the right questions to your product team and to the people that you’re working with. Is this the right decision for end users? Or is this a decision that’s rooted in what the business needs?

Stephanie Warner 7:51
Great, I love that I love be, you know, being kinder than necessary. That’s what a great goal for everybody, you know, even just, you know, regular people who aren’t necessarily creators. But I wanted to take a little step backwards and talk a little bit more about what a Creator would find on create with conscious like, what what is it exactly? What kind of tools and how are you amplifying and influencing the conversation?

Bethany Sonefeld 8:17
Yeah, great question. So create with conscience right now offers a set of principles that can hopefully guide folks to building healthier technology. So you’ll find the first principle is building in healthy boundaries. And that can look like having a snooze option. Bumble, the dating app does this where they give users the opportunity to step away from the platform, if they need some time away. So thinking through how do we build in healthy digital wellbeing experiences into our own products? That’s principle number one. Number two is on the flip side, anticipating what sort of unhealthy behaviours your end users might be going through? And how might they use your own product to potentially harm themselves. So I think a lot of times what I’ve seen working in tech is we tend to focus on the happy paths. And what happens when our users love using our product and everything works out great. But that’s not the reality of who we are. As human beings. We are complex people, we go through extremely complex emotions, like addiction and depression, and exclusion is one of them as well, where a lot of products unfortunately, exclude certain groups of folks and they weren’t meant to be used in that way. But humans have taken them into their own hands. And unfortunately, that’s that’s kind of the nasty side of tech. So digging deep into what are these unhealthy behaviours and let’s have a conversation about them. Let’s, let’s talk openly and and making sure that we’re elevating that into our product development lifecycle. So That’s principle number two. And then the last principle is empowering change. So this is what I was touching on earlier about. It’s great that we have this set of these sets of principles, but how do we put them into practice? How do we match, you know, our business goals with creating something that’s healthy for our end users? And unfortunately, it’s not a one size fits all, like, I don’t have an easy answer for how to do that other than just continuing to have these hard conversations, and continuing to push the boundaries of what it is that we’re building. Because I think that creators have gotten less creative. In terms of you know, for example, people say all the time, Instagram is copying tick tock. And we don’t see any sort of new patterns from a design perspective as much anymore, because we have the system that we can pull from. So it’s really about the Exactly, yes, we have, you know, proven that push notifications, get users to open your app. And so it’s like, creators aren’t really pushing the boundaries of the technology we have any more because we have this set of default things that we know that work. So it’s just really like having those hard conversations and challenging each other to think outside of the box.

R Blank 11:22
So there’s, I’ve spent some time digging through your site. And also watch this, this video of yours from, I think it was the figma conference. And it’s really fantastic. And we’ll put links to both of those things in the show notes. And so there’s a lot that I want to cover, I’m sure we won’t get to all of it. And I feel like I’m probably going to be jumping all over the place. But you were just talking about boundaries. And it reminds me when I was going through your site, you talked about one of the boundaries being all caught up. And I remember just the feeling of reading those words. It was like this Pavlovian trigger of relief. That because I immediately associated that feeling because I’ve seen that notification in I can’t remember which apps because it’s in a bunch, but it that when you’re all caught up, you’re like, ah, like you get this palpable sense of, you know, visceral relief. And so how can before we move on, can you talk a little bit about all caught up and how that factors into your set of boundaries?

Bethany Sonefeld 12:29
Definitely. So we know that people are spending lots of lots of time on social media. So I think, given that knowledge, the all caught up design. My assumption is it was built to help users have that sense of oh, okay, there’s nothing else for me to see here, because I’ve already read everything in my feed, right? So it’s, it’s building in these, I call them checkpoints. moments where you create a space for the user to pause and think about what they want to do next, versus an app like Twitter operates off of infinite scroll. So there was a study that was done that found that if you implement infinite scroll on any sort of platform, your users will spend about 50% more time on that app, which is great for Twitter, right? They want users, you know, interacting on the platform. But I think there’s a balance there where it’s like, you can still build unhealthy boundaries and have moments of pause. And your users may very well just continue to scroll. And that’s, but you’re giving them a choice. And I think that’s the most important thing about that.

R Blank 13:39
So yeah, so I’m not a big user of social media, but I am of, you know, Netflix. And so I guess the equivalent there would be when an episode ends, and it automatically goes to the next one, versus when it stops and says, are you still watching this and, and you have that opportunity to break the endless cycle of sitting on your couch and watching TV? So speaking, speaking of that, right, a lot of what you talk about kind of fits into this perspective of control. Right? So who is in control? Is it you that’s in control? Or is it your technology that’s in control? Can you can you speak a little bit about what you mean by control in this context, and how it influences create with conscience?

Bethany Sonefeld 14:26
Definitely. So I think that a lot of products, unfortunately, have taken away the choice for users. I have seen I think on YouTube, they’ve now implemented it where you can disable autoplay, but it’s, you know, it’s hidden, it’s hard to find the products that we’re using. Unfortunately, many of them have been designed to be manipulative and to control our time because they know how to do that through certain patterns in the UI. And so the goal with create with conscience is to enable creators to give back control URL to your end users. And that can look different across different sectors that you’re working in. I am working in cybersecurity currently. And so while I have a lot of strong thoughts on social media, it differs from what you know, it might look like at, at my own organisation. So it’s really I think, just giving back the choice to your users and building in opportunities for them to make a decision that’s going to be best for their own personal life.

Stephanie Warner 15:30
That great, great that that actually brings me leads me to another question on the user side, how can individuals know their technology and if their relationship with their tech is healthy, like are there tools or exercises that our listeners can use to help them understand better?

Bethany Sonefeld 15:49
Definitely, I always encourage folks to lean into any sort of native kind of OS level patterns that have been implemented, that help users create boundaries. So an example is Apple and their latest iOS version, they really expanded on focus modes, it started with just Do Not Disturb where it was like, you could turn this on, and your phone would light up at all. And that was a great first step. And I think that what we’re seeing now is big companies like Apple and Google are focusing more on digital well being and expanding on on certain things that, you know, they’re giving users the option, not everyone is going to use focus modes, but things like that, and screen time are just helpful to get an understanding from a data perspective of how much time am I spending on on my phone? And for some people, it’s, it’s like, okay, I’m spending four hours on Instagram, what do I do with this data. And there are tools to help you create boundaries. But I think it’s just honestly a habit breaking kind of cycle that you if you want to create a better boundary with tech in your own life, it has to start with Breaking the Habit of scrolling on your phone before bed. So maybe set those limits for yourself, say I’m going to spend 10 minutes, and then I’m going to put my phone away, or I’m going to go for a walk. I always encourage folks to just find what works best for them. Because I think that our phones should be an extension of who we are as human beings. And that’s not how they’re being used in full today.

R Blank 17:31
Yeah, so that gets back to the question of control, right? Because you would need to have intent, intentional control over that relationship, in order to think oh, 10 minutes is, is all is all that I need. It’s all that makes me feel good. Beyond that. I don’t feel good. So I need to put it down after 10 minutes, and I need to track 10 minutes. And that, that seems to be I mean, there’s definitely more people talking about that these days. But that seems to be really hard when these devices and experiences are, as you often talk about engineered for addiction. So so how, what is a step that someone can take? And there may not be a good answer here, but what is the step that someone can take to start taking back control?

Bethany Sonefeld 18:17
I think for me, personally, what’s been successful is disabling a lot of push notifications that are unnecessary to what I may be doing on a day to day basis. So I’m sure you all can relate. Whenever you get a push notification, it’s like a, Hey, look at me pay attention open this app. And sometimes it’s really beneficial. So I imagine folks that you know, have children in daycare, they will they need to have their phone on, they need to be notified of anything like that. But there are some cases like, you know, GrubHub or Uber DoorDash, where it’s like promotions that are typically sent in the form of a push notification. Whereas I think the purpose of a push notification is to alert your user that something needs their attention. And that’s not the way that they’re being used today. So I would say that’s, that’s step one, that helped me create a lot of healthier boundaries, because there wasn’t the temptation there. And then secondly, I’ll you know, double tap on the the focus modes love those, they have been a godsend in just allowing me to get into a flow state during the day. And even at night when I’m unwinding, I don’t want to be bothered or notified and, and so those have really allowed me to build unhealthy boundaries for myself.

R Blank 19:43
i i My phone is when it’s not in aeroplane mode, which is a lot of the time. It’s always in Do Not Disturb unless I’m expecting an important call or an Uber pickup. I find I hate all along Let’s disable a lot of them just as soon as they pop up, in general, and I find that really helps my personal relationship with my phone because otherwise, it’s like, it’s like a nagging spouse, you know, like what? Why it already? I don’t need to know this. So does this contribute? Part of what you might refer what you do refer to? And when I say this, I’m talking about like the constant notice stream of notifications. does that relate to what you refer to as an attention crisis? Can you talk about what that yeah, what that is?

Bethany Sonefeld 20:38
Definitely. So highly, I’m gonna give a plug, I highly recommend everyone read stolen focus by Joe Han Hari, he did a deep dive into why we’re in an attention crisis and what that actually means. And I was surprised to learn that it’s not just because of technology, there are a lot of environmental factors that that have contributed to this as well. But the the section that he talks about the most, where he kind of did a deep dive on the technology front was that across the board, and any sort of software we’re using, there’s always this fight for something to get your attention. So an example is, in Google, I use Google Chrome as a browser. I use Gmail for my email communications, there’s a number that pops up in the in the tab window that tells me I have new email. And my brain is immediately signalled to there’s something new here. What is it? Like who reached out to me, you know, so there’s, it’s a human nature thing to find something that is, I guess, for lack of a better term, dopamine inducing, because there is feelings of dopamine involved here, whenever we see a notification, or we were opening our email, and you see new new things in our feeds.

R Blank 21:59
So I know what you’re talking about in terms of the dopamine triggers, tied to notifications and badges. What I find interesting, and I don’t know, I don’t think I’m the only one. And I don’t know, at a physiological level, if this is tied to dopamine, but when I see badges, notification badges, to me that feels like a trigger for anxiety, not for joy. Do you know what I’m talking about when I say that, Bethany?

Bethany Sonefeld 22:28
Definitely. And unfortunately, the pattern badges originally, were meant to alert users of things needing their immediate attention. And patterns like that have now kind of been manipulated to be used, hey, if we know we can get users on our platform, if we ask them to turn on notifications, or turn on badging, or some other form of, you know, us contacting you, it’s much easier to get the engagement number to come back at a higher rate, which, which equates to, you know, a good business metric that they can point to. So it’s things like that, where I think the use of certain design patterns have been manipulated. And we’re not really questioning the status quo anymore, in terms of is this what’s best?

R Blank 23:15
So, and that gets to something I wanted to ask you, since you agreed to come on the show, which is, and I totally understand where your personal motivation comes from. And I love that you are reaching other creators. And and I understand where their motivation might come from. But where does the business incentive come from? Where would the incentive come? Like you were saying it’s great for Twitter to have infinite scroll, because people are going to spend 50% longer on there? Where does the incentive come for a company like Twitter to say, well, let’s put some stop points in there, right? Because if I mean businesses are going to do what makes money absent regulatory pressure and or consumer demand is that that’s my perspective. So in your obviously, you’re devoting a lot of time and energy to this mission. Where do you think that the incentive will come from for companies, not not just individuals, but for companies to be making these changes

Bethany Sonefeld 24:18
are going to take y’all like maybe two to three years down the road and hope will happen? I think that eventually, my hope is that there will be some sort of guidelines that companies need to comply by in order to be deemed as ethical. And maybe that’s not the right word. But Are you all familiar with the wick CAG guidelines for accessibility? Okay, it stands for Web Content Accessibility Guidelines, and these were released probably 20 years ago, shortly after the internet kind of hit its boom. And these guidelines were built to help disabled users and people using screen readers and other assistive technologies, so that they were able to use the products that people were building, because that wasn’t always the case. So what that looks like, from the design perspective now is we have a checklist per se, that’s, you know, make sure your contrast meets a certain ratio. So that, you know, the text on a dark background like really stands out, things like keyboard tabbing, that is really helpful for people using screen readers. Every single tech company, pretty much should, and they will be sued if they are not complying by these guidelines. There was Wegmans I think got sued, maybe three to four years ago by a person in Florida that was just unable to use their website because of the fact that they weren’t meeting these guidelines. So I’m hesitant to use the word regulations, because I think whenever people hear that term, it’s like, oh, you know, don’t regulate my tech and things like that. But there has to be some guidelines in place. It’s crazy to me that, you know, the internet has existed for this long, and there are not certain guidelines around protecting end users. And there are things like this that are starting to kind of come into practice, there’s the detour act, that has not yet been passed by the Senate, but it was basically outlawed dark pattern. So that’s a great start. And I think, to your going back to your question, it’s really, really difficult to implement this change at at an organisation as large as Google and Apple and Facebook and Twitter and all of these, these tech giants. So my goal with great was with consciences let’s start with the people on the ground, and advocate for change from policymakers. And I think that there’s going to be this blend eventually, where those two can come together, and there’ll be a great marriage of their regulations in place, and guidelines we need to follow because we’ve all agreed to this as a company. And then at the Creator level, we are no longer driving towards, you know, and profiting off of this attention based economy.

Stephanie Warner 27:15
I love that. I love that. And I feel like you’re really ahead of the game, because these conversations are happening. We’re talking about the damage that social media is doing to especially particularly young, young people. And it feels to me, I think your trajectory of a few years in the future. I feel like that’s, that’s pretty accurate. Because we’re at a point where, okay, we’ve talked about this, what can we do? And I feel like what you’re doing now and your organisation and your effort is the answer is the response to to these issues. Because, you know, honestly, I never really thought about on the design level, what could be done to, to kind of curb some of these behaviours. And there, I think there is a way to marry the business goal with the ethical approach. And I think it is going to take pressure from regulation and from users like this conversation isn’t going away, it’s only going to get more and more. It’s only going to get worse, I think, because so many people are being harmed, and we’re, you know, we’re talking about it now. So I think, yeah, go ahead. Yeah,

R Blank 28:22
no, I think I agree with where the conversation ended up, I, Bethany, I totally understand why the word regulatory, is maybe seen as, oh, I don’t even know the right word, but unpleasant buy by a lot of Americans in particular, I feel like these sorts of changes. And tech, by the way, being one of the least regulated industries, that that there is, because the history of tech has been such that you want it free, you want innovation, and you in order to have the most innovation, you need the least regulation. But I, as you talked about in your video, you know, the the hearings, the congressional hearings, where the Facebook of Francis Halligan, the Facebook whistleblower testified, I think this all signifies a sort of a turning point, that it’ll be a slow process. But what I was gonna say is, I feel like what we’re seeing the based on how difficult it is to regulate things in the United States, and how difficult it is, in general to get anything passed at the federal level that we’re going to see this happen in Europe first, where we have seen much stronger privacy, digital privacy protections enabled, they have the right to be forgotten in multiple countries. They have the right to disconnect in multiple countries and soon throughout the entire EU, which is another topic we talked about in another episode. And so that’s why I don’t even consider myself an optimist. But if if I were an optimist, I would be looking to Europe as the place where a lot of these issues are going to be resolved. And once companies have to start complying with laws in Europe, it’s going to become easier to get them to comply with similar laws in the US It states.

Bethany Sonefeld 30:02
Well, I want to go back to something you said, because I think it’s really interesting, just diving deep into the fact that tech for the majority of the last 10 years has been innovate, build fast and break things, that’s kind of a mentality that I’ve heard from the design side of things is, you know, let’s just iterate really quickly and get something out the door, and try to move fast. And when you move fast, it doesn’t make time to have the hard conversations. And so it doesn’t make space for proper research to vet your solutions. So what I think is going to happen in the next three to five years is a gradual slowdown of what we’re building and a mentality of what we’re putting out there in the world is, needs to be considered and needs to really be thought through. Rather than just ship it.

Stephanie Warner 30:59
Move it move fast and break things at some point, you have to fix the things you’ve broken.

R Blank 31:07
Yeah, they forgot that part of the slogan. Cool. So Bethany, we’ve talked about a lot, because we always like leaving listeners with a specific actionable tip. Now I know we’ve we’ve covered a bunch of specific actionable tips in this conversation. And I know you have a lot more to, but let’s let’s, if possible, is there is there one specific piece of advice for our listeners, for them to start finding a healthier balance with the technology in their lives,

Bethany Sonefeld 31:41
I can only choose one thing,

R Blank 31:43
you can do mine, you can break the rules.

Bethany Sonefeld 31:49
I would say reflections a big practice in my life. And it’s helped me to take stock of what’s working and what’s not. And I think something that was kind of a wake up call for me was just noticing and being present, and watching other people be nose deep in their devices. And I’m not saying that phones are bad. I think that that’s definitely the message I want to hit on is that I don’t think devices are bad, I do think it’s the software that’s, you know, being built in these devices that doesn’t have our best interests in mind. So just take stock of where you as an individual are with your own technology use. And that can mean, you know, with your TV with your laptop, or monitor with your own personal device, and just take stock of how you feel afterwards. Do you feel more relaxed after using this tech? Or do you feel more on edge? Do you feel stressed out because you saw you know something on Twitter or Instagram, or social media that made you feel kind of nasty, and then figure out what’s going to work for you to change that behaviour. If it is something that’s affecting you in a negative way, I think a worry I have is that we’re going to become so engrossed in these devices, because they they offer this immediate dopamine hit that the world around us is becoming less exciting, and the people around us are becoming less exciting. So I am worried about the long term effects. And I think it starts with just kind of reflecting on your own personal use, and understanding what you what do you want your relationship to be with tech?

Stephanie Warner 33:32
Right? Thank you for saying how you feel, because I think for myself, that’s been a huge indicator of how I use how I use my devices, is, you know, not accepting the status quo of, well, this this is just something I do. But really asking myself, How does this make me feel? And if and is this affecting me negatively? Is this keeping me up all night? You know, and what how is this technology? Or this this habit that I’ve created? How is it how is it helping or hindering? And if it’s hindering, taking a look at that I appreciate you

R Blank 34:03
know, that was that was that I think that was beautifully said and then you know, the second part of that is, you know, remembering that you as we talked about earlier in the conversation, you remember that you are in control. So once you learn the lessons from your self reflection, remember that you are actually in control of all of the things that are leading to those feelings that you’re experiencing. And that so you have the power to modify those behaviours and modify your relationship with the technology around you. And so yeah, I thank you very much for that answer. That was beautiful. How can our How would you like our listeners to learn more about you and create with conscience?

Bethany Sonefeld 34:39
They can visit create with conscience.com. And on there I have a little bit of background on some of the things that I touched on here in terms of the attention crisis. And then the set of principles are also on there as well. So give those a read. I’ve included specific examples in there because I tend to find that having In those and having a framework to work off of is really important. And then you can follow great with conscience on Twitter. I post all of the I know, right?

Stephanie Warner 35:14
Yeah, it has good things are good. I

R Blank 35:16
totally get it. We have social handles too. But it is funny.

Bethany Sonefeld 35:23
Yes, yes. So I say that because I post about the conferences that I’m speaking at. And I’m at the point where I’m kind of revamping some of my original work that I did with Craig with conscience. So I’d like to share just daily reminders on their on like, Hey, have you taken time away from your device today? So it’s not just to grab to get your attention?

R Blank 35:45
No. Cool. Well, Bethany, thanks. This has been a wonderful discussion. I’m super happy to have stumbled on your work and that you decided to take the time to come join us here on the healthier tech podcast. Thank you so much.

Bethany Sonefeld 35:59
I’m so honoured to be here. Thank you both.

Announcer 36:03
Thank you so much for listening to this episode of the healthier tech podcast. Remember to check the show notes for all the links and resources mentioned in the show. Please like and subscribe to the healthier tech podcast on Apple, Spotify or your podcast platform of choice. Get your free quickstart guide to building a healthy relationship with technology and our latest information at healthier tech.co

Transcribed by https://otter.ai

Don't Miss Out

Get the latest content straight to your inbox

R Blank

R Blank

R Blank is the founder of Healthier Tech and the host of “The Healthier Tech Podcast”, available iTunes, Spotify and all major podcasting platforms.

R has a long background in technology. Previously, R ran a software engineering firm in Los Angeles, producing enterprise-level solutions for blue chip clients including Medtronic, Apple, NBC, Toyota, Disney, Microsoft, the NFL, Ford, IKEA and Mattel.

In the past, he served on the faculty at the University of Southern California Viterbi School of Engineering where he taught software engineering, as well as the University of California, Santa Cruz.

He has spoken at technology conferences around the world, including in the US, Canada, New Zealand and the Netherlands, and he is the co-author of “AdvancED Flex Development” from Apress.

He has an MBA from the UCLA Anderson School of Management and received his bachelor’s degree, with honors, from Columbia University. He has also studied at Cambridge University in the UK; the University of Salamanca in Spain; and the Institute of Foreign Languages in Nizhny Novgorod, Russia.

Connect with R on LinkedIn.

Join Our Email List

Get the latest content from Healthier Tech straight to your inbox. Enter your email address below to join our mailing list.