fbpx

season 2 episode 8 – depersonalizing digital platforms | Ben Grosser

In the second episode with Ben Grosser, an artist focused on the cultural, social, and political effects of software, he talks to us about taking the algorithms out of different digital platforms, teaching students how to analyze digital platforms’ systems, and doom scrolling.

How did he hack TikTok? Should Spotify and Netflix rethink their algorithms? Nir and Ben discuss this and more.

 

 

Nico Daswani The Artian Podcast

Transcripts

The transcript was produced by an AI, mistakes might appear. 

Nir Hindi: I want to ask you another question.

If I may, you are teaching the university of Illinois, the department of art and design. And I think one of the roles of educators is to push the way their students think. How do you challenge your students to think different?

Ben Grosser: So, I mean, I do it in different ways. The, the way I tend to think about it is that.

I am not teaching them how to use a piece of software. You know, I teach digital art for the most part. I am not [00:42:00] teaching them how to use a piece of software. I am not teaching them how to press a button in Photoshop. I am not teaching them how to code a bouncing ball. Although I do those things, what I am trying to teach them is to develop the ways through which they can understand that they are humans living within a system or a set of systems.

And that it is about developing a, a way of looking and developing a way of listening to the world. How do they see, how do they hear? And so that practice then means we are constantly taking our every day. Like the things within our everyday environment. For me, it often is technology and say social media platform.

So, you know, take an interface, take a software interface, put it in front of the class and look at it and analyze the tiniest, smallest little bits of it. How does it look? Why is it this way? Who wants it this way? Who was that in service of how might we make it different? [00:43:00] What if we could change the rules?

You know, in other words, there is a system we are in the middle of it. How can you see it? And then how can you change it? That is really what I am trying to do when I am teaching. And I use the things that they are most familiar with as a way of keeping them engaged and showing them that it is really, it is not apart from them.

It is, it is where they live every day. So, we will look at Instagram, we will look at TikTok, we will look at Microsoft word, you know, we will look at Gmail, we will look at search, like whatever it is, um, whatever is in front of them. We can kind of take any platform, any piece of software, any piece of hardware to, and analyze it from within that frame.

That is how I tend to think about my role.

Nir Hindi: I love it. I mean, because it is kind of spot on to always, what I say artists lead with questions. And in just one exercise, I think you invited maybe 10, 15 questions like this. So, you kind of touched one of the, I think the topics I really wanted to get to speak about, which is TikTok [00:44:00] and we already had a conversation about it.

And when I joined TikTok, I think it was at the beginning of 2019. So almost two years ago, very, very early on in the platform lives. I saw it for the first time in China and it is kind of was hooking. I do not know what it about it was, but it is like something that catches and. When I joined at the beginning, it was like pure creativity it is like, everyone did something different.

And then I left the platform and I think that when I read. What you wrote about TikTok and the, what you developed for TikTok, which in a second you will tell us. I think that is when the moment I kind of, yeah. Okay. Now I understand why I left TikTok, and you said that that kind of the TikTok invite the or makes us find the best path to mimicry and conformity and basically everyone copying each other.

What are your thoughts on that? First, what do you think about TikTok and then what, how explain us [00:45:00] this again, mimicry and conformity, because I think it is like,

Ben Grosser: yeah, so, I mean, I, I came to TikTok later than you. I would probably guess it was late 2019, or probably late 2019 for me, when I first downloaded it, learned about it for my students who a few of them were quite excited about it.

And well, the first thing I noticed about TikTok was this is what all my students are excited about. I do not get it. Um, I do not, I do not know why this is interesting to them, but you know, the next day I loaded it up again and found that an hour later, I was still scrolling through this feed without any, without realizing I had spent that much time at it.

And then I kept finding myself doing that repeatedly, into someone interested in the designs of software and its cultural effects. I am like, well, this is interesting. Right. Um, but even by that point, you know, and you and I were talking about earlier, how it probably changed from your first experience to my first experience, but one of the things that was most prevalent on the platform at that moment and still is, [00:46:00] um, is, uh, lip-syncing and dance.

And so, um, Charli D’amelio was, I do not know that she was not the star. She is now that is the most followed Tik TOK or at least I think she is, she was certainly at one point. Um, but you know, she would do a dance and then all kinds of other people would try to. Reproduce the same dance to the same song in the same way.

And part of what I really started to think about with that culture that was really becoming visible to me on TikTok is that it was a platform that very much in the kind of in the media’s description of it in 2020 gets talked about as the hot new video-based platform where extreme creativity is happening in ways that is not possible on Instagram and Snapchat and whatnot.

So, kind of the hot, new social media [00:47:00] space. But for me, I, I just kept seeing that what the platform seems to produce so much of is. A space for creativity is measured in the ability to conform to what someone else is doing and the, where the creativity happens is in minor variation from it. So, can I do the exact same dance?

There is a lot of value placed on being able to exactly mimic Bella Poarch and her, you know, M to the B kind of faces, zoom videos, and you have got. Just video after video of people trying to do the exact same facial expressions and motions to the music and the same way that Bella Poarch did. And then perhaps some creativity emerges from tiny variations.

So, I did it the same way, but I did this, I added this tiny little thing, so that is a strange way to describe the most creative opportunity in social media, from my [00:48:00] perspective. Now that is not the only thing going on there, of course, but it seemed to me the more I watched over time that I was seeing increasing amounts of this kind of material such that.

I would scroll these 15, second to one-minute videos, flip, flip, flip, flip, flip. And so many of them looked so similar and sounded the same as every other one. And that is a strange situation. I also, you know, along those same lines in terms of the scroll, I have tried to think a lot about, I mean, it is so funny.

Like I can easily find myself. Start to scroll TikTok in an hour, two hours, three hours later, I am like, Oh my God, I have been doing this for two or three hours. And I am like, really? Like, what did I do for two or three hours? But it is, it is so common of an experience with the platform that it is a meme itself that people then reproduce of the, like people stuck on TikTok, and then waking up, like, I am just going to look at TikTok for a minute and then flash.

And now it is like [00:49:00] morning out. It is like light outside and they have been doing it for eight hours. Like I have seen that meme over and over and reproduced by people on TikTok. So, it is a common experience and there is something about the way in which they get lined up the videos, get lined up on your automated algorithmic feed for you that I think produces that inability to step away, even when I am tired, even when I am not even finding that many videos that I am interested in anymore, because.

The fact that there is always one there, you know, there is one more right below where you are right now.

Nir Hindi: And it is so easy.

to get to it.

Ben Grosser: One flip of the thumb, one sensuous, you know, kind of gesture of your thumb brings the next one. And that one might be the best TikTok you have ever seen. And let us face it.

It is 15 seconds. Like you can spend three seconds on it, five seconds on it. So, it is not like the commitment is very large. You do not have three more seconds to see if you could see the best video, you have ever seen in your life. But of [00:50:00] course, 99% of them are not that. And you looked at all 99% of those to maybe get to the one.

So, or maybe in on Thursday evening when I spent three hours doing it. I might not have seen any great videos. So, there is a way in which it sets this, like the design of the interface really keeps me.

Nir Hindi: Yeah. You responded to that. You created the, how you call it, not for you, which is kind of the opposite is for you hash tagging the TikTok, that everyone, or the feed for you and you created the not for you.

What is this? Not for you? How does it work?

Ben Grosser: Yeah. So just in case, there are not people who are not TikToking out there in the world. Um, the nomenclature for the algorithmic feed. On TikTok is the for you page. So, it is the page, it is the stream that was created for you and, and kind of their claim to fame, ByteDance as a company is that they have come up with an [00:51:00] artificially intelligent profiling system for understanding a way for giving you content that will make it difficult for you to step away.

And I think we could challenge that it is as capable as it is. I think it is more other interface design things that we have talked about a bit, but the, for you page is where you, like you load up TikTok, what you see as the, for you page and its videos that have been queued up for you. And you do not have to follow anybody, you know, like you can, but you will have an endless stream of videos no matter what, and it just watches what you do and gives you more of what they say.

It gives you more of what you want to see. But one thing we know from plenty of research at this point about algorithmic feeds and search engines and other kinds of profiling technologies that try to be predictive of what a human want is that they tend to produce what Eli Pariser called a filter bubble, that you will start to see more of the world from a narrower and narrower point of view.

And [00:52:00] you will not really see all these other things out there and such that you start to think that is the world. And there is not anything else out there. And I think this is part of what has happened with Facebook and groups and misinformation. It is certainly prevalent with Google and ways that Google does things.

And I think it is very common on TikTok as well. And I think it accounts for in some ways the. The way in which the feed became less and less interesting to me, the more I played with it, because there was, I was not getting that variation that maybe is even out there, you know, it is like, I do not know how many times I’ve like been on YouTube, for example.

And like, if I look at, uh, a car video, like a car review or something, well now my feed is full of car reviews and I cannot even find anything else. You know, it just thinks all I wanted to see is cars. Now I like cars or whatever. I do not like watching videos about cars. I do not spend much time with them otherwise.

So, this kind of algorithmic profile that all these platforms are using it to keep us engaged, I think produces all kinds of negative effects, but also [00:53:00] even does not even give me what might be most engaging. And so that is what led me to create this piece called not for you, not for you is a browser extension-based work that.

It is what I call an automated confusion system for TikTok will you log into tech on the web browser? And then you turn not for you on, and now it drives TikTok for you. It browses the for you page and it likes videos, and it clicks on them and it looks at follows hashtags, and it then looks at someone who commented and goes and looks at their videos and kind of looks at a few of those.

And it sees what song they use, and it clicks on that song. And then it follows like, well who is else is making things with that song and it just does this forever. And the idea is that it is producing data. For the, for your algorithm that has nothing to do with who I am or how I respond to the interface or what to do.

I think I want as a way of surfacing [00:54:00] material that is different from what my profile thinks of me, but also as a way of potentially surfacing material that TikTok, maybe is not so excited, maybe they are trying to suppress, there has been some good reporting on kind of some of the ways the content moderation policies at, ByteDance and how they make decisions about certain consequences that should be elevated or not.

So, I find it a fascinating thing to watch run, because first, you just, I just try it for a little bit of time. And suddenly, I am seeing things I never knew were on TikTok at all at the most. Striking thing I realize is how localized my, for you page is too typically that there is all this international and foreign language content from all around the world on TikTok that I never saw.

And maybe I could go search for it if I knew what to search for, but the, for you page, is not thinking I am interested in that content at all. And it looks at things differently. Like they do not all [00:55:00] conform to this, at least for me, a kind of a very Western Americanized kind of sensibility.

Nir Hindi: You’re talking about then I think it is like it suddenly make TikTok again exciting for me to go and check to be able to get exposed.

And yeah, I mean, I think we talked about it again in the past, is that what I hate about Spotify and Netflix is that they just profile me and then I get the same. So, if I listen to jazz, that is it. Jazz is the only music in my life I want to hear, and I cannot even find anything else. And if I saw one that I am always sport is a series on Netflix, that is it for the rest of my life.

I am doomed to watch sports and drama. And I do not understand why those platforms do not give you the option to delete my profile. Mix my profile. Um, do not profile me for three months. I think in many ways that will create better experience certain balancing it. Then [00:56:00] 100% profiling to. Give me the option to explore, get exposed to new music, get exposed to new artists, get exposed to new TV show, get exposed to new movies.

So, I do not know if I can raise it as a request, if you can develop hope not for you slash Netflix, not for you, slash Spotify. I will highly appreciate it. I am positive that there are more people in the audience that would.

love it.

Ben Grosser: I will get right on it. You know, I think there is so many things of what you just said.

And we, one of the things we talked about early in this conversation is, you know, growth as a state of being and part of what I think. The software developers are trying to contend with is explosion the endless growth of content and how can they take this endless trove of things you could listen to, or this endless trip, but things you could watch and, and somehow make your experience simpler.

It is like, they all forgot what it is [00:57:00] like to go to the video store. I mean, I know we do not go to video stores anymore, but some of my best experiences came out of like the new release section, not having the video that I came to the store for and going, okay, what do I get? And what is going to happen now.

It is like, they never went to a library and went to the place where the book was that they wanted and it was not there, but they still looked at the shelf and found things that are interesting and maybe found their way to something they never would have been looking for in the first place. Um, and this thing about like, We have come.

We have gotten to a point whereas users of algorithmic platforms, we all are carrying around in our heads, whether we realize it or not kind of an internalized picture of how all these different platforms, algorithms work, like you have some sense of like, well, I should not listen to that on my Spotify account because if I do, it is going to pollute my Spotify account in this way.

And if I am on, I mean, I have like eight Netflix profiles, [00:58:00] just so I can just keep starting over. Like, one of them is called Netflix bankruptcy because I am just, I need to start over. Cause I get stuck in a niche and I do not want to be in the, you know, I have really been enjoying criterion channel as a streaming platform lately.

Not only because I liked the movies, but because it is not trying to figure out what I want. It is just, here is what we have. And you can drill. It is like the library. It is not like. The newsfeed, you know, it is like, here is all the stuff you can just click through it all. And then we change what is there and our role is, or as curators of the material.

And then you can just choose from there, from everything we have selected for the month. And, you know, I think there is going to be up more opportunities down the road for, I mean, if there is an opportunity out there in a streaming music platform, it to be, to do something more along those lines, rather than just continuously try and predict, because I find Spotify just so bad at figuring out what I am interested in.

I do not know how they’ve [00:59:00] raking their decisions, but they like, Oh, well, here is a lot of things like that thing you just listened to. And I listened to the list and like, none of it has anything to do with why I listened to that original. Track and it is however they are characterizing things have nothing to do with the way I listen,

Nir Hindi: if you cannot

integrate into anyone.

Ben Grosser: Yeah.

You know, I mean, Spotify is a little more complicated, right. Because Spotify, um, well, if I were to go literally automate clicks in Spotify, it is having monetary effects on the company. Right. I mean, I suppose there’s bandwidth issues with TikTok, but nobody like it is, it is not, it is not liking very much stuff.

Just like, I do not like very much stuff,

Nir Hindi: you know, it is funny because in TikTok I think I did like maybe two, three, four, five videos of how to use Instagram to engage users and that is it I am doomed. All the videos that I get are this. Okay. I, saw five I am.

sorry, I [01:00:00] apologize.

so, kind of a work around for me is that now what I do is that I do not just keep, I just say, I am not interested in this video, so those videos do not come again. So, it is such a fascinating conversation, and I cannot leave this conversation without asking you about another technology that kind of dominates our life, which is the series slash Google home.

what do you think about these technologies? And then again, You always respond to basically every technology that out there, at least the one that I know,

Ben Grosser: one of the things I do with somebody interested in technologies, I watch a lot of sci-fi movies are certainly a joy of mine, and there is such a strong history and a relationship between sci-fi and technology that ends up getting built.

And it has been a Holy grail for as long as I have been watching sci-fi to just be able to have a conversation with the computer through language and speech synthesis has come so [01:01:00] far and, and machine learning now has made automated understanding, um, or artists automated translation. I should say more transcription possible in ways that it was not just five, 10 years ago.

And you know, I have an iPhone and so I speak to Siri and I ask her very simple questions. Really. I do not have a big conversation with Siri. I asked her to set timers and I asked her what the weather is and. Uh, my, maybe a few other things. And it seems like as much of the time as I do that, I am also just saying, Siri, I was not talking to you.

I was having a conversation with someone else. So those parts of that experience of course will likely improve over time. But personally, you know, one of the areas of focus for me as an artist has been thinking about the ways in which data collection has negative effects for us as individuals and collectively as a [01:02:00] society.

The more that, you know, what we say on a platform or what we look at, what we search for in Google or which video we spend the most time looking at on TikTok, not only helps those companies give us more of what they think we want, but it also creates an endless trove of data that can be used in nefarious ways.

You know, I have a work called. ScareMail that came out after, um, the, uh, Edward Snowden revealed the existence of illegal mass surveillance by the national security agency, the United States. And it was still out there. You can use it if you want. It. It is a piece that works with Gmail and every time you hit compose, it will generate a unique nonsense narrative and attach it to the end of your email, containing a list of probable NSA, search keywords embedded within it.

So, it is like a little, it is a way of inserting noise and nonsense into the NSA’s data banks, but it also [01:03:00] attracts the attention of NSA algorithms, making them feel like they must something to pay attention to. It is a way of wasting resources at the NSA is another way to think about it. But yeah, most important.

It is a way of getting us humans to think about, are you willing to use a work-like ScareMail or not? And if not, why not? Like why does adding nonsense to the end of your email? Why is that dangerous? When it is truly nonsense, it truly means nothing. And it gets at this aspect of surveillance. And so, I cannot help, but think about these new intelligent agents through that lens, personally, that they create a new potential for interaction.

And it is a very natural feeling mode of interaction when it works, but it also creates the conditions under which we become comfortable being subject to computational listening machines everywhere all [01:04:00] the time. And it just reflects on other aspects of surveillance and observation, including video surveillance and face detection and everything.

We enter a messaging program and our email programs and our, this and our that. So, I think about, you know, the Alexa’s and the fact that like, Amazon will there, they get cheaper and cheaper and like, I keep getting offers occasionally. Yeah. We will give you a free Google, whatever, and a free Amazon, whatever, like buy this other thing.

And it is to their advantage to, for us to have microphones throughout our homes and throughout inner pockets and in our cars and in the office, because it produces a lot of useful data. That doesn’t mean that I think they’re all sitting there around going, how can I ruin the world by capturing data and surveilling you, you know, But most technologies don’t start that way at Mark Zuckerberg, didn’t sit down and go, how could I create a platform that would lead to the election [01:05:00] of fascist head of the United States and, you know, heading all the way for like an attempted insurrection at the Capitol and attempted coup like that wasn’t Mark Zuckerberg’s idea when he thought make the world more open and connected, but it’s one of the outcomes of trying to connect everyone and trying to create a feed that will give everybody what it is they’re most engaged by and exclude any contrary views.

And so, I cannot help, but think about the ways that having all these listening machines all around us are concerning about, you know, where that eventually leads.

Nir Hindi: Great. So, you know, I live in, in a multi-Alexa house.  I am interested to understand how you responded to Alexa and maybe voice surveillance platforms, or algorithms with the doom scroller.

What is this doom scroller and why we should use it?

Ben Grosser: Uh, the endless doom scroller is the, is the name of work. [01:06:00] Uh, that is I have made in the middle of 2020, and it really was responding to the conditions. So many of our us found ourselves within, from March 20 20 onward, um, where the world had rapidly and radically changed in terms of our agency to be fluid across it and doom scrolling as a term, although it has been around, I think two or three years, it really kind of came into popular conversation in 2020.

That is a term to describe ways that we in the pandemic often find ourselves regularly. And in some cases, I would say involuntarily scrolling bad news headlines on our phone, often for hours at night, before we go to bed and then wake up and go right back to the phone. I mean, I am certainly describing my own experience here where I would stay up later than I intended [01:07:00] reading, scanning bad news headlines and, um, wake up and go right back to it.

And it was not a productive experience. I still can find myself engaged in it. Certainly, the realities of the pandemic necessitate, a level of vigilance for the purposes of personal safety. We need to be paying attention to some extent to what the news is. But doom scrolling this condition I am describing; it is not just a natural reaction to the news of the day.

It is the result of a perfect yet evil marriage between a populous stuck online social media interfaces, designed to game and hold our attention and the realities of the existential global crisis that we find ourselves in. Right. So yes, it may be hard to look away from bad news in any way format, but it is extremely difficult to do so when it is presented to us [01:08:00] by interfaces designed to hold our attention.

And the thing that has most held our attention and most attracted our attention for almost a year now is COVID-19 it is the pandemic it is where are things going? When is it going to get better? How much worse is it going to get? Um, after finding myself stuck in this condition, like so many other people, I started to think about, well, what is the mechanism behind this software induced kind of condition of despair that why can’t I step away from the interface?

Why do you mind keep scrolling bad news headlines? The truth is, you know, especially in those first few months, we kind of knew what to do. I mean, put your mask on social distance. Work at home. If you can, if you must work out in the world, then do it in as safe way as possible. It was not changing all that much yet.

We are just constantly looking for that thing. And so, I wanted to kind of step back and think about like the mechanism, the mechanisms at play, [01:09:00] and there were two mechanisms. Really? One is. The endless scroll, the infinite scroll that has become such a feature of TikTok and Facebook and Instagram, and all the other platforms designed to keep us engaged.

And then also thinking about headlines and the media and the way that headlines get composed and what it is a headline does. And so, I decided to distill those down to their barest essentials. I took headlines over the course of a month or more, and I would just like read headlines every day and I would kind of abstract them into just simple versions.

And then I also created this interface where all it does is it presents those headlines to you and you literally can never scroll to the bottom. They just never stopped coming. So, you can scroll as fast as you want or as slow as you want, but it is just, it is literally endless. Doom scroller. And for me, it is a way of producing [01:10:00] some bit of mindfulness about what is it we are doing when we want to engage in this activity.

Um, I even talk about it as enabling a sort of exposure or substitution therapy, a way to like, you know, a common reaction I get to the work is someone plays with it for a while and they, and their reaction is, Oh, this is what I have been doing. Like I am not even clicking on these articles. I am just scrolling, looking for.

Maybe looking for the good news during the bad, right. That’s probably a good possibility for what we’re looking for, but just like with the TikTok videos that we talked about earlier, you know, even if you see a good headline at the bottom of the screen, you know, there might be a bad one right below, and it’s so hard to not just go a little bit further and then you see the bad one, but now you’re stuck with me for a good one again, and the cycle never ends.

So that is what the work is. It is a simple interface for [01:11:00] presenting bad news headlines endlessly.

Nir Hindi: You

know, one of the features that I discovered in Alexa is a skill that called good news. And they read you good news because I am, I am avoiding news. I do not know if it is an ignorant act or action, or, but I just can.

Bear. The news is just, you know, it depresses me put me into stress and the news are just the same, same politician, same, same things in the same hours. And everything is the same. And everything is with the purpose to kind of scare you. There is no hope like, okay, I do not want it, normally. What I do is in the morning I ask Alexa, tell me good news.

And then they tell you about the species that suddenly kind of managed to survive and become popular again, or they tell you about the key, the data. So, I wish we could have more algorithms that kind of bring this or [01:12:00] search for this good news and create a channel for good news,

Ben Grosser: man. I think you are hitting on something important about the news and the way it gets written and headlines too, which is that.

I mean the classic kind of media response to why do you show so much bad news? Like, why do not you show good news is it is always, well, that is what people want to read. That is what people want to hear about. That is what gets the most clicks. But again, there is that thing the most more like maybe we need new sources that are not only designed to get more clicks.

What if I want a new source that is designed to get the least clicks? What if there was an award for, I wrote the headline that nobody wanted to read. I mean, nobody, maybe is not going to do much for you, but it is a different way of thinking about this problem, right? Where, you know, even when I talk to journalists, you know, maybe it is an interview for an article they are writing and then the piece comes out.

That is a conversation I have had multiple times as well. I did not write the headline, the journalists, you know, [01:13:00] do not, it is the editor that writes the headline and sometimes journalists, maybe do not love the headline. They got attached. Their story. Those are two different roles, like the news and the headline that is designed to attract your attention.

But I think what is even weirder now that is, I mean, I am just as guilty of this as anybody. So often these days, I do not read the articles, I just read the headlines and maybe the tagline below it, you know, I have not, I go to the New York times, I scan the page. I think that is given me something. I go to the Washington post.

I scan the page. I think that has got to go to the guardian, like whatever it is, I go to Facebook. I scan my feet. I read the headlines of the things that have been posted by my friends. And that is a weird view of the world because these are the blips that are designed to. Keep us focused on the interface, not to keep us informed.

And that is a pretty big difference of motivation. And I mean, I even ended up having conversation with my spouse where I will say, well, did you, you see [01:14:00] that headline about sick? Yeah. Um, did you read the article? No, I did not. I did not read the article early, like reflects that headline. I do not know, but yeah, you know, good news, right?

It is like we could use some and of course there’s opportunities for it. When all these systems are built to preference, whatever to advance, whatever material it is that keeps us engaged and focused rather than informed and happy or optimistic. Then we end up with what we have, which is a planet full of people, doom, scrolling too late into the night and not getting enough sleep.

Nir Hindi: At least now our listeners have some tools to avoid it, at least on computer. I mean, I do not know how it works if it works on the apps themselves, but yeah, one of the things that I try to do is just to put on airplane mode and put it outside my room. I try, it is not every day or every night that I managed to do.

So, I always give [01:15:00] myself the excuse that my alarm is on the iPhone. Even though that alarms were Masco much before

Ben Grosser: we make these deals with ourselves all the time around technology. Right. I did the same thing. I, I was, I got to the point where I was getting so little sleep and I was just so worn out in like late March, early April, whatever it was just that I started putting in my phone in the kitchen at night, plugging in the kitchen and go into the bedroom.

And I had the best month of sleep I had in a long time. And., I do not know, somehow, I rationalized letting it creep back into the bedroom and now I am on TikTok, you know, till six in the morning or whatever it is for me, that is not that late, but they are so attractive. There are so many ways in which they are they had; they attract us to them that I think it takes deliberate action.

It takes intention and it takes creativity to find a way. To gain regain some agency [01:16:00] in a world where we are embedded in surrounded by, and quite frankly have no choice for many of us, but to interact with all these technologies and software platforms.

Nir Hindi: Ben, I think it is a great message. Kind. Even finish our conversation.

It has been such a wonderful time to chat with you. I mean, so fascinating and I really hope that we will have the chance to maybe have another podcast when you have new technologies and new works to share for everyone that interested in a Ben’s work, you can find him on his website, bengrosser.com. You can follow him on the same name on Twitter.

So, he is available over there. Ben, thank you. Very, very, very,

Ben Grosser: yeah, it was a pleasure to chat. Thanks for reaching out.

 

Pin It on Pinterest

Share This