Airport layover musings: AI, tech, internet culture & giving space for constructive criticism

I’m writing this during another long airport layover after attending IEEE Conference on Games 2024 to give a keynote… I found a place here, out of the way of the crowds, where I can intently tap the touchscreen keyboard without feeling noticed.
The American couple next to me forgot their medicine. I feel bad for the occasional person sprinting past, obviously about to miss their flight, but they might make it?

Airports are a weird headspace to be in.

So…

It’s been on my mind… How social media does not facilitate constructive criticism, it facilitates reaction.
Attention is a currency online, almost as valuable as real money. Mining reaction is a good way to get that.
The culture on these platforms is not one that encourages conversation where you might disagree or learn. It’s built to thrive off controversy. It rewards sarcasm, screenshotting and insincerely dragging people, as part of a numbers feedback loop that can get addicting but offer very little meaningful change.

Like… I’ve seen this dynamic often where an expert in a field will share something and either be intentionally misunderstood, or just ignored in preference to the punchy sarcastic posts from others much less informed.
As someone that lives mostly online, it’s a difficult culture not to internalize. I forget that the real world is a lot more of a nuanced, conversational, forgiving, and empathetic place. People like talking. Conversations don’t just center the one person that said the punchline.
People in the real world can disagree about things, and grow from that. Online, not so much. You’ll just block and move on.

I suppose context for this is important because someone might read this and (in a typical knee-jerk reactionary sense) think I’m advocating for politely disagreeing with Nazis (online discourse always has to function in “oh so you are” extremes).

I’m really thinking more about how important it is to facilitate space for constructive criticism of technology, tools, or trends. The type of criticism given space at places it is being critical of.
I was invited to give a keynote at IEEE Conference on Games (IEEE CoG). I knew about IEEE but didn’t know about the games portion. Of course I said yes. I pitched that I would talk about mainstream trends like AI (and what corporations have been doing) in a critical light, while advocating for the values and philosophies of hobbyist work.
This same talk was rejected at another space for being “too critical” and stabby at capitalism, so I asked them a few times if this topic is OK.
They kept giving a green light, so I put together a very thorough critical talk about tech culture, AI, etc…
(You can read the keynote here.)

Initially I didn’t know that this was such an AI focused conference, so imagine how nervous I was at the idea of presenting criticism.
Based on the way it has become a controversial social media buzzword, short of canceling people over it, I had no idea what to expect.
I always loved AI and the history behind that part of computers. I’m worried about the trajectory it has taken recently. What will the future for this branch of computers look like?

Attending this, I realized that it’s vital for this type of respectful “cross pollination” conversations to happen. You can “preach to the choir” only for so long. Social media is enough of that type of echo chamber.
The black and white tendency of social media normalized writing off a thing as “evil”. Online discourse loves its villains. It makes things seem like they’re not worth participating in meaningfully, especially in a way where you can believe in positive change.

AI is such a buzzword right now that it kind of lost meaning.
This type of thing has been part of computers forever. Game developers work with AI all the time. If you narrow down hate of AI to “generative AI” then that’s also a very broad spectrum of things that includes generative art.
Which has also been around for a long time.
Self proclaimed Twitter/X and Youtube commentators don’t really help with educating people on the discernment (it’s a talking point).
The fear and ignorance surrounding it isn’t helping anyone.

In my keynote I talked a lot about how what AI “threatens” to be is more of a cultural issue in tech. If it isn’t AI then it would be the next buzzword to hype up (scare everyone else with), and then move on from.
I’m already seeing plenty of articles about how AI isn’t making its return on investment and is “too expensive”, so I think reality is starting to set in that the promise to replace artists is unrealistic. There’s this bigger cultural issue that keeps reducing people to data to mine, our online lives as a resource, unchecked exploitation… that’s the problem. It also leads to a lot of the abuse of AI technology that we’re seeing.

Either way, attending a tech conference where I was given space to be critical, and people that work with the thing I’m critical of being receptive, gives the impression that we all want to have these conversations. We need these conversations, otherwise in whose hands are we surrendering it to? Bridges need to be built.
It won’t go away.
No technology is the villain. However, the capitalism driving certain use-cases turns it into one.

I forgot how normal it was to show good use-cases for how automation helps the artistic process. In light of everything, you kinda forget that it can be better.
There was one keynote (I paraphrase all this because I don’t want trouble for the speakers) about how they had a very complex and tedious process between departments to make their games. They built a tool that automates a lot of that back and forth, and made the process better.

If anything, attending a (sincere) conference about (sincere implementations of) AI has made me realize is that automation does not replace artistry. It can’t. Art needs artists. Technology empowers artists.

There’s a level of arrogance to thinking human artists are expendable and “can be replaced by machines” that I don’t think is necessarily as much of an AI issue as I think is a tech-bro issue.
In a way I view AI as a type of victim in this too because it’s the new thing to use for exploitation (and in the process also being exploited). All the decades in innovation, the history AI is coming out of, Ada Lovelace… does not deserve this bad reputation either.

AI, automation, simplifying long and tedious tasks… empowers.
Ultimately digital art was a good thing and it did not kill art. I’m old enough to remember when digital art was a “new evil threat to art”, and it was going to destroy creativity. I was told this by my teachers. At the time I thought that bullshit would never end.
It did end. Digital art became normal. Digital art is still here because it centers the needs of the artistic process. That’s why it stayed. It empowered people. It mattered to people.
That should go without saying. I think current mainstream culture is losing sight of that. Maybe it has for a long time. Consider mass layoffs happening across the game industry right now.

Investor focused “tech bro culture” is ultimately bad for tech because it requires a constant cycle of inflating something, overpromise, then crashing after there’s no more hype to mine.
It means that there’s always the next trend just on the horizon, before it actually follows through with the promises of the last. It needs to take certain tech and put it on this unrealistic pedestal that won’t last long.
If you’ve been here long enough you see this cycle.

Someone I know, that works more in the business part of tech, jokingly asked if Adobe does stock buyback when I talked about Adobe’s decisions lately. The answer was yes by the way.

It’s just the structure of things and it’s imperative that we create alternative avenues of existing and creating here (largely the point of my keynote).

“There’s a responsibility for us, as hobbyists, artists, and developers, to be conscientious about the tools we use and promote through that use.
When we chose tools we should also chose in a manner that empowers tool developers that are offering alternatives.
Similarly, as developers, we have the power to create alternatives. That know-how, ability, means, and access… to create something like a tool and distribute it is not to be taken lightly or for granted. Even the smallest efforts matter to people.
The internet is a participation, and we can build better venues for participation.
Our participation needs to be a conscious decision.”

Closing remarks of my keynote

Who does your tech, tool, data… start with?
Who does it give preferential treatment?
Who does it consider a default starting point? Nothing is ever completely divorced from the politics, or social structures, that it exists in.
Not even games are separate from the politics of the industry that they are made in, and I think it’s time to stop pretending that it’s possible to “keep politics out of…”

“Yet as students flock to STEM careers like computer science, they are losing the grounding that the humanities provide in helping them understand their role in society and the impact their creations have in shaping and being shaped by that society.”

Why Computer Science Needs The Humanities Kalev Leetaru, Forbes

There’s been a long standing discussion about how the humanities needs to be more of an aspect in teaching computer science, and I couldn’t agree more.

Abstraction can be impractical. It can also lead to harm through ignorance. Theory, ideas, tech, tools, apps… need to live in reality, and acknowledge the contexts they exist under.
Nothing exists in a bubble.

Tech that is (or believes itself to be) divorced from social responsibility, systemic power structures, issues of inequality… will always result in harm, for the simple reason that there will always be aspects of alienation and other issues (either intentionally or unintentionally) built into it.
I see a lot of that with current developments in tech, AI being no exception.
As long as there is bias, structural inequality, or other issues of power and alienation, that influenced how something was built… all this makes the thing fundamentally not-neutral.

There are more than enough examples of this in other areas of tech…

“As technology becomes more advanced, so does the scope and harm of these blunders. An essay titled the State of Black America 2018 noted how the digital revolution is increasingly leaving their black consumers behind.”

BIGOTRY ENCODED: RACIAL BIAS IN TECHNOLOGY by Taylor Synclair Goethe

Or even older examples…

“However, although things have improved somewhat, male dummies are still the norm, especially when it comes to determining the safety of the driver versus the passenger. In that situation, as discussed in the ABC.com article, male dummies are still the most tested and there is, therefore, far less data available regarding how automobile crashes affect women drivers”

Female Crash Test Dummies Now Regularly Being Used

Non-diverse teams cannot create diverse output.
You need to consider what biases are baked into your tool, tech, app, platform… based on who you are initially alienating.

There were some definite highlights to IEEE CoG and Aleena Chia’s “Race-Making in Real-Time 3D” was one of them. Aleena made critical points. I don’t want to paraphrase so I’ll just encourage you to please follow this person’s work.
I wish everyone could hear it. It’s a talk I’ll be thinking about for a long time…

…Ok so there’s a mouse that just ran past me, where I’m sitting in the “recharge your phone” corner now. Airport wildlife is a thing? I suppose I’m running out of layover time too… So I’ll stop rambling.

Create space for criticism to be heard.

Social responsibility is a critical aspect of technology. It is necessary.

Make, support, platform, and empower alternatives because we cannot count on corporations to do that.

Explore alternative systems.