Urban75 Home About Offline BrixtonBuzz Contact

Expansion of AI and political / social impacts...

AGI doesn't exist and there's no sign at all that it will any time soon. Even with that said though - abundance doesn't negate scarcity. We don't have as much poverty/hunger/homelessness as we do because we're incapable of increasing the supply of and more fairly distributing resources/food/homes, we have it because there's less profit in doing so. AI models built by some of the most predatory companies on earth and shilled by some of the most delusional crackpots (Thiel, Altman, Musk, Bezos etc.) on it isn't going to go in any utopian direction.

In case what Ive said is being misunderstood I dont believe the technology driven by tech-corps ALONE is going to take us in a utopain direction, Im suggesting that if maximalist predictions are true it will be so disruptive across the class spectrum that it creates a historical upset to the order which creates an opportunity for change from below

"AGI doesn't exist and there's no sign at all that it will any time soon." <debatable, but only time not youtubers will tell
 
In case what Ive said is being misunderstood I dont believe the technology driven by tech-corps ALONE is going to take us in a utopain direction, Im suggesting that if maximalist predictions are true it will be so disruptive across the class spectrum that it creates a historical upset to the order which creates an opportunity for change from below

"AGI doesn't exist and there's no sign at all that it will any time soon." <debatable, but only time not youtubers will tell

What we have is tech driven by tech corps, usually partnered with increasingly supine state actors who don't care to understand new developments beyond the reassurances that it can be mobilised towards more invasive and authoritarian ends.

If you're aware of any actual evidence of our current LLMs leading towards AGI then by all means, share it, afaik there is none.
 
What we have is tech driven by tech corps, usually partnered with increasingly supine state actors who don't care to understand new developments beyond the reassurances that it can be mobilised towards more invasive and authoritarian ends.

If you're aware of any actual evidence of our current LLMs leading towards AGI then by all means, share it, afaik there is none.
Heres one...

I like Wes Roth as he isn't breathless and he has a good overview of different actors and analysts (ignore clickbait thumbnail and titles).
 
Heres one...

I like Wes Roth as he isn't breathless and he has a good overview of different actors and analysts (ignore clickbait thumbnail and titles).


Skimming from your bookmark he seems to be working from the same, slightly nonsensical, definitions of AGI as a lot of AI accelerationists. For one he misuses the idea of 'understanding', taking it just to mean the production of more plausible results (a dog that looks like a dog). This is consistently a huge one for GAI advocates, GAI can generate a professional looking image therefore it can create art, for them the understanding is in the structuring of the product. It still doesn't understand what's actually in that image beyond a narrow set of labels that it distills from initially human interactions with the dataset. It still has no idea what that dog actually is because there is nothing backing up that supposed 'understanding', just the collation of data. There is no interaction with the physical, cultural, psychological, emotional etc entity of 'dog', just a vast archive of images, sentences, videos etc that are labelled with that name.

Granted, AGI is a contentious label itself, he talks about Google's qualifications for the term later on where you could maybe say he's right but drawn into any kind of public discussion, or inter-disciplinary context it collapses completely. Again, the recent Gemini fiasco speaks to how weak the tech efficiency only view of AGI as a concept is. Railroaded into the task of offering diversity GAI was immediately lost, it could be programmed to increase the frequency of certain labels in the output but it has absolutely no understanding of how or why that should be attempted, or what that means, so what it outputs is nonsense.

There's all sorts of effects AI may have in replacing or displacing labour or human efforts but it's only really the hype men who frame that as a manifestation of AGI which contains any real sort of understanding or 'self'. And it's that misunderstanding, which to be cynical is often intentional, which has poisoned the well of AI research so much and which also makes AI interactions with reality so jarring (and usually inefficient).

*Should also say that all he seems to offer are opinions and speculation from people speculating about their opinions. Evidence, imo, would be tangiable instances of AI displaying something that really shows 'understanding', 'imagination' or 'autonomy'.
 
Evidence, imo, would be tangiable instances of AI displaying something that really shows 'understanding', 'imagination' or 'autonomy'.
...there's no evidence at present as it doesn't exist yet...the whole conversation is based on benchmarks repeatedly being reached ahead of projections and signals from insiders about what is happening in the proverbial 'basement', behind closed doors.
 
...there's no evidence at present as it doesn't exist yet...the whole conversation is based on benchmarks repeatedly being reached ahead of projections and signals from insiders about what is happening in the proverbial 'basement', behind closed doors.

So hype from major corporations and startups angling for payouts..? Surely after the Metaverse, Sam Bankman Fried, NFTs, a million crypto currencies, Musk's yearly AI/self driving/robots/Mars is just around the corner speeches, VR et al hype a bit more cynicism about Silicon Valley rumours would be sensible.
 
So hype from major corporations and startups angling for payouts..? Surely after the Metaverse, Sam Bankman Fried, NFTs, a million crypto currencies, Musk's yearly AI/self driving/robots/Mars is just around the corner speeches, VR et al hype a bit more cynicism about Silicon Valley rumours would be sensible.
very fair :D
though i never bought into any of that*...I dont know why this seems different to me

*robotic advances are clearly objectively impressive
 
I see it as the other way round. Film production is already the preserve of the wealthy. AI opens up access to creativity for the lower orders.
You keep saying that, but you don't explain how.

Also an awful lot of those big budgets goes on ordinary people's wages.
 
You keep saying that, but you don't explain how.

Also an awful lot of those big budgets goes on ordinary people's wages.
Because you can take out a Runway account for £30 a month, and then produce a four second clip that would cost you ten times that for labour alone without considering equipment costs.
That opens up possibilities for the lay man (such as myself) who simply can’t afford filming budgets but want to be creative with film.
 
Because you can take out a Runway account for £30 a month, and then produce a four second clip that would cost you ten times that for labour alone without considering equipment costs.
That opens up possibilities for the lay man (such as myself) who simply can’t afford filming budgets but want to be creative with film.

Ah. Yes, a clip that lasts less time than the average burp and looks like everyone else's is definitely going to revolutionise film-making.
 
Because you can take out a Runway account for £30 a month, and then produce a four second clip that would cost you ten times that for labour alone without considering equipment costs.
That opens up possibilities for the lay man (such as myself) who simply can’t afford filming budgets but want to be creative with film.

You're not being creative with film, you're using a wish fulfillment tool which distills a load of stolen content down to an artless product that, presumably, offers you some gratification. Although even that certainly isn't a product of any real creativity.

It's also not affordable to a lot of people here, never mind others around the world whose labour has been exploited and actual, real creative work stolen so you can imitate a process you're not in the slightest involved with.
 
You're not being creative with film, you're using a wish fulfillment tool which distills a load of stolen content down to an artless product that, presumably, offers you some gratification. Although even that certainly isn't a product of any real creativity.

It's also not affordable to a lot of people here, never mind others around the world whose labour has been exploited and actual, real creative work stolen so you can imitate a process you're not in the slightest involved with.
As if traditional film production doesn’t exploit foreign workers in many ways ffs.
 
Few points on that...

1: GAI is massively exploitative of workers. Pretty much every publicised instance of it is built on stolen content and beyond that is reliant on low paid workers, often in the Global South, facing shitty conditions to categorise data. Something like Amazon's MTurk for example isn't just exploitative, it's a whole new model of dehumanisation for labour which isolates and eradicates agency for workers both through algorithmic management and the removal of basic human identity within the production process. That's a model that's ubiquitously replicated in building training models. And that's before you get to the exploitation and destruction in the material supply chains to build the infrastructure necessary for AI to function.

2: Sooner or later we will see a surge in copyright claims against GAI models for their blatant theft, those claims will come from corporate media platforms and a minority of wealthier artists, not from your average working creator. Already companies are both expanding their training base and getting ahead of those claims by striking details with platform operators and institutions which completely bypass any notion of consent (or even awareness) to inclusion in these datasets. Reddit is the most recent one to sell off its user's work en masse but within the last few weeks universities have been signing up for it too and there was a whole SAG strike about the potential for Hollywood to sell of artists work without consent or recompense.

3: GAI isn't emerging as a free and open system of tools, it's being built by companies (OpenAI, Meta, Microsoft (through OpenAI) & Google) whose sole purpose is profit. While there's value in offering free iterations to generate investment and popular interest they're still going to monetise those tools ultimately, especially when it comes to robust implementations for commercial use. The cost barriers as well as the access to tech and (to a lesser degree) training will all be barriers to the 'lower orders' getting involved.

4: Even where it does allow access on a fairly open level both the companies supporting GAI and the corporate environment they exist in are built on platform capitalism. Any liberatory quality to these tools (which I'm cynical about anyway) will be usurped by the realities of distribution and the attention economy. Anyone(ish) can make music on their laptop but the Creative Commons/free/indy music models were all still gutted by Spotify and YouTube. The culture of corporate centralisation is as present with GAI as it is with everything else (OpenAI has deals with numerous tech companies, hence stuff like Google Copilot).

5: AI in general, including GAI, is absolutely atrocious at dealing with inclusivity and bias. It's built on oblique training data which is held as corporate secrets and uses algorithms which, even where visible, are usually incomprehensible to the layman. The recent stuff over Gemini is a case in point on that one, it was railroaded into a model of corporate diversity which lead to some pretty absurd results. It was in no way representative, the industry generally isn't, but it did offer plenty of fodder to racists insisting that AI is 'woke'. And even if that was a fairly daft little scandal (of sorts) it still reflects how ill considered GAI generally is.

6: It's wish fulfillment above all else. GAI doesn't enable 'creativity', it enables exploitative requests. It has no creative process or craft, just a demand for immediate gratification fed by underpaid, invisible, largely unpaid labour. It's a glorified version of typing 'big tits' into Google.

7: GAI actively undermines real human creativity. One of the big claims of GAI evangelists is always 'now anyone can be an artist!', which basically translates as 'now everyone can create content which we feel matches up with a commercial defintion of 'proper', which is to say profitable'. GAIs with built in house styles and processes built on theft inevitably narrow the scope of potential by having no space for the originality that emerges in learning about art and crafts as well as demeaning and denying the more universal forms of creativity that genuinely are liberatory. If you're just starting to paint for example you might experiment and discover some new and personal way to do it or you might just look at the GAI and go 'well, shit, I can be a master tomorrow, why bother being bad?'



AGI doesn't exist and there's no sign at all that it will any time soon. Even with that said though - abundance doesn't negate scarcity. We don't have as much poverty/hunger/homelessness as we do because we're incapable of increasing the supply of and more fairly distributing resources/food/homes, we have it because there's less profit in doing so. AI models built by some of the most predatory companies on earth and shilled by some of the most delusional crackpots (Thiel, Altman, Musk, Bezos etc.) on it isn't going to go in any utopian direction.

If food and housing were plentiful, capital would simply create sufficient scarcity to create demand.

See “Veblen goods”…
 
As if traditional film production doesn’t exploit foreign workers in many ways ffs.

So your great argument for GAI is based on its liberatory capacity to make exploiting workers possible for anyone who can afford £30 a month? What a brave new world. Even better, most of those who have their work stolen aren't even aware of it, so no organising, no legal recourse, no potential for the withdrawal of labour. Great stuff.
 
I didn’t say it was going to revolutionise film making. I said it opened up opportunities for the lower orders.

You said it opened up film making to the lower orders. What you've described absolutely does not do anything remotely like that.

An awful lot of the "lower orders" work in film making, especially as crew and in anciliai services, and they get paid for it.
 
You said it opened up film making to the lower orders. What you've described absolutely does not do anything remotely like that.

An awful lot of the "lower orders" work in film making, especially as crew and in anciliai services, and they get paid for it.
I’ve made a film btw. And I did pay crew. Some of them were trying to actively ruin the project out of jealousy. That’s my experience of it.
 
So your great argument for GAI is based on its liberatory capacity to make exploiting workers possible for anyone who can afford £30 a month? What a brave new world. Even better, most of those who have their work stolen aren't even aware of it, so no organising, no legal recourse, no potential for the withdrawal of labour. Great stuff.
It’s great how neo-luddites are all about copyright.
It’s possible to make clips without stealing work btw.
 
I’ve made a film btw. And I did pay crew. Some of them were trying to actively ruin the project out of jealousy. That’s my experience of it.

Well then that must mean all films should be replaced by seconds-long, stolen, repetitive clips! That's so creative
 
It’s great how neo-luddites are all about copyright.
It’s possible to make clips without stealing work btw.

Not using GAI it isn't. And if you object to copyright then feel free to ignore it, GAI also ignores Creative Commons licenses and any and all notions of consent over data usage.
 
And copyright?
What kind of stuff do you think I want to make?

No idea. But anything you make using GAI will be the product of training on copyright, Creative Commons and non-consentually taken work. Unless you're working exclusively from your own curated dataset which, with something like Runway, you aren't.
 
Back
Top Bottom