Urban75 Home About Offline BrixtonBuzz Contact

Feminism and a world designed for men

I bought my car 5 years ago and I can adjust seat height, pitch, steering wheel angle, temparatue on different sides of the car etc but not the height of the seatbelt. If I make the seat high enough so that it isn't cutting into the side of my neck, it's a really uncomfortable (and rubbish) driving position. It would cost very little to make it adjustable but it's not considered necessary :(

I believe you get it on some cars, but its considered a "luxury" feature.

Until embarassingly recently I thought the little "switch" on the mirror was a "Mum/Dad" button, because the change in tilt in my folks' cars when I was growing up was exactly the right height to adjust it for their relative height difference.
 
ok. Do you think that systematic discrimination against women built into these systems is there by conscious design or by naivity?
How about unconscious design, or neglectful behaviour? In general it's rare that people actively set out to discriminate with AI, but they don't examine their own prejudices and are (or should be) fully aware that their prejudices will affect the final system. These ideas aren't _new_. I doubt seatbelt or body armour manufacturers actively set out to discriminate either yet here we are.
 
ok. Do you think that systematic discrimination against women built into these systems is there by conscious design or by naivity?
Why do you think that question is the central one to this thread? Do you think human systematic discrimination against women is always by conscious design? The whole point is that it is a side-effect of the way the system is set up, rather than being malevolent by intent.
 
Seriously hope we’re not blaming the patriarchy for putting cupboards and shelves at the wrong height.
Ffs.
On the other hand, I just bought an ironing board from Aldi. TOTALLY Right-hand user orientated. Bloody up/down lever on the WRONG side.
 
Seriously hope we’re not blaming the patriarchy for putting cupboards and shelves at the wrong height.
Ffs.
On the other hand, I just bought an ironing board from Aldi. TOTALLY Right-hand user orientated. Bloody up/down lever on the WRONG side.

Been here for a year with just 13 messages to his name and now chooses to speak in order to pour scorn on a feminist thread :hmm:
 
I have to wear a small men's polo shirt for uniform because the mechanic's shirts are not available in a women's cut. It's like a tent on me.
A female colleague mentioned this today. We both started work recently at a software company and got a t shirt when we joined. When we were asked for our size, I said medium and, of course, got a man's medium which is far too big. My colleague -- who's very petite -- asked for small and, of course the men's small is bloody enormous on her. Not that I'd ever wear it -- it might come in handy for painting -- guess it's more the lack of thought.
 
A small derail...

I am a massive GoT fan and I cannot believe how many fanboys are hating on Arya this evening. Sorry boys, girls can kick ass.

Bet her armour fit
 
Yeah, you could set up a firm which tells companies what their *real* company culture is like, but I doubt you'd get many takers.
Yeah there are a fair few organisations attempting to make money from helping others to understand and deal with various biases, many of them charities.
Some outfits have been trying to to make an effective 'business case' against discrimination for decades. Scoping the problem and proposing solutions is much easier than actually dealing with the problem.
 
Yeah there are a fair few organisations attempting to make money from helping others to understand and deal with various biases, many of them charities.
Some outfits have been trying to to make an effective 'business case' against discrimination for decades. Scoping the problem and proposing solutions is much easier than actually dealing with the problem.
Yeah, that's true. It's all about perception. Like the gender pay gap issue. Everyone looks really good now that they've 'addressed' it.
 
I believe you get it on some cars, but its considered a "luxury" feature.

Until embarassingly recently I thought the little "switch" on the mirror was a "Mum/Dad" button, because the change in tilt in my folks' cars when I was growing up was exactly the right height to adjust it for their relative height difference.
things you should have worked out years ago thread >>>>>
 
You really are struggling with this aren't you. Do you think your binary options and assumption that the examples given are 'cherry picked' are naive or consciously anti-feminist?
I don't think it's somehow anti-feminist to be skeptical about claims made for how AI behaves in 2019 based on evidence from some years ago. That evidence led to incomplete data and introduced bias being widely identified as central challenges. I certainly don't want to appear to defend all things AI, but I recognise that the machines are learning, as are the humans who manipulate the algorithms and provide the core data sets. The Amazon software mentioned was withdrawn as not fit for purpose, partly because it discriminated against women, yet other recruitment AI is now in use. Claims of systematic bias against non caucasian faces have been around for years yet AI recognition is starting to be deployed for document free passage through high security airports. It's not anti-anybody to want evidence of current practice being as discriminatory as it was a few years ago.
 
It's not anti-anybody to want evidence of current practice being as discriminatory as it was a few years ago.
True,. But you seem to resist the idea that progress is not being made. The point of many of these threads is to look at the continuing discriminatory environment in recognition that many things are not improving at all.
 
True,. But you seem to resist the idea that progress is not being made. The point of many of these threads is to look at the continuing discriminatory environment in recognition that many things are not improving at all.
I don't doubt the generality of your latter point and have no desire to frustrate that intention. This sub-thread grew out of a radio programme that cited specifics that imo don't really still appear to stack up, that's all.
 
Luckily there are sufficient examples outside that radio programme for this to be a serious conversation elsewhere, interlinked with related topics as indicated upthread.
To break the cycle of gender imbalance, it is critical to ensure that women at all stages of their careers are being inspired to actively take part in the development and use of new technologies. As outlined above, our research suggests that this is currently not the case in AI, and we see similar trends in other new technologies such as blockchain – as new skills emerge, the old biases persist.
from here
Will AI make the gender gap in the workplace harder to close?
And
Gartner predicts that by 2022, 85% of AI projects will deliver erroneous outcomes due to bias in data, algorithms or the teams responsible for managing them
From here Why we need to solve the issue of gender bias before AI makes it worse

Here's a bit more info on the Turkish translation example, surely you can't insist on waiting for 'evidence' about the improvements made in 2019 before taking the info behind these (hastily googled) discussions seriously.
Artificial intelligence is demonstrating gender bias – and it’s our fault
 
I don't think it's somehow anti-feminist to be skeptical about claims made for how AI behaves in 2019 based on evidence from some years ago. That evidence led to incomplete data and introduced bias being widely identified as central challenges. I certainly don't want to appear to defend all things AI, but I recognise that the machines are learning, as are the humans who manipulate the algorithms and provide the core data sets. The Amazon software mentioned was withdrawn as not fit for purpose, partly because it discriminated against women, yet other recruitment AI is now in use. Claims of systematic bias against non caucasian faces have been around for years yet AI recognition is starting to be deployed for document free passage through high security airports. It's not anti-anybody to want evidence of current practice being as discriminatory as it was a few years ago.

Huh? You're actually discounting the Amazon software because they decided to stop using it? I mean, it's good that they stopped, but that doesn't mean their previous use of it ceased to exist or that all recruitment AI is suddenly better than the one used by an enormous company with tons of money.

"Incomplete data." :D
 
Back
Top Bottom