Urban75 Home About Offline BrixtonBuzz Contact

Feminism and a world designed for men

Also a side issue, but also very interesting is the finding that lab mice and rats fear male technicians more than they do female technicians, and this has probably skewed findings for decades. Some of what I’ve read suggests that it has long been recognised, but nothing was done about it, either to study or to remedy the problem, and I can’t help help but wonder if that inertia was based in some kind of patriarchal lockdown.

Lab mice fear men but not women, and that's a big problem for science

That's interesting - and makes sense from an evolutionary point of view.
 
I love that on a thread about feminism and a world designed for men, so many posts are about how ACTUALLY its really hard being tall and hot and male :D
Wait until we start discussing sexual assault in a few weeks time...
 
I'm 6'3", so all sinks and basins, standard height worksurfaces and office furniture is too low. I understand the points made upthread about cupboards and so on being too high, but frankly I'd love the option to get a comfortable working height by something as simple as standing on steps. For the most part there really isn't any solution, so, like a lot of tall people I too get backache from using space designed for those shorter than us.

You can also get backache and shoulder problems from constantly reaching up, and steps aren't always safe, especially in a kitchen which is likely to have floors that aren't the safest to place steps on. Plus you're more likely to be hit by a falling object because you're having to grab and swipe at things rather than simply reach for them.

I agree it's a difficult situation though, where few people will win. But if you're going to have an average at all (and sometimes you do have to) it should be an average that actually reflects the average user rather than the average designer or installer. Or, with office furniture, they should install items that can be adjusted for the user. Chairs have been for a long time but there's not much point in adjusting the chair if it means you can't use the keyboard or screen properly.
 
One exception I can think of is when I went to Uni, my first year was in a women's hall which had just gone mixed. The bath was too small, not in a "looks silly" kind of way, but just enough to be kind of painful after a bit for an average-sized bloke (which I am).

Would be annoying to feel slightly the wrong size almost all the time, so I can identify.
 
What is most disturbing is that this is getting to be insitiutionalised in algorithms. Such as the Turkish translation.

Its moving beyond men making gender bias to algorithms doing it as part of the way they have been set up.[QUOTE]
I think the algorithms are nothing new or 'beyond'. Gender bias is part of the way we are all set up. As many of the non-tech examples on this thread suggest

No doubt, but for AI to do it's job properly it simply can't ignore 50% of the population or fail to recognise all non-caucasian facial features. Which makes me wonder the extent to which these cited examples are historic, or cherry picked (as with the Turkish translation).
The AI 'gaffes' we hear about are entirely a result of doing the job as specified. 50% of the population are used to putting up with badly designed stuff. Why assume the examples are biased?
 
One exception I can think of is when I went to Uni, my first year was in a women's hall which had just gone mixed. The bath was too small, not in a "looks silly" kind of way, but just enough to be kind of painful after a bit for an average-sized bloke (which I am).

Would be annoying to feel slightly the wrong size almost all the time, so I can identify.

Odds are that was just crappy design though. Baths aren't better smaller just because you're small.
 
Can't recall if it was that programme or another R4 one that was talking about the inbuilt prejudice in the way computers discriminated against women eg:
- not selecting CVs of perfectly qualified women for a job shortlist
- or translation programs that when asked to translate 'He is a nurse' & 'She is president' into turkish that hasn't got gendered pronouns, then back in to english it became 'She is a nurse' & 'He is president'
chnology more and more.
- also in not recognising non white faces as human (but thats for another thread)
There seems to be an issue of blatant sexism as well as gender bias going on inside tech companies that has far reaching effects as we all rely so much on the technology.
That was a different programme. I heard that one. Quite shocking
 
Odds are that was just crappy design though. Baths aren't better smaller just because you're small.

If I was smaller it would have been fine. I really think it was just that.
They'd put a new block on recently and the baths in that bit were fine... I found out... incidentally...
 
You can also get backache and shoulder problems from constantly reaching up, and steps aren't always safe, especially in a kitchen which is likely to have floors that aren't the safest to place steps on. Plus you're more likely to be hit by a falling object because you're having to grab and swipe at things rather than simply reach for them.

I agree it's a difficult situation though, where few people will win. But if you're going to have an average at all (and sometimes you do have to) it should be an average that actually reflects the average user rather than the average designer or installer. Or, with office furniture, they should install items that can be adjusted for the user. Chairs have been for a long time but there's not much point in adjusting the chair if it means you can't use the keyboard or screen properly.
I'd like to think there's a way round this, but I'm not that hopeful. I worked in a place with (very expensive) adjustable desks once but very seldom saw one that had been changed. To start with I'd raise mine, but it was always back to standard when I came in for the next shift so eventually I gave up.
 
I'd like to think there's a way round this, but I'm not that hopeful. I worked in a place with (very expensive) adjustable desks once but very seldom saw one that had been changed. To start with I'd raise mine, but it was always back to standard when I came in for the next shift so eventually I gave up.

You could have a fixable, adjustable keyboard and mouse stand - they exist and are cheaper than adjustable desks- and allow the monitor to be adjusted in height (which is usually fairly easy but not always done). Then you have the desks slightly high so that people don't have the uncomfortable situation where their legs don't fit under the desk, but the lowest position of the keyboard stand etc is comfortable for short people and can be made higher for the taller people. The other stuff on the desk is unlikely to be much of an issue.

Hotdesking makes this much more difficult obvs but I loathe hotdesking anyway.

It's the sort of thing companies that advise on adjustments for disability advise on, but it applies to everyone.
 
If I was smaller it would have been fine. I really think it was just that.
They'd put a new block on recently and the baths in that bit were fine... I found out... incidentally...

It would have been fine, but it's not like the women would have requested a smaller bath.
 
The AI 'gaffes' we hear about are entirely a result of doing the job as specified. 50% of the population are used to putting up with badly designed stuff. Why assume the examples are biased?

No company is going to deploy phone or door opening security software that fails to perform its basic function because it doesn't recognise any black people, are they? Nor are many employers likely to buy CV scanning software that routinely ignores candidates with a female sounding name. That's speculation: donation to the server fund if you can find any current, up to date, examples of real world AI that systematically works like that.
 
No company is going to deploy phone or door opening security software that fails to perform its basic function because it doesn't recognise any black people, are they? Nor are many employers likely to buy CV scanning software that routinely ignores candidates with a female sounding name. That's speculation: donation to the server fund if you can find any current, up to date, examples of real world AI that systematically works like that.

Is Amazon a big enough company to count?

Amazon scraps secret AI recruiting tool that showed bias against women - Reuters

And although I think AI not recognising people with darker skin (and it absolutely does and should never have been put on the market with those biases in its functioning) is an interesting topic I would personally prefer if this thread were about women rather than becoming a general thread about discrimination.
 
But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.

After years of using it. And if they were, you can bet other companies were.

And a lot of companies either wouldn't care about software that screened out female-sounding names or would think it was fantastic. Great, now we don't ever have to pay maternity pay and we can blame it on the software!
 
After years of using it. And if they were, you can bet other companies were.

And a lot of companies either wouldn't care about software that screened out female-sounding names or would think it was fantastic. Great, now we don't ever have to pay maternity pay and we can blame it on the software!

Why was the machine doing that? :confused:
 
Why was the machine doing that? :confused:

The article explains it. It looked at the CVs of previous successful and unsuccessful applicants and filtered positively for the words that were used in the successful CVs, and filtered negatively for the words used in the unsuccessful ones. One of the main things it discovered was that resumes that included the word "women" as in "women's chess club" tended not get hired in the past, because women have always had problems being hired in the tech industry, so it flagged those terms as negative. So it perpetuated a previously-existing problem.

It was more subtle than that, too:

Instead, the technology favored candidates who described themselves using verbs more commonly found on male engineers’ resumes, such as “executed” and “captured,” one person said.

Men are socialised to use more "aggressive" language than women are. If anyone tries to deny that I'm not going to bother to argue with them.

There is the possibility to teach female applicants to use those words but that depends on them knowing what words to use, and if you tell everyone what words to use then the AI becomes useless.
 
After years of using it. And if they were, you can bet other companies were.
the article says they started work in 2014 and withdrew the s/w in 2015. We all know there *were* systematic discrimination problems with AI, I asked whether there *are now*.
 
Literally helped a friend with an AI photo system that kept rejecting her headshot as "too dark" last year.

The systemic issues that get integrated into AI haven't gone away so the problem hasn't gone away. The worst part is all the systems that don't make such direct, observable decisions though, which are increasingly in use with big data. Do we really trust a finance risk indicator or behavioural threat level predictor for the police to be unbiased? I don't (and I have a degree in AI and worked with big data).
 
The article explains it. It looked at the CVs of previous successful and unsuccessful applicants and filtered positively for the words that were used in the successful CVs, and filtered negatively for the words used in the unsuccessful ones. One of the main things it discovered was that resumes that included the word "women" as in "women's chess club" tended not get hired in the past, because women have always had problems being hired in the tech industry, so it flagged those terms as negative. So it perpetuated a previously-existing problem...

Ah, so the data they fed in was based on who they had hired in the past.
Seems it worked perfectly in that sense. Just told them a few things about themselves that they weren't expecting along with it.

edit: Smash, bludgeon, grrr!!!
 
Last edited:
Ah, so the data they fed in was based on who they had hired in the past.
Seems it worked perfectly in that sense. Just told them a few things about themselves that they weren't expecting along with it.
One of the best uses for this sort of AI/statistical analysis, exposing implicit prejudice.
 
Literally helped a friend with an AI photo system that kept rejecting her headshot as "too dark" last year.

The systemic issues that get integrated into AI haven't gone away so the problem hasn't gone away. The worst part is all the systems that don't make such direct, observable decisions though, which are increasingly in use with big data. Do we really trust a finance risk indicator or behavioural threat level predictor for the police to be unbiased? I don't (and I have a degree in AI and worked with big data).
ok. Do you think that systematic discrimination against women built into these systems is there by conscious design or by naivity?
 
There is a big difference between being an outlying male (which is what the men on this thread are complaining about) and being an average female. Average women are discriminated against not because we are outliers but simply because we are female.

I bought my car 5 years ago and I can adjust seat height, pitch, steering wheel angle, temparatue on different sides of the car etc but not the height of the seatbelt. If I make the seat high enough so that it isn't cutting into the side of my neck, it's a really uncomfortable (and rubbish) driving position. It would cost very little to make it adjustable but it's not considered necessary :(

It was quite hard trying to find a pair of steel toe-capped boots too - I only found one shop that sold them.
 
Back
Top Bottom