man
woman

I. Misgendering = Misrepresentation

Optical lenses symbolize biases and approximations representing the compression and distortion of the information flow. The total bias of machine learning is represented by the central lens of the statistical model through which the perception of the world is diffracted.

ACR (Automatic Gender Recognition) AGR (Automatic Gender Recognition) is a subfield of facial recognition that aims to algorithmically identify the gender of individuals from photographs or videos.

Floating Image
  • Identifying gender
  • Not in the pattern
  • Impossible to identify
  • Misgendering = Misrepresentation

II. Model

AI is not a thinking automaton but an algorithm that performs pattern recognition.

Pattern

Floating Image
A visual pattern is recorded as an impression on a network of artificial neurons that are firing up in concert with the repetition of similar images and activating one single output neuron.

Neural network

Floating Image
Artificial neural networks started as simple computing structures that evolved into complex ones which are now controlled by a few hyperparameters that express millions of parameters.

Black box

Floating Image
The black box effect is an actual issue of deep neural networks (which filter information so much that their chain of reasoning cannot be reversed) but has become a generic pretext for the opinion that AI systems are not just inscrutable and opaque, but even ‘alien’ and out of control.

Ghost work

Floating Image
The problem of bias has mostly originated from the fact that machine learning algorithms are among the most efficient for information compression, which engenders issues of information resolution, diffraction and loss.

III. Model learning

Machine learning is a term that, as much as ‘AI', anthropomorphizes a piece of technology: machine learning learns nothing in the proper sense of the word, as a human does.

Bruce force approximation

Floating Image
Neural networks are said to be among the most efficient algorithms because these differential methods can approximate the shape of any function given enough layers of neurons and abundant computing resources.

Interpolation and extrapolation

Floating Image
The statistical model of machine learning algorithms is also an approximation in the sense that it guesses the missing parts of the data graph: either through interpolation, which is the prediction of an output y within the known interval of the input x in the training dataset, or through extrapolation, which is the prediction of output y beyond the limits of x, often with high risks of inaccuracy.

Curve fitting

Floating Image
‘Curve fitting’ imposes a statistical culture and replaces the traditional episteme of causation (and political accountability) with one of correlations blindly driven by the automation of decision making.

Model fitting

Floating Image
The challenge of guarding the accuracy of machine learning lays in calibrating the equilibrium between data underfitting and overfitting, which is difficult to do because of different machine biases.

Architecture algorithm

Floating Image
The algorithm starts as a blank slate and, during the process called training, or ‘learning from data', adjusts its parameters until it reaches a good representation of the input data.

Ghost work

Floating Image
  • Choosing parameters
  • Defining model
  • Algorithmic becoming bias + is the further amplification of historical bias and dataset bias by machine learning algorithms

IV. Training data

The quality of training data is the most important factor affecting the so-called ‘intelligence’ that machine learning algorithms extract.

Training a dataset

Floating Image
  • Production + labour or phenomena that produce information.
  • Capture + encoding of information into a data format by an instrument.
  • Formatting + organization of data into a dataset.
  • Labelling + in supervised learning, the classification of data into categories (metadata).

Culture construct > Ghost work

Floating Image

The training dataset is a cultural construct + Cognitive bias is a systematic error in thinking that occurs when people are processing and interpreting information in the world around them and affects the decisions and judgments that they make , not just a technical one.

Ghost work > Select data

Floating Image

Selecting data = data bias + Dataset bias is introduced through the preparation of training data by human operators. Training dataset is a cultural construct, not just a technical one.

Labelling > Ghost work > Data posing

Floating Image

Defining data and labelling bias + Labelling bias occurs when the set of labeled data is not fully representative of the entire universe of potential labels. This is a very common problem in supervised learning, stemming from the fact that data often needs to be labeled by hand (which is difficult and expensive)

Taxanomies

Floating Image
Data does not exist, as it is dependent on human labour, personal data, and social behaviours that accrue over long periods, through extended networks and controversial taxonomies.

V. Representation bias

Representation bias occurs when the development sample under-represents some part of the population, and subsequently fails to generalize well for a subset of the used population.

Gender Modal Bias Model of "doing gender" is now widespread within the social sciences—have been particularly interested not just in how people model and gauge the gender of others, but how they believe it is modeled. As would be expected in a schema where gender derives from sex.

Floating Image
  • Binary + man or woman.
  • Immutable + assigned a category.
  • Physiological + physical characteristics.

Historical Bias Historical bias (or world bias) is already apparent in society before technological intervention. Nonetheless, the naturalisation of such bias, that is the silent integration of inequality into an apparently neutral technology is by itself harmful.

Floating Image
The employment of new technologies that reflect and reproduce existing inequalities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.

Cultural Bias Cultural bias is based on their behaviour and social role.

Floating Image

(man or woman)

Biological Bias Biological bias is based on anatomy, chromosomes and hormones, and gender.

Floating Image

(male or female)

Social Bias Social bias as discrimination for, or against, a person or group, or a set of ideas or beliefs, in a way that is prejudicial or unfair.

Floating Image
Sociologists of gender—particularly ethnomethodologists, who study how individuals understand and reproduce the roles and constructs of society.

Gender Bias Gender Bias is a preference for one gender over the other. Gender bias is usually a result of an individual’s ingrained beliefs about gender roles and stereotypes.

Floating Image
The assumption that sex dictates gender—in other words, that it mandates social roles, combinations of behaviors and traits and aspects of presentation and identity—fails to capture the existence.

To combat gender bias in algorithms, it is crucial to address the underlying societal factors that perpetuate such biases. Taking a comprehensive approach is essential for rectifying this issue.

Thank you for reading.