Chimeras by Han Manouach & Anna Engelhardt

Read More

Inventory of Synthetic Cognition

Modernist terms of linearity, sequentiality, separability, temporality, and spatiality

Modernity’s efforts to legitimate the violent rationalities and acts of colonialism

Refusal to install the application would automatically lower the social trust rating of the individual

equates being seen with being protected

ideological weaponization against the Other

biometrical nudity

the goal of “Safety” can be seen as labor exploitation for economic benefits

Lessig, instead, insisted that internet-based technologies had the potential to operate as the ultimate regulatory machines

the AI Act explicitly recognizes the need to regulate ex ante, to adopt a risk-based approach, and to focus on the design of technology phase rather than on the human actor’s behavior

The term jus algorithmi was first coined by John Cheney-Lippold in the context of state surveillance through the instruments of identification, categorization, and control

When the user’s citizenship gets algorithmically assigned without their awareness and outside of their physical body, it might have long-lasting personal and geopolitical implications

over-representing the Western conception of Man as a universal one

neither a human being nor its representation but a distance between the two

Procedural alienation of digital subjects into individualized datasets resets the humanness as categories of “less-than-human,” “more-than-human,” and “non-human

How to address being non-human as praxis?

Interspecies Semiotics

Humans and machine-learning systems are able to share internal world models

Humans and machine-learning systems are able to share internal world models through large neural-net language models and generative-adversarial networks that produce text and images

perceive the world through the hyper-dimensional mathematical structures of statistical computation

The possibility of connecting to machinic perception through the consciousness-informing structures of language implies that these engagements can have profound effects on our understanding of reality.

we should expect to experience a co-evolution of all involved species, including machines and humans.

How will we know when AI begins to shape our umwelt, or even our physical form? And what will we do with that knowledge?

imaginaries of “autonomous technologies-as-monsters

the contexts and circumstances of the “releasing” of one’s creatures

Our inability to control something does not absolve us of being implicated in its futures

Posthuman Folklore is the theoretical perspective that folklore —defined here as socially shared aesthetic traditions— are not unique to humans

“cyborg” question, tracking how the digital realm increasingly contributes to our culture and sense of selves

posthuman folklore should be seen as part of the zeitgeist of the Anthropocene

The idea of listening to other voices, both non-human living forms and non-living artificial forms, are increasingly viewed as a fundamentally necessary step to create a sustainable earth

Digital legacy is the body of posthumously persistent digital material associated with a once-living individual.

Upon death, our digital lives transform into legacies by default rather than design, usually housed on platforms and devices not designed with the end in mind.

A neural net for recognizing dumbbells produced images of dumbbells, as expected, but always with part of a human arm attached

In one case, a researcher’s fitness function included the goal of limited CPU usage; the evolutionary computing solution was to create programs that immediately slept and never woke up, thus using zero CPU cycle

Another program, tasked with sorting lists, evolved to simply delete the lists so that nothing remained unsorted

“AI alignment” (maintaining compatibility between AI and human goals and ethical principles

train data sets to make space for transgression, dissent, and refusal.

the inherent tensions between bodies, access, and the commons

I am not sure whether I am taking care of my devices or if they are taking care of me

The promise of a better tomorrow does not absolve technology companies of responsibility for the harms they are inflicting today.

the ideology of cure,” is deeply embedded in this culture of violence

trying to force our bodyminds to match the neoliberal race of production

What if we reoriented our understanding of technology around slowness, foggy headedness, awake at 3a.m. insomnia

a compulsory perspective that denies there might be value in those lives designated abnormal

population control policies implemented in the Global South need to be understood as continuations of the colonial/imperial project

A model is a simplified representation of something

Our understanding of the world that drives our day-to-day decision-making can be said to be an ensemble of models

When working with models, the emphasis is usually on the model-as-object, like an oracle to ask questions. It might be better to focus on modelas-process:

The “difference that can make a difference” can no longer be identified

More and more are aware that data are manipulated, fuelled by subliminal behavioural interventions and filtered through algorithms

no longer make a distinction between drastic techno-determinist forces (such as automation, AI and 5G) and the collapse of human awareness

Any argument about a newfound possibility of technology to solve the calculation problem is positivist-evolutionary, and may rightly be considered technologically deterministic.

Understanding the input and output variables of an economy does not equate to having the ability to control or change said variables

computational forms of optimization that are not based on profit.

the decentral speculator-planner may forge ahead not by imagining mega-structural systems run solely by socialist government, but by thoroughly considering the bridges, exchanges, and causal connections between currently existing cooperatives and interest groups.

algorithms are always constraint

Digitalization, moreover, aggravated the positions of users: on the one

and, algorithmic “black box” made users powerless in terms of technical knowledge; on the other hand, every new (kind of) user became a sort of quality assurance acting secretly both for developers and users themselves.

Asking “should we de-bias” or “how do we de-bias” presumes that the problem is bias: that AI is needed in a domain, but simply needs some technical tweaks.

Dimensionality reduction is the means by which high-dimensional encodings are transformed into low dimensional embeddings within deep learning models

The associated Manifold Hypothesis holds that real world data forms lower dimensional manifolds in its embedding space

Agential Assemblage

uneven topographies

not governed by any central head

able to function despite the persistent energies that confound them from within

Karen Barad, Meeting the Universe Halfway:

Back propagation (correcting the past) and pre-emption (constraining future outputs

s newsfeed to further maximize engagement. Facebook uses proprietary machine learning algorithms

Facebook uses proprietary machine learning algorithms, which are constantly, automatically updating and correcting themselves—and guarded as trade secrets

EdgeRank analyzes the relationships between digital “objects” (users, videos, posts)

EdgeRank analyzes the relationships between digital “objects” (users, videos, posts) and “edges” (the relationships between them).

Confidence Intervals hold information together with a hard bargain: the higher their degree of accuracy, the lesser their degree of certainty

The Confidence Interval is a staunch number-line architecture super-imposed on radical epistemic instability

to establish public confidence —or control— in capital trade

a tuning apparatus for violent statistical control that by the mid-twentieth century were encoded into the automated function of digital software

this jagged history from colonial violence to machine development