Diella, Albania’s first artificial minister: the trap of feminizing AI

Share:

For the first time in history, artificial intelligence has entered a government in Albania. Beyond questions about the role of AI in public decision-making, Diella’s appointment as minister responsible for public procurement raises questions about the almost systematic feminization of AI avatars. This deceptive practice, which perpetuates gender stereotypes, perpetuates the objectification of women and facilitates manipulation.


The Albanian government has surprised everyone by appointing Diella, an artificial intelligence (AI), as Minister of Public Procurement . Presented as an asset in the fight against corruption, Diella would be responsible for analyzing tenders, identifying conflicts of interest, and ensuring the impartiality of public decisions.

This unprecedented initiative marks a historic milestone. For the first time, an AI has officially entered a government, in this case, in the guise of a female digital avatar. But beyond the media hype and the ethical questions this appointment may raise—can we really govern with an AI?—it raises fundamental questions about the almost systematic feminization of AI agents.

Why is Diella an artificial woman? And what are the implications of this feminization of AI?

Diella: a problematic case study

AI has already been used as a tool of governance. Some cities, for example, use algorithms to optimize transportation or detect fraud . But by appointing an AI minister, Albania is taking a major symbolic step: more than a tool, she becomes a public female figure, meant to embody values ​​of transparency and justice.

The promise is seductive: even if an AI can reproduce or amplify the biases of those who programmed it, a machine cannot, in theory, accept bribes or favor relatives. It appears to offer a guarantee of impartiality in a country where corruption scandals taint political life. Albania is, in fact, ranked 80th out  of 180 countries in the Corruption Perceptions Index, according to Transparency International .

But this vision obscures a central problem: the ethical consequences of the feminization of AI are far from trivial.

Why are AIs almost always female?

Since Siri (Apple), Alexa (Amazon), Cortana (Microsoft) and Sophia, the first robot to obtain Saudi nationality in 2017 , most virtual assistants and intelligent robots have been given a female voice, face, body or first name. This is no coincidence.

In initial research on the issue , we showed that we perceive female bots as warmer, more trustworthy, and even more human than their male counterparts.

Why? Because women are, on average, perceived as warmer and more likely to experience emotions than men… and these qualities are lacking in machines. The feminization of AI objects therefore contributes to humanizing these objects.

This feminization relies on deep-rooted stereotypes: women are “naturally” gentler, more attentive, and more empathetic. By endowing their machines with these attributes, designers compensate for the coldness and artificiality of algorithms and facilitate their acceptance and adoption.

When feminization becomes manipulation

But this practice raises major ethical issues, which I developed in a recent article published in the pages of the Journal of Business Ethics .

This article compares the ethical implications of using gendered and sexualized feminine attributes in two contexts. On the one hand, advertising, where idealized female representations have long been used to attract consumers. On the other, AI agents, which are now adopting these same codes. This comparison shows that, in both cases, feminization generates three major dangers: deception, objectification, and discrimination.

  • Deception and Manipulation

Artificially assigning human and feminine characteristics to machines exploits our unconscious and automatic reactions to neotenic traits (youthful characteristics associated with feminine traits such as round eyes, rounded features) which subconsciously evoke innocence and, therefore, honesty and sincerity.

This subtle manipulation could facilitate the acceptance of potentially problematic algorithmic decisions. A feminized AI makes itself appear more human, more empathetic, more “trustworthy.” However, it should not be forgotten that it is a computer program, without emotions or consciousness—  a question that is beginning to be debated  —whose decisions can be biased or even exploited.

  • Literal objectification

Unlike advertising, which metaphorically compares women to objects , artificial intelligence goes further: it literally transforms women into programmable objects (machines, algorithms). Female AIs reduce feminine attributes to mere service tools: obedient machines, permanently available. This mechanization of femininity reproduces and amplifies advertising logic of objectification, but with a new dimension: interactivity.

As a result, researchers note the persistence of aggressive and sexual remarks in interactions with these assistants , thus normalizing abusive behavior towards “machine women” which risks being transferred to real women… Ultimately , the humanization and feminization of AI can paradoxically lead to an increased dehumanization of women.

  • The perpetuation of stereotypes

At first glance, Diella might seem like a symbolic victory: a woman—even a virtual one—has reached a ministerial position. In a country where politics remains dominated by men, and where most female AIs are assistants, some will see this as a sign of equality.

But this naive and optimistic reading obscures a paradox. While real women struggle to reach the highest positions in many governments, it is an artificial woman who embodies integrity in power. Nicknamed “the servant of public procurement ,” she is in reality a woman without power to act. Here we find an old schema: “the artificial Eve ,” fashioned to correspond to an ideal of docility and purity. A perfect minister, because she is obedient and unalterable… and who will never question the system that created her.

Female AI: Devoted Saint or Manipulative Eve

The feminization of AI is actually based on two tropes deeply rooted in our imagination, which reduce female identity to the archetype of the devoted saint or the manipulative Eve.

The devoted saint is the image of the pure, obedient woman, entirely devoted to others. In Diella’s case, this manifests itself through a promise of transparency and absolute loyalty, a figure of incorruptible virtue in the service of the state and its people.

Diella’s visual representation is also strongly reminiscent of the iconography of the Virgin Mary: a gentle face, downcast gaze, humble attitude, and white veil. These religious aesthetic codes associate this AI with a figure of purity and absolute devotion. But by making AI an idealized and docile female figure, we fuel a benevolent sexism that confines real women to these same stereotypes.

The Manipulative Eve: In popular culture, trust in a feminized AI is transformed into a suspicion of deception or danger. A prime example: the science fiction film Ex Machina , in which the hero is duped by an AI with whom he falls in love.

If Diella were to serve as a political instrument to justify certain opaque decisions, she too could be perceived through this prism: no longer as a guarantor of transparency, but as a figure of dissimulation.

These two contradictory representations – the sacrificial virgin and the treacherous seductress – continue to structure our perceptions of women and are now projected onto technological artifacts, feeding a loop that in turn influences how real women are perceived.

For a non-humanized and non-gendered AI

Rather than humanizing and gendering AI, let’s embrace it as a new technological species: neither male nor female, neither human nor divine, but a distinct tool, designed to complement our abilities, not mimic them. This means giving it a non-human appearance and voice, to avoid confusion, deception, and manipulation.

AI development should be based on complete transparency, representing AI for what it really is: an algorithm.

Finally, designers should make public the composition of their teams, their target audiences, and their design choices. Because, behind the apparent neutrality of algorithms and their interfaces, there are always human, cultural, and political decisions.

Diella’s arrival in the Albanian government should open a fundamental debate: how do we want to represent AI? As these technologies occupy an increasingly important place in our lives, it is urgent to reflect on how their representation shapes our democracies and human relations.

Author Bio: Sylvie Borau is Professor of Ethical Marketing at TBS Education

Tags: