Search for a command to run...
Artificial intelligence has emerged as a profoundly transformative force within contemporary creative industries, fundamentally altering the ontological status of the artwork and the traditional role of the artist. Over the last decade, AI-assisted modes of creation have transitioned from niche technical experiments into a central pillar of global art discourse, redefining the relationship between human agency and computational systems. Within this paradigm shift, international award platforms such as The Lumen Prize serve as vital barometers for recognizing and legitimizing these technology-driven practices. Established in 2012, The Lumen Prize has become a global benchmark for digital excellence, and its 2025 edition highlights a significant surge in AI-driven projects, signaling that algorithmic creativity is no longer a futuristic curiosity but a matured medium. This research investigates the AI-supported works featured in the 2025 selection, seeking to understand how they navigate the complex intersections of data politics, identity, and ecological narratives. The central problem addressed is how artificial intelligence functions not merely as a generative tool but as an active epistemological agent that reshapes the conceptual boundaries of art. The research seeks to decode these works through a rigorous analytical framework, discussing how they challenge traditional notions of authorship, identity, and data politics. The theoretical background of the study contextualizes the 2025 selection within a historical trajectory, noting that early experiments in algorithmic art introduced computational logic to visual composition. This trajectory gained momentum with autonomous art systems like Harold Cohen’s AARON, which exemplified rule-based creation from the 1970s through the 2010s. However, a significant rupture occurred in 2014 with the introduction of Generative Adversarial Networks, shifting the focus from programmed rules to data-driven learning. The current research updates this literature by focusing on the 2021–2026 era, dominated by Latent Diffusion Models and Large Language Models. Theoretically, the study synthesizes three critical perspectives: Lev Manovich’s concept of AI Aesthetics and the database as a cultural form; Kate Crawford’s critique of AI as a registry of power and a carrier of systemic bias; and Arthur I. Miller’s vision of augmented creativity, where the machine acts as a symbiotic co-author rather than a passive instrument. Methodologically, the study employs a qualitative approach centered on Erwin Panofsky’s Three-Stage Iconological Method. This framework allows for a multi-layered deconstruction of the works, beginning with pre-iconographical description, which analyzes the primary technical level—including generative models and datasets. This is followed by iconographical analysis, which identifies the secondary level of conventional themes such as borders, citizenship, and biological mutation. Finally, iconological interpretation uncovers the deep intrinsic meaning, exploring how the work functions as a cultural symptom that critiques technological authority or proposes post-humanist modes of existence. Primary materials for this analysis include official Lumen Prize documentation, artist interviews, and award catalogs. In the analysis of specific awarded works, Cumulus represents a significant technical integration utilizing a custom computer vision program and NOAA satellite data. In the pre-iconographical stage, the work relies on high-altitude atmospheric data, while iconographically, it monitors the Mexico–US border for clouds rather than human movement. Iconologically, it invokes the Overview Effect, using AI to expose the invisibility of human-made borders from a planetary perspective, suggesting that national boundaries are ultimately irrelevant to ecological realities. Turning to The Sylphs, the work technically feeds Spanish texts and their English translations into an AI to visualize linguistic gaps. It identifies how large datasets are inherently biased toward dominant languages and, at an iconological level, treats AI as a lens to expose cultural asymmetry, proving that translation is an impossible, non-neutral act. Furthermore, the experiential work Deutsch / Nicht Deutsch employs biased face-recognition models in interactive stations to simulate a citizenship evaluation process. By forcing viewers to undergo an automated naturalization test, it moves into the iconological realm by critiquing AI as a flawed judge and a tool for algorithmic exclusion. This work serves as a practical manifestation of the dangers of using technology to classify human identity based on arbitrary datasets. In a similar vein of autonomous production, Words Beyond Words utilizes a fine-tuned Large Language Model and meta-poetic code to create an infinitely regenerative poem. This represents a shift in creative subjectivity where the AI acts as an autonomous poet consulting its own machinic muse, deconstructing the myth of the solitary human genius and proposing a post-human poetics. Finally, the Self-contained series uses iterative crossbreeding algorithms to simulate biological mutation within digital images. Its iconological climax occurs when digital data is encoded into actual synthetic DNA, merging biological and digital evolutionary timelines into a singular hybrid artifact that challenges the distinction between the organic and the artificial. A cross-analysis demonstrates three transformative shifts. First, an aesthetic transformation occurs as AI introduces new visual paradigms where images are dynamic outputs of probability. Second, a thematic transformation is evident, wherein AI becomes a critical lens for investigating climate change and linguistic justice. Third, a subjectivity transformation shows the artist’s role evolving from a form-maker to a system-designer who collaborates with an algorithmic partner. Connecting these findings to literature, the analyzed works serve as practical validations of Crawford’s theories on algorithmic bias and Miller’s vision of co-creativity. Furthermore, the reliance on complex datasets reaffirms Manovich’s claim that contemporary art is defined by the transformation of the database into a distinct cultural form. In conclusion, The Lumen Prize 2025 selection confirms that AI-assisted art has reached a state of conceptual and ethical maturity. Artificial intelligence is an active conceptual partner facilitating a post-humanist poetika where intuition and logic intertwine. Based on these findings, the study offers critical recommendations. At an international level, art institutions should prioritize algorithmic transparency and ethical conceptualization over technical spectacle. At a national level, particularly for Turkey, Fine Arts curricula should integrate theoretical courses on Data Ethics and Algorithmic Aesthetics alongside technical training. Finally, future research should investigate the legal implications of autonomous authorship as seen in regenerative works, identifying emerging global standards for AI-driven creativity in the digital age.