tagCloud

Let’s Work Together

Advanced Lab: Synthetic Media (IMA NYU, Spring 2024-26)

Synthetic Media — Course Overview

Overview

Course number: INTM-SHU 306

Course Title: Synthetic Media

Course Description:
This course investigates emerging trends in machine learning and artificial intelligence for generating media content – images, video, sound. The course explores the idea of how artists, designers, and creators can use machine learning in their own research, production, and development processes. Students will learn and understand machine-learning techniques and use them to generate creative media content. We will cover a range of different platforms and models and also experiment with implementing interaction within our work.

Instructional Mode: In-person

Co-requisite or prerequisite: Communications Lab, What’s New Media, OR Emerging Technologies & Computational Arts

Class meeting days and times: Meets once a week, Mondays 15:45-18:30 N405

Learning Outcomes:
Upon Completion of this Course, students will be able to:

  • understand the concept of synthetic media and the technologies it involves,
  • learn about machine learning and generative A.I. and how to apply it creatively,
  • analyze the creative and artistic dimensions of synthetic media,
  • work with state-of-the-art platforms and models for generating synthetic media content,
  • experiment with implementing the generated media content with platforms for interaction design,
  • develop critical thinking skills through analyzing and reflecting on the implications of synthetic media in society and culture.
Course Policies

Attendance and Tardiness

Students are expected to attend all scheduled classes. If unable to attend a class, a student needs to notify their instructor before the class.

Absences and Grades

  • 4 absences will lead to an F for your participation grade.
  • 6 absences will lead to a 25% reduction in your final grade.
  • 8 absences will lead to failure of the course.

Absence Exceptions

Observance of Religious Holidays: You may miss class for the observance of religious holidays. If you anticipate being absent because of religious observance, notify me in advance so we can create a plan for making up missed work.  For more on this policy: https://www.nyu.edu/about/policies-guidelines-compliance/policies-and-guidelines/university-calendar-policy-on-religious-holidays.html

Competitions, Conferences, Presentations: You are permitted to be absent from classes to participate in competitions, conferences, and presentations, either at home or out of town, as approved by the Associate Provost for Academic Affairs.  Review the Undergraduate Bulletin for the conditions you must meet to obtain approval for this kind of absence.

Extended Illness: A student with an injury or medical condition that requires ongoing accommodations (temporary or permanent) should contact the NYU Moses Center for Student Accessibility (CSA). If an accommodation is recommended by the Moses Center, then Academic Affairs may communicate on behalf of students to advocate for excused absences/ extensions. Reasonable accommodations, considering the course objectives, student learning, and fair standards, are ultimately decided by the professor.

Tardiness

Punctual arrival is mandatory for this class. Students need to be on time and not leave in the middle of class unless it is an emergency.

Late Assignments

Assignments are due at the date and time indicated on this syllabus. The late penalty for all assignments is one-third of a letter grade per day (an A becomes an A-, etc.) All other late assignments will earn an F.

Electronic Devices

Mobile Devices: Students may not use mobile devices in class unless otherwise indicated.

Recording Class: To ensure the free and open discussion of ideas, students may not record classroom lectures, discussions, and/or activities without the instructor’s advance written permission; any such recording can be used solely for their own private use. If a student has approved accommodations from the Office of Disability Resources permitting the recording of class meetings, the student must present the accommodation letter to the instructor in advance of any recording. On any days when classes will be recorded, the instructor will notify all students in advance. Distribution or sale of class recordings is prohibited without the written permission of the instructor and other students who are recorded.

Instructional Technology

Email Communication: The course instructor will contact students regularly via email. Students should check for emails from the instructor that will cover in detail reminders, logistics, updates, and so on.  Please note that the instructor will try to respond to all emails within 24 hours. Students should not expect immediate responses to emails sent late at night, during holidays, or on the weekends.

Assignment Notification: All assignments will be posted on the course website. Each student is responsible for reviewing the website and its resources. After each class period, the students are asked to learn about the next homework assignment or other requirements and responsibilities related to the course.

Instructional Technology Tools and Assistance: If you need background on specific instructional technology tools, such as Zoom, NYU LMS (Brightspace) and Voicethread, check the RITS Student Toolkit. You may also email [email protected] for assistance.

Academic Honesty/Plagiarism

Carefully read NYU Shanghai’s Statement on Academic Integrity (in the Undergraduate Bulletin).   Breaches of academic integrity could result in failure of an assignment, failure of the course, or other sanctions, as determined by the Academic Affairs office.

Disability Disclosure Statement

NYU Shanghai is committed to providing equal educational opportunity and participation for students with disabilities. It is NYU Shanghai’s policy that no student with a qualified disability is excluded from participating in any NYU Shanghai program or activity, denied the benefits of any NYU Shanghai program or activity, or otherwise subjected to discrimination with regard to any NYU Shanghai program or activity. Any student who needs reasonable accommodation based on a qualified disability should register with the Moses Center for Student Accessibility for assistance. Students can register online through the Moses Center and can contact the Academic Accommodations Team at [email protected] with questions or for assistance.

Title IX Statement

Title IX of the Education Amendments of 1972 (Title IX) prohibits discrimination on the basis of sex in educational programs. It protects victims of sexual or gender-based bullying and harassment and survivors of gender-based violence. Protection from discrimination on the basis of sex includes protection from being retaliated against for filing a complaint of discrimination or harassment. NYU Shanghai is committed to complying with Title IX and enforcing University policies prohibiting discrimination on the basis of sex. Mary Signor, Executive Director of the Office of Equal Opportunity, serves as the University’s Title IX Coordinator. The Title IX Coordinator is a resource for any questions or concerns about sex discrimination, sexual harassment, sexual violence, or sexual misconduct and is available to discuss your rights and judicial options. University policies define prohibited conduct, provide informal and formal procedures for filing a complaint, and a prompt and equitable resolution of complaints.

Links to the Title IX Policy and related documents:

Academic Resources

ARC Services

The Academic Resource Center (ARC) offers both individual, one-on-one tutoring as well as group sessions in a variety of ways, in a variety of courses. You can log on to WCOnline to book an appointment with a Global Writing & Speaking Fellow or a Learning Assistant (LA). The Global Writing & Speaking Fellows conduct individual consultations on writing, speaking, reading, and academic skills coaching. LAs provide both individual and small-group tutoring support in over 30 STEM, Business, Economics, IMA/IMB, and Chinese Language classes. Visit shanghai.nyu.edu/arc for more information about ARC services.

Library Services

The Library is available to support your research needs. They have access to over 27,000 print resources, 2,000 DVDs, and 1,000 databases (including over a million ebooks, as well as streaming audio and video and image databases).

Librarians with expertise in your research topic are available to meet either in person or online by appointment or by email to help you navigate the research process. Our library team features experts in Business, Arts & Humanities, STEM, Social Sciences & Economics, and data tools & resources. Ask us how we can assist you in developing a research question and formulating a research strategy, selecting databases, requesting materials, and citing your sources. Visit shanghai.nyu.edu/library for more information on:

  • 24/7 access to e-books, e-journals, streaming media, and databases
  • Booking one-on-one consultations for research help

Electronic Reserves

Students can access course readings using their NYU credentials for courses they currently enrolled in at https://ares.library.nyu.edu/.

Interlibrary Loan Service

For materials not available to you immediately, you can request scanned copies of a book chapter or journal article through our Interlibrary Loan (ILL) service. If you don’t know which chapter you need, you can request a Table of Content through ILL.

Assignments & Grading

For all assignments, you are required to demonstrate three important skills:

  • Divergence: A need to showcase thorough research, investigation, and experimentation. It is often impossible to reach superb results if the initial research is limited and lacks depth and quality of resources or information.
  • Criticality: It is paramount to be able to critically reflect on the researched or practiced content and identify what to keep and what to ignore. This skill can be sharpened if you are exposed frequently and to a sufficient amount of content from other artists and practitioners and understand in more detail their methods and motivations.
  • Convergence: After you have done a wide series of experiments and you have critically reflected on the content, you need to showcase your convergence skills. In this stage, it is essential to focus on the optimization and exceptional refinement of your content. Rough outputs that lack numerous reiterations show poor results, even if the previous stages are completed perfectly. Consider planning ahead of time for making the necessary refinements that will showcase incredible final results.


Class Participation & Attendance

Active participation and attendance are essential in this course. Students will be required to proceed to various teaching and learning activities during class, such as discussions, debates, practical exercises, and more.

Homework & Reading Responses

The course necessitates the completion of homework such as readings, writings, and practical exercises. These are important elements that are required for achieving the learning outcomes of this course.

Marking Elements

Class Participation & Attendance20%
Homework & Reading Responses15%
Assignment 115%
Assignment 220%
Assignment 330%

Letter Grades

Letter GradePercentage
A95% – 100%
A-90% – 94.99%
B+87% – 89.99%
B83% – 86.99
B-80% – 82.99%
C+77% – 79.99%
C73% – 76.99%
C-70% – 72.99%
D+67% – 69.99%
D63% – 66.99%
F63% and lower

  • Assignment 1 – Triptych Portraits

    (medium: generative image | work individually)

    Prompt: Create a triptych of the real YOU (your “portrait”). This assignment is inspired by the triptychs of Francis Bacon. The artist used the format to split a single subject so as to convey a visual metaphor for the multifaceted and sometimes fragmented nature of his own identity and capture layers of states of being such as internal conflicts, vulnerability, and the passage of time. He utilized distorted, almost dismembered forms to suggest a struggle with inner demons and existential anxiety. Via his signature unsettling, raw imagery, he manages to convey a fascination of the flesh, depicting a candid exploration of human emotions, the inevitability of decay, mortality, and redemption.

    You are free to interpret the prompt in a way that makes sense to you; consider viewing this with honesty, courage, and depth. This is not a psychological profiling but rather a snapshot of the holistic perception of who you are.

    Technically speaking, you need to provide the triptych in a printed format (~A3 size), and to submit a folder with the final images (.png or .jpeg format, 2K/4K resolution). The images need to contain the workflow (or you can submit the workflow as additional files in JSON format). Your workflows need to have (a) a two-pass sampling system (b) a ControlNet implementation, (c) an IPAdapter with multiple images, and (d) at least one LoRA. You may optionally use Midas for depth estimation or an Inpainting sequence.

    Upload the final content HERE (add a folder as: SurnameName)

S1. An Introduction to Synthetic Media
  • General Introductions & Course Overview
S2. Text-To-Image Generation & Prompt Engineering
  • Readings Discussion
  • Homework (deadline: Mon 17 Feb, by 12 noon)
    • Practice: Develop 20 high-resolution images that each explore a different prompt engineering, and/or different art styles, and/or different input images. Make sure that the images contain the ComfyUI workflow as well (so that we can see both the image and the workflow).

      Upload Final Images Here: Submission Folder (SurnameName-1.png, etc)

S3. Advanced Techniques for GenerativeAI
  • Readings Discussions
  • Computing Resources
    • 4 PCs with NVIDIA GTX3080 (N306)
      • Speed: Very Fast
      • Booking: Via Sheet
      • Note: at the moment 2 of them are updated by the vendor)
    • 5 iMacs (N306)
      • Speed: A bit slow
      • Booking: –
    • 1 PC with NVIDIA GTX4080 (Emerging Media Lab)
      • Speed: Extremely Fast
      • Booking: Via Sheet
    • 1 MacMini (Emerging Media Lab)
      • Speed: Very Fast
      • Booking: Via Sheet
    • Tensor.Art
      • Available for a 3-month subscription (need to purchase on your own and send here: sd163@nyu.edu your electronic receipt.
    • RunPod (serverless access)
      • This will be available to all after week 6
  • Homework (next week): First Draft/Iteration for Assignment 1
S4. Strategies for Project Development (Image)
S5. Assignment 1 Presentations
  • Assignment 1 Presentations

  • Assignment 2 – Machines & Motion

    (medium: generative video & sound | work individually OR groups of two)

    Prompt: In this assignment, you are asked to create a 1 to 2-minute video that explores motion in computational systems, machine hallucinations, or unconscious states of A.I. Unlike traditional animation, movement here is not just about objects shifting in space but about patterns emerging, structures unfolding, and logic evolving over time. How does an algorithm expresses and understands motion? How does data transform when set into motion? Your video should capture motion as a generative process – whether through fluid transitions, recursive loops, glitch disruptions, or emergent formations.

    In terms of technical details, you need to submit a final video and a folder with your project files. The video must be in .mp4 format (1920×1080 or higher 4K), between 1 to 2 minutes, and include a workflow file documenting the processes used. Your workflows must incorporate: (a) a generative motion process (b) a structured sound layer, which could be generative music, algorithmic soundscapes, or fragmented digital textures, (c) (optional) a text or narration component, which may be an AI-generated script, a poetic reflection, or a conceptual statement.

    Upload the final content HERE (add a folder as: SurnameName)

S6. Synthetic Video
  • Workflows
    • S6-1 – Workshop File (base)

    • S6-2 – AnimateDiff Text-To-Video
    • S6-3 – AnimateDiff Prompt Travel
    • S6-4 Frame Interpolation
  • RunPod Tutorial (Cloud Service)

    In RunPod, you can set up your custom machine learning, generative AI, or ComfyUI projects. Below are the commands used in this tutorial (for easier copy-pasting during setup).


    1️⃣ Set Up Storage Space

    • Allocate 100GB of storage before creating the pod.

    2️⃣ Create a New Pod

    1. Attach Storage: When deploying a pod, add your storage space (this option appears at the top of the Deploy GPU Pod page).
    1. Select a GPU: A GPU costing around $0.5/hour (with at least 20GB VRAM) should be sufficient for generative video tasks.
    1. Choose a Template:
      • Select RunPod Pytorch 2.1.
      • Click Edit Template, then under Expose HTTP Ports, add:
        8888,8188

    3️⃣ Set Up the Environment

    Once the pod is set up and ready, go to Connect → Jupyter Lab and follow these steps:

    1. Open a Terminal and install PyTorch:
      pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/xpu
    1. Clone ComfyUI Repository:
      git clone https://github.com/comfyanonymous/ComfyUI.git
    1. Install ComfyUI Requirements:
      cd ComfyUI/
      pip install -r requirements.txt
    1. Install ComfyUI Manager:
      cd custom_nodes/
      git clone https://github.com/ltdrdata/ComfyUI-Manager.git
      cd ComfyUI-Manager
      pip install -r requirements.txt
      cd ../..
    1. Download a Model Checkpoint:
      cd models/checkpoints/
      wget -O dreamshaper8LCM.safetensors "https://civitai.com/api/download/models/252914?type=Model&format=SafeTensor&size=pruned&fp=fp16"
      cd ../..
    1. Run ComfyUI:
      python3 main.py --listen

    4️⃣ Access ComfyUI

    • Go back to your pod, click Connect, then select HTTP Service.
    • ComfyUI should open in a new browser window.
S7. Strategies for Project Development (Video)
  • Workflows

    • S7 – VideoFull

    • S7-1 PostProcessing Video

S8. Assignment 2 Tutorials & Guest Talk
  • ComfyCon | Shanghai Climate Week
  • Assignment 2 Tutorials
    1. Catherine Rong + Tina Song
    1. Katherine Guo & Elaine He
    1. Vicky
    1. Kiri
    1. Hanwen Hu & Leander Bai
    1. Jaidyn
    1. CC
    1. Marissa Lori Moreno
    1. Azaliia
    1. Emy
    1. Anya
  • Guest Talk (5.30-6.30PM, @N306)

    Guest Talk by Chantal Matar: AI, Immersive Media & Spatial Design

    Chantal Matar is a multidisciplinary Lebanese-British architectural and generative designer specializing in the intersection of mixed media and architecture. Her work is deeply influenced by art, nature, and emerging technologies, particularly Artificial Intelligence, which she integrates into her research, digital aesthetics, and built environments. She has led workshops and delivered lectures at some of the world’s leading institutions and conferences, including MIT, EPFL, and UCL Bartlett. Her work has been showcased internationally, including the Venice Biennale and at multiple galleries and exhibitions around the world.

  • Homework (7 April)
    • Installation: Unity Engine & Review of Basics

      To start developing a project with Unity, you first need to install it in our system:

      • Unity Hub and an LTS Unity Editor (recommended: 6000.0.xx)

      Following the installation of the required software, you can open the Unity Hub, which is a tool that allows you to manage your Unity projects as well as manage and install different versions of Unity in one centralized location.

      Unity Hub

      When you open Unity Hub for the first time, you will be prompted to log in with your Unity ID. If you don’t have one, you can create it for free. In Unity Hub’s Preferences menu, select License, and activate a Personal license (which is free). The personal licenses expire after a few days, but you can install it again with the same process.

      The Unity Hub will show you a list of all Unity versions installed on your computer (you can have multiple versions of Unity if you want). If you don’t have any versions installed, you can click on the “Installs” button to install a new version.

      Here are the main tabs that you need to know about Unity Hub:

      • Projects – To open an existing project or create a new one, click on the “Projects” button and then select the option that you want.
      • Installs – To manage and install different versions of Unity, click on the “Installs” button. Here you can see all the versions of Unity installed on your computer, and you can also install new versions or delete old ones.
      • Learn – To access Unity documentation and tutorials, click on the “Learn” button.
      • Other options include the Community, UOS, and Developer Services (which are not going to be covered).

      Creating a Project with the Unity Hub

      To create a new project in Unity, open the Unity Hub and click on the “New Project” button. In the “New Project” window, select the appropriate version for your project from the drop-down menu – 6000.0.xx. Next, select the URP (Universal Rendering Pipeline) or the HDRP (High Definition Rendering Pipeline) template from the list of templates.

      In the right side panel, set the project name and location. You can also choose to create a new folder for your project if you want to. Finally, uncheck the checkbox for “Enable PlasticSCM”. This will prevent Unity from using the PlasticSCM version control system for your project.

      Click on the “Create Project” button to create your new Unity 3D project. Once the project is created, you will be taken to the Unity editor where you can start working on your application.

      Unity Editor

      The Unity editor is divided into several main areas. The most important ones are the following: the Scene view, the Game view, the Hierarchy, the Project, Inspector, and Console.

      • The Scene view is where you can see and edit your 3D objects in your scene. You can navigate and manipulate the camera, as well as select and move objects in the Scene view.
      • The Game view is where you can see what your game or application will look like when it is running. You can run the “Game” by clicking on the Play/Stop buttons at the top of the screen.
      • The Hierarchy is where you can see all the objects in your scene organized in a hierarchical tree structure. You can select, move, and delete objects from the Hierarchy.
      • The Project is where you can access all the assets and files in your project. You can import new assets, create new folders, and access the scripts and prefabs in your project.
      • The Inspector is where you can see and edit the properties of the currently selected object. You can change the object’s position, rotation, scale, as well as access its materials, scripts, and other components.
      • The Console is where we can check if everything runs correctly with our project. This helps us to debug code or other issues that may occur during execution.

      Other important panels and features include the Toolbar, Lighting, and more.

      • The Toolbar at the top of the Unity editor is where you can access different tools such as the move, rotate, and scale tools. You can also access different windows and settings from the Toolbar.
        • The Hand Tool allows panning around the Scene.
        • Press key Alt (Windows) or Option (Mac) and left-click and drag to Orbit the Camera around the current pivot point.
        • Scroll (mouse or keypad) to zoom in and out the scene.
        • The Move, Rotate, Scale, Rect Transform and Transform tools allow you to edit individual GameObjects.
        • Shortcuts: Q-W-E-R-T-Y
      • The Lighting window is where you can set up and adjust lighting for your project. You can add and adjust lights, as well as change the environment settings.
      • Additional windows and panels can be found on the main Unity Menu/Windows.

        Primitives

        To add 3D objects in the scene go to the Hierarchy panel, and either click the + sign, or right click on the Hierarchy panel area, to have the sub-menu appear. Select 3D Object, then choose the type of object you want to create, such as a cube, sphere, or plane. The new object will appear in the Hierarchy panel and the Scene view.

        You can move, rotate and scale the object in the Scene view (by using the Unity’s Transform component), as well as from the number boxes that appear in the Inspector panel.

        Each GameObject can be configured from the Inspector window. Properties such as position, rotation, size are set here. If the GameObject includes materials or additional properties, they can all be accessed from here as well.

  • Assignment 3 – Paradigms of A(I)rt

    (medium: agnostic | work individually OR groups of two/three)

    • Prompt: For the final project, you are given more freedom to explore a topic of your own choosing related to synthetic media. You may choose to create a new generative model using machine learning techniques, implement a creative application of synthetic media for a specific purpose or design an interactive experience that showcases the potential of synthetic media in a philosophical, sociopolitical or artistic context. Moreover, you are encouraged to incorporate the concepts and techniques covered throughout the course and should be expected to produce a significant output, such as an immersive experience, an interactive installation, or an audiovisual composition. The final project is expected to be presented in a public showcase/exhibition, allowing all of you to share their work with broader audiences.
S9. Intergration with Unity (textures, video, 3D)
  • Unity Basics

    To start developing a project with Unity, you first need to install it in our system:

    • Unity Hub and an LTS Unity Editor (recommended: 6000.0.xx)

    Following the installation of the required software, you can open the Unity Hub, which is a tool that allows you to manage your Unity projects as well as manage and install different versions of Unity in one centralized location.

    Unity Hub

    When you open Unity Hub for the first time, you will be prompted to log in with your Unity ID. If you don’t have one, you can create it for free. In Unity Hub’s Preferences menu, select License, and activate a Personal license (which is free). The personal licenses expire after a few days, but you can install it again with the same process.

    The Unity Hub will show you a list of all Unity versions installed on your computer (you can have multiple versions of Unity if you want). If you don’t have any versions installed, you can click on the “Installs” button to install a new version.

    Here are the main tabs that you need to know about Unity Hub:

    • Projects – To open an existing project or create a new one, click on the “Projects” button and then select the option that you want.
    • Installs – To manage and install different versions of Unity, click on the “Installs” button. Here you can see all the versions of Unity installed on your computer, and you can also install new versions or delete old ones.
    • Learn – To access Unity documentation and tutorials, click on the “Learn” button.
    • Other options include the Community, UOS, and Developer Services (which are not going to be covered).

    Creating a Project with the Unity Hub

    To create a new project in Unity, open the Unity Hub and click on the “New Project” button. In the “New Project” window, select the appropriate version for your project from the drop-down menu – 6000.0.xx. Next, select the URP (Universal Rendering Pipeline) or the HDRP (High Definition Rendering Pipeline) template from the list of templates.

    In the right side panel, set the project name and location. You can also choose to create a new folder for your project if you want to. Finally, uncheck the checkbox for “Enable PlasticSCM”. This will prevent Unity from using the PlasticSCM version control system for your project.

    Click on the “Create Project” button to create your new Unity 3D project. Once the project is created, you will be taken to the Unity editor where you can start working on your application.

    Unity Editor

    The Unity editor is divided into several main areas. The most important ones are the following: the Scene view, the Game view, the Hierarchy, the Project, Inspector, and Console.

    • The Scene view is where you can see and edit your 3D objects in your scene. You can navigate and manipulate the camera, as well as select and move objects in the Scene view.
    • The Game view is where you can see what your game or application will look like when it is running. You can run the “Game” by clicking on the Play/Stop buttons at the top of the screen.
    • The Hierarchy is where you can see all the objects in your scene organized in a hierarchical tree structure. You can select, move, and delete objects from the Hierarchy.
    • The Project is where you can access all the assets and files in your project. You can import new assets, create new folders, and access the scripts and prefabs in your project.
    • The Inspector is where you can see and edit the properties of the currently selected object. You can change the object’s position, rotation, scale, as well as access its materials, scripts, and other components.
    • The Console is where we can check if everything runs correctly with our project. This helps us to debug code or other issues that may occur during execution.

    Other important panels and features include the Toolbar, Lighting, and more.

    • The Toolbar at the top of the Unity editor is where you can access different tools such as the move, rotate, and scale tools. You can also access different windows and settings from the Toolbar.
      • The Hand Tool allows panning around the Scene.
      • Press key Alt (Windows) or Option (Mac) and left-click and drag to Orbit the Camera around the current pivot point.
      • Scroll (mouse or keypad) to zoom in and out the scene.
      • The Move, Rotate, Scale, Rect Transform and Transform tools allow you to edit individual GameObjects.
      • Shortcuts: Q-W-E-R-T-Y
    • The Lighting window is where you can set up and adjust lighting for your project. You can add and adjust lights, as well as change the environment settings.
    • Additional windows and panels can be found on the main Unity Menu/Windows.

      Primitives

      To add 3D objects in the scene go to the Hierarchy panel, and either click the + sign, or right click on the Hierarchy panel area, to have the sub-menu appear. Select 3D Object, then choose the type of object you want to create, such as a cube, sphere, or plane. The new object will appear in the Hierarchy panel and the Scene view.

      You can move, rotate and scale the object in the Scene view (by using the Unity’s Transform component), as well as from the number boxes that appear in the Inspector panel.

      Each GameObject can be configured from the Inspector window. Properties such as position, rotation, size are set here. If the GameObject includes materials or additional properties, they can all be accessed from here as well.

  • Adding Videos (render textures)

    In Unity, a Render Texture is a special type of texture that is created and updated at runtime. It allows you to render the output of a video (or camera) to a texture, which can then be used as the input for another camera, a material, or even a GUI element.

    In this example, we want to utilize a surface of the 3D model to display the contents of a video file. Therefore, a render texture will assist us in creating this effect. To do this we need to complete the following steps:

    • Create a Video Player object – A Video Player can be added to our project by selecting GameObject / Video / Video Player. This will add a new video object on the Hierarchy panel. From the Inspector we have the option to load a video file to the object (the video file has to be already added in our Assets folder; .mp4 file format is strongly suggested).

    • Create and assign a Render Texture – A Render Texture is a type of texture that Unity creates and updates at run time. To use a Render Texture you need first to create a new Render Texture asset in the Assets panel. This can be created either by right-clicking in the Assets folder, or by selecting in the main menu: Assets / Create / Render Texture. When the texture is created, we can review its Inspector and set its size to the value that it is needed (i.e. SD/HD, etc). After this is set, go back to the Video Player and set its Target Texture option to the new render texture we have just created.

    • Create a Geometry & Material – In this instance we want the video to appear on a geometry – a 3D object that contains a special material that allows it to display the video frames. To complete this task, we can create a new 3D GameObject (i.e. plane) and position it where it is needed. Finally, we have to create a new material as this will be needed to give to the 3D object its final surface look. Here, we need to assign to the Base Map / Albedo settings of the color the Render Texture that we have just created. To do this, drag the Render Texture file from the Assets folder, to the little square that appears on the name Base Map / Albedo inside our new material.

S10. Integration with Unity (interaction, shaders, VFX)
  • Scripting with C#

    Scripting in Unity refers to the process of creating code to control the behavior and functionality of GameObjects in a Unity scene. Unity supports scripting in C#, UnityScript (a variant of JavaScript), and Boo (a language similar to Python).

    Scripts in Unity are attached to GameObjects as components and contain code that modifies the GameObject’s behavior and properties in response to events in the game. A script can be used to handle user input, control animations, manage physics, and perform many other tasks.

    To use a script in Unity follow this simple flow:

    1. Create a new script: To create a new script in Unity, you can right-click in the Assets panel and select Create > C# Script or another supported language.
    1. Attach the script to a GameObject: To attach a script to a GameObject, you can drag the script from the Assets panel onto the GameObject in the Scene or Hierarchy panel. The script will appear as a component in the Inspector panel.
    1. Steps 1 and 2 can be also done directly from within the Inspector of a GameObject. Click on the GameObject on the Hierarchy panel, and on the Inspector click on Add Component > New Script.
    1. Write Code: The created script appears both on the Inspector window of the GameObject that we have attached to it, and also on our Assets folder in the Project window. Double clicking either the file from the Project window, or from the GameObject’s Inspector, the script will open on Visual Studio Code, where we can edit and save it.
    1. Update the Code and Run: After we edit the code and save it, we need to Run the program, during which the code will execute. The GameObject‘s behavior and properties are going to be updated according to the code we have written.

    In the following example, we attach a new script to a Prefab. Our objective is to write an instruction that continuously rotates the GameObject around its axis. The code for the rotation of the GameObject around its X,Y,Z axis is as follows:

    //Imports
    using System.Collections;
    using System.Collections.Generic;
    using UnityEngine;
    
    //Name of our newly created script (Rotation.cs)
    public class Rotation : MonoBehaviour
    {
    		//Public variables (these appear on the Inspector
        public float rotationSpeedX = 10f;
        public float rotationSpeedY = 10f;
        public float rotationSpeedZ = 10f;
    
        // Start is called before the first frame update
        void Start()
        {
        }
    
        // Update is called once per frame
        void Update()
        {
    				//Call transform.Rotate a method for controlling GameObject rotation
            transform.Rotate(Vector3.right * Time.deltaTime * rotationSpeedX);
            transform.Rotate(Vector3.up * Time.deltaTime * rotationSpeedY);
            transform.Rotate(Vector3.forward * Time.deltaTime * rotationSpeedZ);
        }
    }

    In this example, the rotationSpeedX, rotationSpeedY, and rotationSpeedZ variables determine the speed of the rotation along the x, y, and z axis, respectively. The transform property refers to the transform component of the GameObject, which contains its position, rotation, and scale.

    The Rotate method rotates the GameObject around the specified axis by the specified angle. In this case, the Vector3.right represents the x-axis, Vector3.up represents the y-axis, and Vector3.forward represents the z-axis. The rotation angle is determined by Time.deltaTime * rotationSpeed, where Time.deltaTime represents the time in seconds since the first frame, and rotationSpeed is the speed of the rotation.

    This script will continuously rotate the GameObject around its x, y, and z axis by the specified rotation speed. You can adjust the rotationSpeedX, rotationSpeedY, and rotationSpeedZ values to control the speed of the rotation along each axis from the Inspector panel of the GameObject.

    Scripts can be attached to Prefabs as well. In the following example, a Prefab has been created that includes multiple cubes nested together. The previous script has been adjusted to generate random rotation speed values for each cube. Therefore, upon the initialization of the code, each cube will be controlled with a different rotational speed. The randomization of the values is done with the use of the Random.Range function.

    using System.Collections;
    using System.Collections.Generic;
    using UnityEngine;
    
    public class RotationRandom : MonoBehaviour
    {
        public float rotationSpeedX = 0f;
        public float rotationSpeedY = 0f;
        public float rotationSpeedZ = 0f;
    
        // Start is called before the first frame update
        void Start()
        {
            //Generate random values for rotational speed (X,Y,Z);
            rotationSpeedX = Random.Range(-60.0f, 60.0f);
            rotationSpeedY = Random.Range(-60.0f, 60.0f);
            rotationSpeedZ = Random.Range(-60.0f, 60.0f);
        }
    
        // Update is called once per frame
        void Update()
        {
            transform.Rotate(Vector3.right * Time.deltaTime * rotationSpeedX);
            transform.Rotate(Vector3.up * Time.deltaTime * rotationSpeedY);
            transform.Rotate(Vector3.forward * Time.deltaTime * rotationSpeedZ);
        }
    }


    The following links provide comprehensive details regarding scripting in Unity.

    In the next class, we will focus more comprehensively on coding, and generative art & design.

    • Iterations (For-Loops)

      A for loop is a control structure in programming that allows you to execute a block of code a specified number of times. It is defined with a starting value, a limit, and an increment value (step), and executes the code within the loop until the limit is reached.

      In Unity, for loops can be used to perform repetitive tasks, such as initializing an array, updating elements in a list, or creating multiple instances of a GameObject.

      Here’s an example of how a for loop can be used to initialize an array in Unity.

      using UnityEngine;
      
      public class forLoops : MonoBehaviour
      {
          // Start is called before the first frame update
          void Start()
          {
              int[] numbers = new int[10];
              for (int i = 0; i < 10; i++)
              {
                  numbers[i] = i;
                  Debug.Log(numbers[i]);
              }
          }
      }

      In this example, the for loop starts with the value of i being 0, and continues to loop until i reaches 9 (the limit of 10 – 1). Each time the loop iterates, i is incremented by 1 (the increment value). The code inside the loop sets the value of the numbers array at the index i to i (0,1,2,….9).

      More info on for-loops and Unity:

      • Code Example: 1D For-Loop

        The following example demonstrates afor loop to instantiate multiple instances of a prefab.

        using UnityEngine;
        
        public class PrefabInstantiator : MonoBehaviour
        {
            public GameObject prefab;
        
            void Start()
            {
                for (int i = 0; i < 10; i++)
                {
                    Vector3 position = new Vector3(i * 2, 0, 0);
                    Instantiate(prefab, position, Quaternion.identity);
                }
            }
        }

        Here, the for loop starts with the value of i being 0, and continues the loop until i reaches 9 (the limit of 10 – 1). Each time the loop iterates, i is incremented by 1 (the increment value). The code inside the loop calculates a new position based on the value of i and uses the Instantiate function to create a new instance of the prefab at that position.

      • Code Example: 2D For-Loop

        Nested loops in Unity allow you to loop through multiple dimensions and instantiate prefabs in a grid-like or matrix-like pattern. For example, you could use nested loops to create a grid of prefabs, with each prefab representing a cell in the grid. Here’s an example of how you can use nested loops to instantiate prefabs in a 5×5 grid pattern:

        using UnityEngine;
        
        public class PrefabInstantiator : MonoBehaviour
        {
            public GameObject prefab;
        
            void Start()
            {
                for (int i = 0; i < 5; i++)
                {
                    for (int j = 0; j < 5; j++)
                    {
                        Vector3 position = new Vector3(i * 2, 0, j * 2);
                        Instantiate(prefab, position, Quaternion.identity);
                    }
                }
            }
        }

        In this example, the outer for loop starts with the value of i being 0, and continues to loop until i reaches 4 (the limit of 5 – 1). The inner for loop starts with the value of j being 0, and continues to loop until j reaches 4. Each time the inner loop iterates, j is incremented by 1. The code inside the inner loop calculates a new position based on the values of i and j and uses the Instantiate function to create a new instance of the prefab at that position.

      • Code Example: 3D For-Loop

        The same logic can be used for creating three-dimensional nested loops (see code below).

        using UnityEngine;
        
        public class forLoopsPrefab3D : MonoBehaviour
        {
            public GameObject prefab;
        
            void Start()
            {
                for (int i = 0; i < 5; i++)
                {
                    for (int j = 0; j < 5; j++)
                    {
                        for (int k = 0; k < 5; k++)
                        {
                            Vector3 position = new Vector3(i * 2, k * 2, j * 2);
                            Instantiate(prefab, position, Quaternion.identity);
                        }
                    }
                }
            }
        }

        The code creates an instance of a prefab in a 3D grid pattern. It does this by using three nested for-loops that generate a grid of prefab instances.

        • The first for-loop (int i = 0; i < 5; i++) will run 5 times, with the value of i incrementing from 0 to 4 each time the loop runs.
        • The second for-loop (int j = 0; j < 5; j++) will also run 5 times, with the value of j incrementing from 0 to 4 each time the loop runs.
        • The third for-loop (int k = 0; k < 5; k++) will also run 5 times, with the value of k incrementing from 0 to 4 each time the loop runs.

        At each iteration of these three loops, a new Vector3 position is calculated by multiplying i, j, and k by 2 (position = new Vector3(i * 2, k * 2, j * 2)). Then, a new instance of the prefab is created at this calculated position using the Instantiate() method Instantiate(prefab, position, Quaternion.identity).

        This will result in a grid of 125 instances of the prefab, with each instance being placed 2 units apart from each other.

      • Code Example: Adding Instances as Children

        One additional modification that you can do to this example is to put all new instances within the main prefab GameObject, so that it is easier to control all instances at once. This can be accomplished very easily, by simply adding a 4th property within the Instantiate function, so that it reads as below:

        Instantiate(prefab, position, Quaternion.identity, transform)

        The transform keyword is going to position the newly created prefab within the same GameObject (named Prefab3D-Rotate seen in the following image). Now, if we desire to automate the behavior of this parent (Prefab3D-Rotate), we can add an additional script (rotation.cs) that is going to rotate the parent (and its children, that is, the prefab instances).

      • Code Example: Rotating the Parent

        Finally, if we add a random rotation script within the original prefab (which exists on the Project window), we will get a much more organic behavior.

  • VFX
    • Introduction to the VFX Graph

      The VFX Graph in Unity is a tool that allows developers and artists to create complex visual effects in real time. It is a node-based system that uses a series of connected nodes to create particle systems and other types of visual effects.

      The VFX Graph allows users to create effects with a high level of detail, including dynamic lighting, physics-based motion, and custom shaders. It also includes a range of pre-built templates and effects that can be easily modified and customized to suit individual needs.

      Some key features of the VFX Graph include support for GPU instancing, which allows for the efficient rendering of large numbers of objects, and the ability to create custom nodes and effects using C# code. It also includes support for the HDRP (High Definition Render Pipeline) and URP (Universal Render Pipeline), which are Unity’s rendering pipelines designed to support high-quality graphics and real-time rendering.

      Overall, the VFX Graph is a powerful tool for creating high-quality visual effects in Unity, and its node-based approach makes it accessible to developers and artists of all levels of experience.

    • Installations

      The VFX Graph works with the High Definition Render Pipeline (HDRP) and it also supports materials and shaders (also compatible with the Shader Graph). VFX Graph operates on the GPU, which allows it to have a much more efficient performance in contrast to other particle systems that are rendered on the CPU.

      Package Manager & Settings

      VFX Graph comes together with the installation of an HDRP project in Unity (2021). If a project is in URP, the VFX Graph can be installed via the Package Manager.

      Additional sample content can be found in the package Visual Effect Graph Additions, under Samples in the Package Manager (ensure that the Packages scope is set to Unity Registry).

      After installation, the additions can be found under Assets/Samples/VisualEffectGraph.

      💡
      Another configuration that may be useful at times is activating the Experimental Features of the VFX Graph. You can enable these features via Preferences > Visual Effects > Experimental Operators/Blocks, as shown below.

      VFX Asset & GameObject

      A VFX Graph is a file that can be created and stored in our Assets directory and can be accessed and used as any other asset. There are two main ways to add a VFX Graph to our scene:

      • Using our Assets window Create menu, we can select the option Visual Effects > Visual Effect Graph. This will create a new asset that can be dragged into our Scene window.
      • Using the Hierarchy’s window Create menu, we can select the option Visual Effects > Visual Effect. Following that, we need to set in the Inspector window a new graph in the option Asset Template. This will create a new VFX Graph that will be added to the Assets folder as well.

      When the VFX GameObject has been set up, we can open the graph either by clicking the asset file or selecting in the Inspector the Edit button under the Asset Template.

    • VFX Graph Window

      The VFX Graph Window contains the following elements:

      • Toolbar: Provides access to Global Settings, as well as to toggle several panels (sub-windows)
      • Node workspace: This is where we compose and edit the VFX Graph.
      • Blackboard: Manages exposed properties and global variables that are reusable throughout the graph.
      • VFX Control panel: Modifies the playback of the attached GameObject.

      The VFX Graph uses an interface similar to other node-based tools, such as the Shader Graph. To develop a particle system, we need to make use of the nodes that exist within this window and develop our own network of relationships.

      A user may press the spacebar or right-click to create a new graph element. With the mouse over the empty workspace, we can select Create Node to create a graph’s Context, Operator, or Property. If for example we select Create Node > System > Empty Particle System, a new system will appear in our window (same as the empty example that Unity creates for us every time we create a new VFX graph).

      It is often necessary to use the Inspector window, as it reveals various properties important for the nodes. Make sure to keep it open and visible when you are working with the VFX Graph.

    • System, Contexts, Blocks

      A standard VFX Graph consists of a vertical stack called System. A System defines standalone portions of the graph and encompasses several Contexts. Each System is denoted by the dotted line that frames the Contexts.

      For each graph we have the following elements: Spawn, Initialize, Update, and Output. Inside every Context, we find individual Blocks with independent Attributes (for size color, velocity, etc.) for its particles and meshes.

      The flow between the Contexts determines how particles spawn and simulate, with the processing of the data happening from top to bottom (vertical logic). Each Context defines one stage of computation:

      • Spawn: Determines how many particles you should create and when to
        spawn them (e.g., in one burst, looping, with a delay, etc.)
      • Initialize: Determines the starting Attributes for the particles, as well as
        the Capacity (maximum particle count) and Bounds (volume where the
        effect renders)
      • Update: Changes the particle properties of each frame; here you can apply
        Forces, add animation, create Collisions, or set up some interaction, such
        as with Signed Distance Fields (SDF)
      • Output: Renders the particles and determines their final look (color, texture,
        orientation); each System can have multiple outputs for maximum flexibility.

      Blocks are nodes that define the behavior of a Context. You can create and reorder Blocks within a Context and, when Unity plays a visual effect, Blocks execute from top to bottom. You can use Blocks for many purposes, from simple value storage (for example, a random Color) to high-level complex operations such as Noise Turbulence, Forces, or Collisions.

      💡
      When the mouse pointer is on the Context element, you may press Space (on your keyboard) to have the blocks menu appear. The menu will show all available blocks that you can use within this context.

    • VFX Example 1

      This example demonstrates how to create a basic particle system in which its motion is affected by turbulence and vector force fields.

      Spawn & Initialize

      The first step we need to take is to initialize the Rate of the Spawn System, in this case to a number of 1000. This value refers to the constant spawn rate that the particle system is going to use. Set the Capacity of the Initialize context to a similar value.

      Initially, the Spawn Rate and the Initialize Capacity may seem similar (if not the same), but they are not. Consider that their difference is the following:

      • Spawn Rate: How many particles do we spawn (generate); e.g. Constant Spawn will constantly create the set amount of particles.
      • Capacity: Refers to the number of spawned particles that we can hold before they die so that we can spawn them again.
      💡
      Use the Spawn Rate and the Initialize Capacity carefully. Very large numbers will slow down your application and/or crash the computer. Start with reasonable values (for example, 100 or 1000), and adjust gradually.

      [Block] Set Velocity Random: This block randomizes the velocity of the particles for the X, Y, and Z axes. The values A and B refer to the minimum and maximum values within which the system will generate a random value and assign it to each particle. In our example then, all 1000 particles will instantly get a random velocity value (that is why the particles that appear in the Scene view all have different motion speeds).

      [Block] Set Lifetime Random: This block randomizes a float value (within a minimum and maximum range), and assigns the result to the lifetime of each particle. In this example, each particle will have a life span between 2 to 30 frames.

      Output

      In the VFX Graph, we have multiple options for an Output, that is, the settings we provide for the look and final properties of the particle system. In this example, we are using an Output Particle Quad. This Output context includes options for adding a shader with Shader Graph, Texture, UVs, and Blend Mode, which allow us to set the basic appearance of our particles.

      [Block] Orient Face Camera Plane: The Orient Block makes particles orient themselves to face a certain way. With the Face Camera Plane, the particles orient to face forward towards the camera plane. This mode works for most use cases, however, the facing illusion can break sometimes at the edges of the screen (more info here).

      [Block] Set Size over Life: This block allows us to control the size of the particle from birth to death. If, for example, we can set the size with a value of 1 at position 0 (birth) to a value of 0.1 at position 1 (death). Thus, the particle will start from its maximum size, and until it does it will shrink to 1/10th of its original size.

      [Block] Set Color over Life: With this block, we can control the color of the particle from its birth till it dies. The block allows us to create gradients of color, define the color that we want the particle to have when it first appears on the system, and gradually move from one color value to the other as it reaches its death (in this instance from pink, to light blue, and then to transparent).

      Update

      Till now, in this example, the particles move in a linear way, from the center outwards till they fade out. In the Update context, we can add blocks to set more sophisticated functions to the particle system. For example, we can add various physics simulations to alter the way that the particle system behaves.

      [Block] Turbulence: The Turbulence block generates a noise field which it applies to the particle’s velocity. This Block is useful for adding natural-looking movement to particles. For more information on the types of noise, see the Value CurlPerlin Curl, and Cellular Curl noise Operators.

      [Block] Vector Force Field: The Vector Field Force block uses vector fields to apply a force to the particles. This Block is useful for adding specific forces created in advance and stored in vector field assets.

      Further development practices:

      1. Adjust the number of particles (with caution! start from low values, like 500)
      1. Add a new block to the Initialize context.
      1. Change the turbulence and vector force field settings.
      1. Create and assign a custom color gradient.
      1. Change the particle texture to a custom image (consider a low to mid size .PNG with transparency).
        Example result

    • Properties & Operators

      Properties

      Properties are editable fields that you can connect to graph elements using Property workflow. They can be found on graph elements such as ContextsBlocks, and Operators. Properties in the VFX Graph can be of any type, including the following:

      • Boolean
      • Integer
      • Float
      • Vectors
      • Textures
      • Animation Curves
      • Gradient

      Properties that are made of multiple components (such as Vectors, or Colors) can display every component individually in order to connect these to other properties of compatible type. Use the arrow next to the property to unfold the components.

      Operators

      Just as Systems form much of the graph’s vertical logic, Operators make up the “horizontal logic” of its property workflow. They can help you pass the custom expressions or values into your Blocks.

      Operators flow left to right, akin to Shader Graph nodes. You can use them for handling values or performing a range of calculations. Use the Create Node menu (right-click or press the spacebar) to create Operator Nodes. These Operators from the Bonfire sample, for instance, compute a random wind direction.

      Uniform Operators are Nodes that you can use with a single input of a variable type. For example, you can use absolute values for a float, a Vector3 or an Integer. The output type of any Uniform Operator is always the same as its input type. Connecting a new input with a different type will change automatically the output type of the operator.

      In addition to the Uniform Operators, some operators with many inputs can handle multiple inputs of variable types. These Nodes are called Unified Operators; e.g., the Lerp Operator can interpolate between two Vectors uniformly based on a float or every component using a Vector of the same length.

    • Blackboard & Exposed Properties

      The Blackboard is a utility panel that manages Global properties, which can
      appear multiple times throughout the graph as Property Nodes. Properties in the Blackboard are either:

      • Exposed: The green dot to the left of any Exposed Property indicates that
        you can see and edit it outside of the graph. Access an Exposed Property
        in the Inspector via script using the Exposed Property class.
      • Constant: A Blackboard property without a green dot is a Constant. It is
        reusable within the graph but does not appear in the Inspector.

      To create a Property Node:

      • Drag the Node from the Blackboard Panel into the Workspace.
      • To convert a Property Node to an Inline Node of the same type, right-click the property Node and select Convert to Inline.
      • When you delete a property from the Blackboard, Unity also deletes its property Node instances from the graph.

    • VFX Example 2

      In this example, we will see how to use properties, operators, and the blackboard.

      Spawn & Initialize

      First, the Spawn Context is set to a Constant Rate of 10000.

      The Initialize Particle has a Capacity of 10000. The Set Lifetime Random is set to 2 and 10.

      For the Set Position, we use a Shape: Arc Sphere, with Position Mode to Surface, and Spawn Mode to Random.

      Operators

      For the Set Color, we use a set of operators that allow for two behaviors: the first allows us to select a texture, and from this texture, we extract randomly pixel values that control the color of the particles, and the second one creates an autonomous colorization using the Hue rotation.

      The Coloring network uses the following nodes:

      • Texture2D: Exposed property that appears on the Blackboard, and we can access it from the Inspector window.
      • Use Colormap: Exposed property that appears on the Blackboard, and we can access it from the Inspector window.
      • Periodic Total Time: Returns a recurring lapse of time every N seconds, with given range.
      • Sample Texture2D: Samples a texture at given position and returns the corresponding value.
      • HSV to RGB: Converts a HSV vector to RGB value.
      • Branch: Selects one branch or another based on a boolean condition.

      Here, the main idea is that based on the conditions set by Use Colormap, which is a boolean that appears on the Inspector, we can set the color of the particle system either based on an image/texture, or using an automatic rotation (through HSV to RGB).

      Update & Output

      The Update context uses a Turbulence block (same as Example 1). Regarding the Output, we have made the following adjustments:

      • Set Size over Life: Here, the size of the particle starts from 0, goes to 0.5 mid life, and goes back to 0 when the life of the particle is over.
      • Multiple Color over Life: This block is used to control the color fade of the particle when it reaches its death. Since here we start to multiply the color values with white, all original values that we assigned previously will stay intact. However, as we gradually multiply with darker to black values, the original pixel values will be reduce accordingly (multiply a pixel value with 0, will result to 0).
      • Multiple Size: In a similar way we multiply the size of all particles with a variable. According to the value of the block, all particles will scale up/down uniformly.

      The result of this process appears below. You may see that from the Inspector we can select a new texture, from which the new particles will extract the color pixel values. Also, if we select the exposed property Use Colormap to 1, the particle system uses a continuous rotation of color values.

  • Proposal for Assignment 3 (Deadline Wednesday 16 April, 1pm)

    Complete this form with the details of your project proposal

  • Homework (21 April)

    First Draft of Assignment 3 – Template, Structure, Assets (we will review in class via 1:1 meetings)

S11. Assignment 3: Drafts & Resources

S12. Assignment 3: First Iteration

S13. Assignment 3: Second Iteration
S14. Final Presentations & IMA Show

Course Readings / References
  • Crawford, K. (2021). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press.
  • Kissinger, H. A., Schmidt, E., & Huttenlocher, D. (2021). The Age of AI: And Our Human Future. Little, Brown and Company.
  • Lee, K.-F., & Qiufan, C. (2021). AI 2041: Ten Visions for Our Future. Currency.
  • Miller, A. I. (2019). The Artist in the Machine: The World of AI-Powered Creativity. The MIT Press.
  • Manovich, L., & Arielli, E. (2023). Artificial Aesthetics: A Critical Guide to AI, Media, and Design.
  • Sautoy, M. D. (2019). The Creativity Code: Art and Innovation in the Age of AI. Harvard University Press.