tagCloud

Let’s Work Together

DomoNovus Artifacts

Case Studies for the New Home

Speculative Research Artifacts, Doctoral Research (UK, CN)

Speculative Design, 3D Printing, Robotics, Sensing Ecologies, Microelectronics, Video, Generative AI
2025

DomoNovus Artifacts represents a collection of speculative design cases that aim to frame a radical transformation of the domestic space through computational intelligence and adaptive technologies (DomoNovus is a research project that is contextualized here). The artifacts visualize an alternative present (or anticipated future) where homes evolve beyond static shelters into a living, breathing organism that learns, adapts, and grows alongside their inhabitants. Through a symbiotic relationship, DomoNovus reimagines the dwelling as an active participant in daily life creating personalized environments that oscillate between physical and digital, individual and collective, present and remembered experiences.

DomoNovus contains integrated 3D printing systems that fabricate furniture and architectural elements on demand. Using materials like plastic, wood, metal, and glass, the house responds to real-time conditions. In this open-source ecosystem, furniture stores become filament selections and home accessories are downloadable files shared across online communities. Objects exist both physically and digitally, enabling continuous recycling and adaptation. The house documents every transformation, creating a living archive of how domestic space evolves over time.

The MemCore uses an array of depth cameras and spatial audio microphones integrated throughout the living space to create comprehensive media captures of daily life. Each sensor unit, combines scanning with stereoscopic video to map environments and activities in real-time. When the system’s intelligence units detect significant moments, elevated voice or activity patterns, abnormal amounts of people gathering, or preset trigger events, it automatically saves its full volumetric recordings to the central storage hub. These captures can be viewed through specialized augmented headset and AR glasses, allowing inhabitants to walk through recorded memories as three-dimensional holograms projected into the vision space.

NeuroBalance analyzes biometric data from bathroom and kitchen sensors to create personalized chemical compounds for each inhabitant. The automated synthesis unit combines vitamins, nootropics, and mood-regulating molecules based on real-time health markers, dispensing cognitive enhancers, serotonin precursors, or calming compounds for stress peaks. These custom formulations are delivered through morning beverages or absorbed during therapeutic baths, maintaining optimal mental clarity and emotional balance through precision biochemistry tailored to each person’s immediate needs (domestic space as a nurse, doctor, a therapist).

The HoloSense Interface displays volumetric projections when physical presence triggers it. Light-field technology and diffractive waveguides render the imagery with multi-plane depth and occlusion so that graphics stay stable as user moves. Ultrasonic arrays sculpt mid-air “touch,” letting inhabitants sense clicks and textures through focused acoustic pressure. The system receives emotional state reports for each identified user and dynamically adjusts the presented content by softening colors during stress, amplifying haptic warmth during isolation, or orchestrating full sensory immersion with synchronized light, sound, and touch patterns.

GForge is a closed-loop food fabrication system where a nutrition-aware AI formulates intelligent ingredients and prints meals directly. Drawing on bio-sensing (HRV, glucose proxies, activity, microbiome profiles), pantry inventories, and taste history, the model composes nutrient targets and transforms base stocks, protein isolates, plant gels, mycelial slurries, micro-encapsulated vitamins, probiotic cultures, into printable matrices with tuned rheology and flavor release. Real-time spectroscopy verifies amino acid balance, lipid quality, and allergens while a micro-fermentation bay develops custom umami and aroma precursors on demand.

DataGrid is a multi-depth environmental sensing system that continuously scans the home’s air and microclimate with research-grade detail. Fast sensors track particulates (PM₁/₂.₅/₁₀), CO₂, and reactive gases (NO₂/O₃/CO), while an electronic-nose VOC array fingerprints chemical patterns; periodic deep scans use micro-GC/FTIR (and optional PTR-MS/IMS) to speciate off-gassing, spills, and solvents. Fixed long-wave infrared cameras map heat leaks, appliance anomalies, and surface safety, joined by temperature, humidity, pressure, airflow, sound, and illuminance for a full comfort profile. Edge fusion converts these streams into actionable features, chronic trend, thermal anomaly, triggering ventilation, filtration grade shifts, ERV damper changes, and maintenance tickets, while tagging the 3D memory of rooms with contextual annotations. All capture is consent- and privacy-scoped: redaction for guests, local storage by default, and independent hardwired alarms for life-safety events.

Exhibitions

  • ISEA 2016
  • Ars Electronica 2019

Funding & Support

  • Alexander S. Onassis Public Benefit Foundation
  • DeTao Masters Academy (Shanghai Institute of Visual Arts)
  • Institute of Digital Arts & Technology, University of Plymouth