🚐Nomad Dreamscapes . creator files

00:00
00:00

🚐Nomad Dreamscapes . creator files

Daydream Scope

Explore new worlds with Daydream Scope

Check out the latest model drops and powerful integrations.

Download Now

Where Nature and Technology Collide.

🚐 NOMAD Dreamscapes > Redreaming our trips of reality > Where the physical space meets the Infinite ♾️

esfuerzo . fortaleza . actitud . disciplina . potencia . unión . empuje

marea and webcam node + mediapipe gestures:
***  https://drive.google.com/drive/folders/1DzyTwmkXfXk1AwXIAjSmWc0iCmo2VvEv?usp=sharing

NOMAD Reskinning 🇦🇷🌎

Creators be our Art +  Director "Creator" Clips 🎬🚐

standing as an artist and a player of creative software.🎮🎨 when I think of StreamDiffusion + Daydream...
I think of my own movies> it is about using parameters to mold and sculpt our piece of art.

because this track is for   +++++++Artist++++++++++ and because the aesthetic eye is what defines us, allowing us to craft and share these unique screens created with Daydream

Imagine creating your environment, moving through it, and dyeing it with your dreams.

Gift for you vsome clips from my latest journeys in NOMAD Dreamscapes <3

Status Update IV : mapping and workflow files 🚐💻

well, the pipeline has finally reached the mapping stage: controlling StreamDiffusion with different ports in Daydream, and handling Advanced Outputs via MIDI controllers or Layer Masks > in this case, a quick grid brand that mixes daydream and Nomad Dreamscapes

***The core idea behind this AI program was to master this specific workflow, learn about inputs and think in artistic screens
andddddddd.....

¿Whats about the Output?🔩♾️🌞

starting with a Three.js environment that allows us to move within our input and generate unique POVs, ensuring our journey is a true traversal from "second zero.
> By texturizing it with our own photo references and mixing it with audio-reactive and environment plugins, we achieved a LIVE mapping of our file.

I am thrilled to share the methodology behind these connections. To become and live as creators means utilizing tools; just as our ancestors did to forge and shape our reality 🪴

In this instance, we map screens through our visual and gaming journey, using TouchDesigner and StreamDiffusion to generate the textures and colors.

<and as the main premise>

to meticulously care for the art and aesthetics of the screens and the precise blending of layers
We are building a living system fueled by the creator, their ideas, and infinite possibilities

<3 mapping for this ai program

Mapping and Records : IRL videos 🚐💻

Remote Presence: we currently build a modular immersive system designed for living spaces; designed for the digital nomad lifestyle.
It transforms a physical space —in this case the cockpit of a motorhome— into a dynamic live environment that reacts and evolves with the traveler’s journey.  

leverages Daydream & streamdiffusion on TouchDesigner  to generate infinite visuals derived from the user's daily reality. _prototype for a "Digital Nomad Interface," bridging the gap between wild environments and highend cloud computing. 

🌞By day, we collect REALITY: photos of landscapes, textures, emotions and sounds.
By night, we use our system .toe proyect to reinterpret these memories, projecting an infinite world from inside our cabin.
This project explores the concept of "The Beauty of Nodes": using technology to curate our immediate surroundings.

 Using our space as a proxy for any and powered by realtime texturizing with SD + TD, this isn't just video playback or a pre-render loop > It is a "Redreaming of Reality 


We use gestures🙋🏻‍♂️🙆🏻‍♂️  to sculpt our environment like an ARTIST: a wave of the hand turns a desert landscape into a particle storm;  🏖️🌄🏜️🗻a pinch between fingers increases the intensity of the nostalgia, flooding the room with water. It is about building a safe, yet ever-changing home, no matter where the road takes us.  


Technical Workflow  🎨⚙️ Input > Process > Output

< Inputs (The Senses) 🙋🏻‍♂️🖖🏻

  • Visual Sampling _day: The photos we take during the day define the night. Examples like "Patagonian Pines," "Red Clay Walls of the Desert," or "Mountain Snow" serve as style references for Our Neural Network. . The system "remembers" the day to generate the night.
  • AudioReactive Atmosphere_day > creating a synchronized audiovisual ecosystem ***:rock: Driven by the playlist that made our hearts beat during the trip. We perform real-time analysis on kicks, snares,lows, highs transforming the atmosphere into something immersive thats Reacts and Evolution to the sound***:rock:
  • Gestures (MediaPipe)> replacing keyboards with organic movement to sculpt the environment: *****We coordinate our hands / body / obj to modulate parameters without using a keyboard, finding an organic flow with our body to control the visual. on a bed, in a van, with a sky of **imagine

> Processing (The Brain + TD)🧠

  • TouchDesigner & StreamDiffusionTD: signals are processed locally, 🧬 We process with our brain TD and send them to the Daydream cloud Backend <3 ssssiuuuuuuuuuuuu, visuals that never repeat

> Output (The Canvas) 🚐

  • Mapping & Architecture: using Spout and NDI, the signal is distributed to specific windows and surfaces. The user creates the layout, resizing and texturing their physical view digitally; To map specific windows, this allows for flexible resizing to fit any vehicle geometry, space.
  • The Interface: this is a functional prototype of a Digital Nomad Interface, native, dynamic and evolving, making us feel safe but expansive while collecting reality to travel through latent space.

🤝 Community 

Credits & Resources / shared

Built by the community of creators. This represents a new way of shared knowledge.

  🙏🏻 Some say knowledge is power; I believe power is actually sharing knowledge

This project is built to be open; the goal is to release a clean, accessible .tox framework that allows any creator to map their own room or vehicle, adhering to the DD ethos of accessible computing.

Daydream allows me to create in total virtuality, bridging the gap between the wild environment and high-end computation. What you see here is both a progress report and the DNA of our evolution in creating immersive experiences.

Thanks profesores / Credits: Torin Blankensmith (MediaPipe) + Daydream Team StreamDiffusionTD implementation GPU.

----

📸 updates from the field *** project Caption

this week, spending  🍾🎄🎅🏽 Christmas and holidays  at 3,200 meters above sea level in the Cordillera de los Andes, Patagonia Argentina, I conducted the first on-site test.
[Estuve pensando y testeando, pero evidentemente esta semana debo terminar de ejecutar correctamente el workflow testeado]

*powered by solar energy, connected via satellite internet, and visualizing through LED mapping in the MIDDLEEEE of the Mountains—this is where Nature and Technology Collide.

Daydream allows me to create in total virtuality, bridging the gap between the wild environment and high-end computation.
What you see here is both a progress report and the DNA of our evolution in creating immersive experiences.

----

*****************************************************************************

Status Update II: Midnight Labs 🌑🧪

#gamedev #threejs

**** last night <3 Saturday session, i was deep into testing realtime texturing on a Three.js montage with normal/basics animations; i’m currently learning how to manipulate different environments using specific parameters to control the mood instantly

this vision of developing a "micro game-dev ecosystem inside the motorhome", creating a space to play and drive around a procedural city generator.

this process highlights the massive importance of TESTING AESTHETICS in LIVEEEEEEE, applied to objects, characters, cars, and environments. This workflow validates my belief: dynamic, realtime parameters are the undeniable future of processing + rendering

#RealTimeRendering #TouchDesigner #StreamDiffusion #DigitalNomad 

currently developing

currently developing

Status Update III : sunday  and weekend labs 🚐🎄💻while the holidays were for grounding, family good times; my research algorithm had other plans X kept feeding me Threejs games, and enviroment craftings

so I decided to dive in maybe to create my own Input for the TouchDesigner <3 StreamDiffusion magican workflow.

Experiment > I built a custom micro-game experience .not just to play, but to generate the perfect Iinput

the narrative: craft and play tour trip, driving through a procedural void
features > Pneumatics, variable lighting, day/night, vehicles and NPCs to make the road community feel alive

Merging nature, art, and technology <3 I learning from the community’s polished workflows to elevate our own artist interface

redreaming our trips of reality > where the physical space meets the infinite

 
**Workflow** Game dev_Logic + Enviroment + Interaccion (Three.js) ➡️ Visual Sampling with Touchdesigner ➡️ AI Redreaming <3 StreaDiffusion

we capture ***reality with cameras and memories trough our trip, this time the Input is our digital journey.
Used the powerfull and magician and craziest "style references folder"—curated (curtidas) from thousands of km on the road; to feed the engine and shaping our dreams

🎮 Instead of a camera, proyect built a WebGL environment
Output "Canvas" >🚐 The goal is to generate specific frames and camera moves that are impossible to film; The interface evolves.*

next steps <3 homework for this cohort:

polishing the pipeline + audio-environment alignment "coming next"

**********coding my own REALITY as an Input

Attachments
v21