Building the edges of immersive computing — from spatial experiences on Apple Vision Pro to brain-computer interfaces, AI systems, and published Quest games.
I'm Andre "Dre" Smith, an XR and AI developer pushing the boundaries of immersive technology. I craft experiences that blur the line between the physical and digital — from published Meta Quest games to real-time neurofeedback systems.
My work spans VR, AR, Mixed Reality, Spatial Computing on Apple Vision Pro, Brain-Computer Interfaces with EEG sensors, and AI-driven interactive systems. Every project starts with a question: what's genuinely impossible today?
Currently building the next wave of spatial and AI experiences under the drenerdo brand.
A published Mixed Reality game on the Meta Quest platform. Players interact with mystical orbs in their real-world environment, blending physical space with immersive gameplay mechanics built on Meta's Presence Platform.
Infrascan is a spatial oracle that uses computer vision and augmented reality to verify industrial maintenance and log it as a permanent, tamper-proof record on the blockchain.
Hydroverse is a naval combat game set on a sun-drenched festival waterway. Players pilot armed speedboats across open water, battling rivals with eight distinct weapons — from homing missiles and flamethrowers to freeze rounds that lock enemies in place and ricochet shots that bounce off obstacles for bonus damage. Choose Free-For-All or Team Deathmatch, dial in your difficulty, and fight for score dominance before the clock runs out.
Real-time brain-computer interface systems using EEG sensors including the NextMind device. Translating live neural signals into interactive digital control inside immersive XR environments.
A catalog of immersive experiences across HTC Vive, Oculus, GearVR, and Daydream platforms. From AR+AI-driven applications to Web3-integrated virtual worlds and blockchain-verified digital ownership.
Full-cycle VR, AR, and Mixed Reality development across Meta Quest, Apple Vision Pro, HTC Vive, and WebXR platforms.
Real-time neurofeedback systems using EEG sensors including NextMind — translating brainwaves into interactive digital control.
Building AI-driven experiences combining LLMs, computer vision, and generative systems into immersive products.
Native visionOS development with RealityKit and SwiftUI — designing interfaces that inhabit and respect physical space.
Published games on the Meta Quest store. Designing core loops, physics interactions, and multiplayer systems in Unity.
Smart contracts, NFT integration, and on-chain digital ownership — combined with AR and AI to build the next layer of interactive, verifiable immersive experiences.
Available for freelance, collaboration, and full-time opportunities in XR, AI, and spatial computing.