top of page

Let’s build something awesome together! 

Got an idea, a project, or just want to talk design, tech, or bikes? Let’s chat!

Human-Centered Robotics –
A UI/UX Take on Machine Interaction

Bridging the gap between human intuition and robotic precision.

• Robotic concept   • Human Machine Interaction  • RXD

Start Now

Overview

Part 1 – Prototype

Started as a passion project—a basic 3D-printed robotic arm built out of curiosity and hobby interest.

Start Now

Part 2 – Concept

Applied my UX skills to imagine a smarter, more natural way to interact with robotics using modern interfaces.

Part 1 – Prototype

I was inspired by Iron Man’s Dum-E bot—the way it intuitively helped Tony Stark really stuck with me. About a year ago, I 3D-printed a prototype out of curiosity, but it sat untouched in my room due to a busy schedule. With the rise of AI tools and browser-based tech, I saw an opportunity to bring it to life—reimagining it as a web-controlled, intuitive, and futuristic system.

Start Now
transformed (2).png

What I Built

Using a 6-axis robotic arm and an ESP8266/ESP32 microcontroller, I developed a working UI prototype that allows:

• Real-time control of each joint

• Clean, button-based modular interface

• Hosted on a personal domain, accessible from any browser

• ESP-based backend using HTML/CSS/JS and PWM control for servos

 

This interface is ideal for education, prototyping, and remote robotics experiments. 

Start Now
Start Now

Real-World Use Cases

Small-Scale Automation- Empowers startups and MSMEs (Micro, Small, Micro, Small, and Medium Enterprises) to automate tasks like sorting, pick-and-place, or light assembly using affordable robotic arms—no coding required.

• Remote Robotics- Control robots in hazardous or remote environments via browser. Ideal for virtual training, testing, or distance learning.

• Assistive & Rehab Tech: Enables gentle, therapist-designed movements and opens doors for low-cost, accessible robotic aids in healthcare.

Start Now

Research and highlights

Start Now

Research and highlights

arm.png
arram.png
Start Now

Part 2 –Neural-Robotic Integration Concept

The goal was to transform robotic control into a seamless, human-centered experience—moving beyond complex technical inputs. Designed an interface that enables users to operate a robotic arm using brain-computer interface (BCI) signals, with real-time feedback on neural sync, latency, and accuracy.

Features:

• Multi-modal Control: Supports multiple inputs—BCI, joystick, voice, and gesture—for flexible, inclusive interaction.

• Modular Workflow: Drag-and-drop modules for customizing tasks like path planning, target selection, and grip control.

• 3D Visualization: Real-time robotic arm view with position tracking and speed feedback for precise control.

• Agentic Assistance: AI-powered interface that collaborates with users—pausing, confirming steps, and automating sequences.

• Live Monitoring: Displays task progress, ETA, and playback control for smoother operations.

• Accessible UI: Clean, futuristic design focused on usability across diverse user groups.

Start Now

 A Futuristic Dashboard

Start Now
finallll.png
Start Now

Conclusion & Learnings

This journey was full of errors, fried boards, and late-night rabbit holes into AI forums I barely understood. But every mistake taught me something—from wiring basics to how AI models actually think (kind of). Now that all this knowledge is out there, I’m just trying to use it in the coolest, most meaningful way I can.

bottom of page