Skip to content

manipulusai-sudo/manipulus

Repository files navigation

Manipulus

Intent‑Validated Non‑Verbal Computing

Manipulus is an open‑source research project exploring a future beyond the Command Era of technology. Instead of issuing explicit commands (apps, buttons, wake‑words), Manipulus investigates how human presence, gaze, and gesture can be translated into validated intent — locally, privately, and invisibly.

This repository documents the concept, architecture, and prototype direction of Manipulus. It is not a consumer product, and it does not collect or transmit personal data.


Vision

To liberate humans from the Command Era of technology.

We believe computing should adapt to humans — not the other way around. A home, device, or system should respond naturally to where you are, what you’re attending to, and what you intend — without forcing you to reach for an app, speak a command, or surrender privacy.

Manipulus aims to become an Intent‑Validation Operating Layer:

  • Invisible by default
  • Local‑first and private
  • Activated only by sustained, verifiable human intent

What Manipulus Is

Manipulus is an open‑source Non‑Verbal Communication (NVC) platform.

You can think of it as:

External Telepathy for the 99%

It explores BCI‑like capabilities (attention, intent, context) without implants, surgery, cloud dependence, or biometric storage.

At its core, Manipulus answers one question:

“Is this movement random — or is this intentional?”

Only when intent is validated does the system act.


What Manipulus Is Not

  • ❌ Not a gesture‑control toy
  • ❌ Not always‑listening
  • ❌ Not cloud‑dependent
  • ❌ Not a surveillance system
  • ❌ Not a consumer‑ready product

Manipulus deliberately avoids twitchy, reactive automation. No action occurs without temporal consistency and contextual confirmation.


Core Concept: Intent Validation

Traditional automation reacts to events.

Manipulus reacts to validated intent.

Intent is inferred only when multiple signals align over time, such as:

  • Gaze direction
  • Skeletal posture
  • Gesture persistence
  • Spatial context
  • Duration thresholds

Example (conceptual):

If gaze is maintained toward a defined zone and a gesture is sustained beyond a validation window → intent is confirmed → action is allowed.

This reduces false positives and preserves human agency.


Prototyping Direction

Manipulus progresses through small, verifiable prototypes whose sole purpose is to test whether intent can be distinguished from noise using non-verbal signals — locally and privately.

These prototypes are intentionally minimal, disposable, and exploratory. Details are abstracted to avoid premature coupling to specific hardware, vendors, or implementations.


Privacy & Safety Principles

Manipulus is designed around strict constraints:

  • No biometric databases
  • No facial recognition
  • No identity profiling
  • No cloud streaming by default
  • No data resale

Raw video is processed ephemerally for inference and is not stored.

This repository contains no keys, credentials, endpoints, or production secrets.


Repository Scope

This repo may include:

  • Concept documentation
  • Architectural diagrams
  • Prototype logic descriptions
  • Research notes
  • Experimental code (clearly labeled)

Anything that could compromise privacy or security is intentionally excluded.


Status

Manipulus is early‑stage research.

  • APIs may change
  • Concepts may evolve
  • Nothing here should be considered production‑ready

The purpose is exploration, validation, and open discussion.


Open Source & Contribution

Manipulus is built in public to invite critique, not hype.

If you’re interested in:

  • Non‑verbal interfaces
  • Local‑first AI
  • Human‑computer interaction
  • Post‑app computing

You’re welcome to observe, question, or contribute thoughtfully.


Disclaimer

Manipulus is a research and experimental project.

It does not make medical, cognitive, or behavioral claims. It is not a brain‑computer interface. It is not intended for surveillance, monitoring, or behavioral enforcement.

Any demonstrations are conceptual prototypes only.


Manipulus explores what computing could feel like when technology learns to wait for human intent — instead of interrupting it.

About

Manipulus is a sovereign, 4-layer infrastructure stack for embodied intent. Pioneering a local-first, air-gapped protocol that bridges kinetic human movement to digital execution without the cloud. Building the future of private, non-verbal intelligence.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors