AI Assistant Digital Project

Voice-to-Action Command Interpreter | Personal / Academic Project


Built a mini AI-style assistant that translates natural-language commands into a structured DSL, constructs an AST, and executes actions such as graph generation and code output. Designed the grammar, parsing logic, and execution flow end-to-end to explore how conversational interfaces connect to deterministic systems.

  • This project explores how voice-style user input can be translated into structured, executable actions.


    I designed and built a mini AI assistant that parses natural-language commands, converts them into a domain-specific language (DSL), builds an abstract syntax tree (AST), and executes actions such as generating graphs, producing code, or running system-level commands.

    The goal was to understand how language, structure, and execution connect in real-world AI and automation systems.

  • Key Capabilities

    • Accepts voice-style or text-based commands (e.g., “graph x squared from -10 to 10”)

    • Translates flexible natural-language input into a structured DSL

    • Builds and evaluates an abstract syntax tree (AST) to represent intent

    • Executes actions such as:

      • Generating mathematical graphs

      • Producing Python class definitions

      • Running container or system commands

    • Handles flexible phrasing and input normalization (e.g., synonyms, implied operators)

  • How it Works

    1. User input is preprocessed to normalize language and symbols

    2. A custom grammar parses the input into a domain-specific language

    3. An abstract syntax tree (AST) is constructed to represent command intent

    4. The AST is evaluated to trigger the corresponding execution logic

    5. Output is generated (visuals, code, or commands) and returned to the user

    This mirrors how real AI assistants translate user intent into deterministic system actions.

  • My Role

    • Designed the command language and grammar structure

    • Implemented parsing, AST construction, and evaluation logic

    • Built execution handlers for multiple output types

    • Debugged grammar ambiguities and edge cases

    • Documented system behavior and design decisions


    I owned the full lifecycle of the assistant, from language design to execution and output.

  • Technologies

    • Python

    • Custom DSL & grammar parsing

    • Abstract Syntax Trees (ASTs)

    • Regular expressions

    • Matplotlib / NumPy (for graph generation)

  • Key Learnings

    • How natural language can be constrained into structured systems

    • Tradeoffs between flexible input and deterministic execution

    • Designing grammars that balance usability and precision

    • Debugging parsing logic and ambiguous user input

    • How AI assistants rely on structure beneath conversational interfaces

View Code (GitHub)
Previous
Previous

Take Lead Summer Leadership Project

Next
Next

Black Women in STEM — Accessible Website