Controlling Agents with Natural Language Jared Allen 2005 University of Arkansas EiA Agent team
Background ● Increasing dependency upon machines - Grocery stores, household, etc. ● Complexity of devices - Examples: Thermostat, TV remote controls ● EiA vision - Everything is Alive
Natural Language ● Objective: An intuitive means of communicating and controlling electronic devices. ● Benefits - ease of use, computational power, limited hardware ● Limitations - implementation viability, scope of language
Menu-Based Natural Language Interface (MBNLI) ● Format - cascading menus, guided completion dialogue, absolute grammar coverage ● Advantages - avoids under-shooting or over-shooting the allowable language, retains ease of use
MBNLI
Components ● Grammar: The set of allowable input commands. ● Parser: The software component that keeps track of instruction progress, future options, and output to the real-world device. ● Interface: The visual element of the system that guides user input.
Input/Output ● Input: Grammar, Command String, Translation Definitions ● Output: Machine Instruction (for our purposes, this will be in XML)
Current Implementation AmigoBots
Robot Functionality ● Equipment - Sonar range devices, wireless internet adaptor, gripper ● Functionality - Move, turn, grip, release, check sonar, etc.
Example “Robot, move forward 10 feet.”
For the future... ● Larger collection of grammars ● Cooperative grammars ● Intelligent Agents (“Smart Wrappers”) ● Remote access