Presentation is loading. Please wait.

Presentation is loading. Please wait.

Attention Control & the Disappearing Computer J. G. Taylor Department of Mathematics, King’s College, Strand, London WC2R 2LS, UK & Lobal Technologies,

Similar presentations


Presentation on theme: "Attention Control & the Disappearing Computer J. G. Taylor Department of Mathematics, King’s College, Strand, London WC2R 2LS, UK & Lobal Technologies,"— Presentation transcript:

1 Attention Control & the Disappearing Computer J. G. Taylor Department of Mathematics, King’s College, Strand, London WC2R 2LS, UK & Lobal Technologies, Huntsworth Mews, London

2 General Towards Intelligent Agents: But how defined? Generally accepted:  Construct own goals  Use to develop responses to attain them Develop intelligent software agents in DC ORESTEIA project Very apposite to DC:  Act as guidance systems for humans carrying wearables

3 General II Take account of transmitted data from ambient intelligent environment (AMBi fitted with sets of sensors to detect features of relevance to the human user) Vision:  Reverse engineering of attention from human brain  Attention needed for complex tasks Ultimate vision:  Aware systems through Attention

4 CONTENTS ORESTEIA Problems & Architecture ORESTEIA Level 3 Architecture Problems going to Level 4 Multi-modal Human Attention ORESTEIA Level 4 Architecture Conclusions

5 1. ORESTEIA Problems & Architecture ORESTEIA: Software agents give guidance to user with wearable sensors Sensor type:  Physiological information from user’s body (Guidance on health state of user)  Information from environment (Cameras on outside of car, monitors measuring state of movement of car; guidance on hazards) Input from set of modalities (Up to four considered for ORESTEIA)

6 ORESTEIA Problems & Architecture II Output: Advice to user (Medical state, if critical level; or future dangers needing increased user care) How to fuse various modalities to produce useful ORESTEI car-driving agent? By attention control (From car – car environment - & from driver’s physiology) Overall architecture of ORESTEIA agent: 4 levels

7 ORESTEIA Problems & Architecture III Level 1: Sensors & low power generation systems (Allows communication to central control system by Levels 2 & 3) Level 2: Pre-processing stage (Time series from sensors smoothed, & techniques for preparing time series for higher level analysis) Level 3: Single modality attention artefact layer (Independent Level 3 artefact for each modality) Level 4: Multi-modal overall attention control (Handles attention control competitive processes between modalities & overall responses)

8 ORESTEIA Problems & Architecture IV Criteria:  Service availability  Maintaining logical consistency of data flows  Responds in ‘rational’ manner ( User expects system’s behaviour predictable, final outcome ‘close’ to what user would have done) System first learns behavioural patterns Then acts in user’s preferred way (Not necessarily the rationally correct way)

9 ORESTEIA Problems & Architecture V Agent requirements on overall system, not only on each Level 3 artefact Architecture problems at Levels 3 & 4:  Computational model for artefact  State representations and context modelling  Classifier systems for state evaluation and User Profile  Attention Control for achieving fast response

10 ORESTEIA Problems & Architecture VI  Multiple sensor handling and data fusion  Adaptation to the user  Decision generation Implied requirements: System validation and evaluation? Multiple sensor handling and data fusion (More acute at Level 4: Fusion of data across range of sensors & across different modalities harder than in given modality)

11 ORESTEIA Problems & Architecture VII Clear when combining visual inputs, across the visual space, compared to combining them with auditory and /or haptic inputs (Overlap with each other, so levels of signal intensity, noise, speed, pre- processing, etc can be developed and used in unified manner. But is different pre-processing across vision & audition, & especially different processing speeds)

12 ORESTEIA Problems & Architecture VIII So overall computational model needed for ORESTEIA agent, combining all possible Level 3 artefacts, as well as organizing information flow & attention control, specific to Level 4 (To prioritize responses of overall agent) Sensor fusion, decisions and implied requirements all must proceed smoothly at all levels System validation not only for separate Level 3 artefacts but for overall system

13 2. ORESTEIA Level 3 Architecture The Level 3 Architecture:

14 ORESTEIA Level 3 Architecture II Variety of modules in Level 3 artefact Sensors feed into native state map (Cartesian product of input time series, buffered to a suitable depth) State map feeds intermediate world state module from native world state (To develop simple product state into one with correlations analysed and possible independent components selected by PCA or by INCA – for data compression)

15 ORESTEIA Level 3 Architecture III Observer and Goals modules create historic estimate (In terms of the various probability distributions of the input data series) Provide levels of stability (By means of each time series) Provide levels of deviance from that stability (By the variances) Monitor module calculates deviation of incoming values compared to historic values

16 ORESTEIA Level 3 Architecture IV Thereby provides Attention Controller module with ‘alerting’ values Generates Attention Index for each independent series Used to generate response by the Rules module (Developed partly by expert system and neuro-fuzzy techniques & by direct strength of signal) Gives feedback to aberrant sensors Gives alert to user or overall guidance system for large deviations

17 3. Problems going to Level 4 Combine several Level 3 artefacts to Level 4 Face new problems:  Specificity of processing  Balance of power  Breadth of knowledge Using principle (as in brain) ‘divide and conquer’:  Level 3 = ‘divide’  Level 4 = ‘conquer’: who, how, where?

18 Problems going to level 4 II Still to solve:  Scaling problem (Combining information coming from many different sensor types, each possibly with a large number of data streams (as in a 2-D visual system, with many thousands of pixels, as well as from complex environments in which there are many distracting objects)  Combinatorial Explosion problem (By relegating to Level 2 low-level pre-processing [edge detection, motion detection, etc], then proceeds to concept formation at Level 3)

19 Problems going to level 4 III  Concept = useful cluster of inputs detected by suitable statistical clustering methodology (k-Nearest Neighbours or similar)  Concepts form reduced codes for objects in environment, more easily handled at Level 3 than in Level 4 in the agent  But still fusion at Level 4: turn to human attention fusion

20 4. Multi-modal Human Attention How is control of Attention to different modalities achieved? Is there a separate Attention Control system for each modality? Plus separate region dedicated to combining overall attention demands? Alternatively one overall global attention system, used jointly by any and all modalities at once? Overall possibility unlikely (Loss of specificity for modalities)

21 Multi-modal Human Attention II Experiments show overall possibility untrue Experiment showed presence of common areas of brain involved in the control of movement of attention to any of modalities: Touch, hearing or vision, plus separate modality-specific areas. Subjects subjected to 3 simultaneous streams of stimuli, 1 in each modality

22 Multi-modal Human Attention III Change in one stimulus modality at a time, causing the redirection of attention to that modality Common regions found in tempero- paretial junction (TPJ), the middle temporal gyrus, the insula, and in the inferior frontal gyrus (Would expect TPJ involved in overall attention redirection, the second was a site of common state representation, the third involved common sensory attributes, so additional state map, and the last functions as common goal state, to attend to the new sensory simulation

23 Multi-modal Human Attention IV A multimodal cortical network for the detection of changes in the sensory environment (Jonathan Downar et al.), Nature Neuroscience 3,227-283 (2000)

24 Multi-modal Human Attention VI Fig. 2: Surface rendering of brain regions activated by transitions of the visual, auditory and tactile stimuli

25 Multi-modal Human Attention VI-1 Note that unimodal activations are bilateral and correspond predominantly to association cortices, whereas multimodal activations are strongly lateralized to the right hemisphere and correspond to the temporoparietal junction, inferior frontal gyrus and insula. On the left medial wall, the supplementary and cingulate motor areas are also prominently activated, even though subjects were not required to make responses or movements during task performance. Activations represent averaged data from 10 subjects, superimposed on the standardized brain of one subject. L, left; R, right.

26 Multi-modal Human Attention VII Another experiment: Measure, by PET, effect on brain activity of attending to vision versus touch

27 Multi-modal Human Attention VII-1

28 Multi-modal Human Attention VII-2 Anatomy and rCBF plots (mean adjusted to 50 ml/dl per min for the whole brain, ±SEM) for the two area showing sensitivity to the rate of spatial attentional shifts independently of the sensory modality stimulated. PET activations of the contrasts high rate versus low rate (3A.1), or low rate versus high rate (3A.2), were superimposed on the structural magnetic resonance image of the MNI brain. Sections are taken through the maxima. The rCBF plots of the maxima show that the differential response depending on the rate of attention shifting was independent of the modality stimulated. B Anatomical localisation of the activation in the left hippocampus, with rCBF plot. (H high shift rate; L low shift rate)

29 5. ORESTEIA Level 4 Architecture Overall ORESTEIA Architecture to emulate humans

30 ORESTEIA Level 4 Architecture II Missing Observer module in level 4: Should be there but used for training ‘Fused’ denotes are performing a combining or fusion analysis of inputs from modules of Level 3 (Based on the windowed correlation coefficients from pairs of time series across different modalities) ‘Overall’ created by overall logic of decision making for whole ORESTEIA agent

31 ORESTEIA Level 4 Architecture III Need information from Level 3 information descendants Input to the Level 4 artefact only from lower level artefacts of Level 3, not from sensors: Level 4 non-modal (Due to the need to have ‘divide’ process achieved for separate modalities by level 3 artefacts; Level 4 artefact only process modality-processed data from Level 3 artefacts of each modality)

32 ORESTEIA Level 4 Architecture IV Only one Level 4 artefact, compared to many at Level 3 Final decision-making taken over by sole highest-level processor If other Level 4 artefacts, then decision- making and competition between them for possibly scare resources organised all over again Expected for combination of ORESTEIA agents (Into politics of agent communities: Undoubtedly arise in future)

33 ORESTEIA Level 4 Architecture V Consider particular set of specific modules for ORESTEIA Demo 2 (driver in a car)

34 ORESTEIA Level 4 Architecture VI 4 modalities:  Physiological data (of a user in the car)  Car environment data (involving weather and road data)  Internal car data (velocity, acceleration, friction, wheel turning angle)  Proximity data (other cars, pedestrians, animals or children running onto the road, etc)

35 ORESTEIA Level 4 Architecture VII Data as sets of time series Consider overall hazard function H(sensor inputs,t) at time t (which may be a vector of several values) Hazard level determined by level of attention to be paid to driving situation at time t+1 Measure of danger to driver (So of possible guidance to be given to them, by choosing among possible actions reducing H)

36 ORESTEIA Level 4 Architecture VIII States at Level 3 as independent of those in all other modalities Not case for demo 2: Physiological States (depending on the subject) modified considerably by the level of closeness of other cars, or speed, etc of car itself Dependent case: Need to construct fused state representation at Level 4 Such state represents activities correlated across two (or more) modalities

37 ORESTEIA Level 4 Architecture IX Extract underlying state variables representing independent variables (The psychological state of the driver, correlated with the hazard level) Simpler way: define fused state as cross correlation between separate modality variables (Zero for independent time series, so reduces back to the independent case above) Overall goal maps follow (as at Level 3, by taking histories) Also fused Monitor and Overall IMC modules

38 ORESTEIA Level 4 Architecture X Details still being analyzed (Suitable lengths of windows for calculation, and thresholds appropriate for scaling attention across fused and independent state variables) Rules module at Level 4 constructed from known features of response hazards Denoted ‘Overall’ since involves knowledge possibly outside correlations If certain states in a separate Level 3 module occur, then rule might be ‘warn driver’

39 6. Conclusions Solutions to AMBi/Guidance Agent problems based on principles being extracted from wealth of information arising from brain Especially attention Assured of existence theorem May be no uniqueness theorem

40 Conclusions II Extraction of principles from millions of years of genetic programming should lead to efficient solution to Level 3 and overall Level 4 architecture problem Higher levels: Too many hard competition problems Intelligent Agent communities: Need to develop emotions/linguistic powers

41 Conclusions III References:  [1] http:/www.image.ntua.gr/ORESTEIA  [2] Deliverable ND 1.2, at [1]  [3] Macalaso E, Frith CD & Driver J (2001) Multimodal mechanisms of attention related to rates of spatial shifting in vision and touch. Exp Brain Res 137:445-454  [4] Downar J, Crawley AP, Mikuli DJ & Davis KD (2000) A multimodal cortical network for the detection of changes in the sensory environment. Nature neuroscience 3:277-283


Download ppt "Attention Control & the Disappearing Computer J. G. Taylor Department of Mathematics, King’s College, Strand, London WC2R 2LS, UK & Lobal Technologies,"

Similar presentations


Ads by Google