Activity Recognition: Linking Low-level Sensors to High-level Intelligence
description
Transcript of Activity Recognition: Linking Low-level Sensors to High-level Intelligence
![Page 1: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/1.jpg)
Activity Recognition: Linking Low-level Sensors to High-level
Intelligence
Qiang YangHong Kong University of Science and
Technologyhttp://www.cse.ust.hk/~qyang/
1
![Page 2: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/2.jpg)
What’s Happening Outside AI?
• Pervasive Computing• Sensor Networks• Health Informatics• Logistics• Military/security• WWW• Computer Human
Interaction (CHI)• GIS…
2
![Page 3: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/3.jpg)
What’s Happening Outside AI?
3
Wii Apple iPhone
Ekahau WiFi LocationEstimation
![Page 4: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/4.jpg)
Theme of The Talk
• Activity Recognition:– What it is– Linking low level sensors to high level intelligence
• Activity recognition research: Embedded AI– Empirical in nature– Research on a very limited budget
4
![Page 5: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/5.jpg)
A Closed Loop
5
Eating, Resting, Cooking, Doing Laundry, Meeting, Using the telephone, Shopping, Playing Games, Watching TV, Driving …
Cooking: Preconditions: (…), Postconditions: (…), Duration: (…)
(From Bao and Intille, Pervasive 04)
![Page 6: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/6.jpg)
Activity Recognition: A Knowledge Food Chain
Action Model Learning◦ How to model user’s
actions?Activity Recognition
◦ What is the user doing / will do next?
Localization & Context◦ Where is the user?◦ What’s around her?
6
• Knowledge Food Chain• Output of each level acts as input
to an upper level in a closed feedback loop
![Page 7: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/7.jpg)
Basic: Knowing Your ContextLocations and ContextWhere are you?What’s around you?Who’s around you?How long are you there?Where were you before?Status of objects (door open?)What is the temperature like?…
7
![Page 8: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/8.jpg)
Knowing Your ContextLocations and ContextWhere are you?What’s around you?Who’s around you?How long are you there?Where were you before?Status of objects (door open?)What is the temperature like?…
8
![Page 9: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/9.jpg)
Focusing on locations
Input:Sensor Readings
Wifi, RFID, Audio, Visual, Temperature Infrared, Ultrasound, magnetic fieldsPower lines
[Stuntebeck, Patel, Abowd et al., Ubicomp2008]…
Localization ModelsOutput: predicted locations
9
Dr. Yin, Jie @ work (HKUST)
![Page 10: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/10.jpg)
Location-based Applications: Indoor
• Healthcare at home and in hospitals
• Logistics: Cargo Control
• Shopping, Security
• Digital Wall– Collaboration
with NEC China Lab
10
![Page 11: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/11.jpg)
How to obtain a localization model?
• Propagation-model based– Modeling the signal
attenuation
• Advantages: Less data collection effort
• Disadvantages:– Need to know emitter locations– Uncertainty
• Machine Learning based– Advantages:
• Modeling Uncertainty Better• Benefit from sequential info
– Disadvantages:• May require a lot of labeled dat
a
11
RADAR [Bahl and Padmanabhan, CCC2000]
![Page 12: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/12.jpg)
Using both labeled and unlabeled data in subspace learning
• LeMan: Location-estimation w/ Manifolds [J. J. Pan and Yang et al., AAAI2006]
• Manifold assumption: similar signals have similar labels
• Objective: Minimize the loss over labeled data, while propagating labels to unlabeled data
12
![Page 13: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/13.jpg)
LeMan [J.J. Pan and Yang et al., AAAI2006]
• Supervised vs. Semi-Supervised in a 4m x 5m testbed
• To achieve the same accuracy under 80cm error distance
13
Supervised Semi-supervised
RADAR LeMan
Percentage of labeled data used 100% 23%
![Page 14: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/14.jpg)
Adding sequences: Graphical Model
• Conditional Random Fields [Lafferty, McCallum, Pereira, ICML2001]– Undirected graph, a
generalization to HMM
14
State=locations
Observations = signals
Not using sequential information
Using sequential information
Support vector regression(supervised learning)
CRF(supervised learning)
SemiCRF(semi-supervised learning)
Accuracy 67.33% 83.67% 85.67%
CRF based localization [R. Pan, Zheng, Yang et al., KDD2007]
![Page 15: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/15.jpg)
What if the signal data distribution changes?
• Signal may vary over devices, time, spaces …
• A -> B: the localization error may increase
15
Transfer Learning!
![Page 16: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/16.jpg)
Our work to address the signal variation problems
• Transfer Learning– Problem 1: Transfer Across Devices• [Zheng and Yang et al., AAAI2008a]
– Problem 2: Transfer Across Time• [Zheng and Yang et al., AAAI2008b]
– Problem 3: Transfer Across Spaces• [S. J. Pan and Yang et al., AAAI2008]
16
![Page 17: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/17.jpg)
Transferring Localization Models Across Devices [Zheng and Yang et al., AAAI2008a]
Input: Input:
Output: Output: The localization model on the target device
17
S=(-30dbm, .., -86dbm), L=(1, 3)S=(-33dbm, .., -90dbm), L=(1, 4)…S=(-44dbm, .., -43dbm), L=(9, 10)S=(-56dbm, .., -32dbm), L=(15, 22)S=(-60dbm, .., -29dbm), L=(17, 24)
S=(-37dbm, .., -77dbm), L=(1, 3)S=(-41dbm, .., -83dbm), L=(1, 4)…S=(-49dbm, .., -34dbm), L=(9, 10)S=(-61dbm, .., -28dbm), L=(15,22)S=(-66dbm, .., -26dbm), L=(17, 24)
S=(-33dbm, .., -82dbm), L=(1, 3)…S=(-57dbm, .., -63dbm), L=(10, 23)
Source devices have plentiful labeled dataTarget device has onlyfew labeled data
D-Link Buffalo CISCO
![Page 18: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/18.jpg)
Transferring Localization Models Across Devices [Zheng and Yang et al., AAAI2008a]
Model: Model: Latent Multi-Task Learning [Caruana, MLJ1997]
Each device: a learning task minimize its localization error, and devices share some common constraints
in a latent space
Regression with signals x to locations y
18
Localization on each wireless adapter is treated as a learning task.
0, ( ) ,t t t t ty w x b w w v sharedshared
![Page 19: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/19.jpg)
Transferring Localization Models Over Time [Zheng and Yang et al., AAAI2008b]
20
S=(-30dbm, .., -86dbm), L=(1, 3)
S=(-44dbm, .., -43dbm)L=(9, 10)
S=(-60dbm, .., -29dbm)L=(17, 24)
S=(-33dbm, .., -82dbm), L=(1, 3)…S=(-57dbm, .., -63dbm), L=(10, 23)
S=(-42dbm, .., -77dbm)
S=(-43dbm, .., -52dbm)
S=(-71dbm, .., -33dbm)
S=(-49dbm, .., -41dbm)L=(1, 3)
Input: Input: The old time periodPlentiful labeled sequences:
The new time periodSome (non-sequential) labeled data + some unlabeled sequences
Output: Output: Localization model for the new time period.
PhD Student Vincent Zheng @ Work
![Page 20: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/20.jpg)
Transferring Localization Models Over Time [Zheng and Yang et al., AAAI2008b]
Model: Model: ◦ Transferred Hidden Markov Model
21
Reference points (RPs)
Radio map
Transition matrix of user moves
Prior knowledge on the likelihood of where the user is
Transfer No-transfer
Accuracy under 3m error distance 85% 73%
![Page 21: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/21.jpg)
Transferring Localization Models Across Space [S. J. Pan and Yang et al., AAAI2008]
22
Input: Input:
Output: Output: Localization model for Area B
B
AAccess Point
Area B:
Few labeled data &
Some unlabeled dataArea A:Plentiful labeled data(red dots in the picture)
![Page 22: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/22.jpg)
Summary: Localization using Sensors
• Research Issues– Optimal Sensor
Placement [Krause, Guestrin, Gupta, Kleinberg, IPSN2006]
– Integrated Propagation and learning models
– Sensor Fusion– Transfer Learning– Location-based social
networks
24
Locations◦ 2D / 3D Physical Positions◦ Locations are a type of context
Other contextual Information◦ Object Context: Nearby objects +
usage statusLocations and Context
Where you areWho’s around youHow long you are thereStatus of objects (door open?)What is the temperature like?
![Page 23: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/23.jpg)
Activity Recognition• Action Model Learning
– How do we explicitly model the user’s possible actions?
• Activity Recognition– What is the user
doing / trying to do?• Localization and context
– Where is the user?– What’s around her?– How long/duration?– What time/day?
25
Events
![Page 24: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/24.jpg)
Steps in activity recognition
26
ActionRecognition
sensorsensor sensor sensor
Loc/ContextRecognition
GoalRecognition
• Also,– Plan, Behavior, Intent, Project …
![Page 25: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/25.jpg)
Activity Recognition: Input & Output
• Input– Context and locations
• Time, history, current/previous locations, duration, speed, • Object Usage Information
• Trained AR Model• Training data from calibration• Calibration Tool: VTrack
• Output:– Predicted Activity Labels
• Running?• Walking?• Tooth brushing?• Having lunch?
27
http://www.cse.ust.hk/~vincentz/Vtrack.html
![Page 26: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/26.jpg)
Activity Recognition: Applications• GPS based Location-based services
– Inferring Transportation Modes/Routines• [Liao, Fox, Kautz, AAAI2004]
– Unsupervised, bridges the gap between raw GPS and user’s mode of transportation
– Can detect when user missed bus stops offer help• Healthcare for elders– Example: The Autominder System– [Pollack, et al. Robotics and Autonomous Systems,
2003.]– Provide users w/ reminders when they need them
• Recognizing Activities with Cell Phones (Video)– Chinese Academy of Sciences (Prof Yiqiang Chen and
Dr. Junfa Liu)
28
![Page 27: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/27.jpg)
Microsoft Research Asia: GeoLife Project [Zheng, Xie, WWW2008]
• Inferring Transportation Modes, and • Compute similarity based on itineraries and link
people in a social net: GeoLife Video
29
Segment[i-1]: Car Segment[i]: Walk Segment[i+1]: Bike
P(Car): 75%P(Bus): 10%P(Bike): 8%P(Walk): 7%
P(Bike): 62%P(Walk): 24%P(Bus): 8%P(Car): 6%
P(Bike): 40%P(Walk): 30%P(Bus): 20%P(Car): 10%
Segment[i].P(Bike) = Segment[i].P(Bike) * P(Bike|Car)
Segment[i].P(Walk) = Segment[i].P(Walk) * P(Walk|Car)
![Page 28: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/28.jpg)
Activity Recognition (AR): ADL
• ADL = Activities of daily living (ADLs) • From sound to events, in everyday life
• [Lu and Choudhury et al., MobiSys2009]
• iCare (NTU): Digital home support, early diagnosis of behavior changes
• iCare Project at NTU (Hao-hua Chu, Jane Hsu, et al.) http://mll.csie.ntu.edu.tw/icare/index.php
• Duration patterns and inherent hierarchical structures
• [Duong, Bui et al., AI Journal 2008]
30
![Page 29: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/29.jpg)
Early Work: Plan Recognition
• Objective [Kautz 1987]:– Inferring plans of an agent from (partial)
observations of his actions– Input:• Observed Actions (K,L)• Plan Library
– Output:• Recognized Goals/Plans
31
![Page 30: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/30.jpg)
Review: Event Hierarchy in Plan Recognition
• The Cooking Event Hierarchy [Kautz 1987]
• Some works:– [Kautz 1987]: graph
inference– [Pynadath and
Wellman, UAI2000]: probabilistic CFG
– [Geib and Steedman, IJCAI2007]: NLP and PR
– [Geib, ICAPS2008]: string rewriting techniques
32
Abstraction relationshipAbstraction relationship
ActionsActions
Step 2 of Make
Pasta Dish
Step 2 of Make
Pasta Dish
![Page 31: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/31.jpg)
A Gap?
33
![Page 32: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/32.jpg)
AR: Sequential Methods• Dynamic Bayesian Networks
– [Liao, Fox, Kautz, AAAI2004] [Yin, Chai, Yang, AAAI2004]
• Conditional Random Field [Vail and Veloso, AAAI2008]• Relational Markov Network [Liao, Fox, Kautz, NIPS2005]
34
![Page 33: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/33.jpg)
Intel [Wyatt, Philipose, Choudhury, AAAI2005] : Incorporating Commonsense
• Model = Commonsense Knowledge– Work at Intel Seattle Lab /
UW– Calculate Object Usage
Information from Web Data P(Obj | Action)
– Train a customized model• HMM: parameter learning
[Wyatt et al. AAAI2005]• Mine model from Web
[Perkowitz, Philipose et al. WWW2004]
35
![Page 34: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/34.jpg)
Datasets: MIT PlaceLab http://architecture.mit.edu/house_n/placelab.html
• MIT PlaceLab Dataset (PLIA2) [Intille et al. Pervasive 2005]
• Activities: Common household activities
36
![Page 35: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/35.jpg)
Datasets: Intel Research Lab
• Intel Research Lab [Patterson, Fox, Kautz, Philipose, ISWC2005]– Activities Performed:
11 activities– Sensors
• RFID Readers & Tags
– Length:• 10 mornings
37Picture excerpted from [Patterson, Fox, Kautz, Philipose, ISWC2005].
Now: Intel has better RFID wristbands.
![Page 36: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/36.jpg)
Complex Actions? Reduce Labels?
Complex Actions: For multiple activities with complex relationships
[Hu and Yang, AAAI2008]
◦ concurrent and interleaving activitiesLabel Reduction:
What if we are short of labeled data in a new domain? [Zheng, Hu, Yang, et al. Ubicomp 2009]◦Use transfer learning to borrow knowledge from a source
domain (where labeled data are abundant)◦ For recognizing activities where labeled data are scarce
38
![Page 37: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/37.jpg)
Concurrent and Interleaving Goals [Hu, Yang, AAAI2008]
39
Concurrent Activities
Concurrent Activities
Interleaving Activities
Interleaving Activities
![Page 38: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/38.jpg)
Concurrent and Interleaving Goal and Activity Recognition [Hu, Yang, AAAI2008]
40
Use the long-distance dependencies in Skip-Chain Conditional Random Fields to capture the relatedness between interleaving activities.
Factors for linear chain edges
Factors for linear chain edges
Factors for skip edges
![Page 39: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/39.jpg)
Concurrent and Interleaving Goal and Activity Recognition [Hu, Yang, AAAI2008]
Our Approach Only Concurrent Only Interleaving
MIT PlaceLab Dataset
86% 73% 80%
41
1
0.32 1
0.93 0.27 1
0.48 0.13 0.72 1
S
Concurrent Goals:• correlation matrix between different goals learned from training data
Example: “attending invited talk” and “browsing WWW”.
![Page 40: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/40.jpg)
Cross Domain Activity Recognition [Zheng, Hu, Yang, Ubicomp 2009]
• Challenges:– A new domain of
activities without labeled data
• Cross-domain activity recognition– Transfer some available
labeled data from source activities to help training the recognizer for the target activities.
42
CleaningIndoor
Laundry
Dishwashing
![Page 41: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/41.jpg)
Calculating Activity Similarities
How similar are two activities?◦Use Web search results◦ TFIDF: Traditional IR
similarity metrics (cosine similarity)
◦ Example Mined similarity between
the activity “sweeping” and “vacuuming”, “making the bed”, “gardening”
Calculated Similarity with the activity "Sweeping"
Similarity with the activity "Sweeping"
43
![Page 42: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/42.jpg)
How to use the similarities?
44
Source Domain Labeled Data
Source Domain Labeled Data
Similarity MeasureSimilarity Measure
<Sensor Reading, Activity Name>
Example: <SS, “Make Coffee”>
<Sensor Reading, Activity Name>
Example: <SS, “Make Coffee”>
Example: sim(“Make
Coffee”, “Make Tea”) = 0.6
Example: sim(“Make
Coffee”, “Make Tea”) = 0.6
Example: Pseudo Training Data: <SS, “Make Tea”, 0.6>
Target Domain Pseudo Labeled
Data
Target Domain Pseudo Labeled
Data
Weighted SVM Classifier
Weighted SVM Classifier
THE WEB
![Page 43: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/43.jpg)
Cross-Domain AR: PerformanceMean Accuracy with Cross Domain Transfer
# Activities (Source Domain)
# Activities (Target Domain)
Baseline (Random Guess)
MIT Dataset (Cleaning to Laundry)
58.9% 13 8 12.5%
MIT Dataset (Cleaning to Dishwashing)
53.2% 13 7 14.3%
Intel Research Lab Dataset
63.2% 5 6 16.7%
45
Activities in the source domain and the target domain are generated from ten random trials, mean accuracies are reported.
![Page 44: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/44.jpg)
How Does AR Impact AI?
• Action Model Learning– How do we explicitly
model the user’s possible actions?
• Activity Recognition– What is the user doing /
trying to do?• Localization– Where is the user?
46
![Page 45: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/45.jpg)
Relationship to Localization and AR
• From context– state description from
sensors• From activity recognition
– activity sequences
• Learning action models
• Motivation:– solve new planning
problems– knowledge-engineering
effort– for Planning
• Can even recognize goals using planning
• [Ramirez and Geffner, IJCAI2009]
47
![Page 46: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/46.jpg)
What is action model learning?Input: activity sequences
◦ Sequences of labels/objects: Example: pick-up(b1) ,
stack(b1,b2)…etc◦ Initial state, goal, and partial
intermediate states Example: ontable(b1),clear(b1), …
etcOutput: Action models
◦ preconditions of actions: Example: preconditions of “pick-
up”: ontable(?x) , handempty, …etc.
◦ effects of actions: Example: effects of “pick-up”:
holding(?x), …etc
TRAIL [Benson, ICML1994]: learns Teleo-operator models (TOP) with domain experts’ help.
EXPO [Gil, ICML1994]: learns action models incrementally by assuming partial action models known.
Probabilistic STRIPS-like models [Pasula et al. ICAPS2004]: learns probabilistic STRIPS-like operators from examples.
SLAF [Amir, AAAI2005]: learns exact action models in partially observable domains.
48
![Page 47: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/47.jpg)
ARMS [Yang et al. AIJ2007]An overview
Build constraintsBuild constraints
Activity SequencesActivity SequencesSensor states, object usage
Sensor states, object usage
Information constraintsInformation constraints
Plan constraintsPlan constraintsSolved w/ Weighted MAXSAT/MLNSolved w/ Weighted MAXSAT/MLN
Action modelsAction models
•what can be in the preconditions/Postcond•what can be in the preconditions/Postcond
Each relation has a weight that can be learned
Each relation has a weight that can be learned
49
![Page 48: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/48.jpg)
Evaluation: by Students @ HKUST
execute learned actions Lego-Learning-Planning (LLP) System Design
50
Control Command
Robot Status/ Data
Internet
Activity recognition & planningRobot PDA
Web Server
Notebook
Bluetooth
![Page 49: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/49.jpg)
A Lego Planning DomainRelations
given by sensors/phy. map◦ (motor_speed ) ◦ (empty ) ◦…◦ (across x-loc y-loc z-loc)
ActionsKnown to the robot◦ (Move_forw x-loc y-loc z-loc)◦…◦ (Turn_left x-loc y-loc z-loc) ◦
Initial state: ◦ (empty ) (face grid0)…
Goal: ◦…(holding Ball)
Collection of Activity Sequences (Video 1: robot) (video 2: human)
51
![Page 50: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/50.jpg)
Activity SequencesHuman manually achieves
goal◦ 0: (MOVE_FORW A B C) ◦ …◦ 4: (MOVE_FORW D E F)◦ 5: (MOVE_FORW E F W)◦ 6: (STOP F)◦ 7: (PICK_UP F BALL)◦ …◦ 10: (STOP D)◦ 11: (TURN_LEFT D W E)◦ 12: (PUT_DOWN BALL D)◦ 13: (PICK_UP D BALL)
52
Activity Recognizer
ARMS: Action Model Learning
![Page 51: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/51.jpg)
Learned Action Models•(:action Stop:parameters (?x - loc):precondition (and (motor_speed) (is_at ?x) ):effect (and (empty) (face ?x) (not
(motor_speed)) ))•…
53This is an error
Dr. Hankz Hankui Zhuo @ HKUST
![Page 52: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/52.jpg)
LLP Solves a new Lego Planning Problem
• (:init (empty) (face B) (is_at A) (at ball F) …(across C D W) (across D E F) (across E F W) )
• (:goal (and (is_at F) (holding Ball) )
54
F
Goal
A B C D
E
F
W
Init
Ball
Lego
Ball
![Page 53: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/53.jpg)
Solving the Planning Problem
Generate a plan using a planner◦ 0: (MOVE_FORW A B C) ◦ …◦ 4: (MOVE_FORW D E F)◦ 5: (MOVE_FORW E F W)◦ 6: (STOP F)◦ 7: (PICK_UP F BALL)◦ …◦ 10: (STOP D)◦ 11: (TURN_LEFT D W E)◦ 11: (PUT_DOWN BALL D)◦ 12: (PICK_UP D BALL) ◦ 13: (MOVE_FORW D E F) ◦ 14: (MOVE_FORW E F W)
Execution◦ sometimes it will◦ Succeed!!!◦ But, sometimes it will◦ Fail…
More feedback on learning required
55
![Page 54: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/54.jpg)
Closing the Loop in the Knowledge Food Chain
Close the feedback loopLoop
1. Signal traces collected2. Location, Context, Activities
predicted 3. Action models learned4. New plan generated and
executed5. Errors found6. Human intervention to
correct plans7. New Plans
End loop
56
![Page 55: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/55.jpg)
Activity Recognition: Truly Multidisciplinary
57
•Computer Networks•Pervasive/Ubiquitous Computing•Logistics•Intelligent Transportation•Urban Design and Planning
•Health Informatics/Public Health•Mobile Commerce/Services•Mobile Social Nets•Geographical Information Sys•HCI, Data Mining
Software EngineeringComputer GraphicsHCI
![Page 56: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/56.jpg)
Open issues in Activity Recognition
• User privacy– [Klasjna, Consolvo,
Choudhury, et al., Pervasive2009]
• False Positives– Cost-sensitive appl.
• Market study– Business Models?
• Many users– mobile social networks– Multi-person AR
(cooperation? Competition?)
• Transfer Learning– Between users– Between activities– Between different types
of sensors
58
![Page 57: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/57.jpg)
ConclusionsFuture– Cheaper and more
ubiquitous sensors will bring a new era for AI through activity recognition research and applications
• Acknowledgement …
Theme of the Talk• Activity Recognition:– What it is– Linking low level sensors
to high level intelligence– Closed loop
• Activity recognition as Embedded AI– Empirical in nature– Research on a very
limited budget
59
![Page 58: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/58.jpg)
Theme of The Talk
60
![Page 59: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/59.jpg)
Acknowledgement
• Students (Former) @ HKUST– Vincent W. Zheng, Derek H. Hu, Hankz H. Zhuo,
Sinno J. Pan– Jie Yin, Dou Shen, Jeffrey J. Pan, Rong Pan
• Collaborators– Drs. Junhui Zhao and Yongcai Wang (NEC China
Lab)– Drs. Yiqiang Chen and Junfa Liu (CAS)– Drs. Xing Xie and Yu Zheng (MSRA)
61
![Page 60: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/60.jpg)
References (Localization)• [Bahl and Padmanabhan, CCC2000] RADAR: An in-building RF-based user location and
tracking system.• [Caruna, MLJ1997] Multi-task Learning.• [Ferris, Fox and Lawrence, IJCAI2007] WiFi-slam using Gaussian process latent variable
models.• [Fox and Hightower et al., PervasiveComputing2003] Bayesian filtering for location
estimation.• [Krause, Guestrin, Gupta, Kleinberg, IPSN2006] Near-optimal sensor placements: maximizing
information while minimizing communication cost. • [Ladd et al., MobiCom2002] Robotics-based Location Sensing using Wireless Ethernet.• [Ni et al., PerCom 2003] LANDMARC: indoor location sensing using active RFID.• [J. J. Pan and Yang et al., IJCAI2005] Accurate and low-cost location estimation using kernels.• [J. J. Pan and Yang et al., AAAI2006] A Manifold Regularization Approach to Calibration
Reduction for Sensor-Network Based Tracking.• [R. Pan, Zheng, Yang et al., KDD2007] Domain-Constrained Semi-Supervised Mining of
Tracking Models in Sensor Networks.• [S. J. Pan and Yang et al., AAAI2008] Transferring Localization Models Across Space.
62
![Page 61: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/61.jpg)
References (Localization)• [Stunteback and Patel et al., Ubicomp2008] Wideband powerline positioning for indoor
localization.• [Zheng and Yang et al., AAAI2008a] Transferring Multi-device Localization Models using
Latent Multi-task Learning.• [Zheng and Yang et al., AAAI2008b] Transferring Localization Models Over Time.
63
![Page 62: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/62.jpg)
References (Activity Recognition) [Bao and Intille, Pervasive 2004] Activity Recognition from User-Annotated Acceleration
Data. [Bui et al., AAAI 2008] The Hidden Permutation Model and Location-Based Activity
Recognition. [Chian and Hsu, IJCAI2009] Probabilistic Models for Concurrent Chatting Activity
Recognition [Choudhury and Basu, NIPS 2004] Modeling Conversational Dynamics as a Mixed-Memory
Markov Process. [Kautz 1987] A Formal Theory of Plan Recognition. [Geib and Steedman, IJCAI 2007] On Natural Language Processing and Plan Recognition. [Geib et al., ICAPS 2008] A New Probabilistic Plan Recognition Algorithm Based on String
Rewriting. [Hu, Yang, AAAI2008] CIGAR: Concurrent and Interleaving Goal and Activity Recognition. [Klasnja, Consolvo, Choudhury et al. Pervasive2009] Exploring Privacy Concerns about
Personal Sensing. [Liao, Fox, Kautz, AAAI2004] Learning and Inferring Transportation Routines.
64
![Page 63: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/63.jpg)
References (Activity Recognition) [Liao, Fox, Kautz, NIPS2005] Location-Based Activity Recognition. [Lu and Choudhury et al., MobiSys2009] SoundSense: scalable sound sensing for people-
centric applications on mobile phones. [Patterson, Fox, Kautz, Philipose, ISWC 2005] Fine-Grained Activity Recognition by
Aggregating Abstract Object Usage. [Pollack, 2003] Autominder: an intelligent cognitive orthotic system for people with memory
impairment. [Pynadath and Wellman, UAI 2000] Probabilistic State-Dependent Grammars for Plan
Recognition. [Vail and Veloso, AAAI 2008] Feature Selection for Activity Recognition in Multi-Robot
Domains. [Wyatt, Philipose and Choudhury, AAAI 2005] Unsupervised Activity Recognition Using
Automatically Mined Common Sense. [Yin, Chai, Yang, AAAI2004] High-level Goal Recognition in a Wireless LAN. [Zheng, Hu, Yang, Ubicomp 2009] Cross-Domain Activity Recognition. [Zheng, Xie, WWW 2008] Learning transportation mode from raw GPS data for geographic
applications on the web.
65
![Page 64: Activity Recognition: Linking Low-level Sensors to High-level Intelligence](https://reader036.fdocuments.net/reader036/viewer/2022081514/5681584a550346895dc5a0f3/html5/thumbnails/64.jpg)
References (Action Model Learning)
• [Amir, IJCAI2005] Learning Partially Observable Deterministic Action Models.• [Benson, ICML1994] Inductive Learning of Reactive Action Models.• [Gerevini, AIPS2002] A Planner Based on Local Search for Planning Graphs with Action
Costs.• [Gil, ICML1994] Learning by Experimentation: Incremental Refinement of Incomplete
Planning Domains.• [Pasula et al., ICAPS2004] Learning Probabilistic Planning Rules.• [Yang et al., AIJ2007] Learning action models from plan examples using weighted MAX-
SAT.• [Zhuo et al. PAKDD-09] Transfer Learning Action Models by Measuring the Similarity of
Different Domains.
66