Needs and Emerging Trends of Remote Sensing

7

Click here to load reader

description

Published In: Proc. SPIE 9116, Next-Generation Robots and Systems Date: 4 June 2014 From the earliest need to be able to see an enemy over a hill to sending semi-autonomous platforms with advanced sensor packages out into space, humans have wanted to know more about what is around them. Issues of distance are being minimized through advances in technology to the point where remote control of a sensor is useful but sensing by way of a non-collocated sensor is better. We are not content to just sense what is physically nearby. However, it is not always practical or possible to move sensors to an area of interest; we must be able to sense at a distance. This requires not only new technologies but new approaches; our need to sense at a distance is ever changing with newer challenges. As a result, remote sensing is not limited to relocating a sensor but is expanded into possibly deducing or inferring from available information. Sensing at a distance is the heart of remote sensing. Much of the sensing technology today is focused on analysis of electromagnetic radiation and sound. While these are important and the most mature areas of sensing, this paper seeks to identify future sensing possibilities by looking beyond light and sound. By drawing a parallel to the five human senses, we can then identify the existing and some of the future possibilities. A further narrowing of the field of sensing causes us to look specifically at robotic sensing. It is here that this paper will be directed

Transcript of Needs and Emerging Trends of Remote Sensing

Page 1: Needs and Emerging Trends of Remote Sensing

*[email protected]; phone 1 817 272-5852; fax 1 817 272-5952; uta.edu

Needs and Emerging Trends of Robotic Sensing Michael McNair *a

aUniversity of Texas at Arlington Research Institute, 7300 Jack Newell Blvd., S., Fort Worth, TX,

USA 76118

ABSTRACT

From the earliest need to be able to see an enemy over a hill to sending semi-autonomous platforms with advanced

sensor packages out into space, humans have wanted to know more about what is around them. Issues of distance are

being minimized through advances in technology to the point where remote control of a sensor is useful but sensing by

way of a non-collocated sensor is better.

We are not content to just sense what is physically nearby. However, it is not always practical or possible to move

sensors to an area of interest; we must be able to sense at a distance. This requires not only new technologies but new

approaches; our need to sense at a distance is ever changing with newer challenges. As a result, remote sensing is not

limited to relocating a sensor but is expanded into possibly deducing or inferring from available information.

Sensing at a distance is the heart of remote sensing. Much of the sensing technology today is focused on analysis of

electromagnetic radiation and sound. While these are important and the most mature areas of sensing, this paper seeks to

identify future sensing possibilities by looking beyond light and sound. By drawing a parallel to the five human senses,

we can then identify the existing and some of the future possibilities. A further narrowing of the field of sensing causes

us to look specifically at robotic sensing. It is here that this paper will be directed.

Keywords: Overview, sensing, robotic sensing, analysis

1. OVERVIEW OF SENSING

1.1 Definitions and Context

Before focusing on robotic sensing it is important to have a general description of some of the key terms and concepts.

There are a number of key components in sensing as an activity. To begin with, there are the objects to be sensed.

These exist in any number of states and configurations and are positioned and oriented uniquely at a given time. Even

so, these sensed objects (or targets) can be classified by the amount of incident radiation being absorbed, reflected, or

transmitted. Two forms of reflected energy can be identified: specular reflection and diffuse reflection. Targets with

reflective sensing can be considered to be either passive or active. For clarity, absorbed energy is demonstrated by the

containment of incident energy, transmitted energy is passed through the target, and reflected energy is redirected from

the target. Specular reflection occurs when a surface is reflects almost all of the incident energy in a single direction

whereas diffuse reflection occurs when the incident energy is reflected almost uniformly in all directions. Passive

sensing refers to redirected energy from a source not particularly involved with the sensing system whereas active

sensing refers to redirected energy from a source that is a part of the sensing system. Figures 1 and 2 provide a visual

representation of these characteristics.

Next-Generation Robots and Systems, edited by Dan O. Popa, Muthu B. J. Wijesundara, Proc. of SPIE Vol. 9116, 91160A · © 2014 SPIE · CCC code: 0277-786X/14/$18 · doi: 10.1117/12.2058278

Proc. of SPIE Vol. 9116 91160A-1

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 08/19/2014 Terms of Use: http://spiedl.org/terms

Page 2: Needs and Emerging Trends of Remote Sensing

L J s

Figure 1, Depiction of Reflection, Transmission and Absorption. (T and E stand for target and emitter respectively.)

Figure 2, Depiction of Passive and Active Sensing. (T, E and S stand for target, emitter and sensor system respectively.)

In reality, these characteristics rarely occur in isolation. In other words, a single target may exhibit more than one of

these characteristics at any given time. For example, there are no perfect mirror surfaces (specular reflection) and no

object completely absorbs all radiation. Even with this in mind, there are other classes of effects that prevent what

would otherwise be a clear and unobstructed view of a target object via a sensor. Many of these can be classified as

obscurants and countermeasures, or effects that interfere with either the radiation of incident energy or the receipt of

reflected energy.

With the addition of the effects of obscurants and countermeasures to surface characteristics, a single target may itself

take on combinations of each of the effects listed above. For example, the diffuse reflection from a target may itself

become appear as an emitter in passive sensing. Scattering of light through smoke, dust, or fog also creates a condition

of diffuse reflection.

1.2 Sensor Classification

Electromagnetic Sensing

There are many different kinds of sensors on the commercial market and in research labs. Many of these sensors can be

categorized by way of the information they provide. Three basic forms of information are available from sensors: spatial

information, intensity information, and spectral information. Combinations of these information types then form the

basis for sensor classes.

Likewise, there are a number of sensor characteristics, but even here there is a sense of organization to the types of

sensors. At a first level, sensors can be typed as either active or passive. Passive sensors rely on energy already existing

in the sensing area whereas active sensors emit the energy they then use in sensing. A possibly familiar example would

be that of a camera. Under ordinary lighting a camera is usually thought of as a passive sensor, however under low

lighting conditions a flash attachment may be used. Now with the flash in use, the camera is an active sensor.

Proc. of SPIE Vol. 9116 91160A-2

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 08/19/2014 Terms of Use: http://spiedl.org/terms

Page 3: Needs and Emerging Trends of Remote Sensing

Passive sensors can be subdivided into scanning and non-scanning. Either of these can be imaging type scanners, but

non-scanning sensors can also be non-imaging. Active scanners can be subdivided into scanning and non-scanning

where non-scanning active sensors are non-imaging and scanning active sensors are imaging. Imaging sensors create a

plot or even visual layout (for example a picture) of a sensed area while non-imaging sensors accumulate the sensed

energy and provide an overall characteristic value representative of the energy. A scanning sensor either moves the

emitter or the detection system components to acquire more data than would be available in a static, or non-scanning,

configuration. This motion effect brings the useful capability of creating a 3-dimensional representation.

Non-Electromagnetic Sensing

While electromagnetic technologies have yielded a wide variety of sensing devices, there are a number of additional

sensing technologies available. As a simple step toward understanding this more complete world of sensing it is useful

to review analogies to the 5 human senses: sight, smell, taste, hearing, and touch. The sensing associated with sight is

best paralleled with the description above of electromagnetic based sensing and as such will not be further examined.

The sense of touch is parallel to being able to measures forces. A number of devices exist to measure force – many

times as measurements of weight. Force can be measured in terms of pressure or resistance to force. A variety of

technologies and techniques have been created to measure force that range from simple spring devices to torque sensors

and the like.

Sensors that are analogous to the sense of smell include devices like gas detectors. These generally rely on vapors and

the interaction of these vapors across various materials. Detection of vapors is particularly convenient as opposed to

solids or liquids, since the molecules can be moved across sensing devices. This movement facilitates the detection

outcome and aids in increased sensitivity.

While this may seem loosely related, the sense of taste is analogous to detection of chemicals generally in liquids but

also in gases and solids. By analyzing a sample liquid, for example, it may be possible to devise a sensor that can

identify the components of that liquid and report on not only their existence but also concentration.

Finally, the sense of hearing lends itself to an analogous collection of audio devices that are and have been common on

the commercial market for a very long time. Microphones, whether for general audio or made specifically for selected

frequency bands and environments are the key sensor. Microphones are available for a wide variety of applications and

are too numerous to try to list completely here.

2. ROBOTIC SENSING TECHNOLOGIES

2.1 Overview

Before identifying some of the key robotic sensing systems and methods it is important to develop an understanding of

what constitutes robotics. In developing a definition of robotics it is important to understand that specialized definitions

have been developed for particular applications. For example, ISO 8373, defines an industrial robot as “an automatically

controlled, reprogrammable, multipurpose manipulator programmable in three or more axes, which may be either fixed

in place or mobile for use in industrial automation applications.” Many experts in the field have provided their own

definitions. For the purpose of this paper, we will need a general definition that identifies some of the key attributes of

robotic technologies.

Robots are not just automation devices, since automation is a mechanization of tasks that can be repeated. This

mechanization can also be configurable – as in settings on a clothes washer. Automation devices sometimes include

sensing systems that facilitate their operation. Depending on what is measured through sensors, operation of the device

may then be dynamically configured per a pre-configured set of tasks. Robots move beyond this by including decision

making in their operation. There may be a high degree of automation with sensing (usually multiple sensors), which

when combined provide the information needed to support autonomous decision making.

Robots are implemented utilizing a set of relevant technologies. A sample decomposition of a robotic system can be

shown as the following:

Proc. of SPIE Vol. 9116 91160A-3

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 08/19/2014 Terms of Use: http://spiedl.org/terms

Page 4: Needs and Emerging Trends of Remote Sensing

Co

mm

and

and

Co

ntr

ol

Co

mm

un

icat

ion

Cooperative / Swarm

Autonomy

Human-

Robot

Interaction

Navigation

Manipulation Sensing

Payloads

Perception /

Planning

Mobility

Locomotion Manipulators &

End Effectors

Sensors

Figure 3, Generic robot architecture

Using this as a representative architecture model, we begin to see the role of sensors in robotics and specifically across

the key robotic technologies. While sensing can occur across all of the subsystems of a robotic system, it is primarily

utilized in support of human-robotic interface (HRI), navigation, manipulation, and payloads. Each of these subsystems

operates in a cooperative fashion to support autonomy of the device itself. An additional layer of specialized autonomy

includes collaborative or swarm behaviors. While sensors are used in support of this function, this paper will only

concentrate on functions that are resident to a single robot and are used by that robot.

2.2 Specific Robotic Sensing Technologies

The sections below discuss key sensing technologies in three specific areas: human-robotic interface, navigation, and

manipulation.

2.2.1 Human-Robotic Interface Sensing Technologies

HRI exists generally to increase the effectiveness of human – robot operation. Robots can utilize proximity sensors,

force sensors, visual cameras, haptic equipment, and other devices in order to support not only human control of robots

but feedback of robot status and operation to human. Many devices have been created to support voice recognition,

gesture control, and keyboard/joystick control. Even smart phones and tablets are providing advanced interfaces

between humans and robots.

Status, feedback, and information will, of necessity, be provided to a human in terms of the five human senses. Sight is

the most commonly used sense and is stimulated through displays, indicators, images, video, etc. Hearing is likely the

next most often utilized sense. Tones, speech, music, and other similar means are provided through a number of types of

audio speakers and sound generators. Recently the sense of touch has been exploited through haptic devices. These

devices provide sensory feedback typically through vibration and application of force. Touch devices can also be used

to convey information to a human through temperature, although few devices utilize this specific option. The two

remaining senses of taste and smell have comparatively been ignored with apparently no significant commercial devices.

While the human senses are utilized for status, feedback and information, this is not the case with human control of

robots. Input and control devices for some time have been typically based on a keyboard/keypad and

joystick/mouse/trackball. Recently with the affordability of touch sensitive devices smart phones, tablets, and touch

sensitive screens are being used to augment a point and select style of interface. Additionally, voice commanding has

matured to a point of being very reliable with earlier techniques requiring “training” of the voice recognition algorithms.

In the domain of assistive devices, in particular, additional means of user input are being explored. Use of the tongue,

puffs of air, and other means are now possibilities. In short, any physical action that can be performed is a candidate

means of affecting control of a device.

2.2.2 Navigation Sensing Technologies

Robotic navigation is best described on a scale of manual operation through total autonomy. ALFUS Framework

(Autonomy Levels of Unmanned Systems) published through NIST identifies a scale from 0 through 10 for autonomy.

Level of Autonomy (LOA) 0 describes 100% human involvement in controlling a device while LOA 10 describes full,

intelligent autonomy. In this model, LOA 1-3 identifies a high level of HRI, low level tactical behavior and operations

typically in a simple environment. LOA 4-6 identifies a mid-level HRI with mid complexity functions operating in a

Proc. of SPIE Vol. 9116 91160A-4

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 08/19/2014 Terms of Use: http://spiedl.org/terms

Page 5: Needs and Emerging Trends of Remote Sensing

moderate environment. LOA 7-9 identifies a low level of HRI, collaboration, and high complexity in a generally

difficult environment.

Selection of the appropriate sensor(s) for navigation involves detection of obstacles and identification of navigable paths.

Generally, the higher the fidelity of obstacle and path detection, the higher the level of autonomy is possible. Many

devices exist to facilitate this function ranging from SONAR (Sound Navigation and Ranging) and RADAR (Radio

Detection and Ranging) to LIDAR (Light Detection and Ranging). Additional technologies involving Infra-Red cameras

(IR, FLIR, etc) and other means are also used to help in obstacle detection.

Some of these same technologies also assist in obstacle identification. Whether it is the development of a 3D Point

Cloud, recognition by thermal signature, or some other approach, objects in a field of view may not only be detected but

also recognized. Depending on how objects are recognized, the navigation system on a robot may choose to avoid the

object or ignore the object. This element of perception aids in increasing the autonomous capability of a robotic

platform.

2.2.3 Manipulation Sensing Technologies

Most sensing involved with manipulator subsystems includes force sensors and/or cameras. Force sensors are, of

necessity, made a part of the end effector in order to have a tactile input. For safety reasons, force sensors may also be

included in the manipulator design. Cameras can be included in the manipulator, as a part of the end effector, or

mounted on the manipulator host (usually the robot). Regardless of the mounting location, cameras provide the means to

detect objects that must be avoided as well as objects to be manipulated.

One of the key challenges in manipulation systems involves the navigation of the end effector through a spatial volume.

Path planning, obstacle avoidance, and other technologies usually thought of for navigation are equally applicable for

manipulation. The complexity of multiple degrees of freedom afforded by multiple manipulator joints and the

addressability in space of the end effector require special control and feedback techniques in order to provide precision

and safety.

2.3 Data Analysis

With the push for more and more localized and on-board processing, there is the increasing need to design robotic

subsystems that can not only sense but make decisions based on inferred sensed data. This kind of work has been done

for years in data correlation and data fusion systems. In essence these systems take inputs from multiple sources and

combine those data streams in such a way as to increase reliability of the collected data, recognize objects not

necessarily sensed by one single source, or aid in identification of objects by reviewing multiple characteristics.

In navigation this becomes particularly important when operating around people. Due to safety concerns a system may

need to autonomously detect an object as a potential person and then proceed to identify that object as a person. By

combining not only visual profile type data but also sensed temperature, the combined data may indicate that not only is

a person present but potentially the health of that person. Assistive devices would likewise benefit from the combination

of data from multiple sources in order to assist in diagnosis, configurability or other functions.

3. FUTURE DIRECTIONS

3.1 Reviewing the Current State

Referencing back to our generic robot architecture and taking into account the summary of current technology, we begin

to see areas that are addressed better than others in current technology. Working our way through the key common robot

subsystems we can look more closely at where we stand as an industry.

Human-Robotic Interaction has benefited greatly by devices that involved the senses of hearing, sight, and touch.

Whether it is in the form of status and feedback from a robot or means of input to a robot, there is a wide array of

devices that facilitate this interaction. More devices are being developed at a significant rate. A robot that can provide

feedback through smell and/or taste are notably lacking. In some sense it is no wonder, as any device that would

generate smells and tastes would likely require a chemical means of generating the effects. The delivery could be

through a gas (particularly for smell), liquids or solids being made available to the nose or tongue. While this brings up

hygienic questions, we can certainly the gap and possibly look to future research to address it.

Proc. of SPIE Vol. 9116 91160A-5

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 08/19/2014 Terms of Use: http://spiedl.org/terms

Page 6: Needs and Emerging Trends of Remote Sensing

Navigation has greatly benefited from and is centered on the analysis of sensed sound or electromagnetic energy –

whether actively generated from the sensing device or sensed passively. The electromagnetic spectrum is being utilized

more and more with different kinds of sensors and ranging has been implemented not just with sound but also radio

waves and lasers. Visual analysis for object identification continues to mature as point clouds and object models are

investigated further. Force detection has been used for some time and would be expected to be utilized for obstacle

detection, proximity sensing, and other uses. At one point it was unclear how smell and taste would be utilized in

navigation. If we assume gas monitors and detectors are analogous to smell sensors, then we have yet another means of

potentially identifying objects in the navigation environment.

Manipulation has generally been associated with force sensors. As a means controlling dexterity and being able to

control applied force when griping, this has been a necessity. Recently with smaller and higher resolution cameras, we

now have the means to visually sense manipulation and detect manipulated objects either from the robot itself or from a

point on an end effector. Moving select processing capabilities to manipulators and end effectors allows for smarter

manipulation overall. The analogous senses of hearing, smell and taste, generally have not been exploited but this need

not be the case. For example, having a smell or scent detector could be useful in select applications in conjunction with

an end effector.

3.2 Review of Trends and Needs

Attempting to identify all the current areas of research related to robotic sensing would be too great a task for this paper.

Instead, a selected overview will be presented. The interested reader can then proceed into the areas that seem most

suitable for their application. Continuing with the architecture model identified earlier we will identify some select

future possibilities.

Human-Robotic Interaction is extendable beyond simple control and feedback. Co-Robots are designed to work

alongside people and as such requires not just visual and force sensing but new approaches to how these sensor types are

used. Application of human factors principles based on human-machine interaction will need to be adapted to human-

robot interactions. With every robot there seems to be a unique controller. With the advent of smart phones and tablets

there seems to be a popular push to control devices from central points like these smart devices. Continued “app”

development will allow for increased convenience in robot control.

Navigation historically has been based on 2-dimensional modeling. With lower cost 3-dimensional visual sensors that

trend has changed dramatically. Point clouds and other 3-D model representations are the latest means of utilizing these

new 3D sensors. Additional sensing will need to be developed beyond what is available on board a robot. It is likely

that sensing built into the operable environment will need to be developed. For example, driverless vehicles will likely

be operated en masse by utilizing not only on-board sensors but also sensors at traffic intersections and other points

along roadways.

Manipulation will need to expand beyond industrial applications. Compliant actuation will be a key growth area as close

interaction with humans becomes more common place. Today’s industrial robots work with materials that generally do

not require “softer” handling. While many manipulation techniques can be used in such applications, the end effectors

themselves will need to be adapted for more “gentle” uses. For today’s robots, manipulator subsystems are very

expensive in comparison to the host robotic platform. Manipulators with high degrees of freedom and general purpose

grippers that can operate around people will need to decrease in cost in order to be commercially viable on a large scale.

The focus above has been on various technologies but there is a need to also review the emerging application areas for

robotics. Precision agriculture will no doubt benefit from specialized sensing that can detect horticultural health, detect

disease and pestilence, as well as assess crop damage due to floods, drought, fire or other hazards. Deep sea exploration,

just as with space exploration, entails the use of sensors that can survive in very austere and difficult environments. For

example, visual sensors that can operate in conditions with sediment or other interfering agents will need to be

developed and matured.

4. ACKNOWLEDGEMENTS

Thanks are due to Dr. Jeongsik Shin at the UTA Research Institute for his review and comments on this paper as it was

developed.

Proc. of SPIE Vol. 9116 91160A-6

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 08/19/2014 Terms of Use: http://spiedl.org/terms

Page 7: Needs and Emerging Trends of Remote Sensing

REFERENCES

[1] “Fundamentals of Remote Sensing”, Natural Resources Canada – Center for Remote Sensing,

http://www.ldeo.columbia.edu/res/fac/rsvlab/fundamentals_e.pdf.

[2] “The Remote Sensing Tutorial”, NASA – Goddard Space Flight Center,

http://www.fas.org/irp/imint/docs/rst/Front/tofc.html

[3] “ISO 8373, Robots and Robotic Devices – Vocabulary”, International Standards Organization, 2012.

[4] Hui-Min Huang, Kerry Pavek, James Albus, Elena Messina, “Autonomy Levels for Unmanned Systems (ALFUS)

Framework: An Update”, 2005 SPIE Defense and Security Symposium, 2005.

Proc. of SPIE Vol. 9116 91160A-7

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 08/19/2014 Terms of Use: http://spiedl.org/terms