Final Report 2015
description
Transcript of Final Report 2015
-
[Type here]
1
WIRELESS BOMB DIFFUSING ROBOTIC ARM
USING HAPTIC GESTURES
A SEMINAR REPORT
Submitted by
SHAH DISHA CHANDRAKANT (110340111007)
THAKKAR AYUSHI JAYESHBHAI (110340111009)
SHAH RIYA ANISH (110340111018)
In fulfilment for the award of the degree
Of
BACHELOR OF ENGINEERING
In
ELECTRONICS & COMMUNICATION ENGINEERING
NARNARAYAN SHASTRI INSTITUTE OF TECHNOLOGY, JETALPUR
Gujarat Technological University, Ahmedabad
December 2014
[I]
-
[Type here]
2
NARNARAYAN SHASTRI INSTITUTE OF TECHNOLOGY
ELECTRONICS & COMMUNICATION ENGINEERING
2014
CERTIFICATE
Date:
This is to certify that the dissertation entitled Wireless Bomb Diffusing Robotic
Arm Using Haptic Gestures has been carried out by Shah Disha Chandrakant,
Thakkar Ayushi Jayeshbhai and Shah Riya Anish under my guidance in fulfilment
of the degree of Bachelor of Engineering in Electronics & Communication
Engineering (7th Semester/8th Semester) of Gujarat Technological University,
Ahmedabad during the academic year 2014-2015.
Mrs. Kinjal Kapadiya Mr. Jignesh Patolia
(Internal guide) (External guide)
NSIT, Jetalpur
Mr. Saurabh Patel Mrs. Shardadevi Mandalappu
(Head of Department) (Principal)
E.C. Department NSIT, Jetalpur
NSIT, Jetalpur
[II]
-
[Type here]
3
ACKNOWLEDGEMENT
The success and final outcome of this project required a lot of guidance and
assistance from many people and we are extremely fortunate to have got this all
along the completion of our project work. Whatever we have done is only due to
such guidance and assistance and we would not forget to thank them. We would
also like to thank all the faculty members of Narnarayan Shashtri Institute of
Technology for their critical advice and guidance without which this project would
not have been possible. We would like to express our special gratitude and thanks
to Mrs. Kinjal Kapadiya for giving us such attention and time.
Last but not the least we place a deep sense of gratitude to our family members and
our friends especially Abhijeet Satani who have been constant source of inspiration
during the preparation of this project work.
[III]
-
[Type here]
4
ABSTRACT
Since Bomb diffusing is very difficult, we make use of robots in this field. Our
main aim to select this project is to provide safety to the people who are in bomb
diffusion squad. According to the present law and order situations, where
thousands of people are killed and injured due to the bomb explosions, our robot
helps the defense team to diffuse the bomb with the help of haptic gestures without
physically touching it. It will also provide security to the common man in case of
crowded areas.
To accomplish these objectives we designed a robotic arm in which motion tilt
tracks a human operator's arm motion including open-close hand tracking by the
end-effector. For this case we are using joystick for controlling the robotic
movements i.e. forward, backward, left and right. When we move our hand over
the sensors, our hand movement is captured and the signal is sent to the robotic
arm and it will mimic the action accordingly. Camera is placed over the robotic
arm so that user can detect the circuit on the screen. It also detects bomb during
night since it is having night vision and it can be used for spying. Bombs can even
be diffused by different technologies like radio control, remote control, voice
control, etc. but we are making use of IR sensors for sake of serving our objective.
[IV]
-
[Type here]
5
LIST OF TABLES
Table No Table Description Page No.
1 ANALYSIS OF HAND AND ROBOTIC ARM ACTIONS 25
[V]
-
[Type here]
6
LIST OF FIGURES
Figure No. Figure Description Page No.
1 BlUETOOTH 16
2 MOTOR DRIVER CIRCUIT 17
3 IR SENSOR 18
4 CHASIS 19
5 ROBOTIC ARM 20
6 TRANSMITTER DEVICE 23
7 RECEIVER DEVICE 24
8 BUSINESS MODEL CANVAS 27
9 PRODUCT DEVELOPMENT CANVAS 28
10 IDEATION CANVAS 29
11 L293D 31
12 STATE DIAGRAM OF ROBOTIC ARM 33
13 INTERFACING OF L293D 34
14 ORCAD DESIGN 35
15 ROBOTIC BASE 37
16 ARM MECHANISM 38
17 SUBSYSTEM DIAGRAM 39
18 RELATIONSHIP 41
[VI]
-
[Type here]
7
Abbreviations Used:
RF Radio Frequency
EN Enable pin
IR Infra-red
TX Transmission
RX Receiver
PC Personal Computer
AM Amplitude Modulation
DC Direct Current
TTL Transistor Transistor Logic
RTL Resistor Transistor Logic
IC Integrated Circuits
AI Artificial Intelligence
IP Internet Protocol
AVR Advanced Virtual RISC
RC Remote Controller
LED Light Emitting Diode
-
[Type here]
8
TABLE OF CONTENTS
ACKNOWLEDGEMENT..3
ABSTRACT ........................................................................................................ 4
LIST OF TABLES ............................................................................................. 5
LIST OF FIGURES ........................................................................................... 6
ABBREVIATIONS USED.7
[1] INTRODUCTION
THE CENTRAL IDEA.12
1.1 PROBLEM DEFINITION..13
1.2 EXPECTED OUTCOME...13
1.3 LITERATURE SYRVEY...14
USED COMPONENTS15
1.4 HARDWARE..15
1.4.1 BLUETOOTH..15
1.4.2 MOTOR DRIVER CIRCUIT...16
1.4.3 SHEET OF IR SENSORS16
1.4.4 ANTENNA...16
1.4.5 RF TRANSCEIVER PAIRS.17
1.4.6 MOTORS..17
1.4.7 CHASIS18
1.4.8 ROBOTIC ARM...18
-
[Type here]
9
1.4.9 WIRE CUTTER19
1.4.10 METAL DETECTOR.19
1.5 SOFTWARE20
1.5.1 BLUETOOTH RC CONTROLLER.20
1.5.2 IP CAMERA.20
1.5.3 ORCAD SOFTWARE FOR PCB DESIGNING..20
1.5.4 PROTEUS SOFTWARE FOR SIMULATION20
1.5.5 ARDUINO 1.0.5...20
[2] DESIGN ANALYSIS AND IMPLEMENTATION
BLOCK DIAGRAM.......23
2.1 TRANSMITTER SIDE.23
2.2 RECEIVER SIDE.............24
WORKING.25
BUSINESS MODEL CANVAS.27
PRODUCT DEVEKOPMENT CANVAS.28
IDEATION CANVAS29
[3] SOFTWARE AND STIMULATION
3.1 MOTOR DRIVER L293D...31
3.2 ARCHITECTURE MODEL36
3.2.1 SOFTWARE ARCHITECTURE..36
3.2.2 ROBOTIC ARCHITECTURE..36
-
[Type here]
10
3.3 SYSTEM OVERVIEW..39
3.4 DATA DESCRIPTION..41
3.4.1 MAJOR DATA OBJECTS.41
3.5 SYSTEM INTERFACE DESCRIPTION..42
3.5.1 EXTERNAL MACHINE INTERFACE.42
3.5.2 EXTERNAL SYSTEM INTERFACE42
[4] MICROCONTROLLER AND WORKING
4.1 COMMON FEATURES OF AVR.44
4.1.1 PROGRAM MEMORY..44
4.1.2 INTERNAL DATA MEMORY.44
4.1.3 INTERNAL REGISTERS..44
4.1.4 GPIO PORTS..45
4.1.5 EEPROM45
4.2 WORK DONE...46
4.3 ADVANTAGE..47
4.4 DISADVANTAGE47
4.5 APPLICATIONS...47
[5] TESTING AND FUTURE SCOPE
5.1 SYSTEM TEST AND PROCEDURE..50
5.2 INTEGRATION TESTING..50
5.3 VALIDATION TESTING.50
-
[Type here]
11
5.4 TESTING TOOLS AND ENVIRONMENT50
5.5 FUTURE SCOPES...52
5.6 CONCLUSION.52
REFERENCES53
-
[Type here]
12
CHAPTER 1
INTRODUCTION
-
[Type here]
13
THE CENTRAL IDEA
The main backbone behind the idea of creating a bomb diffusion robotic arm is to provide a cost-
effective and safe solution to the very critical task of bomb diffusion. Bombs are a weapon for
mass destruction and have been a threat to the peace and prosperity of the world as whole. We
have thought to provide an alternative to the risky process of handling a bomb manually by
replacing it with the mediator of a robotic arm. Further when we see the possible obstacles of
developing a robotic arm, we find cost as a major hindrance. Of course costly equipment can be
installed when it comes to the safety of places having great value such as military sites and
government properties, but for private institutions, it becomes a matter of thought while
considering the finances. Thus, we thought to make the arm as simple and cost friendly as
possible, by only focusing on the economical techniques of bomb diffusion.
When we focused on the other obstacles except the cost, we found that size was the major
concern for us. As terrorists or people associated with such antisocial activities always hide
bombs at places which are very narrow and have a limited access. Thus we focused on creating
the robot arm whose dimensions are as small as possible so that we can reach our target
destination with ease and higher mobility. Also, the major issue was weight of the robot. As
more weight can give a hindrance to the mobility of the project. We designed the robot with
controls of Atmega controller as it is easy to implement and has features which makes it suitable
for our project. Also, Atmega controllers made the design of the arm with least possible external
hardware, making the whole robot a simple yet efficient means of bomb diffusion.
-
[Type here]
14
WHAT IS EMBEDDEDD SYSTEMS?
In An embedded system software and hardware devices used in embedded products such as
Microprocessors and controllers are combined and controlled in such a way that a particular task
is accomplished to the most efficient outcome. We program the controller or processor in a way
that it works according to the definition of the given aim. Processors differ from controllers in a
way that processors are a general purpose devices which do not their own memory and we have
to use external peripherals according to our task. Thus, we chose Atmega controller over
microprocessor so that we can reduce circuit complexity and bulkiness.
1.1 PROBLEM DEFINITION: Bomb diffusing is very difficult & it may put life in risk if directly humans diffuse bomb. For the
safety of human life, Robots are involved in the field of bomb diffusing. Our main aim to select
this project is to provide safety to the people who are in bomb diffusion squad. Taking into
consideration the present day law and order conditions, where thousands of people are killed and
injured due to the bomb explosions, our robot helps the defense team to diffuse the bomb with
the help of haptic gestures without physically touching it.
1.2 EXPECTED OUTCOME: We have designed a robotic arm in which motion tilt tracks a human operator's arm motion
including open-close hand tracking by the end-effector. For this case we are using joystick for
controlling the robotic movements i.e. forward, backward, left and right. When we move our
hand over the sensors, our hand movement is captured and the signal is sent via RF link to the
robotic arm and it will mimic the action accordingly. Camera is also placed over the robotic arm
so that user can detect the circuit on the screen.
-
[Type here]
15
1.3 LITERATURE SURVEY:
We researched a lot and after a lot of considerations we decided to implement the hand gesture
mechanism by virtue of IR sensors. While doing the same we searched many websites on the
internet and came up with the implementation methodology and circuit diagram for the same.
The original idea was adopted from a research article named Microcontroller Based Gesture
Recognition System for Handicap People was published in Journal of Engineering Research and
Studies. The goal of the original project was to develop a computerized Indian Sign Language
recognition system. We also considered Matlab for the same but since the software is not an
open source we discarded the option. Another idea includes the use of IR proximity sensors for
hand gestures recognition. The system allows a user to interact with mobile devices using
intuitive gestures, without touching the screen or wearing/holding any additional device.
Evaluation results show that the system is low-power, and able to recognize 3D gestures with
over 98% precision in real time.
Although the idea of our project is original, a number of projects with similar functionalities can
be found. For Example the British Police have a bomb disposal robot, the Israeli Army has it and
it is also being used by bomb disposal squads and a number of states of USA. The main idea of
this robot is to provide the bomb disposal squad with safety and security from the risks that they
face every day. The bomb disposal squads of Pune have metal detectors and other equipment for
bomb detection and disposal, but they have to risk their lives by approaching the bomb or the
suspicious packet without any safety and precautions.
Our robot provides an extra layer of protection to the bomb disposal squad. Mobile robots reduce
or eliminate a bomb technicians time-on-target. A robot takes risk out of potentially deadly
scenarios and lets the bomb technician focus on what to do to an explosive device rather than on
the immediate danger to life and limb. Even if a robot cannot reach an item for disruption, it can
still be used to relay information to aid in tool and procedure selection to moving downrange. In
addition, events recorded by a robots camera can provide evidence for further analysis.
-
[Type here]
16
USED COMPONENTS:
We have used technologically advanced components in order to reduce the processing time and
the instruction delay to its minimum. We needed both hardware and software components for the
accomplishment of the project.
They are described with their photographs as under:
1.4. HARDWARE:
1.4.1 BLUETOOTH:
Bluetooth is a wireless device which is used for the communication between short ranges.it
works on the 2.4 GHz frequency, on the frequency hop spread spectrum (FHSS) mode of
signaling system with 1400 hops per second. We first thought of controlling the robotic
movement by hand gestures along with the arm movement, but that was increasing the circuit
complexity and the cost of the project. Thus, we then planned to use a Bluetooth module and
interface with an android software app named blue tooth RC controller. This eased the complex
process of robotic movement as now it was very simple to interface a Bluetooth device with the
Atmega controller and then connecting the output to the motor driver circuits.
FIG 1.4.1
-
[Type here]
17
1.4.2 MOTOR DRIVER CIRCUITS:
The microcontroller output is not sufficient to drive the DC motors, so current drivers are
required for motor rotation. The L293D is a quad, high-current, half-H driver designed to
provide bidirectional drive currents of up to 600 mA at voltages from 4.5V to 36V. It
makes it easier to drive the DC motors. The L293D consists of four drivers. Pin IN1
through IN4 and OUT1 through OUT4 are input and output pins, respectively, of driver 1
through driver 4. Drivers 1 and 2, and drivers 3 and 4 are enabled by enable pin 1 (EN1)
and pin 9 (EN2), respectively. When enable input EN1 (pin 1) is high, drivers 1 and 2are
enabled and the outputs corresponding to their inputs are active. Similarly, enable input
EN2 (pin 9) enables drivers 3 and 4.
FIG 1.4.2
1.4.3 SHEET OF IR SENSORS:
An infrared sensor is an electronic device works on the principle of emitting the radiation as per
the radiation of light incident on it. We wanted that our project should have the control of hand
gestures for movement of the robotic arm so, we planned to interface these sensors with the
controller and then for the emission of the radiation, we used the antenna of a remote controlled
car. These sensors sense over hand movements and emitted the radiation to the receiver
according to the magnitude of radiation.
-
[Type here]
18
FIG 1.4.3
1.4.4 ANTENNA:
An antenna (or aerial) is an electrical device which converts input into radio waves, and vice
versa. It is usually used with a transmitting and receiving device. In transmission, a radio
transmitter supplies an electric current oscillating at radio frequency(i.e. a high frequency
alternating current(AC)) to the antenna's terminals, and the antenna radiates the energy from the
current as electromagnetic waves. In reception, an antenna intercepts some of the power of an
electromagnetic wave in order to produce a tiny voltage at its terminals that is applied to a
receiver to be amplified. The receiver has a module which converts the radio waves into
electrical energy and the energy passes on.
1.4.5 RF TRANSCEIVER PAIRS:
Radio frequency transceiver pairs are widely used in any wireless communication purpose. They
are used when the transmission is to be carried without having a line of sight. But, when we deal
with the complex circuits, these pairs can prove a bit inefficient.as, they are very difficult to be
programed, especially in our project, as the intensity of hand movements need to be very
constant and according to the range for precise implementation.
1.4.6 MOTORS:
Motors are a device which convert electrical energy into mechanical energy. They are used in
our project for the motion of robot on which the arm is mounted. They come in various types
-
[Type here]
19
such as dc motors, servo motors, Johnson motor, etc. We have used dc motor for our purpose as
it is faster and efficient for the robot mobility.
1.4.7 CHASIS:
Chasis are the basis on which the robot is made, and they also support the mounted robotic arm.
Chasis act as a base for the arm and also the body of robot. They are made up of a metal sheet
molded in a curved shape from two sides. Two chasis are joined to make a robot.
FIG 1.4.4
1.4.8 ROBOTIC ARM:
Robotic arm is a device in shape of an arm which moves according to the control of controller in
order to access the position of the bomb in order to diffuse it. We have interfaced servo motors in
the arm as per the requirement of the movement, so that the arm can rotate as per the controllers
-
[Type here]
20
direction.
FIG 1.4.5
1.4.9 WIRE CUTTER:
For cutting the circuit of bomb, we have attached a wire cutter or a simple blade, so that when
the bomb is accessed, and the gripper is closed, the bomb circuit gets cut and the bomb gets
diffused.
1.4.10 METAL DETECTOR:
Every bomb has a metal part. Thus, in order to detect the bomb, we have placed a simple metal
detector in the front of the arm, so that it senses the bomb and the controller knows the location
of the bomb.
-
[Type here]
21
1.5 SOFTWARE:
1.5.1 BLUETOOTH RC CONTROLLER APPLICATION FOR
ANDROID:
The Bluetooth RC controller application is an android operating system application, which is
used for the movement of robot in all directions and also for the operation of the gripper.it works
on the Bluetooth principle and is a cost effective solution for robotic movement.
1.5.2 IP CAMERA:
As installation of a camera or a spy cam would cost us much, we used an android application
known as IP CAMERA, which give us the view of the site and the user can observe it in the
laptop, which is interfaced to the application.
1.5.3 ORCAD SOFTWARE FOR PCB DESIGNING:
Orcad is open source software which we have used to design the printed circuit board for our
project. We have designed the circuits for motor drivers and the Atmega controller.
1.5.4 PROTEUS SOFTWARE FOR STIMULATION OF
CIRCUIT:
Proteus is also open source software which is used for circuit stimulation. It has all the necessary
interfacing devices and controllers which can be interfaced and checked out if the programming
of the device is correct or not.
1.5.5 ARDUINO 1.0.5:
We have used an Atmega 328 controller.so, we for programming of the controller, we have used
the Arduino 1.0.5 open source software. It has features like serial communication, analog to
digital conversion, I2C and SPI support, etc. The reason behind programming in this software is
that it is very simple and user friendly software which easily dumps the program into the
controller.
-
[Type here]
22
CHAPTER 2
DESIGN ANALYSIS
AND
IMPLEMENTATION
-
[Type here]
23
BLOCK DIAGRAM
2.1 TRANSMITTER DEVICE:
FIG 2.1
ATMEGA
CONTROLLER
SHEET OF IR
SENSORS
CONTROLLERS
HAND
GESTURES
ANTENNA
-
[Type here]
24
2.2 RECEIVER SIDE:
FIG 2.2
ATMEGA
CONTROLLER
ROBOTIC ARM
MOVEMENT
ROBOT
MOVEMENT
INPUT FROM
ANTENNA
INPUT FROM THE
BLUETOOTH RC
CONTROLLER
APPLICATION
METAL
DETECTOR
IP CAMERA ANDROID
APPLICATION
VIEWERS
CELLPHONE
CAMERA
-
[Type here]
25
2.3 Working:
The working of our project can be understood by dividing it into two modules namely, the
transmitter and the receiver. The transmitter consist of two vertical sheets of infrared sensors,
two horizontal sheets if infrared sensors, an Atmega controller and a transmitting antenna with a
RF transmitter. The infrared sensors are interfaced with the digital input and output pins of the
controller. When the user moves the hand over the sheet, the controller will get the signals
according to the position of the hand. We have programmed the controller in such a way that the
controller will sense the output of the IR sensors when the value of the radiation lies in a
particular range. We have decided the operation of the arm making it depending upon the range
from the users hand to move the robotic arm. On the other hand, a blue tooth based application
is used to control the motion of the robot in all directions. So, when the robot moves, the metal
detector detects the presence of the bomb circuitry, and gives the notation to the user about the
presence of the bomb. The user then moves the robotic arm by controlling it with hand
movements and also simultaneously views the position of the bomb placed with the help of the
IP camera application.
TABLE 2.2.1
HAND ACTION
(from users side)
ROBOTIC ARM
ACTION
Hand will move left Robotic arm will move
left
Hand will move
right
Robotic arm will move
right
Hand will stop Robot will stop
Hand will move upwards Robotic arm will move
upwards
Hand will move
downwards
Robotic arm will move
downwards
-
[Type here]
26
The receiver consist of an Atmega controller, a Bluetooth module, robot and the arm mounted on
the top of the robot chasis. The motion of the robot will be done by the motors interfaced with
the digital output pins of the controller, which is guided by the Bluetooth application. The arm
will move as per the users gestures, when the antenna over the transmitter will emit the
electromagnetic radiation, and the receiver of the transceiver module will convert the signal, and
guide the arm to move in that direction. Finally, when the arm reaches the position of the bomb,
the gripper is controlled by the Bluetooth app, which holds the wire into the grip, and cuts it so
the bomb gets diffused.
-
[Type here]
27
Fig 2.3 Business model canvas
The Business Model Canvas is a strategic management and lean startup template for developing
new or documenting existing business models.
It is a visual chart with elements describing a firms or product's value proposition, infrastructure,
customers, and finances.
Value Proportions:
Innovative
Cost Reduction
Risk Reduction
Newness
Security
Safety
Customer Relationship:
ATS
Bomb squad Services
Military
Automated Services
-
[Type here]
28
Fig 2.4 Product development canvas
The Product Canvas is a simple, yet powerful agile tool that helps you create a product with a
great user experience and the right features.
Product Features:
Compact Design
Video Feedback
Artificial Intelligence
Improved Reliability
Easily movable
Product Functions:
Surveillance and Spying
Bomb Diffusion
Allows interactivity in real time
-
[Type here]
29
Fig 2.5 Ideation canvas
-
[Type here]
30
CHAPTER 3
INTERFACING AND
SIMULATION
-
[Type here]
31
3.1 MOTOR DRIVER L293D:
FIG 3.1
PIN DESCRIPTION:
1-Enable 1
2- Input 1
3- Output 1
-
[Type here]
32
4, 5, 12, 13- Ground
6-Output 2
7- Input 2
8-VSS
9- Enable 2
10-Input 3
11-Output 3
14-Output 4
15-Input 4
16- VCC
-
[Type here]
33
FIG 3.2
-
[Type here]
34
INTERFACING OF L293D
FIG 3.3
-
[Type here]
35
FIG 3.4
ORCAD DESIGN
-
[Type here]
36
3.2 Architecture Model:
The architecture of the robot is based on a modular approach. For every function there is
a separate module, which performs its individual action. The architecture consists of the
following modules:
3.2.1 Software Architecture:
The application is divided into the following modules according to the functions that they
perform:
Video Interface Module: Camera is placed on the robotic arm to control the live feedback of the video, which
is transmitted from the robot to the control application.
Serial Interface Module: Serial communication is accomplished using this module. This module monitors all
the data sent through the serial port and all the synchronization of the microcontroller
and this module does serial port.
Application Control Module: All the events occurring when using the software application are controlled through
this module. This means all the application level design and inputs of the end user are
translated into control signals, which are transmitted to the microcontroller. All the
user interface and error control is done in this module.
3.2.2 Robotic Architecture:
The architecture of the robot has also been designed keeping the modular approach in
mind. It is divided into the following modules:
The Robotic Base: The base is made of rectangular chasis. In the middle supporting beams have been
welded to support the platform on top of the structure of the base. The base consists
of the following mechanisms that are responsible for driving and turning:
-
[Type here]
37
FIG 3.5 ROBOIC BASE
Driving DC Motors: The Driving DC motors are 12 volt DC motors that are connected to the each wheel
of the robot base. Torque from the DC motor is transferred to the individual wheels
that drive the robot.
Steering DC Motor: The steering DC motor is done by functions and commands programmed in
microcontroller.
The Robotic Arm: The robotic arm has been designed using metallic strips. The power to weight ratio
has been kept in mind in the design of the robotic arm, as it has to be light enough to
be moved by the stepper and DC motors and strong enough to be able to move
nominal weight objects. It is a jointed arm type robotic arm with 360 degrees of
freedom. It consists of:
o A DC motor for the shoulder of the robotic arm. This DC motor is mounted on
a circular plate onto which robotic arm have been mounted.
o At the next joint, the elbow, a DC motor is fixed which is responsible for the
pitch of the robotic arm. This DC motor is responsible for the vertical
movement of the arm.
o The final joint is connected to a DC, which controls the movement of the
gripper. The gripper is a two finger type and can pick up objects with ease.
o Also, cutter is placed over it to cut the wire.
-
[Type here]
38
FIG 3.6 ARM MECHANISM
The Wireless Module: The purpose of the wireless module is to provide wireless communication between
the control application and the robot. Data from the control application is transmitted
from the transmitter, which is connected to the serial port of the PC, and received at
the robot by the receiver. This receiver is connected to the serial port of the
microcontroller. The data transmission is done serially by the transmitter and
receiver. This transmission is based on Radio Frequency using Amplitude Modulation
at 433 MHz
-
[Type here]
39
3.3 Subsystem Overview:
Figure 3.7: Subsystem Diagram
IR Sensor: IR sensor is used as input. It recognize hand gestures and work accordingly. By using
a led which produces light at the same wavelength as what the sensor is looking for,
you can look at the intensity of the received light. When an object is close to the
sensor, the light from the led bounces off the object and into the light sensor. This
results in a large jump in the intensity, which we already know can be detected using
a threshold.
Video Control: The video control turns on and off the video signal from the wireless camera. It
provides features such as:
Displaying live camera feedback
Turning the video transmission on or off.
Transceiver System: The transmitter being used works at a frequency of 434 MHz. Most transmitters
available in the market either work at 434 MHz or 834 MHz. Some also use 900 MHz
-
[Type here]
40
for data transmission. This transmitter gives our robot a range of about 300 feet,
approximately equal to 100 feet. The receiver has the same characteristics and pairs
with the transmitter to form a transceiver system.
Wireless Camera: The wireless camera has its own transmitter and receiver system, which provides a
wireless link from the robot to the control application. It is mounted on the robotic
arm and provides a live view of the remote site. The application can record the video
input from this camera for further analysis. Initially the camera has been mounted on
the robotic arm, although it can be mounted anywhere on the robot according to the
users need. IP-cam is used for this purpose.
Gripper: The gripper is used to manipulate the object, which poses a threat or is suspicious and
can be moved from a critical place to a much safer place. It is constructed to be a two-
finger type gripper. It is controlled using a DC motor that allows a firm grip on the
object.
DC Motor: DC Motors basically works by applying DC power and the shaft spins. Depending on
the polarity of the electricity applied the shaft will spin in one direction or the other.
We have used eight DC motors. Four are used for driving the robot, one at the robot
arm and one for the gripper.
Power Supply: The robot has its own power supply. Three transformers have been used as the source
voltage for the power of different modules on board the robot.
-
[Type here]
41
3.4 Data Description:
3.4.1 Major data objects:
The major data objects in this system include the application, which generates data
signals. These signals are control signals that are generated by the application depending on the
input from the user.
User Input Data: Signals are generated when the user inputs data into the application. This data is
dependent on the function that has to be performed by the robot. For controlling
different functions of the robot different signals are generated.
Serial Data Object: The data generated by the input from the user has to be transmitted serially to the
microcontroller. This function is performed by the serial data object.
Wireless Transmission Object: The data from the serial data object has to be transmitted to the robot without using
any kind of wired communication. The wireless transmission object transmits data
using a wireless link over radio waves.
FIG 3.8 RELATIONSHIP
-
[Type here]
42
3.5 System Interface Description:
The system has many internal and external interfaces. The internal interfaces include
The Video input interface control
The data input wireless control interface
The serial data input/output control
3.5.1 External Machine Interfaces:
The system application has the following external machine interfaces:
The robot itself
The IP camera
3.5.2 External System Interfaces:
The system is connected to the following external system interfaces:
The wireless transmitter
Wireless receiver
Bluetooth module
-
[Type here]
43
CHAPTER 4
MICROCONTROLLER
AND WORKING
-
[Type here]
44
4.1 COMMON FEATURES OF AVR
Full form of AVR is Advanced Virtual RISC. The AVR advanced virtual risk microcontrollers
are Harvard architecture 8- bit reduced instruction set computer single chip microcontrollers.
They were developed by the Atmel in 1996. AVRs use the on- chip flash memory for program
storage. In the AVRs, the program and data are stored in two different memories. AVRs are
classified into:
Tiny AVR, mega AVR, Xmega, application specific AVR, FPGA AVR, and other 32 bit AVRs.
Tiny AVRs have 0.5 -16 kb of program memory and they come in 6-32 pin package with a
limited peripheral set.
The mega AVRs have 4-512 kb program memory and they come in 28-100 pin package. We
have used the mega AVR in our project as they have an extended instruction set. The xmega
AVRs support features like cryptography, DMA and event support.
The device architecture of AVR have the FLASH, EEPROM and SRAM all integrated onto a
single chip.so, AVR do not need any external memory.
4.1.1 Program memory:
Program instructions are stored in flash memory that is non-volatile. Each and every instruction
takes one or two 16-bit words.
The size of the program memory is described in the naming of the device itself (e.g., the
ATmega64x line has 64 kB of flash, while the ATmega32x line has 32 kB).
There is no off-chip program memory; all codes that are executed by the AVR core must lie in
the on-chip flash.
4.1.2 Internal data memory:
The data address space consists of the, I/O registers, SRAM and register file.
4.1.3 Internal registers:
The AVRs have 32 single-byteregisters and are classified as 8-bit RISC devices.
-
[Type here]
45
4.1.4 GPIO ports:
Each GPIO port on AVR drives up to eight pins and is controlled by three 8-bit registers: DDR,
PORT and PIN.
DDR: Data Direction Register
It configures the pins as either inputs or outputs.
PORT: Output port register.
Sets the output value on pins configured as outputs.
Enables or disables the pull-up resistor on pins configured as inputs.
PIN:
Input register, used to read an input signal. This register can be used for pin toggling: writing a logic one to a PIN bit toggles the
corresponding bit in PORT, irrespective of the setting of the DDR bit.
4.1.5 EEPROM:
Almost all AVR microcontrollers have internal EEPROM for semi-permanent data storage. Like
flash memory, EEPROM can maintain its contents when electrical power is removed.
In most variants of the AVR architecture, this internal EEPROM memory is not mapped into the
MCU's addressable memory space. It can only be accessed the same way an external peripheral
device is, using special pointer registers and read/write instructions, which makes EEPROM
access much slower than other internal RAM.
-
[Type here]
46
4.2 WORK DONE:
In making this robotic arm, for bomb diffusion, we first thought the design of a basic structure
and imagined about the appearance and the structure of the arm. We first designed the arm and
its motionary parts according to our requirement. Then, we thought of the technique of cutting
the wire with the gripper. We first thought of keeping it in the form of a human wrist. But, that
would cost us much, and the extra fingers were of no use and they increased the circuit
complexity. Thus, we thought to design the gripper in the form of a scissor. We assembled a
blade inside the gripper to cut the wire .after this, another problem was the logic to control the
arm. First, we decided to control both arm and the robot motion by gestures. But, then we
experienced the difficulty of achieving the preciseness and accuracy need. Thus, we thought of
other alternatives and, thinking about it, an idea about controlling the robot motion with an
android application strikes us. We calculated the possibilities of success and then agreed to
involve technique in our project, as it was efficient and simple. So, now the control of robotic
motion was decided and then we implemented it. And then came the critical path, of applying the
logic to control the arm movement. As we had already decided that the arm was to be controlled
by the hand gestures, we thought of interfacing two sheets of infrared sensors instead of one.
This was because only one sheet would not be as precise and accurate as we wanted it to be. So,
we took four sheets, two for horizontal and two for vertical. Then interfaced it with the Atmega
controller using its I/O pins and the output went to the transmitting antenna, which radiated the
electromagnetic radiations, which receives the radiations and converts it to the digital format
directly are interfaced to the arm joints. We installed an android application named IP camera
which gives us the view of the site. Also, to discover the bomb site, we placed a metal detector
so that the presence of the bomb can be observed.
-
[Type here]
47
4.3 ADVANTAGES: Compact design
Quick movement
Improved reliability
Removable multi gripper robotic arm
Artificial intelligence
Night vision camera
Time saving
Reduction in number of peoples needed in field. It can be altered to suit the need of the
user.
Robust
It can handle different loads
It can be controlled remotely
It has video feedback
Life saving and safe
It allows interactivity in real-time between operator and Robotic Arm.
4.4 DISADVANTAGES: Robots are not as suitable for making complicated decisions.
Auxiliary controls are required to move the workspace of the device to a new location.
Full working of the robot is dependent on the range of RF transceiver low power module.
I.e. up to 100 meters.
4.5 APPLICATIONS:
In Chemical Plants.
In Atomic Research Labs, Atomic Power Plants.
In Mines
In Material Handling.
-
[Type here]
48
Police: In hostage situations
Military: For reconnaissance missions, Bomb Diffusion, War Fields to save lives of
militants.
Fire: To provide video feedback of the site for analysis
Nuclear: For handling hazardous or radioactive materials.
-
[Type here]
49
CHAPTER 5
TESTING AND
FUTURE SCOPES
-
[Type here]
50
TEST PLAN:
5.1 System Test & Procedure:
The DC Motor Circuit:
The DC motor is connected to each wheel of the robot for its movement and also one at
the elbow and one for the gripper is used.
This circuit has also been tested on a breadboard.
The Serial Interface Circuit:
Serial interfacing was first tested by connecting L293D to the serial port of the PC and
the microcontroller. Hyper Terminal was used to generate the control signals, which were
tested by turning an LED on and off.
5.2 Integration Testing:
After the individual testing, the robot underwent integration testing. All circuits were
mounted on the robot, connected to the microcontroller circuit and tested. Control signals from
the serial port of the PC were generated using the control application and tested by turning on or
off the appropriate motor.
5.3 Validation testing:
The validation testing required the signals generated by the control application to be
correct. This was verified by checking whether the required motor moved on the appropriate
signal. The following mapping was used to test the validation.
5.4 Testing tools and environment:
The code written for the microcontroller was tested by using a test circuit. Every time a
code needed to be tested, it was programmed in the microcontroller and was tested on this
circuit.
The tools required for this testing are:
A spare microcontroller
Multimeter
-
[Type here]
51
Power Supply (+5volt and 12 Volt)
LED
Resistors
Capacitors
Serial Cable
-
[Type here]
52
5.5 FUTURE WORK:
Though we have completed the project to a satisfactory extent, we find that there are some
situations where we are not able to make our project work most efficiently. We are planning to
make a more efficient version of the robotic arm. We plan to make the arm function as a night
vision disposal arm. For which special night vision camera has to be interfaced. Also, we are
planning to extend the range of the connectivity, by using more advanced technologies. Also, we
are planning to make the robotic arm operate under water, for which we would require other
technologies such as waterproof circuits and also require a waterproof coating to prevent the
water accumulating inside the circuit. Also, we are planning to control the robot by help of brain
waves. We plan to use NEUROSKY for the brain wave control implementation. But, due to
constraints of making it user friendly and economical, we have not implemented it yet.
NEUROSKY is a technology which is used to control the instruments by brain waves. We have
planned to implement it for controlling the motion of robot as well as the robotic arm and the
gripper. We also plan to use the quad-copter for the motion of the robot, as the robot cannot
climb steps and is presently unapproachable. We are planning to make the arm a more accurate
and safe solution to the risky task of bomb diffusion, by improving it in all aspects. For this, we
will have to be more technologically sound and advanced. We can also make this arm to be
controlled by only genuine and authenticated personals by authenticating the user through a
finger print scanner or any authentication key. We have plans for future implementation of the
project, but we need more resources both finance and intelligence.
5.6 CONCLUSION:
This project describes designing and working of a system which is useful in areas concerning
security, surveillance etc. This assembly and the set up would help spread awareness and thus
prevent a dismay amongst masses which is posed by many malicious objects in the vicinity.
Besides, the hand gesture recognition makes the robot more user friendly and less prone to
accessibility hindrance. It is an effort to reduce crime levels and in turn curb terrorism.
-
[Type here]
53
REFRENCES:
[1] Engwirda, T. Engwirda, A. Vlacic, L. Dromey, G., Sch. of Microelectron. Eng., Griffith
Univ., Brisbane, Qld., Australia, Industrial Technology 2000. Proceedings of IEEE International
Conference on 19-22 Jan. 2008, Volume: 1, On page(s): 460- 465 vol.2, ISBN: 0-7803-5812-0,
INSPEC Accession Number: 6715776.
[2] Ray Lockton, Balliol College, Oxford University, Hand gesture recognition using Computer
Vision.
[3] Matuso et al, Hand Gesture Recognizing Device, United Nations patent, Patent no. US6,
215,890 B1, April 10, 2001.
[4] Heng-Tze Cheng, Contactless Gesture Recognition System Using Proximity Sensors,
Carnegie Mellon University, 1-1-2010.
[5] Muhammad Ali Mazidi, 8051- Micro-controller and embedded system using assembly and
C, 2nd edition, Prentice Hall publication, 2005.
[6] Designer & Author: Laszlo Kirschner, Mobile jammer [Online]. Available:
http://www.circuit-projects.com.
[7] Maeyama, S., Yuta, S. and Harada, A., (1998) Mobile Robot in the Remote Museum for
Modeling the Evaluation Structure of KANSEI, 7th IEEE International Workshop on Robot
and Human Communications, pp. 315-320, Oct 1998.
-
[Type here]
54
-
[Type here]
55