Demonstration of a Low Cost C.O.T.S. Rendezvous Depth...

76
Demonstration of a Low Cost C.O.T.S. Rendezvous Depth Sensor for Satellite Proximity Operations Final Report Author: Richard DUKE Bsc Demonstration of a Low Cost C.O.T.S. Rendezvous Depth Sensor for Satallite Proximity Operations Department of Electronic Engineering Faculty of Engineering and Physical Sciences University of Surrey Guildford, Surrey, GU2 7XH, UK August 2014 Supervised by: Dr C. P. B RIDGES Professor. P. PALMER ©Richard Duke 2014

Transcript of Demonstration of a Low Cost C.O.T.S. Rendezvous Depth...

Page 1: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Demonstration of a Low Cost C.O.T.S.Rendezvous Depth Sensor for Satellite Proximity

OperationsFinal Report

Author:

Richard DUKE Bsc

Demonstration of a Low Cost C.O.T.S. Rendezvous Depth Sensor forSatallite Proximity Operations

Department of Electronic Engineering

Faculty of Engineering and Physical Sciences

University of Surrey

Guildford, Surrey, GU2 7XH, UK

August 2014

Supervised by:

Dr C. P. BRIDGES

Professor. P. PALMER

©Richard Duke 2014

Page 2: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Declaration of Originality

I confirm that the project dissertation I am submitting is entirely my own work and that any

material used from other sources has been clearly identified and properly acknowledged and referenced.

In submitting this final version of my report to the JISC anti-plagiarism software resource, I confirm

that my work does not contravene the university regulations on plagiarism as described in the Student

Handbook. In so doing I also acknowledge that I may be held to account for any particular instances

of uncited work detected by the JISC anti-plagiarism software, or as may be found by the project

examiner or project organiser. I also understand that if an allegation of plagiarism is upheld via an

Academic Misconduct Hearing, then I may forfeit any credit for this module or a more sever penalty

may be agreed.

Dissertation Title:

Demonstration of a Low Cost C.O.T.S. Rendezvous Depth Sensor for Satellite Proximity Operations

Author Name:

Richard Duke

Author Signature: Date: August 19, 2014

Supervisor’s Name:

Dr. C. P. Bridges

Professor C. Palmer

i

Page 3: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Acknowledgements

I would like to express my heartfelt appreciation to my supervisor Chris Bridges who encouraged

me to take up this project. I would like to thank him for his constant encouragement and motivation

during the project especially during the most difficult and frustrating periods.

I would like to acknowledge Phil Palmer, and William Beckwidth for their assistance and help

in relation to the air bearing test hardware set-up. I would also like to thank Hirotaka Niisato for

developing the driver used in this project, and his support in comparing driver output between his

testbed, and the one used in this project.

Finally I would like to thank my family and friends for their constant support through this

dissertation and return to University. Especial thanks goes to my wonderful mother for proof reading

the final report.

ii

Page 4: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Word Count

Number of pages: 66

Number of words: 20535

iii

Page 5: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Abstract

With the quest to undertake ambitious tasks in space, proximity operations involving multiple

spacecraft are expected to increase. A vital part of any proximity operation is the relative range

and pose measurement between spacecraft and target. In the past sensors have been expensive

and only capable of being used with cooperative targets. Recent years have seen the start of a

technology race to bring low cost depth sensors (LIDARS) to consumer devices. The aim of this

project is to demonstrate that one of these low cost C.O.T.S. depth sensors could be used in close

proximity operations to facilitate rendezvous and docking operations for CubeSat sized satellites.

This report details the selection of a low cost LIDAR, the Softkinetic DS325. A Linux ARM

driver is developed to interface this with a Raspberry Pi to process depth data and obtain range

and pose estimation. Test hardware is developed to facilitate in the loop testing using simulator

satellites on air bearings simulating a 2D frictionless environment. The report demonstrates that

it would be possible to use a low cost LIDAR on a CubeSat sized mission, and highlights the

difficulties that this technology brings especially in interfacing with a relatively low capability

processing board. In addition, the driver modified as part of this project becomes the first correctly

working open source driver for the Softkinetic range of depth sensors.

iv

Page 6: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Contents

Declaration of Originality i

Declaration of Acknowledgements ii

Word Count iii

Abstract iv

1 Introduction 1

1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.2 Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

1.3 Aim and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

1.3.1 Aim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

1.3.2 Assumptions and Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . 6

1.3.3 Detailed Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

1.4 Novelty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

1.5 Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

1.6 Structure of Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

1.7 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

1.8 Abbreviations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2 Literature Review 11

2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.2 Sensors used in Spacecraft Proximity Operations . . . . . . . . . . . . . . . . . . . 11

2.3 Proximity Operations involving CubeSat’s . . . . . . . . . . . . . . . . . . . . . . . 13

2.4 Low Cost Depth Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

2.4.1 Comparison and SWOT Analysis of Low Cost LIDAR’s . . . . . . . . . . . 16

2.4.2 Softkinetic DS325 Time of Flight Sensor . . . . . . . . . . . . . . . . . . . 18

2.5 Interface and Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

2.5.1 Raspberry Pi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

2.5.2 C++ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

2.5.3 Universal Serial Bus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

2.5.4 LibUSB Library . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

2.5.5 TCP/UDP Protocols . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

v

Page 7: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

3 Hardware Development and Set-up 24

3.1 Softkinetic DS325 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

3.2 Raspberry Pi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

3.3 Power Supply . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

3.4 Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

3.5 Test Platform and Air Bearing Tables . . . . . . . . . . . . . . . . . . . . . . . . . . 28

4 Systems Integration 30

4.1 Interface Driver Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

4.1.1 Assessment of Softkinetic Interface . . . . . . . . . . . . . . . . . . . . . . 32

4.1.2 Development of Packet Replay Software . . . . . . . . . . . . . . . . . . . 37

4.1.3 OpenNI2DS325 Driver . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

4.1.4 Conversion of Raw Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

4.1.5 Final Driver . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

4.2 Vision Processing Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

4.2.1 Basic Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

4.2.2 Network Interface Development . . . . . . . . . . . . . . . . . . . . . . . . 45

4.2.3 Acquisition of Initial Data Analysis of Development Data . . . . . . . . . . 46

4.2.4 Filtering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

4.2.5 OpenCV . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

4.2.6 Range and Pose Calculation . . . . . . . . . . . . . . . . . . . . . . . . . . 49

5 Testing and Validation 50

5.1 Failure of LIDAR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

5.2 Power Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

5.3 Accuracy and Precision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

5.4 Basic Timing Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

5.5 Network Protocol Timing Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

5.6 Filtering Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

5.7 Range detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

6 Conclusion 58

6.1 Reflection vs Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

6.2 Achievements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

6.3 Project Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

6.4 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

vi

Page 8: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

6.5 Final Thoughts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

References 64

vii

Page 9: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

List of Figures

1.1 LIDAR Illumination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.2 Coherent Signal Method for Time of Flight LIDAR . . . . . . . . . . . . . . . . . . 4

1.3 Image Showing Structured Light Speckle Pattern as used in the Microsoft Kinect . . 5

1.4 Example Depth Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.1 Graphics Depicting the LIRIS Experiment . . . . . . . . . . . . . . . . . . . . . . . 13

2.2 Primesense Family . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2.3 Softkinetic Family . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

2.4 Combined SWOT Analysis of Microsoft Kinect and Softkinetic DS325 . . . . . . . 17

2.5 Raspberry Pi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

2.6 USB Data Flow Schematic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

2.7 Isochronous and Control Transfer Packet Black Diagram . . . . . . . . . . . . . . . 23

3.1 Block Diagram of System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

3.2 Schematic of system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

3.3 Picture of Final Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

3.4 The Full Air Bearing System with LIDAR attached . . . . . . . . . . . . . . . . . . 30

4.1 Depth Image using Softkinetic Software on Microsoft Windows . . . . . . . . . . . 33

4.2 Example USB Enumeration Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

4.3 Schematic of DS325 Endpoints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

4.4 Screenshot of Packet Sniffing using Wireshark . . . . . . . . . . . . . . . . . . . . . 36

4.5 Identification of Packet Type and Contained Data . . . . . . . . . . . . . . . . . . . 37

4.6 Screenshot of Packet Replay Software . . . . . . . . . . . . . . . . . . . . . . . . . 38

4.7 Image Showing Basic Test Output of Depth Stream . . . . . . . . . . . . . . . . . . 39

4.8 Plot showing Actual Distance vs Reported Distance from the OpenNi2DS325 Driver. 40

4.9 Example of Raw Depth Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

4.10 Plot Showing Raw Data Values vs Distance from Object . . . . . . . . . . . . . . . 42

4.11 Plot Showing Final Output vs Measured Distance . . . . . . . . . . . . . . . . . . . 43

4.12 Comparison of Original OpenNiDS325 Driver Output to that of the Final Driver with

Improved Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

4.13 Flowchart Describing Execution Flow of main.cpp. . . . . . . . . . . . . . . . . . . 45

4.14 Example of Timing Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

4.15 Consecutive Images of Matlab Streaming Output On Air Bearing Table . . . . . . . 47

4.16 Example of Histogram Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

4.17 Example of Cross Section Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

viii

Page 10: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

5.1 Plot of Accuracy and Standard Deviation Data . . . . . . . . . . . . . . . . . . . . . 51

5.2 Showing Results for Erosion Filter when Run a Different Amount of time. . . . . . . 54

5.3 Showing Segmentation Algorithm for Different Threshold Values . . . . . . . . . . 55

5.4 Output from Combined Erosion and Segmentation Algorithms . . . . . . . . . . . . 57

5.5 Image Showing Output of Full Filter Algorithm . . . . . . . . . . . . . . . . . . . . 57

5.6 Screenshot Showing Output of Range and Bearing Data to Target . . . . . . . . . . . 58

ix

Page 11: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

1 Introduction

Proximity operations involving multiple spacecraft have been an important part of space technology

from the beginning of the space era. Most of these operations, up to the present day, have focused on

manned space travel. Demand has started to grow for a wide variety of applications that require an

increase in proximity operations as well as their automation to reduce the need for real time manual

control in flight. This significantly reduces both costs and risks to human personnel. Developing this

key technology is critical to allow more complex operations in space. It is a key enabler to allow

docking of spacecraft allowing large structures to be constructed in space. It will allow spacecraft

to be inspected, repaired and maintained whilst in orbit. It will also be necessary for orbital debris

removal, as well as more exotic missions, such as collecting a Mars sample return capsule.

In close operations between two spacecraft, the sensors that determine relative navigation data

between the spacecraft are critical. The sensors provide data to on board GNC (Guidance, Navigation

and Control) systems to control the orbit and attitude of the spacecraft. Sensors need to be reliable

and accurate. Inaccuracies could result in incorrect positioning of the spacecraft resulting in mission

failure or collision. As with any other space instrument, sensors are required to stand up to the harsh

space environment.

One form of sensor, a LIDAR (LIght Detection and Ranging) is the focus of this project. This is

an active sensor which uses the backscattering of light emitted from the sensor to map the distance

between the sensor and scattering surface. LIDARS have advantages over a variety of other sensors

as they can directly measure the distance to an object.

Recently, low cost LIDARs have started making an appearance in consumer electronics. As with

any technology, mass production leads to lower production costs. The ability to use commercial off

the shelf (C.O.T.S.) products would substantially reduce the cost of producing a LIDAR system for

space. However, these low cost LIDAR’s are be designed with terrestrial applications in mind and are

not tested to confirm they can withstand the extra demands of the space environment. Specifically

C.O.T.S hardware is not designed or tested to withstand conditions such as lack of atmosphere;

extreme temperature; radiation exposure, and strong ambient lighting conditions. The accuracy

and reliability of a system may also be required to be far higher for a space application; where the

ramification from any error could be serious.

An important aspect to consider, when using these type of sensors, is the processing requirements.

Due to the real time nature required by the GNC (Guidance Navigation and Control) algorithms,

the processing of the data must be done on-board the spacecraft. Despite remarkable advances in

computing, restrictions on power, mass, and volume limit the amount of data that can be processed in

real time. This means algorithms need to be carefully selected to be effective whilst coping with the

1

Page 12: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

sensors low processing requirement.

This aim of this project is to demonstrate the feasibility of using a low cost commercial off the

shelf (C.O.T.S.) depth sensor as a method of detecting relative range and pose (orientation) between

two spacecraft during proximity operations (with focus on rendezvous and docking). It is a proof

of concept study. It is intended that the results from the project will lead to a better knowledge of

the feasibility of this type of system, and will inform future work leading to the building of flight

hardware and algorithms.

First a review of currently available low cost LIDAR’s will be conducted. Different sensors will

be compared and possible candidates suggested. After an evaluation of candidate sensors, a suitable

sensor will be selected and purchased. An interface will be developed to stream the LIDAR sensor

data to a small control board based on the low cost Raspberry Pi computer[1]. Two approaches to this

are investigated; reverse engineering of a driver, and using a recently developed driver based on the

Open NI framework [2].

Basic vision processing algorithms will be investigated and developed to identify a target spacecraft,

and obtain range and pose information intended to be passed onto a spacecraft control system. Where

possible different methods will be developed and contrasted. Both their effectiveness and processing

requirements will be looked at to inform future work.

A frictionless 2D test bed will be used as a means of gathering initial development data as well

as to test the outcome of any algorithms. Both hardware and software interfaces will be developed to

mount the system to a 3U air bearing satellite and simulate frictionless movement. This will build up

capability for future complete demonstrations of 2D simulated operations.

1.1 Background

From the early days of the space program, proximity operations between spacecraft were given

a high priority. It was recognised that any attempt to land on the moon, one of the primary aims of

the space program at the time, would require the separation and joining of different spacecraft. The

first challenge was the closure of distance between two spacecraft to where orbital mechanics have

only a small affect. At this point, simpler control laws can be used and close proximity operations

can be undertaken with only small drifts from the nature of orbital mechanics to account for. An

initial attempt at rendezvous failed between Gemini 4 and its Titan II upper stage. This was due to

the lack of realisation about the counter intuitive nature of orbital mechanics. In 1965 Gemini 6 and 7

became the first spacecraft to successfully rendezvous. The first docking (joining) of spacecraft was

successfully completed a few months later in 1966, when Neil Armstrong successfully docked to a

Agena target vehicle with Gemini 8. [3]

2

Page 13: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

The next year the Soviet Union managed the first automated (unmanned) docking between two

cosmos satellites. Two years later they matched the United States manned capability with the docking

of two Soyuz spacecraft.

After the technology was demonstrated, rendezvous and docking of two spacecraft become routine

despite the challenges. It featured heavily in the Apollo program between the lunar lander and the

command module. The docking of two spacecraft even become a symbol of international cooperation

between the two cold war nations of the United States and Soviet Union, in the form of the Apollo-Soyuz

test program. As human space-flight moved towards orbital space stations, docking between spacecraft

was required at regular intervals to allow for resupply and crew rotation. Where the United States

used the space shuttle capability to construct the US segment of the ISS (International Space Station),

Russia through it’s space station programs chose to use automated docking of spacecraft to achieve

the same result. This led to an increase in automation of space technology.

Automated proximity operations in robotic space flight have been few and far between, however

there are now numerous missions on the horizon that will require development of this technology.

These cover a wide variety of applications from exploitative such as landing asteroids, to the commercial

such as spacecraft maintained. Maintenance and inspection is expected to be involved in many future

missions allowing satellites to be repaired, refuelled and their lifeline extended, lowering replacement

costs. The increased concern over orbital debris in low earth orbit is expected to lead to missions to

remove and debris resulting in a safer environment for other spacecraft.

The success of rendezvous and docking in the past has generally depended on relative navigation

sensors such as laser reflectors, as well as human vision, to inform control decisions for the last

few meters of closure. Whilst these methods are well tested, new technologies that are able to take

advantage of modern processor technology are being investigated. These have the potential to input

and analyse more data points in real time, allowing more flexibility such as supporting operations with

non cooperative satellites, and other spacecraft which have not previously being prepared in advance.

One of these new technologies is Imaging LIDAR. LIDAR is similar in it’s physics to the more

well know RADAR, but uses infra-red and ultra violet frequencies which are closer to the visible

spectrum. LIDAR requires a electromagnetic signal to be sent from the sensor towards the area being

observed as shown in Figure 1.1. Depending on the surface properties, a portion of this ’illumination’

will be backscattered towards the sensor. The amount of light will depend on the surface properties

of the object.

LIDAR has been used in its simplest form for scientific purposes to infer surface type purely from

the intensity of light scattered back by the surface. LIDAR really comes into it’s own when it is used

to make a distance calculation to the incident surface. The travel time of the light being emitted by the

3

Page 14: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 1.1: LIDAR Illumination [4]

Figure 1.2: Coherent Signal Method for Time of Flight LIDAR [5]

sensor, to being received back. If th this time is known, then the distance can be worked out simply by

a formula (1.1) where ∆t is the time taken for the light to be received back, and ∆d is the calculated

distance.

∆d =c ∗∆t

2(1.1)

In reality, measuring light travel time can be difficult, especially with any noise interference.

To solve this, the light is transmitted as a coherent signal. This signal can then be compared with a

reference signal to determine the amount of time taken to traverse the distance, as shown in Equation 1.2

In recent times LIDAR technology has expanded to include Imaging LIDARS. This is where the

returning light is received back to a CMOS sensor similar to those used in digital cameras. This

allows thousands of measurements to be received at the same time, building up a fuller picture of

the distances to surrounding objects. These Imaging LIDAR’s have started to make an appearance

in consumer electronics. The first commercial success of these depth sensors was in the Microsoft

Kinect controller developed for use with the Xbox 360. The Kinect LIDAR uses a different method

to calculate depth data compared to the traditional time of flight method. The depth map is calculated

by analysing the deformation of a speckle pattern of the infrared laser light as shown in Figure 1.3.

The technique of analysing a known pattern is called structured light.

4

Page 15: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 1.3: Image Showing Structured Light Speckle Pattern as used in the Microsoft Kinect

The Kinect allows users to interact with the games console using 3D body movement. Relatively

quickly the USB interface to the Kinect was reverse engineered by the open source community and

a new driver and source code made available. This led to a large community of hobbiests writing

software for the sensor for many varied applications.

Due to the success of the Kinect, several other companies have been developing their own depth

sensors. Most applications still focus around user interaction, but other applications are starting to

increase, such as their use as scanners to record room dimensions, scanning of objects for 3D printers

and suchlike.

1.2 Context

The University of Surrey has a rich heritage of spacecraft design with a focus on bringing C.O.T.S.

technology, like LIDAR’s into the spacecraft industry. Starting in 1981 when its first satellite UoSAT-1

was launched, Surrey has specialised in the design of small satellites. By using C.O.T.S. technology,

Surrey aims to reduce cost and increase technology development. Satallite proximity operations,

leading to the rendezvous and docking of small spacecraft is an active area of research. Two missions

are being developed to test new functionalities within this field, and are of direct relevance to this

project.

The STRaND-2 mission is a technology demonstrator mission to launch twin CubeSats with the

aim of testing rendezvous and docking technology [6]. It builds upon the successful STRAND-1

mission, which was the first to test a smart phone in space. This mission is being developed with

Surrey Satellite Technology, a spin off from the Surrey Space Center (SSC) at the University of

5

Page 16: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Surrey.

AAReST (Autonomous Assembly of a Reconfigurable Space Telescope)[7], a joint mission between

Surrey and CalTech, aims to reconfigure a group of CubeSats whilst in orbit to demonstrate that a

space telescope could be constructed from multiple satellites. This is key enabler for future astronomy

as newer missions will require mirrors which are too big to be launched on current rockets. The

CubeSats are expected to be launched together, but on orbit will undock and redock with one another

to place them in the correct configuration for the telescope. As the alignment between the mirror

segments on each satellite will not be perfect, adaptive optics will be used allowing the mirror to

be focused. The spacecraft bus systems, propulsion and rendezvous and docking technologies will

be developed by SSC including a novel magnetic docking system for CubeSats. Caltech will be

responsible for the mirror structure.

1.3 Aim and Objectives

1.3.1 Aim

The main aim of the project is a proof of concept investigation for a low cost rendezvous sensor.

This research problem is formally stated as ....

”To demonstrate that a Low Cost C.O.T.S. LIDAR can be used as a rendezvous depth

sensor for Satellite Proximity operations.”

1.3.2 Assumptions and Constraints

• The scope of this project will focus on selecting a suitable sensor as well as developing and

assessing the interface between the sensor and the processing board.

• The project will assume that the sensor suite will be used for CubeSat type missions.

• Mission requirements will be assumed as similar to those for the Arrest [7] and STRaND2 [6]

spacecraft concepts as these are the most likely upcoming missions for the LIDAR to included

as a payload.

• It is assumed that the sensors will have to provide navigation data on the order of several

meters until the point of capture by the docking system. The docking system specifications are

assumed to be similar to the magnetic docking system designed by Craig Underwood at the

SSC.

• The sensor will interface a processing board based on the Raspberry Pi processor. [1]. The setup

of the Rasberry Pi will only be required to facilitate the development and testing portion of the

6

Page 17: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

project. Further flight preparations such as boot loaders, operating system optimisation and

similar will not be considered in this project. (This is being investigated by other researchers

within the SSC)

1.3.3 Detailed Objectives

1. To research and select suitable depth camera. This will involve a literature review of different

sensors and a SWOT analysis to assess the merits of candidates.

2. To develop suitable hardware interfaces to support development data acquisition and testing.

This involves designing a suitable hardware structure to contain the LIDAR and control board,

provision of suitable power supply, as well designing a physical interface to test air-bearing

satellites.

3. To develop the software interface between LIDAR and control board. Assessment of raw

processing abilities and processing requirements.

4. To investigate and develop filters that allow detection of a target satellite.

5. To implement visual processing algorithms that will detect range and pose of target satellite.

6. To demonstrate the system on air bearing simulator satellites.

7. To characterise basic capabilities and restrictions of the system. Power budgets, processing

frame rate, and effect of ambient lighting conditions will be looked at with the intention of

being able to inform future work.

1.4 Novelty

The novelty in this project is the electrical, mechanical and software integration of the different

systems required for the full design. The project makes used of a Low Cost Time of Flight Imaging

LIDAR. These are new to the market and are now at the forefront of depth sensor applications. The

project will also make first use of the 3U air bearing satellite and will be one of the first use of this

technique in the SSC.

1.5 Challenges

• Low cost depth sensors are new technology. As with any infant industry, there is a lot of flux

and rapid change within the time-scale of the project. For some sensor types there a gaps in the

literature due to the relatively small time these sensors have been available and the propriety

nature of the hardware.

7

Page 18: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

• Reverse engineering of a driver was required. This is an extremely challenging procedure. It

carries significant risk in relation to schedule delays as well as risk in relation to achieving the

main aim of the project.

• Due to the restrictions of the processing board required to be tested on the presumptive mission,

vision processing algorithms will be required to be very efficient.

1.6 Structure of Report

The report structure is generally time-line based, however different aspects of the project are

grouped into separate sections to aid clarity. First a literature review is conducted which identifies and

conducts a SWOT analysis between different LIDAR candidates. A sensor is chosen and hardware

interface developed. The test apparatus is looked at and progressed to a usable state. The report then

moves onto the software interface development and vision algorithm implementation. Testing and

validation results are provided leading to a review of the achievements of the project and outline of

future work.

1.7 Definitions

Depth Image

Also known as a ’Depth Map’,’Z image’ or ,’Z map’. This is a standard name for a set of data

which encodes depth (distance) data. It is often displayed as a image with the depth encoded as a

colour or intensity as shown in Figure 1.4 . For this project, when displayed as an image, pixels are

referenced on a x-y 2D Cartesian coordinate system with the origin at the top left pixel. It should be

noted that the encoded depth is a direct range from the sensor to the backscattered surface and not

necessarily in the direction perpendicular to the x-y plane.

Point Cloud

This is a collection of data which contains x, y, and z Cartesian co-ordinates of one or multiple

points. When point clouds or Cartesian co-ordinates are discussed in this project it is assumed that

we are dealing with a standard right hand Cartesian system with the sensor at the origin. A positive

direction in the Z axis is defined as looking outwards from the sensor, The positive Y direction is

defined in the sensor ’up’ direction, and the positive X direction is given using the right hand rule.

8

Page 19: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 1.4: Example Depth Image [6]

9

Page 20: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

1.8 Abbreviations

Abbreviation Description

AAReST Autonomous Assembly of a Reconfigurable Space Telescope

ARM ARM Processor Architecture (ARM Holdings plc)

COTS Commercial Off The Shelf

DHCP Dynamic Host Configuration Protocol

DS325 Softkinetic DepthSense 325

ESA European Space Agency

GNC Guidance Navigation and Control

GPIO General Purpose In-Out (Connector Pins)

I2C Inter-Integrated Circuit

IP Internet Protocol

LIDAR LIght Detection and Ranging

JPL Jet Propulsion Laboratory

NASA National Aeronautics and Space Administration

SSC Surrey Space Centre

SSH Secure SHell

SSTL Surrey Satellite Technology Ltd

STRaND Surrey Training, Research, and Nanosatellite Demonstrator

SWOT Strengths, Weaknesses, Opportunities, and Threats

TCP Transmission Control Protocol

TOF Time Of Flight

UDP User Datagram Protocol

USB Universal Serial Bus

VNC Virtual Network Computing

X86 80386 Compatible CPU

10

Page 21: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

2 Literature Review

2.1 Introduction

In this section, a review is undertaken into the current uses of technology in spacecraft proximity

operations, moving to review of possible suitable sensors. Available low cost sensors, specifically

LIDAR’s are researched and a comparison between them is undertaken before a candidate is selected.

Finally, we will finish with an introduction and review of literature available for some the technologies

that will be used in implementing the sensor interface.

A wide range of literature is required to make a project of this kind successful, ranging from

details of previous space missions, to instruction guides for C++ programming. One of the challenging

aspects for the literature review is the lack of information due, to the novelty of both the sensors and

it’s intended application. Many papers and other information only became available towards the end

of the project. For example LIDAR testing using the ATV (European Automated Transfer Vehicle)

took place within the last week of the project. In respect of the chosen LIDAR, basic information

only was available until a more detailed overview was published in the last few weeks.

2.2 Sensors used in Spacecraft Proximity Operations

In the past, typical manned spacecraft has made use of the human ability to integrate information

from multiple sources such as video guidance systems, video overlays, and laser range finders. [8].

On the Space Shuttle, the Trajectory Control System (TCS) was a laser-based system that tracked

retro reflectors located on the ISS to provide bearing, range and closing rate information. The system

was ultimately controlled by a pilot. Soyuz, Progress and the European Automated Transfer Vehicle

(ATV) were designed to dock automatically, but relied on manual control as a backup. In the case

of the ATV this involved a videometer that monitored laser pulses sent from the ATV and returned

back via retro reflectors. [9]. This was similar to that used by the TCS. Russian vehicles use a

system called Kurs which utilises a radar based system that also uses a reflector to determine range.

An understanding of these systems is important to appreciate the current operations, but they bear

little resemblance to the new depth image type sensors, and therefore little of this research is directly

applicable.

Research papers referencing the newer scanning LIDAR sensors tested in the last days of the

shuttle are able to provide some relevant information. The sensors envisioned in this project are

of different design, but the underlining physics is shared. The basic methods of processing data

generated by the sensors and converting this into useful navigation data will be similar. One of the

best sources for information on this is a NASA provided paper summarising the test demonstrators.

11

Page 22: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

[dto] The paper provides an evaluation on the performance of several sensors, including TriDAR,

STORM, and DragonEye. It provides a detailed assessment of each one, which can inform this project

of how the chosen sensor could be assessed both qualitatively and quantitatively. It also informs about

several issues to affect these sensors. The TriDAR system, in certain modes identified only part of

the ISS, which caused a navigation error in the order of 20m. This suggested that when designing

the visual processing algorithms, initial detection of the object needs to be well throught through and

must detect the whole of the object to get good bearing and range information. The report also notes

that all sensors were affected by spurious reflections off the corners of the ISS. As the physics is

broadly the same, this would be something that should be assessed for low cost sensors as well. A

paper by Christian et al [10] also supports the NASA paper and is broadly based on similar data from

the same tests.

The NASA documents are supported by additional papers, such as Zhu et al [11] and Ruel et al

[12], which focus on the on the individual tests. Both papers state the advantages of depth information

sensors, as well as giving detailed information about the capabilities and basic design of the sensors.

With many of the sensors being propriety, these papers offer a good insight into an area which is

normally difficult to access.

A similar test being conducted by ESA (European Space Agency) is attempting to gather data

on LIDAR imaging. ATV-5 Georges Lematre is in orbit at the time of writing and carries the LIRIS

(Laser InfraRed Imaging Sensors) depicted in Figure 2.1. While only a passenger payload at the

current time, the aim is to gather test data to demonstrate that a LIDAR can be used as a rendezvous

sensor for a non-cooperative target. This is equivalent to the aim for this project, but on a larger scale.

Whilst little information other than press releases are currently available, the mission illustrates the

importance of this subject is recognised by different space agencies, and some of the uses for the

capability.[ATV]

Outside manned space missions, there is one relevant mission to consider. NASA’s DART mission

used a video guidance sensor to demonstrate automatic rendezvous. The mission was not successful

and resulted in a collision between the two spacecraft. Whilst the sensor is of a different type to

the depth sensors being investigated, NASAs mishap report [13] is worthy of study and highlights

the importance of how the proximity navigation data is integrated into the control system. It also

highlighted the importance of assessing the confidence of the measurements from the system and

taking correct actions in the event of a sensor system failure.

12

Page 23: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 2.1: Graphics Depicting the LIRIS Experiment

2.3 Proximity Operations involving CubeSat’s

The PRISMA mission [14] is a demonstrator mission similar to the missions this project is

expected to influence. In 2010, two satellites called Mango and Tango were launched together. Both

satellites were slightly larger than traditional CubeSats, but sufficiently similar to provide comparisons.

The aim of the mission was to undock and perform proximity operations between to two spacecraft.

Despite spacecraft failures, mission objectives were completed were completed. The sensor used in

the reletive navigation was a vision based camera as opposed to a LIDAR. Though mostly successful

the PRISMA mission highlighted the disadvantages of vision based systems; specifically dealing with

changeable light conditions during sunlit operations and availability during darkness,.

At the time of writing, no other CubeSat sized spacecraft have successfully rendezvoused and

docked. However there are several missions on the drawing board that will be applicable to the

demonstration that we are trying to achieve. The papers and information most of use to this project

relate to the STRaND-2 and AAReST spacecraft which this project hopes to directly influence. The

STRaND-2 paper [6] contains information on the mission concept and spacecraft design. This paper

gives us a basis of the size and capability of the spacecraft and will form the assumptive mission and

spacecraft that the sensor will be developed for. Whilst the aim of these new depth sensor techniques

is to achieve detection without any cooperative aspects of the target, there must be some knowledge

13

Page 24: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

of the targets dimensions which this paper can support. Information on the AAReST [7] spacecraft

can also provide similar requirements. Specifically the ARReST mission concept has detailed plans

for the docking system. This provides us with a capture limit for the docking system and therefore the

sensor range requirement for this project that would allow navigation data all the way to the capture

point. Much of the detailed information is contained within presentation and reviews on AAReST

which are available within the SSC.

2.4 Low Cost Depth Sensors

The past few years have seen the development and commercialisation of low cost depth sensors.

As is common with young industries in the technology arena, there is a high level of turmoil and

change within the companies involved. Even within the relatively short time-scale of this project, it

is expected that the technology and players involved in developing and marketing the products will

change. The first commercial success of low cost depth sensors was in the Microsoft Kinect controller

developed for use with the Xbox 360. This was a structured-light 3D scanner type sensor. [15] The

chipset responsible for the depth sensing was developed by the Israeli company Primsense. The sensor

was developed primarily for skeleton relative movement recognition, to allow users to interact with

games by moving their body. After an open source driver was developed by hobbyists, applications

for the Kinect grew and with it the industry lead, OpenNI framework, was built to develop further

applications. Microsoft followed by releasing their own framework for developers. Many papers

have been released for systems that use the Kinect sensor to achieve depth perception. However, very

few are directly applicable to this project as they focus on skeleton tracking and relative movement as

opposed to absolute distance measuring.

Following on from the release of the Kinect, Primesense started to release it’s own family of

products base on the same technology starting with the Carmine. In addition, they started to licence

their technology to other companies such as ASUS with the Asus Xition. The family is depicted in

Figure 2.2. As they have nearly identical chipsets, all are very similar in capability.

An open source driver exists for the Primsense family based on Open NI, with very good community

support. However, after research was done on support forums, no examples of the sensor working

successfully on a Raspberry Pi could be confirmed with most comments suggesting that the processing

speed is not high enough to support successful functioning. This would be a risk to the project.

(NOTE: As the project has progressed, the Primsense family is now no longer in production.)

The second family of depth sensors available is from Belgium company Softkinetic. The first

sensor, the DS311 was built as a Kinect competitor. It featured a long and short range capability

which allows for long range skeletal tracking, as well as short range finger tracking, an ability that the

14

Page 25: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 2.2: Primesense Family

kinect does not have. All the Softkinetic sensors use time of flight to create a depth map. The DS325

follow up sensor has a much smaller form, but is focused at close finger tracking only. [16] This

means its range is closer than the primesense sensors. It is able to detect objects as close as 15cm.

Initially the new Softkinetic DM536 was considered as it was the smallest. However the module is not

currently on sale. After contacting the company it was found that DM536 was identical to the DS325

(except for the casing) and so the DS325 could be disassembled to achieve the same objective with a

waiver obtained from Softkinetic. In terms of interface drivers, Softkinetic have released drivers for

both Windows and Linux. Currently no Linux ARM version has been released meaning that a driver

for the Raspberry Pi would have to be developed. The softkientic family is depicted in Figure 2.3

Softkinetic has a strong working relationship with Intel and as part of this Intel (Creative) helped

develop the Senze 3D depth camera based in the Softkinetic chipset. This formed the demonstration

hardware for Intels Perceptual Computing Framework. [17]. While there are small differences in

how the Depthsense and Senz 3D sensors communicate to their host requiring different drivers, for

all intents and purposes the hardware is identical.

General documentation related to the Softkinetic family sensors is more limited. This is due

their more recent development compared to the Primsense family, and the use of wholly proprietary

drivers have has limited the developer user base. However, there are some research papers which do

provide some very useful insight. The University of Waikato performed an analysis of the Softkinetic

Depthsense DS325 measuring accuracy and precision of the sensor. The results stated mean that a

similar undertaking may not be required for this project, although checks still need to be made in the

case of a different driver being used. In this case, the results can be used to compare the difference

between drivers. From online support pages on the SoftKinetic website, development of the sensor

is ongoing within the same model designation, hence the results may not be strictly valid for more

recent purchases so this needs to be born in mind. [16]. A good deal of information about the chipset

15

Page 26: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 2.3: Softkinetic Family

can be obtained from Texas Instruments who has begun to licence Softkinetics technology. Several

useful white papers, made available late in the project, are available provide data that on the design

and operation of the sensor.[5] Some of the data has been written by Softkinetic themselves but only

released through Texas Instruments.

Clarkson et al [18] provides information on how to correct for distortion in the Kinect system. The

techniques could also be used for the time of flight system to correct for lens distortion. Depending

on the time-line of the project, this could be used to improve the accuracy of the system once tested.

2.4.1 Comparison and SWOT Analysis of Low Cost LIDAR’s

A major decision point of the project was the choice of sensor. During research it was determined

that there were 5 main contenders across each of the two families of depth sensors. The Kinect was

the presumptive sensor at the start and the one sensor that was easy to source, and so this was chosen

to represent the Primesense family. The DS325 was chosen as the Softkinetic family candidate.

A SWOT analysis (Strength, Weakness, Opportunity, and Threats) was performed. The sensors’

advantages and disadvantages from an integration perspective were assessed, as well as the ability of

the research into each sensor to contribute to the community. The SWOT analysis was significantly

different for each sensor and the comparison is shown in Table 2.4

There were two main factors that lead the decision of purchasing the Softkinetic DS325 depth

16

Page 27: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Microsoft Kinect(Structured Light)

Softkinetic DS325(Time of Flight)

Stre

ngth

s

• Extensive documentation.

• Well known USB interface.

• USB driver exists.

• Good range up to 4 meters.

• Suitable mass for CubeSat.

• Suitable volume for CubeSat.

• Range overlaps docking system

capture distance.

• Low power usage.

• Field of view better in the vertical.

Wea

knes

ses

• High mass.

• Dimensions are large, almost full

length of a 3U CubeSat.

• Power consumption high.

• No examples of driver working

with Raspberry Pi.

• No driver exists for Raspberry

Pi. Risk to project due to Reverse

Engineering.

• Lack of Documentation

Opp

ortu

nitie

s

• Well known, good public relations.

• Market leader for several years.

• Good prospects for the future by

forming a basis for the Intel

Perceptual Computing framework.

• Sensors are new, any research will

be adding to basic knowledge.

• Newer improved sensors on the

horizon.

Thr

eats

• Uncertain future for sensor and

similar sensors in the same

family. The Company chose not

to support future open source

development

• Replacement sensor heavily

propriety, likely unable to be used.

• Newer sensor with less market

share.

• Less well known company.

Figure 2.4: Combined SWOT Analysis of Microsoft Kinect and Softkinetic DS32517

Page 28: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

sensor for the project; that of capability, and how the knowledge gained would apply to the industry

applications in the future.

In terms of hardware capability, the Softkinetic Depthsense was the best decision in most regards.

It was the lightest, smallest and required the lowest power consumption; all perfect for a CubeSat. It

uses true Time of Flight which provides good precision when the return signal is strong. [19] While

the far range was not as good the Primesense family, it is capable of closer ranges up to 150mm. This

overlapped with the docking capture distance expected from either of the two reference missions.

However, the main driving force behind the decision was the future applicability of the research.

Whilst for the past few years the Primesense family had been the industry leader in depth sensing, the

companies buyout by Apple introduced uncertainty into the future. This would have meant that the

end result of research may have been less applicable in the future as the research may have needed

to be undertaken with another sensor. While this was by no means a certain argument at the time

the depth sensor choice was undertaken. As the project matured the decision proved a good one.

By the end of the project the Primsense company had been shut down; the original Kinect sensors

and similar sensors are no longer in production; and associated support frameworks have been lost

or unsupported. The Softkinect sensor on the other hand has become more prominent in the industry

and is now the base technology for both Intel and Texas Instruments.

For all it’s benefits, the lack of available driver for the Raspberry Pi was seen as a large risk factor

in the project. This risk was accepted due to the larger benefits of research for the chosen sensor.

2.4.2 Softkinetic DS325 Time of Flight Sensor

The selected sensor, the Softkinetic DS325 has little information provided by the manufacturer,

as for most applications detailed information is not needed as the company provides it’s own filtering

and a software development kit to access the sensor at a high level. However as stated earlier, Texas

instruments now provide the technology on licence from Softkinetic. Whilst the products will not be

exactly the same, we can assume that the broad design is comparable.

From the two Texas instruments white papers, [5] [20] we find out the sensors are time of flight

sensors which use a coherent modulated light source to illuminate an object or scene. Typically the

light source is in the near infra range at a wavelength of circa 850nm. There are two different methods

of measuring the reflected light depending on whether the emitted light is pulsed or uses a continuous

wave source. Lefloch et al [21] states that the softkinetic chipsets are based around the continuous

wave method. In this case the light is modulated before for emission. Multiple samples are then taken

of the returned light each 90 degrees out of phase. According to [19] from these measurements the

phase angle change ψ between the emitted light and the light received back can be calculated using

18

Page 29: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Equation 2.1 :

ψ = arctanQ3 −Q4

Q1 −Q2(2.1)

and from the phase angle ψ the distance d the light travelled can be calculated with equation 2.3

where f is the frequency of the emitted light:

d =c

4πfψ (2.2)

One issue that occurs with this set-up is that of aliasing. The phase angle is measured, but this

wraps around after 2π. Therefore aliasing (repeat values) occur at a distance given Equation :

dambiguity =c

cf(2.3)

To solve this, the sensors uses multiple frequencies. As the ambiguity distance is dependant on

frequency, the only real detection will be where each of the different frequencies report the same phase

change where qe is the quantum efficiency, fm is the modulation frequency, Deff is the demodulation

frequency, It is the total intensity, and Is is the active light signal intensity [5].

The uncertainty of the distance measurement is given by Equation 2.4:

∆D =c

2fm

1√8

√qeIt

DeffqeIs(2.4)

This means that (of the things we can influence) the sensor is more accurate when there is more

reflected light backscattered from the laser, but ambient light introduces a noise factor that reduces

the accuracy. [20]

2.5 Interface and Testing

This section reviews some of the technologies and related information that will be used in this

project to interface and process data from the depth sensor.

2.5.1 Raspberry Pi

The Raspberry Pi computer released in 2013 as shown in Figure 2.5 was developed by Mark Upton

and associates in order to provide a cheap way of “getting young people interested in technology and

programming”. Based on a Broadcom 2835 module with a ARM11 processor, it is able to run a

complete Linux based operating system, whilst being around the size of a credit card. [1]

19

Page 30: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 2.5: Raspberry Pi

Over recent years it has become very popular with hobbyists. This popularity, as well as the low

cost, lead to it’s selection for development for flight tests by the Surrey Space Centre, and therefore

it’s use in this project.

The Raspberry Pi mergers an impressive range of capabilities from a range of different things. It

can act as a low level ’electronic control’ type processor such as an Audino, with items such as GPIO

(General Purpose In/Out) pins to basic electronic commanding. At the other end of the spectrum,

it also can run a fully functioning Linux based operating system with more complex interfaces such

as USB, HDMI and LAN. This is important for the project as it will allow the use of direct USB

interface to the LIDAR, as well as ability to program the visual processing algorithms in c++. With

specific use to space, the Raspberry Pi also directly supports the I2C protocol which is often used for

spacecraft bus systems.

Due to the academic target market of the Raspberry Pi, there has been a large community of

developers which has resulted in a lot of available support. The Raspberry Pi foundation provides

a wealth of information via their website. [22]. This contains detail on the hardware as well as

community forums and resources for software configuration, troubleshooting and developer information.

The processing speed of the Raspberry Pi is not as great as some of the newer ARM platforms

now available. This is important as it is likely to restrict the type of vision processing algorithms that

could be run, restricting it to less intensive processing algorithms. Graphics capabilities are very good

and equivalent to a modern games console. However, basic processor performance is equivalent to a

300MHz Pentium 2, very slow compared to modern personal computers.

20

Page 31: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

2.5.2 C++

C++ is one of the most used programming languages in the world and is used across almost all

applications. It’s roots started when the C language started to be extended to use classes in 1979 Bell

laboratories. Over the years it evolved to current standards and makes C based programming more

efficient and flexible.

C++ is largely compatible with its C ancestor (all be it sometime with some modifications). This

makes it a good choice for this project as many visual processing libraries make use out of basic C or

C++ and therefore the can be used together. More flexible and faster to program than C, due the ability

to use classes, streams and suchlike, C++ is one of the more complex languages. It allows pointers to

memory to be set up, and so care must be taken to allocate and reference memory correctly, otherwise

memory leaks and exceptions could occur. Never the less, this lower level approach often leads C++

to compile programs with faster execution time than most other languages. This would be of huge

benefit to this project due to the high processing requirement needed on a processing board with

only moderate processing ability compared to a modern PC. Therefore this will be the programming

language of choice for the project.

There are many IDE’s (integrated developer environment) available, however when working with

the language on linux, often just a good text editor is used. Compilers such as GC++ are used to

compile the C++ language into machine code and link the program together. On linux, compilation

and linking is often facilitated with ’makefiles’. These are scripts that automate the building process.

Simple scripts need only contain one or two lines calling GC++ with the location of the code as an

argument. However makefile scripts can be extended to using variables, multiple options and nested

scripts allowing for more complex scenarios.

Due to it’s widespread use, there is an overwhelming amount of documentation available on the

internet and books for C++. Specialist books such as Practical Computer vision using C [23] give

good guidance on how to convey mathematical vision processing concepts into practical code. The

book specifically covers basic algorithms such as threshold, geometry recognition, and classification

of objects.

2.5.3 Universal Serial Bus

The Universal Serial Bus is a comprehensive standard that controls both the hardware and software

protocols to communicate between the hosts, (Raspberry Pi), and Devices, (the sensor). It comprises

of a very high level protocol which allows fast data speeds, connection of multiple devices, and

multiple simultaneous logical data streams. One of the big advantages of USB is that of device

enumeration. When a device is plugged in, the host is able to query it for information. The device

21

Page 32: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 2.6: USB Data Flow Schematic

then provides the host with information such as the type of device; information about how it’s interface

is structured; and the protocols and data speeds it can support. This allows the host operating system

to be able to select the correct software to interface with it.

Enumeration data can be sent from the host to various ‘endpoints’ on the device and vice versa.

These endpoints are different for each device, but are communicated to the host via enumeration.

Each endpoint has a defined type which controls how data is transferred.

Control Transfers Used for command and status operations. eg. Commands to switch a device on.

Interrupt Transfers Used as a priority message. eg. Computer Mouse / Keyboard data.

Isochronous Transfers Used to stream large amounts of data where dropped packets are not critical.

eg. Video streams.

Bulk Transfers Used to transfer large amounts of data which requires guaranteed delivery. eg.

Printer data.

At a lower level, different parts of the transfers are broken down into separate packets depending

on the transfer type. The data flow and packet specification of a Control Transfer is shown in 2.6 and

2.7 respectively.

Like all widespread protocols there, is a wealth of information spread on the web. One critical

source of information is USBIF [24] which contains the formal specification of the standard. USB in

22

Page 33: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 2.7: Isochronous and Control Transfer Packet Black Diagram

a Nutshell [25] provides a much similar description of the standard which is more accessible from a

practical perspective.

2.5.4 LibUSB Library

The LibUSB library simplifies the access of USB devices without the need for very complex

kernel-mode drivers. This simplifies and speeds up the development of drivers, and also makes the

solution more robust as any issue with a kernel-mode driver could cause instability in the entire

system. The library is used in many different open source projects, however despite this there is very

little information available on how to implement it. The only information worthy of note is provided

by the developers on their website. [26]. This contains basic information on each function, but

does not go into details. An extensive search was performed on the internet, however any additional

information was effectively a clone of what was available on the original website and therefore did

not add to the information available.

2.5.5 TCP/UDP Protocols

There are various protocols available in streaming data over networks. Two are by far the most

popular in current use with IP technology. TCP (Transmission Control Protocol) is a client server

based transmission protocol. Clients connect to a listening server via an open port. Data can then

be streamed in both directions between the two. Each packet is acknowledged and so transmission

of a packet is guaranteed to reach it’s destination or an error will be flagged. UDP (User Datagram

Protocol) is a more basic protocol than TCP. Implementation is more straightforward and the transmission

overhead is much smaller. However, data is transmitted without any guarantee of delivery. Extra

information often needs to be sent to provide control of data as there are no internal control mechanisms

23

Page 34: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

to regulate the flow. Whilst the UDP protocol is more bandwidth friendly than TCP, often more data

needs to be sent to control the Flow. This may lead to using more bandwidth than TCP as TCP has

everything built in.

There is a multitude of information available on the web for programming TCP and UDP sockets.

Two source are used in the project are from ‘linux how tos’, [27] as this gives a detailed description

of how to implement sockets in C++. Also Matlab help files [28] are used to implement the Matlab

portion of the project as the Matlab implementation is very specific to the program.

3 Hardware Development and Set-up

The aims of the hardware design were to interface the LIDAR with the Raspberry Pi, satisfy all

power requirements and allow the components to be securely mounted onto the air-bearing simulator

satellites. After investigating the test apparatus, it was decided that the most logical place to mount

the LIDAR and associated hardware was on the top of the 3U simulator satellite. The design would

need to be completely self contained. Any wires feeding to the system would impart torques on the

test apparatus which would negate the frictionless aspect of the equipment. For this reason, a power

source also needed to be integrated into the system. Similarly, external communication for testing also

necessitated a wireless solution. A wireless dongle was used to transmit data to a nearby Windows

Client PC. It is expected that the flight design will replace this with a wired of I2C connection from

the Raspberry PI’s to the spacecraft’s control system instead. A basic system design is shown in

Figure 3.1

3.1 Softkinetic DS325

As stated in the literature survey, the DS325 was able to be disassembled into a smaller version of

the sensor by removing the casing and support structure. A waiver was obtained from the Softkinetic,

and disassembly was conducted according to the manufacturers instructions. This left the sensor

and lenses contained in a smaller metal container. Further disassembly to circuit board level was

considered and excluded as this would have increased potential schedule risk, as damage to components

in the process was a possibility and was not necessary for the test purposes. It remains open work for

the future.

The USB cable to connect the sensor was circa 1 meter long and therefore was prohibitive to

fitting on the simulator satellites from both a volume and mass point of view. Therefore this was

reduced in length, and soldered to a new USB connector to recreate the interface.

24

Page 35: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 3.1: Block Diagram of System

3.2 Raspberry Pi

For testing the Raspberry Pi Model B (512kb memory) was used. The Rasbian operating system

was installed with standard options [29].

Wireless capability was provided by a WiPi network adapter from the company ‘Element 14’

[30]. Several different wireless configurations were used to find the most reliable network connection

for both test and development. Setup via the University’s Network proved problematic due to having

to having a frequency changing IP address, therefore an Ad-Hoc (direct) Wireless set-up between

the Pi and PC client was preferred. Setting up an Ad-hoc network was achieved by editing the

“/etc/network/interfaces” file which contains network interface settings. ISC DHCP [31] was downloaded

and installed, however the DHCP proved very intermittent and often required multiple restarts to get

an IP Address for the client PC. Therefore setting a static IP address on the ad-hoc network for both

the Raspberry Pi and the client pc was the more suitable way forward.

Most access to the Raspberry Pi was conducted through the command line remotely via SSH(Secure

Shell Protocol). This was a significant time saving. To aid development, VNC (virtual network

computing) server [32] was installed to allow the desktop environment to be accessed from the client

PC.

GEdit [33], a text editor, was also installed to aid in development of code and Samba [34] was

also installed to allow file transfer between the Raspberry Pi and the Windows Client PC.

25

Page 36: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

3.3 Power Supply

In order to identify the correct battery size to power both the LIDAR and Raspberry Pi a basic

power analysis was performed to approximate power consumption. As all power supply in this

project is being based off USB specification, a supply voltage of 5v was assumed for all components.

Approximate requirements were gathered from hardware datasheets and summarised in Table 3.1.

Voltage Current

Softkinetic DS325 5V <500mA @5V (<2.5W) [35]

Raspberry Pi 5V 700-1000mA [36]

Table 3.1: Listed Power Supply Requirements

From this a maximum current consumption of the system was assumed as 1.5A which was used

to size the battery. A 12000mAh Lithium Ion battery designed to power USB was sourced that would

fit inside a CubeSat. This provided an ample supply of power giving several hours of use for each full

charge.

The importance of correct power supply for the sensor was made apparent during testing of the

interface driver. Initially the LIDAR was plugged into the Raspberry Pi USB port directly. However

after much painstaking troubleshooting it was found that the Raspberry Pi was not providing enough

power to the LIDAR, resulting in a detection range of circa 20 cm instead of 1m. (Troubleshooting

was made harder due to the presence of a second issue with the output from the interface driver

itself.) Whilst the Raspberry Pi can provide current up to 500mA, the documentation states that the

USB power was designed for 100mA usage and “...while it is possible to plug a 500mA device into

a Pi and have it work with a sufficiently powerful supply, reliable operation is not guaranteed.” [37]

This was proved to be the case. To solve this issue, power was directly fed from the battery to the

LIDAR via the +Vcc cable in the USB (Universal Serial Bus). The GND (ground) connection was left

connected to Raspberry Pi to provide a common ground for the signal lines. The full power supply of

the system is shown in Figure 3.2

3.4 Structure

There were several different alternatives considered for a structure to support the components.

Initially the aim was to try and compress all the necessary parts into a 1U purpose built box to

simulate the requirements required by the flight hardware. A 1U cube was provided for this purpose.

Velcro was decided as the method to attach it to the top of the simulator satellite. However this did

not prove to give the stability required. Also, it became apparent that the equipment would not fit in

26

Page 37: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 3.2: Schematic of system

without significant modification. A major problem was the Raspberry Pi. Whilst only measuring 8cm

long itself, it requires 15 cm due to the presence of USB cables and flash a card.

Final flight hardware will look very different to the hardware used for this project. The Raspberry

Pi model will likely be the ’compute’ model with a much smaller footprint, the LIDAR will require

stripping down to the circuit board level and the battery may not be required due to the availability

of power from the spacecraft bus. Therefore it was decided to design a structure that was conducive

to testing. Cardboard mockups were made. A key design driver was the need of easy of access to all

components to allow access to interface sockets.

The final design took inspiration from the simulator satellites, where four threaded rods were

used to support the major components. In this design, nuts and washers were used to space 100mm

by 100mm platforms hosting the components. This gave the flexibility for different spacing between

the levels. The four supports were offset from the corners to allow for the locating of the sensor along

one edge. M6 threaded rods were used which allowed compatibility with the main support structure

of the simulator satellites.

To keep the cost low, several 100mm by 100mm offcuts of steel were sourced free of charge and

metal work completed by the author. Weight was not an major issue during this project, however it

would be expected in future to swap the steel layers with aluminium to reduce the payload weight.

The bottom plate was used as an adapter to allow bolting the structure onto the load bearing threaded

rods on the air bearing satellites which had a different cross section. Velcro was used to attach, the

components to the structure which allowed easy removal whilst providing adequate stability.

This overall design as shown in Figure 3.3 proved a very good design for testing both this payload

27

Page 38: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 3.3: Pic of Final Structure

and other test payloads. It provides a rigid test platform, whilst giving the ability to be connected /

disconnected from the simulators with four extended nuts.

3.5 Test Platform and Air Bearing Tables

In general the air-bearing simulator satellites and granite table where in good condition, but

required some assembly to bring up to a working state. Work was also required in order to be able

to run the simulators, partially due to lack of set up information as well as fixing some issues on the

hardware.

Initial tests showed that the run time was only circa one minute compared to the three minutes

expected. A basic leak test was undertaken and it was found there was a leak from the pressurised

system. Further checkout showed that one thruster was not firing correctly and it became obvious that

the supply to the thruster was the source of the leak. The pipe from the thruster was replaced and a

overnight leak test performed which proved that the leak was fixed.

A separate Lithium Ion battery was used to power the on-board Arduino for control of the thruster

solenoids. The battery was non rechargeable, and so for final testing, the battery was replaced with a

USB cable and powered using the same battery powering the LIDAR.

Some basic code was provided at the start of the project for the Audino. However as this was

only very basic testing code, a more user friendly control method needed to be implemented. After

setting up the wireless connection using two Audino XeeBees, serial commands could be sent to

the Arduino. Thrusters were set to thrust in pairs for better control. These were programmed to

respond depending on different keystrokes as show in Table 3.2. This was developed with William

28

Page 39: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Thruster Pair effect Ascii Key

Forward 8

Reverse 2

Left 4

Right 5

ClockWise Rotation (Fast - 2 thruster pairs) 9

Anti-ClockWise Rotation (Fast - 2 thruster pairs) 7

ClockWise Rotation (Slow - 1 thruster pair) 1

Anti-ClockWise Rotation (Slow - 1 thruster pair) 3

Table 3.2: Control Thruster Mapping to Keystrokes

Beckwith (Undergraduate Student) The main keystrokes correspond to the secondary arrow keys on

most numeric keypads.

Optimisation of the thruster firing period was also required. It was found that if a thruster fired

for too long, (on the order of half a second or more) the internal pressure of the compressed air would

fall and affect the air cushioned feet. This would cause the simulator satellite to catch on the table

introducing a torque. Some cleaning of the granite air hole feet was also done to reduce torques

resulting from uneven air flow.

29

Page 40: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 3.4: The Full Air Bearing System with LIDAR attached

4 Systems Integration

4.1 Interface Driver Development

The development of the driver interface is the main focus of this project, and the most challenging

aspect. After the selection of the sensor, it was apparent that acquisition of an interface diver or

information on the sensors USB protocol would be extremely advantageous. Extensive correspondence

was had with Softkinetic via email and phone with reference to obtaining a Linux driver for the

Raspberry Pi. They confirmed that they had a development driver, but would not be moving it to a

release version. The main reason for this was that their internal tests showed that the combination

could only return data at a rate of one frame per second. This would not be useful for most applications

as even basic vision processing filters would be needed which would slow down the refresh rate

30

Page 41: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

further. Further discussions were had with the manufacturer over the course of several weeks to try

and obtain the source code to the driver to make improvements or to gain interface information which

would help writing a new new driver from scratch. The offer was made to attempt to improve their

driver and release code back, as well as sign any non disclosure agreements. However the efforts in

the end proved fruitless.

At the same time as the negotiation with Softkinetic took place, alternative options for the driver

were investigated. Softkinetic Linux drivers for the x86 platform as well as Android based drivers for

the ARM platform where compiled and tested. It was unlikely that these drivers would work due to

binary libraries used in the compilation, however the attempt was undertaken due to the massive time

saving that it could have brought. Several generic O/S provided webcam and audio drivers were also

tested. The audio drivers were capable of accessing and streaming the microphone data on-board the

device, however no depth data or image sensor data could be streamed.

With the unavailability of any driver, work started on creating one. Reverse engineering can be an

extremely difficult task with a lot of uncertainty. It is akin to more of a science research project than

an engineering one, as the main task is to understand from observations and measurements (mainly

from USB data packets) how the system works in detail. There was a huge element of risk in this, as in

some cases reverse engineering can be virtually impossible, especially if any digital rights encryption

has been added. The only precedent for reverse engineering this type of sensors was the Microsoft

Kinect. However it turned out that the Softkinetic interface was much more complex, due to more

commands required to switch on the LIDAR. In addition as well as more processing was done on

board the Kinect leading to a more straightforward interface.

Legalities

Reverse engineering can be a legal minefield. However, in European Law there is a clear and

specific exemption which permits reverse engineering to allow interoperability between different

systems. This is stated in Directive 2009/24/EC of the European Parliament and of the Council on the

Legal Protection of Computer Programs (paragraph 15)

The unauthorised reproduction, translation, adaptation or transformation of the form of

the code in which a copy of a computer program has been made available constitutes

an infringement of the exclusive rights of the author. Nevertheless, circumstances may

exist when such a reproduction of the code and translation of its form are indispensable

to obtain the necessary information to achieve the interoperability of an independently

created program with other programs. It has therefore to be considered that, in these

limited circumstances only, performance of the acts of reproduction and translation by

31

Page 42: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

or on behalf of a person having a right to use a copy of the program is legitimate and

compatible with fair practice and must therefore be deemed not to require the authorisation

of the right-holder. An objective of this exception is to make it possible to connect all

components of a computer system, including those of different manufacturers, so that

they can work together. Such an exception to the author’s exclusive rights may not

be used in a way which prejudices the legitimate interests of the right holder or which

conflicts with a normal exploitation of the program. [38]

The purpose of the reverse engineering the driver is to provide interoperability between the

Raspberry Pi and the DS325 for which there is not commercially available driver. (As discussed

previously, the manufacturer was contacted and they stated they do not intend to provide one in the

future.) Therefore the reverse engineering is permitted under European Law.

4.1.1 Assessment of Softkinetic Interface

The first course of action was to asses how the hardware and interface worked with a known

set-up. This was to provide valuable information on how the hardware communicated. A Microsoft

Windows based PC and the Softkinetic provided windows driver was used. Along with the driver a

basic graphical viewer was included. This was very easy to install and provided a quick and easy

demonstration as show in Figure 4.1

These initial investigations revealed some interesting functions of the Softkinetic driver:

• The user must register before being able to stream data.

• Only one user at time can change settings on the device. A user must select a control setting to

be able to do this.

• Output included the expected depth map both in integer and floating point formats. Also

included was a depth confidence map, audio microphone stream, high definition colour video

stream and a vertices map.

This information was combined with several tests investigating the activity which occurred on the

USB when selecting different functions. It was determined that the driver used a server and client

based architecture for interfacing to the hardware. While this could be a powerful method of being

able to stream the data to multiple clients, this meant there were additional functions built into the

driver that were not required. This led to the belief at a driver may be able to be built that had a higher

frame rate than suggested by the Softkinetic developers.

32

Page 43: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.1: Depth Image using Softkinetic Software on Microsoft Windows

Figure 4.2: Example USB Enumeration Data

33

Page 44: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

USBlyzer [39] and Device Monitoring Studio [40] were then used to interrogate the USB

interface and determine how the device was enumerated. Painstakingly several hundred lines of data

were analysed to pin down critical data such as the endpoint address of the depth sensor. Examples of

this data are shown in Figure 4.2. A basic understanding of the enumeration of the device could then

be obtained as shown in Figure 4.3

The amount of data recovered by the programs was encouraging not only for the information it

provided, but also as it hinted that the device followed most of the USB standards [24]. Following

this, more in depth interrogation of the packet data could be started.

USB Packet Sniffing

The next step was to look at the packet data going between the client and device. This is known

as “protocol reverse engineering”. Several ’sniffer’ programs were investigated for this purpose. Four

of these were chosen to analyse the data. Each one was good at converting data from binary to

hexadecimal and ASCII characters to aid clarity, however each had strengths in different areas. Each

one provided a different part of the data, and all had different filtering and search capabilites. The

four ‘sniffer programs’ used where:

• Device Monitoring Studio [40]

• Bushound [41]

• Simple USB Logger [42]

• Wireshark + Winpcap [43][44]

As more efficiency was gained with the understanding of the packet data, Wireshark + Winpcap

were used increasingly as this allowed access to all the raw data. Winpcap was used to ‘sniff’ the

data as this feature is not availible in Wireshark. Wireshark was then able to display the data as in

Figure 4.4.

Multiple tests were performed and packet data was recorded and analysed. It was apparent that

most data transmission occurred in three stages:

1. Plugging in the device to the USB socket

2. Opening up the Depthsense Viewer (and hence starting the Depthsense server)

3. Clicking the Run command to stream depth information to the viewer.

34

Page 45: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.3: Schematic of DS325 Endpoints

35

Page 46: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.4: Screenshot of Packet Sniffing using Wireshark

All three stages consisted of standard command USB packets with commands moving data in both

directions. The third stage, after clicking the run command where data was streamed to the viewer,

included incoming data using isochronous packets. The packet sizes for the isochronous type packets

were extremely large and obviously contained large amounts of depth or video data. When only

depth data was selected, only isochronous packets from endpoint 82 were observed. This confirmed

suspicions from the previous enumeration analysis that depth data was indeed linked to endpoint 82.

An attempt was made to write a program who’s purpose was to connect to the device and to start

an isochronous stream from endpoint 82. The libusb-win32 library [26] was used to access the USB

hub. Connection to the device was accomplished and the correct configuration parameter set. The

interface settings were set to those seen in the packet data analysis. Initially the size of packets was

determined by a brute force method of trying different sizes until a successful connection was made.

This suggested a 64byte packet. Eventually, a more direct method was found by making calculations

based on the observed offset values in the header packets, which were reasoned to be a multiple of

the packet size.

A good connection and streaming was established, but only a very small amount of data was

returned, most of it containing a zero value. It was noticed that the laser used to illuminate the scene

was not switching on, therefore there could be no correct depth data streaming out. When using

the Softkinetic viewer, the laser was activated during the the third stage when the run command was

36

Page 47: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.5: Identification of Packet Type and Contained Data

selected. More packet data was analysed to identify possible commands to switch the laser on. About

20 likely candidates were selected and sent both individually and in combination with no success.

This was an extremely time consuming process, with several thousand packets to analyse, akin to

looking for a needle in a haystack. A new more comprehensive approach was attempted.

4.1.2 Development of Packet Replay Software

With the set of packet commands (about 1000) known to be able to activate the laser, an method

recording the packets and replaying them was developed. Once a the device was working with a

independent program and data, it would then only be a matter of filtering the packets down, to work

out which ones where the important ones. In order to get this working, the USBpcap program was

used to capture the packets while using the Softkinetic driver. A code library called WpdPack [45]

was used in a c++ program to read the packets back. Wpdpack was written for C and needed some

slight modification to allow it to work in visual c++ due to using a C++ reserved words. The packets

were able to be replayed to the USB using the LibUSB library as before, but custom translation and

filtering could also be written to help identify and filter packets more efficiently. An example is show

in Figure 4.5

Packets where then be replayed to the device according to the following rules:

• Incoming (device to host) packets were ignored

• Control packets used to alter configuration and interface packets were recognised and commands

were sent using specific LIBUSB functions.

• Commands requesting data from the device were replayed.

Commands sending data towards the device were at first treated in a similar way to commands

flowing to the host. However, when the packet replay method failed a comparison was done between

known working packet format and the actual format the packets were set in. A inconsistency was

37

Page 48: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.6: Screenshot of Packet Replay Software

spotted in the way Winpcap recorded the data. For command packets, both incoming and outgoing

data was being listed as incoming only. A set of rules, based on previous packets was created to allow

the replay software to be able to correctly handle the correct direction for the data transfer.

When packet replay software was complete, packets were able to be replayed back as shown

in Figure 4.6. Approximately 95% of all control packets were reported back as being successfully

received. The laser was activated and depth data was streamed.

4.1.3 OpenNI2DS325 Driver

At the same time as the packet replay software was bearing fruit, a new driver appeared on the

GitHub repository. Developed in Japan and named OpenNi2D325 [2] it was designed to act as a

driver to the DS325 while making use of the OpenNi middle-ware. The code behind it looked

comprehensive, and so the driver and all dependencies were installed on the Raspberry Pi. Much

of the driver bore resemblance to the reverse engineering that had been completed including the use

of the LibUSB library to access the USB protocol. Due to the number of libraries that needed linking,

and a large number of compiling options that needed to be set, one of the example programs was

duplicated to be used as a custom test program.

The OpenNI based driver had a complex set of makefiles to control the compilation of the

software. These were examined and adjusted to allow the building of a new program. Initial tests

proved very positive with depth information being able to be streamed. While the data received

was similar to the packet replay software, using the new driver saved time in narrowing down the

important commands. The new driver also took advantage of extensive error handling by using the

standard Open Ni framework so this was seen as a more reliable way forward. Also despite getting

the replay software to stream the data, this still had to be understood and decoded to get useful depth

data.

38

Page 49: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Basic test output was displayed as an ascii depth map where 1 represented 100mm, 2 = 200mm.

This was activated using the command line argument display when running the program. Due to the

limited number of characters only every 16 pixels were sampled. This was adequate to show that the

depth data was streaming, as shown in Figure 4.7.

Figure 4.7: Image Showing Basic Test Output of Depth Stream

Despite initial promise, it became apparent that the depth information being reported from the

driver was in some cases incorrect. Strange ‘quantisation’ effects were being seen about 800mm

where the data seemed to be grouped. A test was conducted to prove this. The middle pixel of the

camera was used, as this was in the middle of the field of view and should look directly forward. The

depth value from this pixel was then plotted against the real life distance of the sensor from a target.

At least 10 readings were taken every 2cm from the target.

39

Page 50: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.8: Plot showing Actual Distance vs Reported Distance from the OpenNi2DS325 Driver.

The results, as plotted on figure 4.8 demonstrate large inaccuracies which render them unusable

for spacecraft rendezvous. The ‘quantisation’ effect is clearly seen when the sensor is over 800mm

from the target. Even when the sensors is in closer proximity the results cannot be declared accurate.

After looking at the source code of the driver contained in the DepthDS325Stream.cpp file, it became

apparent that the conversion algorithm taking the raw data from the sensor and converting it to a depth

values was the cause and was not something that could be rectified quickly.

Therefore the process of reverse engineering had to continue to develop an new algorithm from

the raw data shown in Figure 4.9.

40

Page 51: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.9: Example of Raw Depth Data

4.1.4 Conversion of Raw Data

Again measurements were taken every 2 centimetres and the raw data for the center pixel was

monitored to look for any trends. It was determined that each pixel was represented by 4 bytes of raw

data. These 4 bytes contained both the infra red intensity data, as well as the depth data. The depth

data consisted of 3 bytes and was separated into three different values:

• Byte 1 - Changes on Short Range up to circa 300mm

• Byte 2 - Changes at intervals generally marked by byte 3 roll over.

• Byte 3 - Fine depth data, Increments byte 2 when value rolls over.

41

Page 52: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.10: Plot Showing Raw Data Values vs Distance from Object

All raw data was plotted as shown on Figure 4.10. A combined value was also plotted. This took

the value for byte 3, but added 255 (162) each time byte 2 incremented. This then became a 4 byte

value. When plotted, values above 300mm where shown to have a non-linear correlation. Several best

fits equations were attempted and it was found that a quadratic equation of form Ax2 +Bx+C = 0

was the best type to model the data.

The coefficients of the best fit quadratic equation were calculated by Matlab to be, A = 0.00023,

B = 0.81 and C = 83.

−b√b2 − 4ac

2a(4.1)

The quadratic formula showing in Equation 4.1 was then implemented to return the good output

values from the combined byte 2 and 3 data. This resulted in a marked improvement from the original

algorithm as shown in 4.11.

42

Page 53: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.11: Plot Showing Final Output vs Measured Distance

4.1.5 Final Driver

The final form of the driver to be used in visual processing used the OpenNi2DS325 as a base, with

the raw data to depth algorithm replaced with the basic algorithm developed above. This combined

a more mature driver with a much better output. The improvements can be seen visually when

comparing depth images taken with the original driver to those taken compared to the final one as

shown in Figure 4.12

43

Page 54: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.12: Comparison of Original OpenNiDS325 Driver Output to that of the Final Driver with Improved

Algorithm

4.2 Vision Processing Software

Perhaps the most difficult aspect of implementing visual processing algorithms for an embedded

system is being able to visualise each step. When processing thousands of data points is impractical

to print the data to the console session, and it is generally difficult to create graphical interfaces for

these type of systems. Programs such as Matlab can allow images to be displayed with ease, and

also allow quick prototyping of new algorithms. It was decided to stream data from the Raspberry Pi,

directly to Matlab on a high powered PC during the development phase. Algorithms could then be

developed on a high powered machine quickly, before being deployed back to the Raspberry Pi.

4.2.1 Basic Structure

First the basic structure of the driver / vision processing software was implemented. The program

continued to make use of the OpenNi2DS325 makefiles for compilation purposes. A new directory

was created in the structure and makefiles adjusted to compile the directory into a separate executable.

The main function loop as described in Figure 4.13 was then implemented in the main.cpp entry

source code file. Functions to read and write to files are located in textio.cpp. General functions such

as logging data to screen were created and are located in general.cpp.

44

Page 55: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.13: Flowchart Describing Execution Flow of main.cpp.

Timing code was developed in the main loop calling functions from timing.cpp. The aim of this

was to provide basic statistics on time taken to execute various steps in the program. This displayed

total time for each loop, and also broken down into waiting time (time waited before a new frame of

data is available), processing time (time taken for visual processing), and display time (time taken to

display, or transmit data). An example of the timing output is given in Figure 4.14

4.2.2 Network Interface Development

Three different methods of transmitting image data over the network were implemented to asses

which was best.

The first option implemented was the simplest. All data was outputted to the console session

45

Page 56: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.14: Example of Timing Output

using the argument displayfull. The result where then sent to the client PC using a program called

Socat [46]. While this did work, the transmission rate was extremely slow which was traced down to

the amount of time it took to print the data to the console screen.

The other two options implemented where sending the data via TCP and UDP protocols. Several

function were written to open up TCP and UDP sockets, transmit data, and then close the sockets

again. These function are contained within the socketio.cpp file. Whilst TCP is the most complicated

protocol to implement for small mounts of data transmission, as large amounts of data was being

streamed, UDP turned out to be the most complicated to implement. This was primarily due to the

maximum packet size being 65,535 bytes. Each frame of data was a minimum 240x320 (resolution)x

5 (bytes per pixel) = 384,000 bytes. This meant each frame of data had to be broken down and sent a

few lines at a time, and then reconstructed at the other end. TCP was able to handle this automatically.

Outputting the data via TCP or UDP was selected by using the tcp or udp arguments respectively.

On the client PC side, in built commands were used in Matlab to set up TCP and UDP sockets

to read the data. This proved straightforward to set up and data was able to be streamed successfully

using both methods. Example output is showing in figure 4.15

4.2.3 Acquisition of Initial Data Analysis of Development Data

Development data was aquired by running the system on the air-bearing satallites whilst outputting

the depth data to the Rasberry Pi flash card. This data could then be ’replayed’ into the system via the

testfile argument to aid development. To aid this, the data was also streamed via TCP and show live

46

Page 57: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 4.15: Consecutive Images of Matlab Streaming Output On Air Bearing Table

using the test stream tcp.m file. (test stream udp.m could have been used instead).

Several Matlab scripts were written to help understand and process the data. The test historgram.m

file allows a histogram plot of all the data points as shown in Figure 4.16. This was useful in getting

a general feel for the incoming data and was pivotal in discovering the floors in the OpenNi2DS325

driver.

Figure 4.16: Example of Histogram Data

The file test crosssection.m was used to get a feel for how the data was changing across the

47

Page 58: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

image. This also provided a visual feel of how useful processing such as Sobel analysis (using

gradients) would be, and a visual approximation of the accuracy of the data.

Figure 4.17: Example of Cross Section Output

4.2.4 Filtering

The first aspect to filtering, was to filter out any error values set by the driver. After a careful

analysis of the driver and data coming from it, was found that certain errors detected would be given

a value. (Actually a small range of values). The function filter error() was used to set these values to

0. (All filter functions are located in the filter.cpp file.)

Two filters were then considered to process the data to revel only the target satellite. Firstly an

erosion method was considered. This consisted of checked weather each non zero pixel was bound

by more than 4 other non-zero pixels. This is a very quick filter to eliminate noise a most noise in the

image was only in small groups of pixels.

The second filter that was developed was a type of segmentation algorithm. With the expectation

of certain types of future processing algorithms such as an ellipse bounding method it was felt

important that a filter was implemented that would return only one region. Filters such as the erosion

filter or a standard deviation filter always return the possibility of returning data other than the main

object. Procedures such as ellipse bounding which operate with looking for maximum and minimum

extremities of a region are then wildly inaccurate. Many segmentation algorithms are build upon

region growing algorithms however a slightly different method was developed which is believed to

be more efficient for this type of scenario where there should only be one object detected.

Each pixel in the image was looked at one by one and placed into a group. A threshold was set

which stated the maximum variation between the depth of adjacent pixels for the two to be considered

part of the same object. The following algorithm was used.

48

Page 59: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

1. If the pixel directly above is below the threshold, then the pixel’s group became the same as

that above.

2. If step 1 is false, then the pixel directly to the left is considered instead. If that is below the

threshold, then the pixel’s group became the same as that left pixel.

3. If both step 1 and step 2 are false, then the pixel is placed into a new group.

4. Finally, if both the left and above pixels are in different groups, but both are below threshold,

then both the different groups are joined and merged into one.

Once the pixels were grouped, a filter could then be applied to the image. In this case only

pixels from the largest group were carried through, all other pixels were set to 0. This is reasonable

assumption, as this was sufficient during testing, and would also be a good assumption for final use

in space where no other objects are expected to be in close proximity. The benefit of this method was

that only one object is detected, the downside is that the processing time is increased significantly the

more potential objects / noise is about.

4.2.5 OpenCV

Several attempts to compile the OpenCV library on the Raspberry Pi. This was not possible, since

the compilation consistently ran out of memory during the process. (The compilation was so slow

that it took around 24 hours to get to this point each time. Workaround such as manually increasing

the virtual memory size did not rectify this.

Therefore OpenCV was cross compiled using a faster windows based laptop using the MinGM

development environment [47] and a windows version of cmake [48]. This resulted in generating

linux binaries which were copied back to the Raspberry Pi. However, this workaround was only

found late on in the project too late for the library to be used for algorithm development.

4.2.6 Range and Pose Calculation

A very basic range and direction algorithm was implemented which used a ‘Center of Mass’ or

‘Moments’ concept to calculate the central point of the object [49]. This was effectively calculating

the mean x and y location of all the objects points. The range to this point, as well as the closest point

of the object were then printed out to the console session.

49

Page 60: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

5 Testing and Validation

5.1 Failure of LIDAR

At the start of the final data gathering phase in the last week of the project the LIDAR failed.

Several attempts were made to communicate with it, with no luck. The LIDAR was also unresponsive

to Windows PC running the Softkinetic driver. As the most likely explanation was in the altered wiring

via the USB cable this was inspected closely. No obvious signs of damage or issues could be seen.

The entire USB cable was stripped out and replaced to no avail. A multimeter was also used to make

sure the connector was passing through voltage correctly. With no improvement, this narrowed down

the issue to the circuit board. Due to the late time scale and lack of expertise due to the summer

holiday period it could not be rectified or a new one sourced in time. Fortunately some data had been

collected in the lead up to the issue, which the following results are based on.

5.2 Power Analysis

To measure the current a small USB extender was spliced and an ammeter placed into the +Vcc

cable. This was then used to measure the current requirements. The current was monitored during

all stages from raspberry pi started through to streaming data from the sensor. Maximum readings

were taken to indicate the maximum power requirements. Average readings were taken during steady

streaming and were generally stable enough to give an average that could be determined visually that

could be used to determine expected long term power usage.

Current (mA)

Raspberry Pi Bootup Max 0.56

Raspberry Pi TCP Streaming (Ave) 0.54

Raspberry Pi TCP Streaming (Peak) 0.60

DS325 (Ave) 0.43

DS325 (Peak) 0.44

Table 5.1: Current Consumption Tests

This data gives a average power of 0.99A, with a maximum possible peak consumption of 1.04A.

This would be within a range that is possible for a CubeSat to supply for a relatively short term use.

The maximum power consumption for the Raspberry Pi is lower that what is stated manufacture. This

may be due to the fact we have no peripheral other than the wireless attached. When streaming, the

CPU was at a steady 100%, however, as the power consumption was lower than anticipated, several

50

Page 61: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

attempts were made to increase the Pi’s power via extra processing, however this was not possible.

The power consumption for the LIDAR was matched with the manufacturers information.

5.3 Accuracy and Precision

Due to the LIDAR failure it was impossible to do a full analysis of it’s accuracy and precision.

However, some data from the initial development of the driver existed and this was able to be used.

In this case only data for the centre pixel was recorded. Multiple samples were taken for each data

point so both the accuracy and precision using standard deviation could be plotted.

Figure 5.1: Plot of Accuracy and Standard Deviation Data

The results show the accuracy is worse close up to the sensor. This was to be expected as no

account has currently been taken for the first byte of raw data which affects the values in this area. A

trend of approximately 20mm accuracy is a little less than mentioned in literature [19]. This is likely

due to a slightly less accuracy in the driver compared to the manufactures one, but is most likely

down to the method of capturing the data. A metal ruler was used to measure the LIDAR distance.

It is likely that there would be up to 5mm of error. This would be shown in the figures above. The

precision also increased far away form the LIDAR.

5.4 Basic Timing Analysis

A basic timing analysis was done of the driver with different output functions.

51

Page 62: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

No Output Display Summary Display All Output to File Processing Test

No of Frames 5918 3568 8 413 7347

Test Duration (s) 291 351 422 300 457620

Time per frame (ms) 49 98 52763 727 62

Frames per sec (Hz) 20 10 <1 1 16

Avg time per frame (ms) 48 97 52763 725 61

Max time per frame (ms) 206 1429 54173 2303 3125

Min time per frame (ms) 44 2 50708 351 24

Avg wait per frame (ms) 48 72 44 <1 37

Max wait per frame (ms) 206 611 359 60 246

Min wait per frame (ms) 44 <1 <1 <1 2

Avg process per frame (ms) <1 <1 <1 3 23

Max process per frame (ms) 6 16 <1 51 79

Min process per frame (ms) <1 <1 <1 <1 18

Avg display per frame (ms) <1 24 52717 722 <1

Max display per frame (ms) 7 1278 54172 2275 2

Min display per frame (ms) <1 2 50707 350 <1

Approx CPU usage (%) 50 100 100 100 80

Table 5.2: Argument commands

This showed about 20 frames per second is possible (far more than the 1 frame per second form the

Softkinetic development driver). This drops off significantly if the results are printed to the command

window or to a file.

5.5 Network Protocol Timing Analysis

Timing analysis was done when streaming both TCP and UDP protocols.

52

Page 63: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

TCP UDP

Time per frame (ms) 664 598

Avg display per frame (ms) 661 594

Max display per frame (ms) 1251 977

Min display per frame (ms) 469 360

Approx CPU usage (%) 100 100

Table 5.3: Argument commands

The results show that the UDP protocol was still the fastest. However the TCP stream was a lot

more reliable, and so that was the primary one used for the project.

5.6 Filtering Analysis

Several tests were done to try and find the best filter which was able to identify a spacecraft from

background noise. The first results are from the erosion filter.

53

Page 64: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 5.2: Showing Results for Erosion Filter when Run a Different Amount of time.

Test Average(ms) Max(ms) Min(ms)

Error Filter Only 7 23 4

Run x1 15 42 11

Run x2 23 71 18

Run x3 28 58 22

Table 5.4: Argument commands

The first time the algorithm is run, it is very effective at removing noise. With each progressive

run of the algorithm, the extraneous data points reduce. This example was chosen specifically, as it

demonstrates that even after 3 runs, some noise still remains. This is not ideal as several algorithms

which are likely to work with this data to detect range and pose, require a single shape outline. Noise

54

Page 65: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

could cause issues in those circumstances. The results show a fairly steady progression with the

total processing time increasing the more times the algorithm is run. However, in terms of overall

processing the algorithm is not processor intensive.

Next the segmentation algorithm was analysed. It was first run for different threshold values. The

threshold value controls how close the depth value two points have to be, to be considered in the same

group. The colour in the images in 5.3 describe the group each pixel belongs in.

Figure 5.3: Showing Segmentation Algorithm for Different Threshold Values

We can see from the results that the low thresholds, allow many groups to be created. These split

up the main object into separate groups. The higher thresholds manage to capture the whole object,

but have the potential to add in more noise as part of the group. The threshold that was decided as

near optimum was 100mm. This seemed a good value from the start as it was roughly the same order

as that of the target spacecraft 100mm x 300mm. From the graphs it is also apparent that the 100mm

threshold managed to capture the whole object as one group, but left out some of the pixels around

the edge. The pixels around the edge of objects often are inaccurate due to spurious reflections. This

filter was a nice middle path that grouped exactly what was required.

55

Page 66: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Test Average(ms) Max(ms) Min(ms)

10mm 123 192 92

50mm 249 308 82

100mm 98 145 44

500mm 52 91 35

Table 5.5: Argument commands

The timing results for the threshold show the time taken decrease as larger thresholds were used.

The main time dependency for this algorithm is in the last step in the algorithm. In the case of the top

and left pixels being in different groups, but are deemed to be connected by the new pixel, the groups

must be combined. The more groups that exist due to low thresholds, the more times this may occur,

and the slower the algorithm runs.

The segmentation algorithm gives the best output but it’s timing is dependant on background

noise. To try and optimise this the erosion algorithm was run once before the segmentation.

Test Average(ms) Max(ms) Min(ms)

100mm Segmentation Algorithm 98 145 44

100mm Segmentation Algorithm + Erosion Filter Run Once. 50 84 34

Difference -48 -61 -10

Difference % 51 58 77

Table 5.6: Argument commands

This proved to be a very efficient way of getting a good reliable result, and at the same time

reducing the processing requirements of the segmentation by a half.

56

Page 67: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Figure 5.4: Output from Combined Erosion and Segmentation Algorithms

Finally the largest group was taken, all pixels not in that group were set to zero. All pixels in that

group had their depth data retained. This is shown in Figure 5.5 where the target has clearly been

identified from the background noise.

Figure 5.5: Image Showing Output of Full Filter Algorithm

57

Page 68: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

Test Average(ms) Max(ms) Min(ms)

Full Algorithm 49 98 37

Table 5.7: Argument commands

The full algorithm takes only 47 milliseconds to run on average which is quire respectable for

result we get from it.

5.7 Range detection

The basic range detection algorithm was not able to be tested fully with live data. No accuracy

tests could be done with this as no measurement of the physical layout could be taken. However this

was able to be tested with previously collected test data. The console output of one of the tests is

shown in Figure 5.6.

Figure 5.6: Screenshot Showing Output of Range and Bearing Data to Target

6 Conclusion

6.1 Reflection vs Objectives

The aim of the project as stated at the start of the project:

58

Page 69: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

”To demonstrate that a Low Cost C.O.T.S. LIDAR can be used as a rendezvous depth

sensor for Satellite Proximity operations.”

This was not demonstrated outright due to the failure of the LIDAR during testing. However, a

full system has been demonstrated that includes returning data from the sensor, processing it on the

presumptive processing board, and returning range and bearing data.

• To research and select suitable depth camera. This will involve a literature review of different

sensors and a SWOT analysis to assess the merits of candidates. This was successfully completed

and the Softkinetic DS325 was chosen. Whilst having limitations, it was successfully able to

fulfil the majority of aims and objectives for this project. The SWOT analysis proved to be

worthwhile as it highlighted the uncertainly surrounding the Primesense family which would

have problem to be a problem for the future applicability of the project.

• To develop suitable hardware interfaces to support development data acquisition and testing.

This involves designing a suitable hardware structure to contain the LIDAR and control board,

provision of suitable power supply, as well designing a physical interface to test air-bearing

satellites. This objective was comprehensively met. The air-bearing satellite test apparatus

was brought into full operation. A comprehensive test program was developed which was

able to perform several different functions including streaming live data to a client pc in real

time. A good hardware design was constructed, one which will be a good template for future

demonstrations.

• To develop the software interface between LIDAR and control board. Assessment of raw

processing abilities and processing requirements. This objective was the hardest and most

time consuming to meet. However a driver was developed/modified using data from reverse

engineering. The information use is being shared with the authors of the OpenNi2DS325

driver and is likely to make a significant contribution to the open source community. The driver

processing requirements were also measured as part of the results fulfilling the objective..

• To investigate and develop filters that allow detection of a target satellite. Again, this objective

was met with a low processing cost method of filtering the satellite in question from noise and

other objects.

• To implement visual processing algorithms that will detect range and pose of target satellite. A

basic range and pose of both the centre of the target, as well as the closest point of contact with

the satellite. Only a basic algorithm was implemented, and whilst the pose of the satellite was

59

Page 70: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

met (via bearing information), the pose of the target was not successfully achieved due to the

extra time required in the interface stage of the project.

• To demonstrate the system on air bearing simulator satellites. The idea of this was to demonstrate

the full system at the same time. This did not occur due to the late failure of the LIDAR.

However, whilst not ideal this objective was met in two parts. The system was demonstrated on

the air-breathing tables streaming live image data, in some instances to an audience of industry

representatives. The visual processing algorithms developed later were not able to be tested

live, but were able to make use of recorded data from the earlier demonstrations.

• To characterise basic capabilities and restrictions of the system. Power budgets, processing

frame rate, and effect of ambient lighting conditions will be looked at with the intention of

being able to inform future work. Again, this was affected by the failure of the LIDAR, however

several key tests were completed. Power consumption tests were met as well as basic proof of

some of the vision processing algorithms. Testing of the test hardware and software, as well as

the transmission of data to a client PC was comprehensively completed.

6.2 Achievements

The main aim of this project was achieved that of demonstrating that a low cost LIDAR could

be used as a rendezvous sensor. This was met on a basic level, and will need more testing in future

before moving to any flight hardware.

The modification / development of the interface driver has enabled the integration of a Softkinetic

LIDAR with a Raspberry Pi. Without it, it would simply not be possible to run these two systems

together. This driver will be of use to the wider depth sensor community as a whole. Talks are

continuing with the authors of the original driver to incorporate the algorithm used in this driver.

The potential used for this is demonstrated by several requests via online forums for working driver,

as well as a request from another ex University of Surrey student who’s masters involved using the

Kinect system, but who would like to migrate it to the Softkinetic system.

Another achievement is that the finished driver, plus all the algorithms required to detect a satellite

was able to run at a faster frame-rate than the manufacturers though possible. Softkinetic after tests

could only provide one frame of data per second, with not visual processing.

Whilst the full vision algorithms will require some future work, a good initial filter was demonstrated

that was able to successfully filter out noise and guarantee only one object for further processing. It

did this with a relatively low processing requirement.

The test platform set up, both software and hardware a design of which could be used for furthur

test of the LIDAR or other imaging payload. The air bearing test apparatus including the LIDAR and

60

Page 71: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

live transmission of data were demonstrated several times to during industry visits showing the profile

of this technology demonstration that this can show.

6.3 Project Evaluation

Easily the most challenging aspect in this project has been in the adjustment of final goals in

response to unexpected challenges encountered during the project. The beginning of the project saw

ambitious but realistic goals being set to work towards a full demonstration of the sensor on the air

bearing simulators. After an in-depth review of the compatibility with current available LIDARs

it was realised that the interface driver would be far more complicated than originality thought.

There was a significant delay during talks with the manufacturer of the LIDAR in trying the get a

development driver. When this was not possible, it was decided to reverse engineer a driver. This

effectively put all time-scales on halt, as it was completely unknown how long this would take, or

even if it was possible. The project was effectively de-scoped to just the interface driver development.

In May seven months into the project, the reverse engineer was starting to bear fruit, a new open

source driver became available. While at this point, the reverse engineering was going well and the

risk had reduced, it was decided that the best contribution to the technology development was to use

the new driver, and try and meet the original goals which we though were still achievable, despite the

time constraints. However after a few weeks it was realised that the new driver was fundamentally

flawed and required work. This put even more pressure onto the time-line. It was realised at this time

that while it may be possible still to meet the original goals, it would be on a best effort basis, and

likely not be met as comprehensibly as otherwise they may have been. The goals were then also made

harder to confirm by the failure of the LIDAR in the final week of the project, resulting in requiring a

programming change in order to replay test data and achieve some of the missing measurements.

Whilst the main delays to the project were unavoidable due to the novelty of integrating the

specific hardware, there were several things that could have saved time overall. Whilst planning

ahead is often a good thing, a lot of time was lost in researching and testing algorithms when in the

end there was no time to implement them.

Also, it took several weeks to realise that the new driver was flawed. This was due to two reasons.

Depth data between 30cm and 1m where most of the testing was done seemed reasonable when

viewed as an image. Whilst the images were not as defined as those from the Softkientic viewer, this

was discussed and believed to be due to the Matlab plotter and not the LIDAR itself. This resulted in

not only a delay of several weeks to start working on the modified driver, but development data have

been acquired and algorithms worked on that were then deemed incorrect for the more accurate data.

In hindsight, the accuracy of the new driver should have been checked vigorously before starting on

61

Page 72: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

the vision processing algorithms.

6.4 Future Work

In some ways the succeeds of the project has been to determine the areas which are going to need

more development going forward. Many of the problems encountered during the project were not

seen in advance. By forging on to the final goals this project has provided a lot of information that

will help in developing future systems, and significantly reducing the risk to those developments.

As the depth sensor industry is moving ahead so very quickly, any follow up project is likely

to use different hardware that would make the development much easier, and improve the overall

system. As it currently stands the use of low cost LIDARS is proved to be possible, but quite difficult

to fully implement. There are several possible improvements, or alterations that would be considered

game changes and significantly improve the system as a whole.

1. The development of a stripped down ARM driver by the manufactures. This is unlikely, and

a driver has now been implemented, but it stands to reason that a purpose built driver would

likely have more features available and be more reliable and accurate.

2. New model of depth sensor that is build for measurement. Current LIDARS are built for

detecting movement and are therefore not as accurate as models that are likely to appear in

the near future.

3. Change of processing board to a higher power model. Whilst this may not be desirable, this

would allow standard libraries to be used which may speed up the algorithm development.

In the event the project is taken forward with similar hardware the following following work and

recommendations are suggested:

• More work should be done to asses the effect of the first byte from the raw data for the sensor.

This would increase the accuracy of the sensor in the 0-300mm range where the quadratic

correlation breaks down.

• Further work to optimise the driver further. Currently the driver is based on the OpenNi

framework the functions of which are not needed. Whilst this would be useful in the case

of changing to a similarly compatible sensor, optimising the driver to it’s basic components

(influenced by the reverse engineering) would help increase the frame rate of the system.

• More work should be done on the vision processing. Only a short amount of time was spent on

this due to the various delays. Whilst the basic detection of the CubeSat looks very promising,

62

Page 73: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

range and pose detection could be improved significantly. An investigation of using the GPU

processing capability on the Raspberry Pi may also provide a more powerful means of processing

the data.

• A recommendation for the Air Bearing Simulator Satellites. A regulator controls the air pressure

to the thrusters and the air cushions on the feet. When activating the thrusters, a drop in pressure

occurs. The regulator cannot fully compensate for this and pressure drops to the air cushion.

This results in the feet catching on the table inducing a torque. To compensate for this, the

regulator is turned to maximum, which reduces the flight time. The seperate regulation of the

feet and thrusters should solve this issue and result in a longer flight time.

• Finally, a full hardware in the loop test with feedback control was not possible to complete due

to the time constraints as well as the LIDAR failure. This should be a primary goal for any long

term development.

6.5 Final Thoughts

This project has been very multidisciplinary and so there are several areas that I have gained

personal experience in. Whilst the issues with the drivers caused a lot of headaches, they provided a

great educational experience to really understand hardware and software interfaces. It also provided a

great feeling of achievement when the reverse engineering bore fruits. Several specific achievements

are:

• Ability to work with C++ in a non windows based environment. Gained in depth knowledge of

Linux and especially executable compilation, linking, and optimisation.

• Gained knowledge into the challenges of reverse engineering, and gained some in depth knowledge

of the USB standard.

• Worked with visual processing for the first time. A proved a good opportunity to put lecture

material into work.

• Learnt and used Latex for the first time!

I would finally like to state my appreciation for the opportunity to learn and work on a piece of

active research and full appreciate the support and equipment made available for this project such as

the air bearing table apparatus.

63

Page 74: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

References

[1] Eben Upton and Charles Severance. Eben Upton: Raspberry Pi. 2013.

[2] SIProp. Openni2ds325. https://github.com/SIProp/openni2ds325 [Online: accessed 19-08-2014].

2014.

[3] Barton C. Hacker and James M.Grimwood. On the Shoulders of Titans: A History of Project

Gemini. Nasa Special Publication. NASA, 1977.

[4] Mark Entwistle and Mark Itzler. “Focal-Plane Arrays”. In: Laser Focus World (), pp. 5–7.

[5] Larry Li. Time-of-Flight Camera An Introduction. 2014.

[6] L Pryce C.P. Bridges S. Kenyon and R.Bird. “STRaND-2 : Visual Inspection , Proximity

Operations & N anosatellite Docking”. In: Aerospace Conference, 2013 IEEE ().

[7] Craig Underwood and Sergio Pellegrino. “Autonomous Assembly of a Reconfigurable Space

Telescope ( AAReST ) for Astronomy and Earth Observation”. In: Applied Optics Vol 52 Issue

22 ().

[8] Cornelius J Dennehy Nesc. “Relative Navigation Light Detection and Ranging ( LIDAR )

Sensor Development Test Objective ( DTO ) Performance Verification”. In: NASA Technical

Reports Server May ().

[9] ESA. ATV-5 Set to Test New Rendevous Sensors. http://www.esa.int/Our Activities/Human Spaceflight/ATV/ATV-5 set to test new rendezvous sensors[Online:

accessed 07-08-2014].

[10] John A Christian and Scott Cryan. “A Survey of LIDAR Technology and its Use in Spacecraft

Relative Navigation”. In: (1966), pp. 1–7.

[11] X. Zhu, I. C. Smith, and F. Babin. “A hybrid 3D sensor (NEPTEC TriDAR) for object tracking

and inspection”. In: Laser Radar Technology and Applications (). Ed. by Gary W. Kamerman

and Monte D. Turner, pp. 621407–621407–8.

[12] Stephane Ruel and Tim Luu. “STS-128 On-Orbit Demonstration of the TriDAR Targetless

Rendezvous and Docking Sensor”. In: (2010), pp. 1–7.

[13] NASA. DART Mishap Overview Report. http://www.nasa.gov/pdf/148072main DART mishap overview.pdf

[Online: accessed 07-08-2014].

[14] M Richard, L Kronig, F Belloni, S Rossi, V Gass, S Araomi, I Gavrilovich, and H Shea.

“Uncooperative Rendezvous and Docking for MicroSats”. In: 6th International Conference on

Recent Advances in Space Technologies June (), pp. 12–14.

[15] Miles Hansard. Time-of-Flight Cameras : Principles , Methods and Applications. 2012.

64

Page 75: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

[16] Softkinetic. DepthSense Cameras. http://www.softkinetic.com/products/depthsensecameras.aspx[Online:

accessed 30-12-2013].

[17] Intel. Intel Perceptual Computing SDK [Online: accessed 30-01-2014]. https://software.intel.com/sites/landingpage/perceptual computing/documentation/html/.

[18] Sean Clarkson, Jon Wheat, Ben Heller, James Webster, and Simon Choppin. “Distortion Correction

of Depth Data from Consumer Depth Cameras”. In: 44.0 ().

[19] Michael J Cree, Lee V Streeter, Richard M Conroy, and Adrian A Dorrington. “Analysis of

the SoftKinetic DepthSense for Range Imaging”. In: Lecture Notes in Computer Science (),

pp. 668–675.

[20] Rogier Scheffer. “3D Imaging and Gesture Recognition”. In: (2014).

[21] Damien Lefloch, Rahul Nair, Frank Lenzen, Henrik Sch, Michael J Cree, Reinhard Koch, and

Andreas Kolb. “Technical Foundation and Calibration Methods for Time-of-Flight Cameras”.

In: (2013), pp. 1–20.

[22] Rasbperry Pi Foundation. Raspberry Pi. http://www.raspberrypi.org/ [Online: accessed 26-07-2014].

[23] J R Parker. Practical Computer Vision Using C. John Wiley and Sons Inc, 1994.

[24] Jan Axelson. USB Complete - Developers Guide. 4th ed. Lakeview Research, 2009.

[25] Beyond Logic. USB In a Nutshell. http://www.beyondlogic.org/usbnutshell/[Online: accessed

30-01-2014].

[26] Johannes Erdfelt. libusb. http://www.libusb.org/ [Online: accessed 05-03-2014].

[27] LinuxHowtos.org. Sockets Tutorial. http://www.linuxhowtos.org/C C++/socket.htm [Online:

accessed 19-08-2014].

[28] Inc MathWorks. Using TCP/IP Server Sockets. http://www.mathworks.co.uk/help/instrument/using-tcpip-server-sockets.html.

[29] Raspbian O/S. http://www.raspbian.org/ [Online: accessed 19-08-2014].

[30] Element 14. WiPi Datasheet [Online: accessed 07-07-2014]. http://www.farnell.com/datasheets/1669935.pdf.

[31] Internet Systems Consortium. ISC DHCP. http://www.isc.org/downloads/dhcp/ [Online: accessed

19-08-2014].

[32] VNC Server. http://gettingstartedwithraspberrypi.tumblr.com/post/24142374137/setting-up-a-vnc-server

[Online: accessed 19-08-2014].

[33] GEdit. https://wiki.gnome.org/Apps/Gedit [Online: accessed 30-01-2014].

[34] Samba. www.samba.com [Online: accessed 14-04-2014].

[35] Softkinetic. DS325 Datasheet Version 4.0. http://www.softkinetic.com/Portals/0/Documents/PDF/WEB 20130527 SK DS325 Datasheet V4.0.pdf

[Online: accessed 19-08-2014].

65

Page 76: Demonstration of a Low Cost C.O.T.S. Rendezvous Depth ...personal.maths.surrey.ac.uk/C.P.Bridges/AAReST/Files/2014, Richar… · This report details the selection of a low cost LIDAR,

Richard Duke, MSc Dissertation

[36] Raspberry Pi Foundation. Raspberry Pi Power Documentation. http://www.raspberrypi.org/documentation/hardware/raspberrypi/power/README.md

[Online: accessed 19-08-2014].

[37] Rasbperry Pi Foundation. Raspberry Pi USB Documentation. http://www.raspberrypi.org/documentation/hardware/raspberrypi/usb.md

[Online: accessed 19-08-2014].

[38] Europen Union. Directive 2009/24/EC of the European Parliament and of the Council on the

Legal Protection of Computer Programs. 2009.

[39] USBlyzer.com. USBlyler. www.USBlyzer.com [Online: accessed 19-08-2014].

[40] HDDSoftware. Device Monitoring Studio. http://www.hhdsoftware.com/ [Online: accessed 19-08-2014].

[41] Perisoft. Bushound. http://perisoft.net/bushound/.

[42] Incentives Pro. Simple USB Logger. http://www.incentivespro.com/usb-logger.html [Online:

accessed 19-08-2014].

[43] Wireshark. Wireshark. https://www.wireshark.org/ [Online: accessed 19-08-2014].

[44] Winpcap. Winpcap. http://www.winpcap.org/ [Online: accessed 19-08-2014].

[45] WpdPack. WpdPack. https://github.com/engina/uip-1.0-win/tree/master/WpdPack [Online: accessed

05-05-2014].

[46] SOCAT. http://www.dest-unreach.org/socat/ [Online: accessed 19-08-2014].

[47] MinGW. MinGW. www.mingw.org [Online: accessed 19-08-2014].

[48] CMake.com. CMake. www.cmake.com [Online: accessed 05-04-2014].

[49] A C M Sigplan Notices. “ACM SIGPLAN Notices, Volume 28, No. 3 March 1993”. In: 28.3

(1993).

66