Autonomous Shopping Cart

of 52 /52
Autonomous Shopping Cart ME102B: Mechatronics Design Spring 2016 Team 23 “The Dream Team” Shayaan Abdullah Avee Arvind Byron Ho Yuyao Lian Manjap Singh Minhao Zhou Due Date: Tuesday, May 10, 2016 Final Report Professor Liwei Lin GSIs: Sina Akhbari, Eric Sweet, and Ilbey Karakurt

Embed Size (px)

Transcript of Autonomous Shopping Cart

Microsoft Word - ME-102B-Final-Report.docxShayaan Abdullah Avee Arvind Byron Ho
Yuyao Lian Manjap Singh Minhao Zhou
Due Date: Tuesday, May 10, 2016 Final Report
Professor Liwei Lin
ME102B Spring 2016
Executive Summary
When we shop, the main objective typically is to concentrate on purchasing the necessary supplies. We tend to focus on saving money for the items we’re looking for and aiming to make our shopping experience as efficient as possible. The shopping cart we place our groceries and valuables in is an essential part of this experience, and our group wants to make it easier and more convenient for the customer.
For this reason, our group aims to develop an autonomous shopping cart that follows a customer wirelessly as he/she purchases groceries and supplies. Our target audience is all shoppers at various stores, but more specifically, elderly individuals, handicapped or disabled persons, and store managers. As such, our target market is mainly grocery and department stores, but our concept can be extended to other markets such as airports and hotels.
In terms of project specifications, we aim to meet or exceed the following basic customer needs: robust, easy to use, and smooth operation. This translates to a shopping cart that is constructed using strong and durable material, reliable sensors that have high connectivity and accuracy, and DC/AC motors that are programmed to move the cart in a tranquil fashion. We plan to borrow a shopping cart from a local supermarket and use it as the foundation for our project. In doing so, we can focus on the controls and sensors used to monitor the shopping cart and aim to fabricate a truly inspirational mechanism.
There are quite a few challenges in developing such an innovative project; as such, we plan to perform in-depth analyses of all major aspects of the our shopping cart. Some potential challenges include: maintaining continuous shopping cart and user connectivity, adjusting for changes in load weight, and programming the cart to avoid obstacles.
In order to create a truly valuable shopping cart, we first plan to obtain customer user needs to understand what features are important and valuable. To do so, we aim to interview various clients and users, perform independent field observations, and conduct surveys through a variety of mediums. Next, we aim to generate a plethora of concepts that would meet or exceed these user needs; afterwards we will perform a concept selection to focus on a specific concept that we would like to pursue in greater detail.
Thereafter we plan to prototype this chosen concept using CAD software. In doing so, we will be able to refine any design parameters and begin the fabrication of our mechanism. We will program the feedback controls of the motors, adjust the sensors for optimal vision, and ensure the cart operates smoothly. Lastly, we plan on presenting our design to the class during the ME102B Design Exposition.
ME102B Spring 2016
ME102B Spring 2016
1.0 Introduction
The objective of shopping in stores is to find and purchase products one needs and to save money while doing so. The general shopping procedure consists of customers going through aisles and grabbing supplies they need. However, there are components of today’s shopping experience that can further be improved in order to make grocery shopping more efficient and enjoyable. We are a student group that is trying to accomplish such a task by optimizing the standard shopping cart.
The purpose of our project is to build an autonomous shopping cart that follows a customer wirelessly in large supermarkets or other stores like Ikea, Target, Costco, etc. Please refer to Figure 1 for a visual concept. Customers would therefore have a more optimal shopping experience allowing them to focus their full attention on scanning aisles for products without having to worry about lugging a heavy cart around.
Figure 1: Visual illustration of the concept
The audience this project is directed to is all shoppers in general with an emphasis for the elderly, the disabled and handicapped, and those who have difficulty pushing and pulling heavy loads. The cart would be able to maintain a safe distance from the customer at all times while still being easily accessible for items to put in. The cart would be able to tailor its velocity to the speed of the user and be able to turn with the user.
We wish to accomplish this by driving the wheels of the cart with motors to allow propulsion and turning. Sensors will be implemented in order to measure distance from the cart to the user and other objects. A visual detection system will also be utilized in combination with the distance sensors as a tracking system and to ensure that the cart follows the user. Lastly, safety modules will be added for crash prevention purposes.
ME102B Spring 2016
2.0 Specifications
The autonomous shopping cart is a straightforward idea for customers to understand. Since we all use shopping carts in our daily life, the familiarity allows the customers we interviewed to easily come up with their specifications for our projects. In this case, our team members are also target customers. Therefore, we gathered the customer specifications by interviewing each other as well as customers in local supermarkets.
The engineering requirements were obtained by conducting research and combining the results with the customer specifications. For example, having the cart follow the coupled customer is a vague requirement, and we translated it as thus: the shopping cart is to maintain an appropriate distance from a user while searching for supplies in aisles as well as be able to stop when the user approaches the cart. Therefore, we determined that the corresponding engineering requirements will be for the cart to keep a constant distance with the customer when they are scanning for supplies and having it stop when the visual detection system predicts that the customers are to place supplies in the cart. Specifically, referring to the interview data, we set the target distance to be six feet so that the cart is close enough for the customer to store items without interfering the shopping experience. Other engineering specifications were obtained similarly: the customer needs are the guidelines for the engineering requirements while the engineering requirements provide executable, practical design goals to fulfill the customer needs. 2.1 Quality Function Deployment (QFD)
The QFD chart was obtained by going over the customer specifications with corresponding engineering requirements. The team gathered together and discussed the value of each column in the QFD chart. It not only helped us better understand our project, but also provided a better way to visualize our thinking process. Since there are no mature products in the current market, we used several demo products that we found online as our potential benchmark competitors. After carefully studying their features, we compared them with our design in the QFD chart.
The QFD Chart is attached in the Appendix 14.2 section 2.2 Hardware Specifications
The controls system algorithm required image and distance data to be taken in order to specify the user of the cart and maintain a safe distance from him/her at all times. Therefore, image processing was found to be necessary. In our project, we used a Raspberry Pi camera to capture images, along with a Raspberry Pi Model B to process these images. In addition, we required a microcontroller to drive the motor to propel the cart, and a servo to direct it (we used an Arduino for this purpose). In order to power the Raspberry Pi and the Arduino, we used a portable phone charger as a 5 DC
ME102B Spring 2016
source. Furthermore, a ping ultrasonic sensor was used to complement the image data, and was mounted to the front of the cart.
The camera needed to be mounted with a system consisting of precise tolerances in order to ensure optimal image data to be taken. As such, we purchased and used the Raspberry Pi Camera housing that was specifically designed as a camera mount.
The rear wheels of the cart allow for propulsion. The propulsion system involves an axle connected to the rear wheels driven by a motor. The radius of the cart wheels are 2.16. Measured with a spring scale, approximately 36 force is required to initiate the cart movement while 24 is needed to maintain its steady state. Therefore, the motor needs a minimum torque of 311 − for cart propulsion (details will be discussed in the Parameter Analysis section). Also, due to safety concerns, the motor needs to achieve maximum performance under 12. Following these guidelines and with the help of Tom Clark, we chose our propulsion motor: Dunkermotoren G63x55 DC motor.
The front wheels are used for steering of the cart. An ackerman steering system, which will be discussed in more details in the Parameter Analysis section, was utilized to determine the steering parameters. The system was regulated by a servo motor for precise position control. By calculation, the servo motor needs a steering angle of 32 degrees and a minimum torque of 9.52 − for a sufficient turning range and power. The servo motor needs to achieve maximum performance at a voltage of 5. As thus, we selected our servo motor: AdaFruit servo motor (Product No. 1142), which is able to generate a maximum torque of 10 − .
The circuit board consisted of an H-bridge circuit in order to allow direction and
velocity control over the motors. The Arduino provides a PWM signal to the H-bridge to control the voltage across the motor and hence its angular velocity. The motor is powered via the H-bridge, which is supplied with 12DC from a lantern battery and another 5from a modified phone charger. The 12 battery will need to be rechargeable in order for the project to be feasible. Currently, the circuit board was mounted on the baby seat of the shopping cart, but in the future, it will be enclosed by a manufactured case in order to prevent damage to the electronics. 2.3 Algorithm (Software) Specifications
The Algorithm specifications may be split into broadly three sections and will be distributed across two different platforms, namely the Arduino and Raspberry Pi. The Raspberry Pi will be primarily responsible for processing image data and determining the control set points that will be used by the Arduino to actuate the motor and servo. The algorithm must optimize response time against accuracy.
ME102B Spring 2016
Person Detector
The Raspberry Pi runs the person detector algorithm. The algorithm must control the camera and capture images and then determine where a person is relative to the centre of the image captured. Using this information, the algorithm will determine an appropriate duty cycle for the H-bridge and servo. This Person Detector is the most computationally intensive portion of the project software, and hence trade-offs is made to balance the response time of the cart and the detecting accuracy. A frame rate of 5 to 10 frame per second (fps) is desired.
Motor and Servo Driver
The Arduino will be actuating the motor and servo in response to the control signal computed by the Raspberry Pi. The Arduino will hence have to generate two PWM signals of the specified duty cycle to be fed into the H-bridge and to ultimately control motor speed and direction. Similarly, the Arduino will have to generate a PWM signal for the servo with duty cycle based off the desired position of the servo.
Serial Communication
The computed setpoints on the Raspberry Pi need to be communicated to the Arduino. This will be done via serial communication. Hence, a program on the Raspberry Pi will need to convert the control setpoint into bytes to be sent serially to the Arduino. A function on the Arduino will be required to listen to the serial port and to sequentially parse the values sent from the Raspberry Pi. Ideally the Arduino should echo back these values to the Raspberry Pi for debugging purposes. 3.0 Concept Generation
Since our project is based on modification of the current shopping cart instead of designing from scratch, our concept generation process is more detail-focused and feature-specific, instead of free-drawn. As we have stated in the Specifications section, we found the following requirements specifically important, on which our concept generation was focused on: wheel powering, steering, maintaining appropriate distance from user, crash prevention, strong connection, cost, and lastly, aesthetics.
The system logic for our project is straightforward: on-cart sensor/visual tracking system will pick up the location of the user and provide that information to the controller. The controller will translate the customer’s position information into the speed and angle that the cart needs. Then this information will go through the motor via varying currents to control its rotational speed.
Each team member brainstormed six ideas, and we categorized them with respect to the subfunction of the cart. We took advantage of each team member’s different expertise to work on some technical features, and we used everyone’s
ME102B Spring 2016
creativity for topics such as aesthetics. We used intuition and a pros/cons analysis when coming up with the concepts. For example, when we are exploring ways of crash prevention we found that ultrasonic sensor is usually used for this task in other robotics applications and they are relatively cheap. The accuracy might not be very good but it should be enough to handle our task. This is how we generated our specific concepts.
All 25 generated concepts are documented and attached in the Appendix 14.1. Similar concepts were combined. 4.0 Concept Selection
We went through the generated concept list, and combined similar ideas for each concept. We then categorized them into nine main concepts, as listed below. These nine concepts were put into the concept screening matrix (shown below in Table 1). According to the QFD, the top seven specification requirements were generated and used as our selection criteria. 4.1 Concepts
1. Four wheels for driving, steering, and braking; multiple ultrasonic sensors with wearable receiver devices for connection; one motor for each wheel for more accurate torque generation.
2. Three wheels, including two rear wheels for driving and one front wheel for steering; one motor for torque generation and another for steering; camera and wearable devices of strong colors (not necessarily receivers) for connection and distance control; emergency brake for crash prevention.
3. Two hub motors for driving, steering and braking; one front wheel for stability; camera and wearable devices of strong colors (not necessarily receivers) for connection and distance control; bumpers around the carts for crash prevention.
4. Hydraulic powering system for driving with a stepper motor in front wheel for turning; infrared beacons with receivers for connection; bumpers around the carts for crash prevention.
5. Three wheels, including two rear wheels for driving and one front wheel for steering; shop surveillance camera with wearable receiver device for customer- cart relative location estimation; bumpers around the carts for crash prevention.
6. Three wheels, including two rear wheels for driving and one front wheel for steering; Preset tape in supermarkets for direction guiding; camera for connection and distance control; bumpers around the carts for crash prevention.
7. Four wheels, with two powered rear wheels and two unpowered front wheel for steering; camera that recognizes human features so the cart follows the user; ultrasonic sensor senses distance to prevent the cart from crashing into the user
8. Four wheeled design with rear wheels powered in fixed positions; front wheels are allowed to steer similar to an automobile; user recognition uses a mechanical radar constructed from an ultrasonic sensor mounted on a servo motor.
ME102B Spring 2016
9. Tracked vehicle, motor would be powered and steered by just two motors; uses camera to sense user; bumpers used to prevent damages from obstacles
4.2 Concept Screening Matrix Reference: One current demo autonomous driving cart. Corresponding video is linked as shown: (
Selection Criteria Concept Variants
1 2 3 4 5 6 7 8 9 Ref.
Smooth Driving + 0 + + 0 0 0 + + 0
Smooth Steering + + 0 + + + + + - 0
Appropriate Cart- customer Distance 0 0 0 - - - + 0 0 0
Crash Prevention 0 0 - - 0 0 + 0 0 0
Aesthetic - 0 + + + 0 0 0 0 0
Cost - 0 - - - - - - - 0
plus 2 1 2 3 2 2 4 3 2
minus 2 0 2 4 2 3 2 1 2
same 3 6 3 0 3 2 1 3 3
net 2 1 2 3 2 2 4 3 2
rank 7 9 5 3 4 6 1 2 8
continue? Y N Y Y Y Y Y Y N
Table 1: Concept Selection Matrix
ME102B Spring 2016
4.3 Concept Scoring Matrix
The top seven qualified concepts were then put into the concept scoring matrix (shown below in Table 2a and 2b) for further screening. Although all selection criteria were important, they were reweighed and evaluated according to the interview feedbacks. Each concept was ranked by the sum of its ratings weighted by the selection criteria weight, and five concepts with the highest total scores were selected. How we determined the weight of each selection criteria: To improve the customer's’ shopping experience, it is important to make sure that
the carts can follow the coupled customers smoothly while customers can access their carts with ease. These features are the ones that make the autonomous shopping cart superior. Therefore, the related four criteria equally take 20%.
If the cost of implementing such an autonomous shopping cart system is too high, few merchants would be willing to replace traditional shopping carts, which makes our product meaningless. Therefore, the cost takes 12.5%.
Given that our autonomous shopping cart is able to maintain an appropriate distance with the coupled customer, the chance of unwanted crashes would be minimal. However, it is still important to make sure the cart won’t damage any goods or injure customers. Therefore, crash prevention takes 5%.
Finally, the cart has to possess an appealing look so customers would buy the idea. Therefore, aesthetics takes 2.5%
Score Rating Weighte d score Rating Weighted
Score Rating Weighted Score Rating Weighted
Smooth Driving 20% 4 0.8 3 0.6 3 0.6 3 0.6 3 0.6
Smooth Steering 20% 3 0.6 3 0.6 4 0.8 3 0.6 3 0.6
Strong connection 20% 3 0.6 4 0.8 2 0.4 1 0.2 1 0.2
Appropriate cart-
customer distance
20% 3 0.6 4 0.8 3 0.6 2 0.4 2 0.4
Cost 12.5% 2 0.25 2 0.25 2 0.25 2 0.25 2 0.25
Crash prevention 5% 3 0.15 4 0.2 4 0.2 2 0.1 2 0.1
Aesthetic 2.5% 3 0.075 4 0.1 3 0.075 3 0.075 34 0.075
ME102B Spring 2016
Rank 3 4 5 6 7
Continue? Y Y Y N N
Table 2a: Concept Scoring Matrix
Rating Weighted score
Smooth Driving 20% 4 0.8 4 0.8
Smooth Steering 20% 4 0.8 4 0.8 Strong connection 20% 4 0.8 4 0.8 Appropriate cart- customer distance
20% 3 0.6 3 0.6
Cost 12.5% 2 0.25 2 0.25 Crash prevention 5% 4 0.2 4 0.2 Aesthetic 2.5% 3 0.075 3 0.075 Total Score 3.525 3.525
Rank 1 2
Continue? Y Y
5.0 Concept Description
After interviewing with both customers and shop managers, we concluded that the manner in which the cart interacts with the coupled customer is the most challenging part. Specifically, the cart should be able to distinguish and even predict the customer’s behavior so it would follow him/her with an appropriate velocity within a certain distance as mentioned in the Specifications section.
For the purpose of this project, we decided that the best mode of attack is to utilize a Raspberry Pi Camera utilizing a Pedestrian Detecting algorithm (OpenCV), found in the Appendix 14.3. This algorithm will detect any moving bodies in the line of sight of the camera. Once the microcontroller has tracked the user, a mapping box will be placed around him/her. Based on the centroid of this bounding box with respect to the centroid of the image, the control set point for the cart will be determined.
ME102B Spring 2016
In addition, the algorithm will detect whether the user has moved laterally in the line of sight. If this is the case, the algorithm will calculate the angle that the user has deviated from the camera’s line of sight. This angle will be used to determine the steering servo motor’s motion. This will allow the cart to turn so that it is aligned with the user’s motion again. The feedback algorithm will constantly maintain this alignment to create the “follower” function that we are aiming for.
The motor is powered by a 12 lantern battery. It will drive a timing belt connected to a sprocket on the rear wheel axle. This drivetrain will allow power transfer to the rear wheels for propulsion.
The servo motor turns a lever mounted in a connector piece. This lever pulls or pushes on 2 axles connected to the front wheels. The motion of the servo motor dictates which direction the cart will turn, utilizing an Ackerman turning system.
Figure 2: Shopping cart design CAD model
ME102B Spring 2016
Figure 3: Current CAD drawing without camera
After various experimentations with the motor, sensors, and cameras, our overall design is based on three general modules, including two hardware modules and one algorithm software module. 6.1 Propulsion The propulsion module consisted of the motor and the drivetrain system that propels the axle connecting the two wheels. To power our shopping cart, we used a DunkerMotoren G63x55 with a Worm Gearbox SG80 gearbox attachment. This provides 1133 − of torque and roughly a maximum rotating speed of 134. Referring to the hardware specifications, this motor was able to generate enough driving power. The motor is mounted to a wooden board lying on the lower rack of the shopping cart. An 8 pitch belt pulley with 20 teeth was attached to the drive shaft of the motor. This pulley drives a 760 length belt that attaches to another 8 pitch sprocket with
ME102B Spring 2016
22 teeths. This sprocket sits on a 12 diameter steel axle that is supported by the shopping cart’s four flange bearings located on the rear wheel casters. Aluminum hub assemblies were used to connect the wheels to the axle. 6.2 Steering
Our steering system resembled an ackerman steering system. A servo motor was mounted to the lower rack and was linked to a hinged connector piece that links to the two front wheel casters using two steering rods. This connector piece sways in the right and left directions, depending on the motion of the servo motor. In doing so, it causes the front wheels to turn in the left or right directions. The theoretical maximum steering angle is 32 degrees. An illustrative diagram is shown below in Figure 4.
Figure 4: Diagram of the front steering system 6.3 Algorithm (Software)
As mentioned earlier in section 2.3 the algorithm may be divided into three
components: the person detector, the DC motor and servo motor driver, and the serial communication function.
The person detector makes use of a built-in pedestrian detector in OpenCV. The
pedestrian detector consists of a Support Vector Machine (SVM), pre-trained on the Histogram of Gradients of sample images. At the beginning of each loop, the python script on the Raspberry Pi captures an image using the Raspberry Pi camera. The built- in detector is run on the image and if a person is detected, a bounding box is drawn
ME102B Spring 2016
around them. In addition, if more than one bounding box is found and if they are found to overlap with one another, non maximum suppression is used to obtain a larger bounding box. The centroid of the bounding box is found, and the location of the centre of the box relative to the centre of the image is determined. This determines the control mode that is communicated to the Arduino over a serial connection. The control mode is an integer that determines if the cart should move to the right, straight, left or remain stationary. The cart may only be in one these modes at any given point in time. If no person is detected, the control mode corresponding to the cart remaining stationary is communicated to the Arduino. If the x-centroid of the bounding box is found to lie within a tolerance (deadband) of the centre of the image, the control mode is the integer corresponding the the mode ‘straight’. Similarly, if the centroid is to the left or right, the integer corresponds to the control modes of left and right respectively. Once this integer has been determined, it is converted into a byte and sent over a serial connection to the Arduino by way of python pyserial module.
At the start of the Arduino loop the serial buffer is read and the integer
corresponding to the latest control mode is assigned to a variable, using an in built parser. The Arduino also echoes this value back to the Raspberry Pi in order to make debugging easier. Using a switch statement, the PWM signal for each control mode is generated. For instance, if the control mode is ‘right’, the motor is driven forward and the servo is directed to the right.
In order to reduce the cart response time, parameter adjustments must be made.
Parameters that may be altered include the window stride length for the SVM filter, the resolution of the captured image, and the scale of the non-maximum suppression. However, due to the change in the parameters we may sacrifice the accuracy of our algorithm.
In order to prevent the accumulation of errors due to the decreased accuracy of
our algorithm, we programmed the Arduino such that it would return to the default stationary state at the end of a specified time interval. Hence although the control mode is updated every t seconds by the Raspberry Pi, the Arduino will only implement this control mode for a fraction of this time. For the remainder of the interval, it will return to the stationary control mode, with motor voltage reduced to 0 and with the servo positioned to keep the cart straight. This will reduce the cart speed but will prevent the cart from veering off track such that the person is outside of the field of view.
Decreasing the accuracy of the algorithm also increased the number of false
positives, especially when the person was close to the camera. In order to ensure safety it is necessary that the cart does not move forward when a person is close to it. We therefore, added a ping ultrasonic sensor to the front of the cart. Although such sensors are inaccurate over distances over 2 meters, they are reliable over shorter distances which is when a person is close to the cart. The last piece of the Arduino code therefore determines the distance of the closest object in line with the cart using the ultrasonic sensor. If this distance is smaller than the specified threshold, the cart is set to
ME102B Spring 2016
stationary mode irrespective of the control decision from the Raspberry Pi. The final Arduino circuit schematic is shown in Figure 5 below with connections made to the motor via the H-bridge, the servo, and the ultrasonic sensor.
Figure 5: Arduino Circuit Schematic
It may be noted that the algorithm is unable to deal with the situation of multiple bounding boxes or multiple detected people. However, this can be addressed in the future by way of a more intricate person detection algorithm.
6.4 Overall Design The design specifications of our final product were based on these guiding principles. Since the most important part of our project is the shopping cart’s ability to follow the user, the design should be successful in achieving the objectives. We were confident that the design would be mechanically and electronically sound. As long as
ME102B Spring 2016
any possible bugs are fixed properly, the user detection code should interact with the cart properly. The final product is shown below in Figure 6:
Figure 6: Photo of the final product Despite any minor defects in the design, we were confident that this is the best design given the timeframe that we have. Many alternatives were proposed using different sensor types, but we found using cameras as opposed to ultrasonic sensors to be the most appropriate for our current design. Affordable ultrasonic sensors do not react as quickly as the current cameras that we possess and that is important in tracking a shopping customer throughout the supermarket. Therefore, we have committed to using a camera as our sensor. The dimensions of the design are reasonable because all of our components fit within the confines of the shopping cart. The dimensions of the overall assembly are identical to the dimensions of the shopping cart. As long as we do not interfere with the main shopping rack of the cart, the device is sized appropriately.
ME102B Spring 2016
Figure 6 below is the bill of materials for our assembly. At the moment, the total pricing of the device is reasonable due to the careful selection of affordable parts and the salvaging of spare motors and material from users who no longer need them. The stock material was machined to the dimensions given. We also machined the belt sprocket because the shaft diameter was too small to fit our half-inch diameter axle. Therefore, we had to machine this part so that our components matched. Overall, the concept satisfies all of our design needs and allows us to accomplish the task within a reasonable timeframe.
Figure 7: Project bill of materials 7.0 Parameter Analysis 7.1 Propulsion
The nature of the autonomous shopping cart requires that it moves with a steady and reasonable moving pace, which corresponds to relatively small but consistent power output generated by the driving unit, the electrical DC motor in our case.
ME102B Spring 2016
Specifically, we chose to use a DC motor since it is easier to control, safer to operate and more convenient to encode.
We measured the weight of the car to be 29.8, with all wheels attached and zero extra load. Measured with a spring scale, approximately 36 force is required to initiate the cart movement while 24 is needed to maintain its steady state. With the assumptions that the belt transmission has an efficiency of 98% and there is no other power loss in the system, we can calculate the amount of torque on each wheel that is required for the motor using the Equation 1 below:
= (1)
: [ − ]; : []; : []
The torque requirement is determined by the force and wheel radius measured. With these data, we find out the motor needs to output 311 − to power this 4-wheel shopping cart smoothly. Our model produces about 1133 − , which is sufficient for the purpose of our project. With the linear relationship between motor torque and the mass of the cart, we are confident to say that the cart can carry at least another 60 of extra load, which is way more than typical grocery items weight. 7.2 Steering
To steer our shopping cart, we came up with various options during the concept generation and selection process, and we chose to use the Ackerman steering mechanism (Figure 8) because of its simplicity and reliability. The classic Ackerman diagram is shown below.
ME102B Spring 2016
Figure 8: Ackerman steering system [2]
We performed the calculation to help us find the steering angle. With the steering angles, we can further determine the length and mounting position of the steering rods, which are connected to the servo motor in the front of the cart.
For calculation purposes, we assumed that the difference between the steering angle of outer wheel and inner wheel are negligible; The thickness of the wheel is also negligible. The steering angle can be obtained by the inverse tangent of the cart length and the desired turning radius. Usually the turning radius of the shopping cart is very small because all front wheels are free to rotate 360 degrees. However, in our case we are limiting the motion of the front wheels, and we also wish to have a steadier turning process. Combining these considerations, we set the turning radius to be about 3 times the cart length, and Equation 2 below gives us about 32 degrees.
ME102B Spring 2016
= (13) (2) The contact surface between wheel and ground can be modeled as plastic-
plastic, which has the static friction coefficient of 0.4. Assuming the weight of the cart is evenly distributed on each wheel, the force required to turn each front wheel is calculated using !
! × × .
Assuming the mounting position of the steering rod is 2 from the center of the
front wheel, the torque required is going to be calculated using the governing equation = , which tells us the servo motor needs to generate 9.52 − torque. 7.3 Problems Analysis
One of the primary specifications is that the cart should follow the user at a safe distance when the user is moving along the aisle to scan items. However, the cart should be stationary while the user is retrieving and placing items in the cart. The cart can hence be considered as being in one of two modes, the first occurs as a user scans for an item and the second when the user selects an item and chooses to place it in the cart. Our chosen concept tackles the above by using a camera.
The difficult part of this design is that the cart needs to be able to distinguish between the two modes via information gathered from the sensors mounted. Currently, our cart is coded so it stops when the person it follows is within a certain distance in front of the camera (approximately six feet). But ideally in the future, we would like to achieve this function by modifying the camera algorithm so it not only detects a human but also recognizes the gestures that he/she is making. For instance, we may determine that the user would like to place an item in the cart if the user’s most recent action was reaching out for an item and picking it up. The latter part of this problem is particularly difficult and will only be tackled once the first portion has been accomplished.
A user would also like the cart to operate and transition smoothly between each mode. This requires that the cart be able to alter direction and speed easily. Our concept achieves the above by running two separate control loops: one for direction of the cart and the other for its velocity. Although the information source for the two control loops will be the same i.e. the visuals from the camera and the output from the ultrasonic sensor. The two loops will extract different inputs from this source. Specifically, the control loop for direction will act on the x-y position of the user in the image, while the control loop for velocity will take into account the depth of the user. However, this may still be insufficient to guarantee ‘smoothness’, as our algorithm must also be ‘real-time’.
8.0 Manufacturing Plan
At the moment, all the concepts have been determined. For the sensing/connectivity issue mentioned in the Design Review 2, we decided to tackle it
ME102B Spring 2016
using both a Raspberry Pi camera and an ultrasonic sensor after we attempted several trials with the Xbox Kinect Camera system. After the Design Review 2, we have met the following milestones:
Fabrication: All the parts were machined three weeks before the deadline to
make enough time for debugging and modification. This three-week window also took care of all the time-consuming processes needed for assembly and unexpected challenges. Since the project was based on an existing shopping cart, after having all the parts machined, there were still several compromises that needed to be made in order for the machined parts to be adapted to the original design. Unlike other projects where everything is based on the team’s original design, in this project, many changes and design modifications were made as we moved forward with the manufacturing process. The bottom line was to have everything ready for code testing by April 24th, 2016.
Arduino, Raspberry Pi Programming (Electronics Setup): This task will last
continuously from March 2 to April 29. The connection between the Raspberry Pi and Arduino boards has been successfully established, and we are able to drive the motor using both boards. We accomplished driving the motor with appropriate acceleration and RPM using a Raspberry Pi camera image processing algorithm, and our shopping cart was controlled properly by the algorithm during the design expo.
The overall plan is also illustrated in the updated Gantt Chart below:
Chart 1: Project Milestones 9.0 Manufacturing and Test Results
The functionality of the autonomous shopping cart could be divided into many simple independent tasks. Despite the fact that achieving the final goal requires the collaboration between these tasks, they could still be tested independently. Testing
ME102B Spring 2016
independently could save a lot of time since they could be performed at the same time. As discussed earlier in the design section, the testing was divided into the same components: the propulsion, steering and electronics system. 9.1 Propulsion System
The propulsion system was designed with a rear-wheel drive concept. The existing wheels from the cart were adapted so they would be compatible with the Dunkermotoren G63x55 DC motor that we are using to power our shopping cart.
The wheels are attached to the drive axle using two hub assemblies (Figure 9). Using the mills available in the student machine shops, four clearance holes for ¼-20 screws were drilled 1.25 inches from the center on each rear wheel. These holes are used to mount the hub assemblies.
Figure 9: Hub assemblies attached to the rear wheels
The two-piece hub assemblies were manufactured using the OMAX Water Jet
Cutter available in the student machine shop and the finer detailing was performed using the lathe machine. A 3-inch diameter bottom hub base piece was manufactured with four ¼-20 clearance holes that aligned with the drilled holes located on the
ME102B Spring 2016
shopping cart wheels. In addition, four 10-32 tapping holes located 1.15 inches from the center were implemented as well. These equally-spaced parts were offset 45° from the ¼-20 clearance holes.
This piece attaches directly to the wheel and acts as a platform for the top hub piece which was fixed to the drive axle. The top hub piece possesses four 10-32 clearance holes that line up with the corresponding holes on the bottom hub piece. The drive axle is a 12 mm diameter, 28-inch long steel rod. Set screws were used to secure the hub assembly to the drive axle.
The axle is threaded through two bearing fixtures that protrude out from the rear wheel casters of the shopping cart. Each bearing fixture consists of a 2-inch by 3-inch by 0.25-inch aluminum plate. A ¼-20 and 10-32 clearance hole were aligned with the two holes located on each rear wheel caster and drilled into each aluminum plate. Bolts of corresponding sizes were used to secure each of these two aluminum plates to the wheel casters. Two 12mm McMaster flange bearings were bolted on both sides of each aluminum plate using 10-32 bolts and nuts. These two bearings fixtures allowed the steel drive axle to be mounted to the shopping cart but spin freely at the same time.
Compression springs of roughly ¾-inch diameters were placed between the bearing fixtures and the wheel hubs to ensure that the axle does not slide laterally within the flange bearings.
An 8 mm pitch, 24 teeth steel timing pulley with a 20 mm thickness and 12mm bore diameter was installed concentrically around the drive axle at the center. An 8-32 hole was drilled on the side of the hub feature so a set screw could be installed. This pulley was placed at the center of the axle’s horizontal length and fixed into position using an 8-32 set screw.
A 760 mm long timing belt with an 8 mm pitch was used for interfacing between the drive axle and the Dunkermotoren G63x55 motor. The Worm Gearbox SG80 possesses a 22 teeth steel sprocket that drives the timing belt, transferring power to the drive axle. The drive motor was mounted on a 12-inch by 18-inch wooden platform that was cut to fit the rear region of the lower rack of the shopping cart. A 6-inch by 3-inch notch was cut at the center of the wooden platform with a handsaw to allow space for the motor sprocket to be positioned. The timing belt is fed through the sprocket through this gap. To ensure that the timing belt and the motor component does not interfere with any features of the shopping cart as well, several steel bars forming the lower rack of the shopping cart were removed using bolt cutters. These bars were not crucial to the structural integrity of the cart.
The main cylindrical body of the G63 motor was mounted into a shallow groove that was milled into 6-inch by 7-inch by 2-inch wooden block. This allowed for two plastic brackets to secure the motor to the wooden block. Two ¼-20 clearance holes were drilled on both sides of the motor to secure the wooden block, motor brackets, and
ME102B Spring 2016
wooden board. These three components were bolted together with four 2.5-inch long, ¼-20 aluminum bolts. In addition, in between the brackets and motor were a sheet of thin neoprene rubber, intended to increase friction and prevent the motor from sliding within the fixture.
Figure 10: Motor secured to wooden plate
The wooden platform is mounted to the shopping cart using Extra Heavy-Duty zip
ties, placed at twelve various locations. These zip ties are threaded through the drilled holes on the wooden board and around the steel bars forming the lower rack of the shopping cart.
Figure 11: Wooden plate used for the rear motor system
ME102B Spring 2016
9.2 Steering System
As shown in Figure 12 below, the steering system has a very complicated design. A total of twenty parts were designed and machined (excluding nuts and bolts) to make sure the steering of a shopping cart could be done with one servo motor. OMAX Water Jet Cutter was used to cut out the mounting plate for the steering system, the delrin steering arm, and the delrin servo arm extender. These parts were all sent to the drill press for to clean up the clearance holes. Matching the servo arm extender, the original servo arm was also drilled with the same pattern as the extender. The standoffs and steering rod were cut and manufactured using ½ inch aluminum rod on a lathe machine. The wheel flange was cut out using waterjet and then sent to the sheet metal bender for the final shape. The parts drawing could be found in the Appendix 14.8.
Figure 12: 2D Drawing of front steering system
After all parts were manufactured, in order to successfully assemble everything into the shopping cart, similar technique was used as the propulsion system. A ¼ thick DFM board was cut to shape and mounted to the shopping cart chassis using zipties, as shown in Figure 13 below. Then holes were drilled on it to match with the bolt pattern on the steering plate, which was then assembled using bolts and locking nuts. Four
ME102B Spring 2016
standoffs were used in between two steering plate to create space for the servo motor and the steering arm. After those parts were secured using screws and nuts, a threaded male rod was inserted into the wheel end of the steering arm to control the two steering rods. The steering rods were attached with ball joints at each end to ensure the translation of torque.
Figure 13: The DFM plate used for mounting steering system
The first test performed was the validation of the steering system’s smooth operation. Since the shopping cart’s weight would significantly increase the required steering torque, the test was done with the front wheels lifted off the ground. The steering system was assembled and the servo motor was connected to the controller, which was given the command of repeating 30 degrees oscillating motion. The goal of this test was to see if the turning motion of the front wheels could be directly controlled by the turning of the servo motor. Doing test without load made sure the servo was protected from overusing, and also enabled easier observation for the test results.
This test exposed a lot of problems in our system. First of all, the most serious issue that existed was that the desired range of motion of the wheels could not be achieved. It was observed that after the servo motor turned approximately 30 degrees, the steering arm also turned around 30 degrees. However, the motion of the steering arm failed to transfer to that of the front wheel. It was observed that the front wheels only turned for approximately 6 degrees, which was far less than the desired system specification. Through investigating, we found out that due to the designed quality of the front wheels, they were inherently unstable. As shown below in Figure 14, when the steering rod pushes the front wheel, it was supposed to be turning around the central axis, as indicated by the dashed line in the Figure 14. However, the wheel had lots of freedom moving to the sides, as shown below by the phantom images in the figure. When being pushed, most of the torque was first consumed by this swinging motion. Therefore, the steering angle of the front wheel was significantly reduced. Tests were even run with the shopping cart on the ground to reduce the instability of front wheel; however, the steering angle was still small.
ME102B Spring 2016
Figure 14: The instability of front wheel when pushed by the steering rod (the degree to which is exaggerated)
Another issue spotted in the test was that the teeth on the plastic servo motor
arm were stripped. As a result, it failed to remain in contact with the servo motor. As shown below in Figure 15, the teeth are the key to connecting servo motor and servo arm. Once the teeth on the servo arm started to slip, it lost grip on the servo motor and could not transfer any motion from the servo anymore. The problem was believed to be caused by inappropriate handling during the assembly and testing phase.
Figure 15: The teeth used to connect servo and servo arm [3]
ME102B Spring 2016
In order to fix the issues we noticed during testing, several modifications to the design were made. First of all, since it was hard to find servo arm replacement parts, another servo motor was purchased. The new servo motor could generate a maximum torque of 16 − . We believed that a more powerful servo could actuate the steering movement with more ease and thus reduce the wear off of the motor teeth. Also, proper handling of the servo arm was instructed to each member of the team to prevent the same wear from happening again. As for the biggest problem (wheel wobbling), after consulting with many machinists in the student machine shop, it was concluded that replacing the front wheels would be the best solution. However, it would take extra time and budget, compromising with the project schedule. Since the problem with the wheel could not be fixed, the next best option was to increase the steering arm length. By doing so the steering rod could push the front wheel for a larger distance, with the same servo steering angle. As shown below in Figure 16, the distance from steering arm axle to the steering rod directly controls the length of steering rods’ path. Increasing steering rod path length could also increase the steering angle of the front wheels, since the wiggling motion of wheels had limited freedom.
Figure 16: How steering arm length affects steering arm path
ME102B Spring 2016
New steering arm parts and the corresponding steering plate parts were redesigned and manufactured. With the same servo motor turning angle limit, the length from the steering arm axle to the wheel end attachment was increased from 1.2 to 2.15, by 79%. As a result, the length of the steering rod path increased by 12% and the steering angle of the front wheel increased to 20 degrees. One of the disadvantages of the design change was that it required a larger force to push the wheels because the length of the force arm from the steering arm axle to the wheel end was increased, as shown by Figure 17 and Equation 3 below.
!"#$% ∗ !"#$% = !"##$%&' !"# ∗ !"##$%&' !"# (3)
Figure 17: Force analysis on the steering arm
However, this issue was fixed by increasing the length of the force arm from the steering arm axle to the wheel end. As mentioned earlier, the stronger motor helped solve this problem. The comparison between the old and new steering system design is shown below in Figure 18.
ME102B Spring 2016
Figure 18: Comparison between old (bottom) and new (top) steering arm design Another feature was also added to the design to aid the transfer of torque from the servo motor to the front wheels. In the original design, the steering rods were connected through drilled holes on the existing wheel caster. The space on the wheel caster was very limited and provided a very small distance from the attachment point to the wheel center. This meant that it required more force to turn the wheel since the force arm was small. A wheel flange was added to the design to provide a longer force arm for the steering rod to push the wheel. As shown below in Figure 19, the flange was made using a 3/16-inch aluminum sheet.
Figure 19: Comparison between with (right) and without (left) wheel flange design
ME102B Spring 2016
With all these modifications, we performed more tests of the steering system to ensure its success. The same method was used by lifting the cart off the ground and testing the linkage motions. It was found that the steering angle was significantly increased to 20 degrees from the first test. Even though it was smaller than the original design target, we found it sufficient to drive the cart in field tests. 9.3 Camera and other Electronics
The person detection algorithm was first tested on a laptop using a webcam and was found to run at a fps of 10 with the default parameters, and with the camera capture resolution set to 250 by 400 pixels. If the image resolution was increased, the lag time was found to be unsuitable for a near real-time application. When the same algorithm was run on the Raspberry Pi, the fps was found to drop to 0.25, which was unacceptable based on our design specifications. This decrease in the fps may be attributed to the lower processing power of the Raspberry Pi. The window stride over which the SVM filter is applied is also increased to 2 pixels in the x and y directions Furthermore, the scale for the non-maxima suppression algorithm is increased to 1.2 from its default value of 1.05. Prior to making these adjustments, the Raspberry Pi took approximately 4 seconds to capture and process an image. With the adjustments, however, the time is reduced to the range of 0.2 to 0.3 seconds. This is an approximate fps of 4 to 5, which lies in our desired range. The servo was tested using a potentiometer and an Arduino. The position of the potentiometer was determined by the Arduino and was mapped to a range between 0 and 180 degrees. This value was then used to drive the servo to the desired position. The code in the Appendix 14.6 achieves the above, and also echoes the servo set point for debugging purposes. By adjusting the position of the servo via the potentiometer, and by reading the set point of the servo at this point, we were able to find the set point that corresponds to the straight, right, and left position of the wheels. These values were then incorporated into the motor and servo driver code, so that the cart may respond correctly based on the specified control mode.
Prior to mounting the motor to the cart, the H-bridge circuit and motor were tested using the motor test code shown in the Appendix 14.5. The Arduino code turns the motor in one direction for 2 seconds, brakes the motor, and then spins it in the opposite direction at full speed.
Similarly, the ultrasonic sensor is tested prior to being included in the final code.
The raw signal from the ultrasonic sensor is converted into units of centimeter using the conversion factor provided in the component data-sheet. The code is shown in Appendix 14.7.
ME102B Spring 2016
10.0 Discussion
As with any engineering project, there are a number of improvements that can be made in the future to develop a more polished design. Nonetheless, there were also a number of aspects in our final project that worked very well, and overall we all were very happy with the ultimate product.
Our project did accomplish its intended goal: it correctly followed individuals as
they walked throughout the Soda-Jacobs Hall breezeway. This corresponds to a correctly programmed algorithm, an effectively machined shopping cart, and a correct use of all sensors. Specifically, the Ackerman steering mechanism in the front of the cart allowed for smooth steering and positioning of the shopping cart towards the user’s position. The rear propulsion system allowed for steady and tranquil driving of the cart, which prevented any jerky motion when it needed to stop. The Raspberry Pi camera provided a strong connection from the cart to the user and did not require an individual to wear anything to be detected by it, which is an added convenience to the user. In addition, the overall controls system effectively demonstrated a cart-user distance that was safe, appropriate, and crash-preventative. Lastly, our shopping cart was both cost- effective and aesthetically appealing as it was designed with frugality and all user-needs in mind. Since we borrowed the shopping cart from Safeway and the rear motor from Tom, our out-of-pocket costs were relatively low compared to other ME 102B Spring 2016 project teams. Moreover, with the various equipment in the Mechanical Engineering Machine Shop, we were able to refine our cart to make it visually attractive.
If we were given more time than just one semester, however, there are a number
of aspects we would pursue to make our design even better. Specifically, we would start by developing a more rigid steering system. This would mean swapping out the original cart wheels for more rigid wheel assemblies. The wheels and wheel casters that were included with the shopping cart were not designed to be an integral part of the cart steering. Therefore, the bearings used to provide the wheel casters with free rotational motion were rather loose, causing our steering system to not be as effective as we would have preferred it to be.
We would also machine the steering plate that holds the servo motor to adjust for
a wider steering angle. This would, in turn, correlate to a larger turning radius for our shopping cart to have. In terms of the rear motor system, we would like to purchase and implement a stronger motor that provides more power for cart propulsion. We would also like to design this rear system to allow for backwards motion and for more human motion. This means we want this new motor to be able to respond to the sudden and jerky motion of an average human.
Moreover, we would also like for our shopping cart to follow only the designated
user and not accidently start following another human. This happened several times during our exposition; for example, when an individual was testing the cart’s abilities and another person also got inside the Raspberry Pi camera’s vision, the cart would
ME102B Spring 2016
stop and follow the incorrect user. We would like to improve this by making our camera only follow a designated item instead of a person; perhaps we could make our users wear a wristband that the shopping cart can detect and follow. We would also like to implement an obstacle detection algorithm to prevent any accidents during the shopping experience. In terms of the Raspberry Pi camera, we would also like to extend its range for future shopping cart design considerations. This means we would buy a camera and sensing system that would provide for a wider coverage of users.
Overall, if we were to redesign the hardware for our project, we would opt to
fabricate our own shopping cart rather than modify an existing one. The reason for this is that it is much more difficult to implement features into an already established design. Rather, we would prefer to keep all of the shopping cart’s features in consideration while designing the actual shopping cart. However, in the timeframe that we were given, we may not have been able to accomplish this ideal task.
In addition, our shopping cart exhibited considerable lag time during the
exposition. This means that the controls system with our current prototype needs to be redesigned to provide for a more responsive and quick cart. We can accomplish this by purchasing and using a Raspberry Pi camera with a higher processing speed and more RAM than our current design. We could also switch the algorithm we used. Instead of using the OpenCV Pedestrian Detector for the image processing, we could research other types of detectors used in industry (i.e. Google’s self-driving car’s algorithm) and use them. We could also implement a more sophisticated detector that could separately identify individual users. In doing so, we hope the lag that was prominent in our current design will not be too much of an issue for future modifications. We could also make our shopping cart weigh less in order to reduce the associated lag. One way we could do this is to use a shopping cart from CVS instead of Safeway; CVS’s shopping carts are smaller and use less material than Safeway’s which should allow for smoother driving. In addition, we could also improve the lagging experience by purchasing and using a more powerful propulsion motor.
Lastly, the positioning of the camera on our cart can be improved. The camera
was placed in the back on the child seat in our current design. However, this means that the cart cannot “see” directly in front of itself (i.e. something like a can of soup that gets in front of the wheels of the cart). This is not a safe design and we would like to improve it by repositioning the camera to the front of the cart. In addition, we would also like to add sensors to the sides and back of the cart so that all 360 degrees of vision will be accessible for the algorithm. This will allow us to program a shopping cart that understands its position with respect to itself, the user, other customers, shelves, children, etc. In doing so, we can create a powerful design that effectively fulfills all of the user needs.
Overall, we were very happy with our results. Our project worked amazingly and
we received a lot of positive feedback from professors, GSIs, fellow students, the machinists in the Machine Shop, and other visitors during the exposition. We were
ME102B Spring 2016
especially glad that we will able to graduate from college by delivering such an awesome capstone project and potential startup idea. It is designs like this shopping cart that continue to inspire and motivate us to pursue an occupation that will improve the infrastructure of our society and enhance the quality of life. 11.0 Conclusion
The goal of this project was to improve the shopping experience at stores by optimizing the standard shopping cart. We are a group of ambitious mechanical engineering students whose goal was to build an autonomous shopping cart that follows customers wirelessly as they grocery shop. The autonomous shopping cart was targeted to benefit all shoppers at various stores, however with an emphasis on the elderly, disabled and handicapped, and store managers. It allows customers to focus on scaling aisles for needed supplies without having to laboriously drag a cart around and overall make grocery shopping a more efficient and enjoyable experience.
We were able to successfully design and manufacture the autonomous shopping
cart. The fundamental mechanical design consisted of a propulsion system driving the back wheels, a steering system driving the front wheels and a camera to be used as a visual tracking system. The camera locates users by drawing a box around them and taking image data. The image data is then processed and converted by a Raspberry Pi to be fed to an Arduino microcontroller in order to command the motors of the cart. The final product proved to be effective and aesthetically pleasing. The cart was able to follow users multidirectionally and maintain safe distances from them at all times. In addition, the cart was also easy to use and performed smoothly.
Nonetheless, there were also aspects of the cart with room for improvement. In future designs, slack of the the steering system will need to be considered further to ensure full turning radius capabilities of the cart. The visual system would need to be optimized to detect only the designated user as opposed to detecting others in its sight as it does currently. In addition, a more powerful processor would need to be used in order to improve lag times and camera position will need to be optimized to perfect distance sensing capabilities by the visual tracking system.
Overall, the autonomous shopping cart project was a success and we are very pleased with the results. This project proved to be a potential next step in changing the shopping experience of the future.
ME102B Spring 2016
12.0 ACKNOWLEDGEMENTS We would like to personally thank Professor Lin, the Graduate Student Instructors (Sina Akhbari, Eric Sweet, and Ilbey Karakurt), the machine shop experts (Brien, Mick, Jacob, Dennis, Jesse, and Scott), and Tom Clark for assisting and providing clarifications on our project during lab times and office hours. Without their help, this project would not be possible. 13.0 Reference [1] “This Shopping Cart of the Future Creepily Follows You Around Stores.” Entrepreneur. N.p.,
31 Dec. 2014. Web. 09 May 2016. <>. [2] Heddrick, Karl. “ME131 Vehicle Dynamics and Control.” HW2: Ackermann’s Geometry.
University of California, Berkeley, Berkeley, CA. Reading.
[3] “Micro Servo - High Torque Metal Gear.” Adafruit Industries Blog RSS. N.p., n.d. Web. 09 May 2016. < BEiQA-ZeDmopaOAQ7eHypbEwyaCvuKb2ve3E4MyCBm8SQAH3vFaMaAtQE8P8HA>.
ME102B Spring 2016
14.0 APPENDIX 14.1 Concept Generation List I. Wheel powering: 4-wheel drive (4 motors)
One motor for each wheel for maximum torque to handle weight in the cart.
3-wheel drive (2 motors) One motor drives the back two wheels of the cart and functions to propel the cart forwards and backwards. The other motor drives the front wheel and allows turning.
Hub motor Motors build into wheels into the cart for drive, stability, and aesthetic appeal. II. Steering: 4-wheel direction
Cart is able to turn as a result of outputting different velocities in the motors.
One front wheel turning Dedicating one wheel for turning and the others for propelling the cart. Stepper motor for front wheel turning
Stepper motor allows drive of the wheels while constraining them from turning uncontrollably
Hub motor Motors build into wheels into the cart for drive, stability, and aesthetic appeal.
Rack and pinion mechanism. III. Appropriate distance: Infrared LED and photodiode
Infrared sensor for measuring distance from cart to user. Ultrasonic system
Ultrasonic transducer system to measure distance between cart and object in path to feedback information to cart for motor to generate corresponding acceleration.
Shop surveillance camera to locate the customer
Use the shop surveillance system to locate customers and follow customers’ movements.
ME102B Spring 2016
Install multiple sensor for better data retrieving. Color belt/wrist band
User wears a piece of clothing that can be detected by a sensor on the cart for tracking purposes.
Beacon and infrared sensor Infrared distance measurements would be fed back to the cart controls system to ensure that an appropriate distance from cart to user is maintained at all times.
V. Crash prevention: Each wheel has a brake
Brake for each wheel to optimize safety. Emergency brakes
Pads to quickly stop the wheels from turning in case of impending crash.
Bumpers around Allows cushioning in case of the cart crashing into an object.
Ultrasonic Ultrasonic transducer system to measure distance between cart and object in path to feedback information to cart for motor to slow down.
Hydraulic braking system
Braking system similar to cars for crash prevention. VI. Aesthetic: Aerodynamic shape
Improvements to the standard shopping cart design for a futuristic look.
Customized chassis Personally designed chassis that deviates from the standard shopping cart design.
Modular design Modular design for items of different shapes and weights. Streamlined design
Aerodynamic look for optimal aesthetic appearance LEDs
LEDs to make the cart stand out and for a futuristic look.
ME102B Spring 2016
Figure 20: Quality Function Development Chart
ME102B Spring 2016
14.3 Raspberry Pi Code # import the necessary packages from imutils.object_detection import non_max_suppression from imutils import paths from picamera.array import PiRGBArray from picamera import PiCamera import numpy as np import imutils import cv2 import time import serial import struct ####### Parameters to be Tuned ########## device = "pi" send_data_boolean = True decision_horizon = .2 #units of seconds decision_threshold = 1 deadband = 15 framerate = 10 img_width = 200 device = "pi" winStride_parameter = (2,2) scale_parameter = 1.1 #original value is 1.05 if device == "pi": address = '/dev/ttyUSB0' else: address = 'COM3' if send_data_boolean: ser = serial.Serial(address,9600) test_bytes = bytearray([10]) ser.write(test_bytes) def send_data(ser,data_list): if ser.inWaiting() == 0: bytes = bytearray(data_list) ser.write(bytes) else: read = print struct.unpack('B',read) def draw_arrow(image, p, q, color, arrow_magnitude=9, thickness=1, line_type=8, shift=0): # adapted from # draw arrow tail cv2.line(image, p, q, color, thickness, line_type, shift) # calc angle of the arrow angle = np.arctan2(p[1]-q[1], p[0]-q[0]) # starting point of first line of arrow head p = (int(q[0] + arrow_magnitude * np.cos(angle + np.pi/4)), int(q[1] + arrow_magnitude * np.sin(angle + np.pi/4))) # draw first half of arrow head cv2.line(image, p, q, color, thickness, line_type, shift) # starting point of second line of arrow head p = (int(q[0] + arrow_magnitude * np.cos(angle - np.pi/4)),
ME102B Spring 2016
ME102B Spring 2016
rects = np.array([[x, y, x + w, y + h] for (x, y, w, h) in rects]) #feed array of rectangles to non_max_suppression to get a single overlapping box for the person pick = non_max_suppression(rects, probs=None, overlapThresh=0.4) #draw the rectangle in the image for the pcked rectangles for (xA, yA, xB, yB) in pick: cv2.rectangle(image, (xA, yA), (xB, yB), (0, 255, 0), 2) if len(pick) != 0: if len(pick[0]) == 4: #find the centroid of the bounding box assuming there is only one person in the image i.e. one bounding box centre = find_centroid(pick[0],img_width) decision_time_end = time.time() decision_diff = decision_time_end - decision_time_start y_diff = centre[1] - y_array[-1] x_diff = centre[0] - x_array[-1] x_array = np.append(x_array,centre[0]) y_array = np.append(y_array,centre[1]) image_centre = (img_width/2) if abs(x_array[-1] - image_centre) < deadband and send_data_boolean: send_data(ser,[3]) elif (x_array[-1] > image_centre) and send_data_boolean: send_data(ser,[2]) elif x_array[-1]< image_centre and send_data_boolean: send_data(ser,[1]) else: send_data(ser,[0]) end_time = time.time() delta_t = end_time - start_time print "This took %f"%(delta_t) cv2.imshow("After NMS", image) #out2.write(image) #out.write(frame) rawCapture.truncate(0) if cv2.waitKey(1) & 0xFF == ord('q'): break #cap.release() #out.release() #out2.release() cv2.destroyAllWindows()
ME102B Spring 2016
14.4 Arduino Code #include <Servo.h> #define trigPin 12 #define echoPin 13 Servo myservo; int turn_centre = 87; int turn_deviation = 30; int H_bridge_1 = 11; int H_bridge_2 = 10; int cart_state = 9; int output_byte; unsigned long starttime; unsigned long endtime; unsigned long delta_t = 0; void setup() { Serial.begin(9600); myservo.attach(9); pinMode(13, OUTPUT); pinMode(H_bridge_1, OUTPUT); pinMode(H_bridge_2, OUTPUT); pinMode(trigPin, OUTPUT); pinMode(echoPin, INPUT); starttime = millis(); } void loop() { long duration, distance; digitalWrite(trigPin, LOW); // Added this line delayMicroseconds(2); // Added this line digitalWrite(trigPin, HIGH); // delayMicroseconds(1000); - Removed this line delayMicroseconds(10); // Added this line digitalWrite(trigPin, LOW); duration = pulseIn(echoPin, HIGH); distance = (duration/2) / 29.1; int inputbyte; int motors_off = 2; //when the state is set to 0 read the buffer else echo it back inputbyte = echo(); if (inputbyte < 4){ cart_state = inputbyte; } if (distance < 34) { cart_state = 0; } if (delta_t > 10 && (cart_state == 2 || cart_state == 1 || cart_state == 3)){ starttime = endtime; motors_off = 1; } endtime = millis(); delta_t = endtime - starttime;
ME102B Spring 2016
ME102B Spring 2016
ME102B Spring 2016
14.6 Servo Motor Test Code #include <Servo.h> Servo myservo; // create servo object to control a servo int potpin = 0; // analog pin used to connect the potentiometer int val; int old_val = 0; // variable to read the value from the analog pin void setup() { myservo.attach(9); Serial.begin(9600);// attaches the servo on pin 9 to the servo object } void loop() { val = analogRead(potpin); // reads the value of the potentiometer (value between 0 and 1023) val = map(val, 0, 1023, 0, 180);// scale it to use it with the servo (value between 0 and 180) if (old_val != val && old_val != val-1 && old_val != val + 1) { Serial.println("New Val is:"); Serial.print(val); } myservo.write(val); old_val = val; // sets the servo position according to the scaled value delay(15); // waits for the servo to get there } 14.7 Ultrasonic Sensor Test Code #define trigPin 12
#define echoPin 13
#define led 11
#define led2 10
delayMicroseconds(2); // Added this line
Figure 21: Front wheel steering flange
Figure 22: Steering Plate
Figure 24: Servo Motor slider piece
ME102B Spring 2016
ME102B Spring 2016
Figure 28: Steering Arm for steering system
ME102B Spring 2016