Starting from:
$35

$29

Homework 2 Autonomous Mobile Robots Solution

A document containing the answers and gures should be uploaded to Gradescope as HW2writeup.pdf. All the code and the les needed to run the code are to be uploaded to Canvas as a zip le named HW2code.zip. Speci c functions, as indicated below, should be uploaded to their corresponding assignments on Canvas.

Notes about autograded assignments:

    • We highly recommend you develop and debug the autograded functions in Matlab. The error messages that the autograder provides are not as explicit as in Matlab.

    • You may submit as many times as you like until the deadline

    • Make sure to also include the functions in the zip  le you upload to Canvas

    • Reusing code: We encourage you to reuse your prior functions in the new functions. For the auto-graded assignments, if you want to use a previous function you have written (for example robot2global), simply copy-paste it below the function you are writing.

Dead Reckoning (45 Points)

The iRobot Create encoder information can be read from the distance and angle packets (using functions DistanceSensorRoomba and AngleSensorRoomba, respectively). These packets provide the distance the robot traveled and the angle it rotated since the sensor was last read.


    1. Given a known initial con guration (x; y; ) of the robot within a global frame, the distance traveled d and the angle the robot turned , compute the new con guration of the robot. Explain your calculation. Note that you cannot assume a \turn then move" scheme. (Hint: Assume that the robot’s wheels turn at a constant rate between sensor readings.)

    2. Edit the function integrateOdom.m to integrate the robot odometry as calculated in part 1 (this is known as \dead reckoning"). Submit this function in the autograded assignment Homework 2 integrateOdom.m on Canvas

    3. Generate a trajectory and gather data: No need to submit the control code or the simulator map and con g les.

        (a) De ne a map for the simulator. You may use the map from Homework 1 or de ne a new one.

        (b) Write a control program that will drive the robot in the simulator. The robot motion should include simultaneous non-zero forward and angular velocity (meaning the robot motion should in-clude arcs). The program should read the odometry, either by periodically calling DistanceSensorRoomba and AngleSensorRoomba and storing the data or by periodically calling the function readStoreSensorData. Make sure the robot runs into a wall at least once. The program should be deterministic, that is

the trajectory should be the same in repeated runs.

        (c) Using Con gMakerGUI, create a con g  le that de nes errors on the odometry.

        (d) Run the simulation without a con g  le (no noise is added to the sensors). Save the data.

        (e) From the same initial pose as the previous run, run the simulation with a con g le. Save the data.
    4. Plot in the same gure the true trajectory as captured by the overhead localization, the trajec-tory for the integrated noiseless odometry (using the function integrateOdom.m with data from the rst run) and the trajectory for the integrated odometry which contains errors (using the function integrateOdom.m with data from the second run). Specify what error parameters were used.

    5. Did the integrated trajectory from the rst run match the overhead localization? Did you expect it to? Explain.

    6. How does sensor noise a ect the quality of the localization?

    7. What might cause errors in odometry measurements?

Expected depth measurement (30 points)

In this section you will write a function that given the robot pose and the map, predicts the depth measure-ments. Note that the realsense sensor provides a depth image that corresponds to the depth of obstacles along the sensor- xed x-axis, as shown in Fig 1 .



object



range
depth
x



y
plane of



camera
create



Figure 1: Depth information



    1. Edit the le depthPredict.m to calculate the expected depth measurements (based on the pose) for a robot operating in a known map. These depth measurement correspond to the distance between the sensor and the closest obstacle. Assume the sensor reference frame is not rotated with respect to the robot- xed reference frame. You may want to use the provided function, intersectPoint.m, which calculates the intersection point of two line segments. Hint: why do we need to provide the sensor- xed frame origin?

Note that the Field of View (FOV) of the depth image is centered around the sensor- xed x-axis, the rst element in the vector is the leftmost point and the last element in the vector is the rightmost point. We recommend looking at the documentation of the RealSenseDist function regarding the format of the depth measurements. Submit this function in the autograded assignment Homework 2 depthPredict.m on Canvas.

    2. You may continue to work with the same control program and con g le from the previous section, or create new ones. Your con g le should have non-zero values for the realsense noise (in the \St. Dev." column). Make sure you are collecting and saving the following data (in dataStore):

        ◦ truthPose (measurements from overhead localization)

        ◦ rsdepth


2
    3. In the simulator, load a map (you may use box map), load the con g le and run your control function. Save the data.

    4. Based on the map and dataStore.truthPose, and given that the sensor (in the simulator) is positioned 0.16m along the X-axis of the robot- xed frame, calculate the expected depth measurements. Plot the actual (simulated) depth measurement and the expected depth measurements on the same graph. Are they identical? Do you expect them to be? Explain.





























































3

More products