# HW3-6 Adding noise - Distance

 8 1 In the commented out directions for HW3-6, we are told to incorporate noise into steering by choosing our steering angle from a Gaussian distribution of steering angles. For bearing, we are supposed to just add some noise from a Gaussian distribution. We aren't given any specific instructions for incorporating noise into distance. I tried it both ways (separately): choosing the distance from a Gaussian distribution (centered at our true distance) and also adding noise from a Gaussian distribution (centered at 0). They both had distance_noise set as their variances. I was surprised to find that they performed differently. It turns out adding the noise gives slightly better results than when the distance is chosen from a Gaussian distribution. The comparison was made by computing the True/False rate of their checker over 2000 runs. The difference was ~88% for adding noise and ~84% for taking the measurement out of a Gaussian. I can't see why one should be better than the other in this situation. Can anyone explain the difference? Also, which is the one more true to reality? I submitted my assignment with the better results, but that's not to say that's how things play out in the real world. UPDATE: I reran the program 10,000 times each and for the "n+random.gauss(0,noise)" case I got 90%, but for the "random.gauss(n,noise)" case I got 80.88%. asked 09 Mar '12, 08:52 AME 1.1k●10●21●34 accept rate: 23%

 2 To be sure that I get the question right. What you are saying is, that n+random.gauss(0,noise) is not the same as random.gauss(n,noise)? I checked the implementation of gauss (python 2.7) in random.py: def gauss(self, mu, sigma): """Gaussian distribution. mu is the mean, and sigma is the standard deviation. This is slightly faster than the normalvariate() function. Not thread-safe without a lock around calls. """ # When x and y are two variables from [0, 1), uniformly # distributed, then # # cos(2*pi*x)*sqrt(-2*log(1-y)) # sin(2*pi*x)*sqrt(-2*log(1-y)) # # are two *independent* variables with normal distribution # (mu = 0, sigma = 1). # (Lambert Meertens) # (corrected version; bug discovered by Mike Miller, fixed by LM) # Multithreading note: When two threads call this function # simultaneously, it is possible that they will receive the # same return value. The window is very small though. To # avoid this, you have to use a lock around all calls. (I # didn't want to slow this down in the serial case by using a # lock here.) random = self.random z = self.gauss_next self.gauss_next = None if z is None: x2pi = random() * TWOPI g2rad = _sqrt(-2.0 * _log(1.0 - random())) z = _cos(x2pi) * g2rad self.gauss_next = _sin(x2pi) * g2rad return mu + z*sigma  As you see, mu is added at the end, so there is basically no difference between n+random.gauss(0,noise) and random.gauss(n,noise). If I misunderstood your question, then please clarify answered 09 Mar '12, 16:51 jkp-2 346●2●15 Yeah, that's the comparison I was trying to make. I thought they should be the same too. I just reran my code 10,000 times each and for the "n+random.gauss(0,noise)" case I got 90%, but for the "random.gauss(n,noise)" case I got 80.88%. Regardless of the success rate aspect, is the "random.gauss(n,noise)" more computationally efficient? random.gauss(n,noise) only does one operation compared to "n+random.gauss(0,noise)" which has two operations. (10 Mar '12, 02:53) AME Well, it's one addition you save, so yes, it's more efficient, but it's so little that you will hardly notice. (10 Mar '12, 07:10) jkp-2
 0 I can't see how it can be better? 2000 runs is probably not enough. Changing the mean of a Gaussian just seemed to move its graph along the x axis, the shape of the distribution was the same. So you'd expect if random.gauss(0, 5.0) is returning +/- 25, say, that (50, 5.0) will return 25 to 75. As for the real world, "it depends" is probably the best answer. But intuitively you might expect that the further you went, the bigger the error would be. So this +/- 25 regardless of distance seems unlikely. But in the real world your robot or car isn't likely to travel large distances without a sense and resample step in any case. So it might be that at road speeds with typical sensing and resampling rates it works. answered 09 Mar '12, 09:13 Michael F 6.6k●7●30●68 I reran the program 10k times each and for the "n+random.gauss(0,noise)" case I got 90%, but for the "random.gauss(n,noise)" case I got 80.88%. You may be right that it might not matter too much in a real world situation where they make so many measurements that our d is so small. It's still weird! (10 Mar '12, 02:59) AME
 0 In my point of view they are exactly the same. I wouldn't say that 4% difference is a conclusive result. If you try running the test again (2000 more times) maybe you will get inverted results. answered 10 Mar '12, 03:20 André August... 917●2●5●15 I just updated the original post with the latest results (they appeared in comments rather than OP before). "I reran the program 10k times each and for the "n+random.gauss(0,noise)" case I got 90%, but for the "random.gauss(n,noise)" case I got 80.88%." (10 Mar '12, 03:29) AME @AME Have you run the tests again with the same results? I tried with smaller numbers and both ways behave basically the same. (11 Mar '12, 14:31) Margarita Ma... @marga I'm not sure which numbers you made smaller. Do you mean smaller alpha and d in the motion vectors? If so, I did mess around with alpha a little, but not d. If you meant number of iterations, I started off running them at small numbers and they were getting numbers that were about the same as each other, but they were so close to the 80% threshold I wanted to be a little more sure they were >80% so I started running them more and more and they began drifting apart more with "n+random.gauss(0,noise)" becoming clearly more successful. It wasn't until I ran them in the thousands where there was a definite winner, so to speak. (11 Mar '12, 17:11) AME
Question text:

Markdown Basics

• *italic* or _italic_
• **bold** or __bold__
• image?![alt text](/path/img.jpg "Title")
• numbered list: 1. Foo 2. Bar
• to add a line break simply add two spaces to where you would like the new line to be.
• basic HTML tags are also supported

×5,187
×183
×37
×25
×9