Each transmission burst in 802.11 requires a guard interval that makes it so transmissions do not interfere with one another. Normally this is an 800ns timer that allows for time between bursts. This gives a safe time between transmissions. 802.11n outlined the idea of a short guard interval, decreasing the timer to 400ns. This gives less time between transmissions, which could potentially cause collisions between transmissions from the same device. In tests done in a closed environment, there was a definite increase in packet error rate. Without SGI, there was little to no packet error. With SGI, we saw a 30% increase in packet error rate. This is a significant increase in error, meaning that far more collisions are occurring. However, the throughput of a BSS using SGI was significantly better than a BSS that does not. While the error increased as predicted, the error did not affect the much higher rates that SGI predicted. Is this true in an uncontrolled environment?