Jump to content
S4GRU

Sprint LTE Coverage Maps via Sensorly

Recommended Posts

  • Like 25

Share this post


Link to post
Share on other sites

ATTENTION S4GRU MEMBERS: We have joined to collaborate with Sensorly on a thread here at S4GRU. Above is an embedded Sensorly Sprint LTE map for our member's use. In the thread below, feel free to post comments or questions related to the Sensorly coverage, the Sensorly app or other things relevant. Sensorly will stop by occasionally to participate in the conversation and answer some questions as they have time.

 

Please be respectful and constructive with your questions and comments. We appreciate being able to have direct access with these guys, as the Sensorly app is such an integral part of what we wireless enthusiasts do to track Network Vision/LTE deployment.

 

Robert

  • Like 13

Share this post


Link to post
Share on other sites

Thanks! I have usually not been able to view the Sensorly map from my desktop at my usual daytime location because it is blocked by the local proxy server. (And Sensorly on my phone is almost useless there until NV reaches the area with some usable bandwidth.) But I can view the embedded version above.

  • Like 1

Share this post


Link to post
Share on other sites

This is perfect considering I sometimes have trouble with sensory.com when using my ipad. I'll start using this version if it continues to cooperate.

 

Thanks a ton!

  • Like 1

Share this post


Link to post
Share on other sites

I love this map, being a Sensorly user who has mapped a small amount of the 3G coverage in my area I like the above map. It lets me know the rest of the areas near me that I can map. Specifically, it appears the park I was just at yesterday isn't really mapped. I will go back there and map it once they open up the ball fields again.

  • Like 1

Share this post


Link to post
Share on other sites

A number of threads here and elsewhere discuss the relative strenghs/weaknesses of various devices' radios. Most of these discussions lack hard numbers to back up the assertions. Perhaps the data acquired by Sensorly could provide the empirical data to compare pairs of devices, or to compare a single device against the "average" device's radio performance.

 

For example, there has been some discussion that the HTC EVO 4G LTE phone's LTE radio doesn't perform as well as some other devices. An extended query on the coverage map could overlay, for a particular radio such as the LTE radio, the coverage reported by the HTC EVO 4G LTE with the coverage reported by the Samsung Galaxy S III. I expect that when viewing the overlay at a high zoom level (lots of detail), one of the devices will show either a larger coverage area or a stronger signal in the same coverage area.

 

Another interesting metric would be something along the lines of average signal strength per unit of coverage area for various devices for a specific radio. I expect a better radio will show a better average signal strength than a weaker radio. This metric would obviously require a fair amount of computation, but because the results won't change often the computation could be run as needed. Perhaps such a report could goad certain cellular companies into fixing problems with a device's radio that aren't obvious without munching a lot more data than is available to an individual?

 

Bob

  • Like 1

Share this post


Link to post
Share on other sites

A number of threads here and elsewhere discuss the relative strenghs/weaknesses of various devices' radios. Most of these discussions lack hard numbers to back up the assertions. Perhaps the data acquired by Sensorly could provide the empirical data to compare pairs of devices, or to compare a single device against the "average" device's radio performance.

 

That is an interesting proposition, but I do not think that it would produce particularly meaningful results. In most locations, Sensorly simply does not aggregate enough data to produce valid averages across multiple devices. Moreover, Sensorly does not track location with high enough resolution to compare data sets acquired at similar locations.

 

For example, device A is in location X and reports signal strength of -80 dBm. Device B is 15 feet away from location X and reports signal strength -70 dBm. Per Sensorly reporting, both handsets are in the same location. So, does device B offer superior RF performance, or is device A simply in a fade due to multipath?

 

In the end, the RF performance data that you seek can really only be acquired in a highly controlled lab environment.

 

AJ

Share this post


Link to post
Share on other sites
A number of threads here and elsewhere discuss the relative strenghs/weaknesses of various devices' radios. Most of these discussions lack hard numbers to back up the assertions. Perhaps the data acquired by Sensorly could provide the empirical data to compare pairs of devices, or to compare a single device against the "average" device's radio performance.

 

For example, there has been some discussion that the HTC EVO 4G LTE phone's LTE radio doesn't perform as well as some other devices. An extended query on the coverage map could overlay, for a particular radio such as the LTE radio, the coverage reported by the HTC EVO 4G LTE with the coverage reported by the Samsung Galaxy S III. I expect that when viewing the overlay at a high zoom level (lots of detail), one of the devices will show either a larger coverage area or a stronger signal in the same coverage area.

 

Another interesting metric would be something along the lines of average signal strength per unit of coverage area for various devices for a specific radio. I expect a better radio will show a better average signal strength than a weaker radio. This metric would obviously require a fair amount of computation, but because the results won't change often the computation could be run as needed. Perhaps such a report could goad certain cellular companies into fixing problems with a device's radio that aren't obvious without munching a lot more data than is available to an individual?

 

Bob

 

I have the data. I have done dozens of tests on 10 Sprint devices including radio performance for 1x, EVDO, WiMax and LTE. I was prepared to do an article series on my testing, but now I'm holding it because of the Jelly Bean update on the EVO seems it may have altered the LTE connectivity with weak signals. The EVO signal in LTE used to bounce around by 10-12dBm when it got weak. And it was impossible to keep weak LTE signals unless it LTE only mode.

 

That being said, I can tell you in my radio performance testing, that the EVO LTE is not a top RF performer. In 1x and EVDO, my testing yielded middle of the pack performance. In LTE performance, my testing of the EVO was pretty good with good signals, but near the bottom of other LTE devices with midrange and weak signals.

 

But this may have changed with the improvements in Jelly Bean. Because of the signal strength is no longer bouncing around, it will likely keep the LTE connection and perhaps even improve in performance.

 

Robert via Samsung Note II via Tapatalk

 

 

Share this post


Link to post
Share on other sites

With a large enough sample set, even the low resolution wouldn't be a problem. The precision errors would average out, and no device would have an unfair advantage because the errors would be close to randomly distributed.

 

However, calculating confidence intervals would be difficult, and the sample size of overlapping data points for any given device is probably too small, at least on LTE. Also, the fact that coverage is in flux because of active deployment makes the data almost worthless for this purpose at this point.

 

Share this post


Link to post
Share on other sites
With a large enough sample set, even the low resolution wouldn't be a problem. The precision errors would average out, and no device would have an unfair advantage because the errors would be close to randomly distributed.

 

However, calculating confidence intervals would be difficult, and the sample size of overlapping data points for any given device is probably too small, at least on LTE. Also, the fact that coverage is in flux because of active deployment makes the data almost worthless for this purpose at this point.

 

I did my testing in Waco and Wichita Falls in fully deployed areas for this reason. I tested each device at four different intervals using averages. I also did testing in high signal areas, midrange signal areas and low signal areas. My interest was mostly in radio performance via signal strength and quality of signal. Although I also did data performance testing, this was not the purpose of my testing.

 

I also tested Verizon LTE and T-Mobile HSPA+ for comparison with all the Sprint LTE device testing.

 

Robert via Samsung Note II via Tapatalk

 

 

  • Like 1

Share this post


Link to post
Share on other sites

Now that sounds like a good methodology. I'm really looking forward to that comparison article. You're just waiting until you can get a new comparison for the Evo?

Share this post


Link to post
Share on other sites
Now that sounds like a good methodology. I'm really looking forward to that comparison article. You're just waiting until you can get a new comparison for the Evo?

 

Yes. I don't feel good showing the results which will show the EVO LTE with noticeable disadvantages if those problems have been reduced or no longer exist. I'm hoping to arrange something this month still.

 

Robert via Samsung Note II via Tapatalk

 

 

  • Like 1

Share this post


Link to post
Share on other sites

With a large enough sample set, even the low resolution wouldn't be a problem. The precision errors would average out, and no device would have an unfair advantage because the errors would be close to randomly distributed.

 

However, calculating confidence intervals would be difficult, and the sample size of overlapping data points for any given device is probably too small, at least on LTE. Also, the fact that coverage is in flux because of active deployment makes the data almost worthless for this purpose at this point.

 

I think the best that Sensorly could do would be to report the median signal level per device during a select time period in which all of the devices in the survey were available. The law of large numbers, as you note, would presumably average out the inconsistencies among reporting locations.

 

However, I would want to see the numbers of each device used to calculate those stats. And such a survey still could not account for location bias. Is it possible, for example, that Sprint LTE subs in markets with minimal deployment prefer the Galaxy S3?

 

Furthermore, any Sensorly signal measurement would take into account the downlink only. But that tells only half the story. Strong downlink reception combined with weak uplink transmission equals poor service, though the Sensorly downlink data would suggest otherwise.

 

AJ

Share this post


Link to post
Share on other sites

Location bias shouldn't matter if only multidevice datapoints are used. If an area has only datapoints from one device, it should be ignored.<br /><br /><br />Also, can't a reasonable examination of uplink performance be taken from FCC documents

Share this post


Link to post
Share on other sites

Location bias shouldn't matter if only multidevice datapoints are used. If an area has only datapoints from one device, it should be ignored.

 

You are missing my point. People who live in markets with larger site spacing might disproportionately prefer device A, while people who live in markets with tighter site spacing might disproportionately prefer device B. That could be unlikely, but if true, it would constitute bias in the sample set.

 

Also, can't a reasonable examination of uplink performance be taken from FCC documents

 

Well, we already use those FCC OET uplink stats to project RF performance. But Bob's proposal is to use Sensorly data instead. Additionally, uplink/downlink performance really need to be assessed holistically. The LG Viper, for instance, offers healthy max ERP/EIRP, but reportedly offers rather mediocre performance.

 

AJ

Share this post


Link to post
Share on other sites

You are missing my point. People who live in markets with larger site spacing might disproportionately prefer device A, while people who live in markets with tighter site spacing might disproportionately prefer device B. That could be unlikely, but if true, it would constitute bias in the sample set.

Anecdotal evidence: people living in my (rural, widely spaced) area tend to prefer devices over others due to perceived signal strength and quality. The Motorola Photon sold disproportionally well compared to some more urban areas (yes, I checked the sales numbers). We've had a few EVO LTEs returned for similarly weak perceived service.

Share this post


Link to post
Share on other sites

All in all, Sensorly's gathering methods are very sound. Although there is variation between devices to some degree, the results Sensorly reports via crowdsourcing is superior to that of the carriers. Especially Sprint LTE coverage maps. Which are grossly exaggerative.

 

Robert via Samsung Note II via Tapatalk

 

 

  • Like 1

Share this post


Link to post
Share on other sites
The Motorola Photon sold disproportionally well compared to some more urban areas...

 

I am not surprised that the Photon sold better than urban areas. City real estate tends to be expensive.

 

;)

 

AJ

Share this post


Link to post
Share on other sites

You are missing my point. People who live in markets with larger site spacing might disproportionately prefer device A, while people who live in markets with tighter site spacing might disproportionately prefer device B. That could be unlikely, but if true, it would constitute bias in the sample set.

 

I think we're talking about different methodologies of comparison. What I'm saying is that the difference in the measurements between devices in a specific locations is what should be analysed. Say, on a busy stretch of highway, users of several different devices have sent in signal strength data points. The best way would be to take each point and compare it with all of the others within a certain radius, but it could be done more simply by dividing the highway into blocks. The blocks would be large enough to contain at least one or two points from each of the devices, but small enough to have a fairly consistent 'actual' signal strength. The average for each device within each block would be compared to the averages for the other devices in the block. Then, those differences could be aggregated and analysed.

 

The bias you're talking about wouldn't have any effect other than reducing the total useful dataset.

  • Like 2

Share this post


Link to post
Share on other sites

I have the data. I have done dozens of tests on 10 Sprint devices including radio performance for 1x, EVDO, WiMax and LTE. I was prepared to do an article series on my testing, but now I'm holding it because of the Jelly Bean update on the EVO seems it may have altered the LTE connectivity with weak signals. The EVO signal in LTE used to bounce around by 10-12dBm when it got weak. And it was impossible to keep weak LTE signals unless it LTE only mode.

 

That being said, I can tell you in my radio performance testing, that the EVO LTE is not a top RF performer. In 1x and EVDO, my testing yielded middle of the pack performance. In LTE performance, my testing of the EVO was pretty good with good signals, but near the bottom of other LTE devices with midrange and weak signals.

 

But this may have changed with the improvements in Jelly Bean. Because of the signal strength is no longer bouncing around, it will likely keep the LTE connection and perhaps even improve in performance.

 

Robert via Samsung Note II via Tapatalk

 

I have noted a couple of times, since my EVO LTE's upgrade, that the phone holds an LTE connection in places where it used to lose it. Of course that result might or might not be because of the upgrade, and the incident frequency hasn't been high enough in enough different places to draw conclusions.

 

I was hoping Sensorly would have enough data in some geographic locations to make comparisons at least interesting data points, if not definitive.

 

Bob

Share this post


Link to post
Share on other sites

Purple in Howell, Michigan!

  • Like 2

Share this post


Link to post
Share on other sites

I just want to thank our sources here at S4GRU. You have to hide in the shadows and never get the public accolades you deserve. Thanks for all you do you to give us heads up about things like Howell. There is no S4GRU without you. Thanks for all of it.

 

Robert via Samsung Note II via Tapatalk

  • Like 18

Share this post


Link to post
Share on other sites

Somebody was in the Wendy's restaurant and turned Sensorly on for a moment. Too bad they did not leave Sensorly plot their path as they departed. We need more Sensorly users and the users we do have need to use it as much as they reasonably can. Sensorly does not help if it is not being used.

  • Like 4

Share this post


Link to post
Share on other sites

Somebody was in the Wendy's restaurant and turned Sensorly on for a moment. Too bad they did not leave Sensorly plot their path as they departed. We need more Sensorly users and the users we do have need to use it as much as they reasonably can. Sensorly does not help if it is not being used.

Heck, I've been driving/biking/walking around with it running just to expand LTE map boundaries and fill in the gaps! Gets me out and about to see the area more, and kind of fun to see the coverage map fill up with purple!

  • Like 2

Share this post


Link to post
Share on other sites

Heck, I've been driving/biking/walking around with it running just to expand LTE map boundaries and fill in the gaps! Gets me out and about to see the area more, and kind of fun to see the coverage map fill up with purple!

I remember pushing my son around in his stroller farther and farther from our home just to be able to test the app and how the maps were updating :)

  • Like 6

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • large.unreadcontent.png.6ef00db54e758d06

  • gallery_1_23_9202.png

  • Posts

    • And its back up today. Speeds seem good for 10x10. I did test calling, I was able to get my phone to ring while on SA N71 but no audio passed either way. Sent from my SM-G986U using Tapatalk
    • Wrong perks program. This is akin to the Sprint Works program, where Sprint and your company have a bit of a deal that gets you a discount on your line. For example, I have a $5 discount for the time being thanks to the Sprint Works deal with my employer. 
    • The *vast* majority of markets had at least 60Mhz deployed, more if you count small cells and magic boxes. There were only a few areas where they were limited to 40Mhz due to spectrum issues.
    • I was in Greenville, NC last week and finally go to experience T-Mobile's network somewhere that wasn't New York for an extended period of time. Just like NYC, T-Mobile's 600MHz spectrum is extremely underdeployed. There are about 15 T-Mobile sites in the entire city and T-Mobile has deployed 600MHz on only 1 of them. Sprint on the other hand has tri-banded nearly every site. Another issue is that Greenville has pretty bad site density on all networks but the issue is exacerbated on T-Mobile by the fact that they lack Band 12 licenses for pretty much all of Eastern NC. The only lowband license they have is 600MHz and they haven't really put it to use at all. The good news is that speeds weren't that bad at all. Speeds were around 20Mbps on average with peaks of about 65Mbps. Deploying 600MHz likely isn't a priority for T-Mobile in Greenville and the towns surrounding it because on their end the network performance looks fine however, deploying 600MHz would fill in a lot of coverage gaps and give a much more reliable signal overall. Until then, Verizon and AT&T will have T-Mobile beat on coverage thanks to them having lowband on every site and U.S. Cellular will have them beat because U.S. Cellular has a bunch of extra sites that no other carrier uses in the region that provide them with much better coverage than any other carrier in the area.
    • Think T-Mobile will bid on this 5G spectrum?: https://www.defense.gov/Newsroom/Releases/Release/Article/2307275/white-house-and-dod-announce-additional-mid-band-spectrum-available-for-5g-by-t/ “The White House and Department of Defense announced today that 100 megahertz (MHz) of contiguous mid-band spectrum, in the 3450-3550 MHz band, will be available for 5G by the end of the summer.“ More info: https://breakingdefense.com/2020/08/pentagon-gives-up-huge-slice-of-spectrum-for-5g/ “The White House formally made the request in April. Roughly 200 technical experts from all four armed services, the Office of Secretary of Defense, and the White House Office of Science & Technology Policy studied the problem for 15 weeks. The FCC, which has already endorsed the plan, will start auctioning the spectrum off in December 2021, Kratsios said, with commercial use beginning “as soon as mid-2022.”” I’m sure AT&T/Verizon will fight hard for it.
  • Recently Browsing

    No registered users viewing this page.

×
×
  • Create New...