Jump to content

MrZorbatron

S4GRU Premier Sponsor
  • Posts

    842
  • Joined

  • Last visited

Posts posted by MrZorbatron

  1. it is filling out a form on the IRS website and you get your EIN at the conclusion. When I got mine several years ago I think they had to mail it to me but it was quite easy besides that.

     

    EDIT- As a sole proprietor it is very easy.

    You get a PDF copy immediately and then it comes in the mail within a few days. This is true even as a C Corp.

  2. My understanding is that it only uses as much spectrum as you want the end user to have access to. So if you want each end-user to have 10 mhz then you only deploy 10 mhz. Now backhaul is a big beast with this, but if I am understanding his claims correctly, sprint could basically get away with using almost none of their spectrum and still have more end user bandwidth than they have now. I will definitely be following this.

    actually, backhaul won't be as big a beast as you might think it would be. With the processing they're using their data center for, it looks like this would do double duty with its backhaul, just like it would do with his radio signals.

     

    I, on the other hand, am talking about spectrum reuse between technologies. Namely, this network, and conventional LTE networks.

  3. His video claims that a phone can switch between p-cell and cellular. See around 45:00. How it does it, I don't know.

    Yeah, I saw that, but didn't see details. LTE and CDMA2000 hand off too, but it isn't a handoff of equals. Same with LTE and UMTS, or UMTS and GSM. 800 MHz LTE hands off with 1900 gracefully, but now we are talking about two different bands. If this is the case, and I am over at Sprint trying to deploy this, will it use up all of my 2500 MHz spectrum and preclude me from deploying conventional LTE on anything but 800 and 1900?

  4. Thanks for the link, it provides slightly more information than the initial demo video posted yesterday, even though we'd love to know more about the underlying technologies.

     

    ...

    Very slightly more, unfortunately.

     

     

     

    Here's a simple question...  How will this work in conjunction with more broad-reaching LTE solutions?  The system as presented would work well in a city, but not in the middle of nowhere.

     

     I live in a state with large rural areas which are covered by sites spaced at the limits of their radio technologies.  I see lots of areas with 5-10 mile site spacing with 300-400' antenna heights, whose radios are running at their maximum possible design power.  Some of these cells should be split due to changes in capacity requirement, but it rarely happens.  This leads to LTE speeds of pretty much ether 25 Mbps or 5Mbps, with little in between.  Either the cell is overloaded but not quite enough to be split, or it's just sitting there providing great coverage to the farmer's cattle and seeing 10-20 actual device connections per day.

     

    If this truly is compatible with conventional LTE technologies at the phone's end, how will it handle the handoff with conventional networks?  I would hate to see a separate band required, but would hate even more to see a zone of alienation between the two LTE technologies to prevent interference.  Even though your system is intended to leverage interference to its own benefit, I cannot see it being able to do so with that which is caused by neighboring conventional cell sites, as the signaling would not be precisely known.  This could present some major spectrum issues, thereby substantially increasing operator expense.  Frequency reuse is a headache now.  I would hate for it to become a migraine.

  5. I don't know if I would call bs on this just yet.  If it is for real then watch out.  This guy has been behind some nice projects in the past and it they have some pretty slick demos.  That said there is a lot of vaporware in the world, with any luck this will be concrete.

    And there has been some nice vaporware with his name on it.  Like I said, I would like to see this work, and if it does, for someone to put it into service.

     

    It just feels too much like a Quixtar seminar to me, especially in how much is talking tells so little about what's really (or allegedly) going on.  I wonder how many radios one will have to send to hit gold diamond whatever status.

  6. I'm calling BS on this. I've never had my B25 LTE pings higher than 75ms. They are usually always sub 58ms. Lately, they've been down in the 30ms range. As for EVDO pings, pre NV they were always high. Now they're mostly below 100ms unless it's an overburdened site/sector or a legacy site.

     

    I see 65-120 generally on LTE, 80-140 on EVDO EHRPD and about the same with Rev.A.

  7. Simple overloading.

     

    attachicon.gifuploadfromtaptalk1392876927980.jpg

     

    These are normal LTE speeds for me in Northwest/west San Antonio. None of the speeds shown had an LTE signal strength less than -85. And if I fall back to 3g, forget about it. Does anybody know what kind of issues we could be dealing with with these speeds? Backhaul, 1900mhz spectrum being saturated possibly?

     

    I should add that I kept changing servers in an attempt to get better results, but it ended up making them worse.

     

    Sent from my Nexus 5 using Tapatalk

    • Like 1
  8. Unless someone else is using those same Powerwave antennas, I'd guess that's a Samsung high capacity NV site.

    I see nothing like that here.

     

    I don't see anything that strikes me as NV equipment. Have you checked out the NV equipment spotting threads? Ericsson NV equipment is very easy to spot once you've seen it.

    1 below center rack looks like Sprint legacy.

  9. Which is really unfortunate for those of us in West Michigan. While I agree that Band 41 provides more than enough capacity for the medium to long term, in the short term we have limited deployment and a low penetration of Band 41 capable devices. If nothing else, Clearwire to ful. 800/1900/2500 NV conversions would help some of the cell edge and in-building woes we suffer around here.

    I absolutely HATED Sprint's in-building CDMA performance the last time I spent a few days in GR. Even when I had good signal, it dropped calls badly and didn't take more than a move of a few feet to turn 3 bars into 1 or none. Totally awful. It seemed to have improved last time I was out there, which was about a year and a half ago, bUt I wasn't in as many places or there as long as about 4 or 5 years ago.

  10. He is not going to release any technical details because people (read Chinese, Koreans) will clone it and he will have to stop them in court. As I said, let it play out. I will let those that are in SF, then NYC test it for me. Or the technical sites. 

     

    He could give a little bit of real information and just stay away from the sensitive parts.  I don't expect a book on it.  Enough content to fill an index card (unillustrated) would be nice.

  11. It is OK to be skeptical. As an old EE, trained in calssical RF, I was highly skeptical of multipath, QAM, cross-polarization, beam forming, etc. But they work. It could be an extreme case of beam forming!

     

    The man is putting his money where his mouth is. He is building a netwok in San Fran then, NY, then other large cities. The proof is in the pudding. Let it play out! 

     

    Well, multipath has yet to be used to any seriously helpful effect, though it has been pretty well mitigated in many cases on modern airlinks.  802.11n is the standard that comes closest to trying.  QAM has come a long way from what it was when it was first brought out, and has far more potential than was expected by most traditionalists.  I have no issue with cross polarization as used to retain orthogonality, though in my first post here, I said intersecting because I am not even sure the proper word to use.

     

    Pretty much it sounds like he is talking about a selectively phased and polarized signal that would be detected correctly only in a specific location, and the components of which would be encoded in such a way that in other locations and combined with other components of other signals, they would carry different data.  Kind of like a "beam forming plus" where you are actually using different parts of a signal in different ways.  Phased array radar came to mind for a bit while I was half asleep thinking about this last night.  I still can't see this working with a conventional LTE radio on the device, which he claims to be using.  The device would be extremely sensitive to physical orientation as well as position.  How would position be tracked with such granularity?

     

    I love finding ways to bend/break the rules. I do not have any problems with others who find ways to do things other than those that come to my mind.  I just like technical explanation and not technical-sounding buzzwords and "consumer-grade" high tech terms coming from professionals.

  12. I have a hard time with the fact that the people he is demoing this for are journalists and mostly not particularly technical. In the second link, it is stated that his machine sent a signal directly into a device. How do we know it isn't omnidirectional? How do we know that the signal in that mysterious little bubble around the device is at all dissimilar to the signal a few inches, feet, or yards away?

     

    Eight high definition streams to eight iPhones? Any old DWL2100AP could handle that. The 4K video? Do we trust that it's actually 4K and not just UHD? Either way, 450Mbps 802.11n setup could handle that before its morning coffee. Sounds like he's demoing wifi to me. Take the device apart ans post a few pictures. Are there specialized chipsets in there or is it Broadcom or perhaps Atheros?

     

    There's too much obfuscation and not enough explanation in all of this. Everything here reads like an attempt to keep us from understanding, perhaps to secure the investments of the idiot.

     

    Now, am I a total radio specialist? No. I am an electrical engineer with a hobby in radio, whose business includes high speed radio connections. I would just as soon believe in subspace radio as in this.

  13. This is complete nonsense... bullshit even.

     

    First off, backhaul requirements would be insane. Second, it would have to take advantage of some sort of intersecting polarizations or something, which would be insanely delicate to set up and maintain. Third, the processing requirements at both the head end/central office and the phone/device, would be huge. Fourth, it would need a lot more than a new SIM card, it would need an entirely new type of radio. Fifth, the speeds he is suggesting would require such a complicated modulation scheme that interference would become a huge factor, even something as simple as multipath would make it unusable.

  14. I don't use them, but I see where they could be useful on the road -- sort of like a gas gauge. Would love a RSSI or RSRP number twice the size.

    If I didn't want the bars... Well, CDMA Field Test is free. Two of my reasons for choosing Signal Check Pro were the simplified geolocation and the signal bars.

     

    As far as bigger notification items, that would really suck with the stupid huge 4G thing on the newer Samsung firmwares.

    • Like 1
  15. Ok, didn't see your chart there.  It pretty well agrees with my findings.

     

    Incidentally, any idea what each of Sensorly's 5 color grades means?  Seems their breakpoints are:

    > -75, > -85, > -95, > -110, < -110

    but this just "seat of the pants" feeling.

     

    Also, I do agree that the numbers are always accurate for everyone.  I just feel like the bars show much more usable of signal where there really isn't much.  I am divided on which I display more.  Sometimes the bars are easier so I don't have to look closely enough to read the tiny status bar sized numbers (please do not make them bigger, this is not what I mean).  I think the bar issue is especially true on some relatively LTE-deaf devices, Kyocera Torque, Evo LTE for two.

    • Like 1
  16. Are you referring to the SignalCheck icon bars? They are shown on a RSRP scale based on technical articles by AJ and others on here. If you have a suggestion or example, I am always open to adjustments. Having that as a user option seems unnecessary.. X dB is X dB, regardless of device or region.

     

    -Mike

     

    Well, it shows all bars down to about -90 RSRP or so.  I would personally define the scale as more like:

    6 @ > -75

    5 @ > -82

    4 @ > -89

    3 @ > -96

    2 @ > -103

    1 @ > -110

    0 @ < -110

     

    Or even:

    6 @ > -75

    5 @ > -83

    4 @ > -91

    3 @ > -99

    2 @ > -107

    1 @ > -115

    0 @ < -115

     

    I do understand that these bars are showing signal level on a scale relative to the levels required to get maximum performance, but I would really rather see it on a scale of absolutely how much is there.  A simple scenario:  I frequently use my phone to do certain types of file downloads.  I then plug it into the computer to copy the file over.  According to Sprint, this is a perfectly acceptable alternative to tethering.  These files are frequently device drivers, etc, and tend to be about 5-20MB.  Sometimes I do them in the car on the way somewhere.  Showing more signal range graphically would present more of an actual impression of how far I can go in an area with limited LTE coverage before my download will switch over to CDMA and then either stall permanently or continue at a very low rate.  If I see 5 bars when I start, I know I have at least a couple of minutes of usable signal before I get out of range, and odds are my download will be complete well before I get out of range.  This is just one example.

     

    Right now, it shows bars still while the signal is unstable to unusable.  Phones do not hold signal reliably when the program is showing 1-2 bars because it is borderline on usability.  Presence of signal on the very edge of usability should be indicated by zero bars.  It's like with the old phones, they even explained in the manual that while the tower symbol alone did indicate signal presence and communication with the control channel and calls may be possible, they will be subject to increased interference and the odds of disconnection are higher.

     

    Now, since I got the Note II, I am not seeing any of this allegedly highly superior RF performance.  It seems to me that it is very marginally better than the GS3 that I used to have.  It does seem to hold onto weak LTE better before giving up and going back to EVDO, though it does not achieve the same speeds and usability that the GS3 did on a weak signal of -115 or less.

     

    For example, right now approximately:

    >-102 shows 5 bars

    >-107 shows 4 bars

    >-111 shows 3 bars

    >-115 shows 2

    <-120 is zero

     

    Incidentally, I do know that decibels are logarithmic, but the signal indicators on the vast majority of phones are not calculated with that taken into account, and the rate of falloff is numerically pretty linear with distance, anyway.

  17. Would it be wise at this point to fork SignalCheck?  Maybe a SignalCheck One app for 3GPP and/or 3GPP2 single radio path devices, a SignalCheck Two app for SVDO and/or SVLTE devices?  Or can that be seamlessly accomplished inside just one app?

     

    AJ

    That could probably be done with a menu selection for device type.

     

    I would like an option to adjust the LTE dbm/bar algorithm because it always shows too many bars for the level of LTE signal present when the status bar indicator is set to bars and not numbers.

  18. The top one kind of looks like some Nextel iDEN panels I saw that were installed very late in life as replacements or at new sites. Like post 2008. Do you see these often?

     

    Robert via Nexus 5 using Tapatalk

     

    Also looks a lot like what Sprint used to use (still does on some GMOs) in a few places near me.  Two ports at the bottom of a narrow, shallow unit.

  19. That shouldn't be a problem unless you want to stream hd video to your 80 in tv. If so then you should pay more.

     

    Jim, Sent from my Photon 4G using Tapatalk 2

    I've run 480 line video of Nova and Frontline from PBS.org on my MB855 via HDMI to my television over CDMA. Picture actually isn't half bad. It isn't like I do it all the time, though. That day, my internet connection took a dump while I was watching Nova online, and I was really sick. I think I used about 4 hours and hit about 1.5GB on that line all month.
  20. ...

     

    Licensing and coordination is pretty easy, you just need to do it.

     

    ...

     

    FCC is easy, just paperwork and waiting.

     

    Problem is that there are too many people who would "play" with this hardware without knowing (or often without caring about) the potential implications. They might not know what we do, being both in the telecommunications industry, but still know enough to make it work (or at least cause enough trouble attempting to do so).

     

    I read some of ARRL's copies of FCC enforcement actions. It's really quite amusing to see the amount of havoc one can cause with a simple MF, HF, or VHF rig at a couple hundred Watts.

    • Like 1
×
×
  • Create New...