week-9

Wireless Broadband

I’ve always wanted to learn the differences between the different types of wireless broadand networks, especially with 5G on the horizon. This week is the perfect week to do a bit of research and write it up here!

1G

1G, not surprisingly, was the first wirless network to be rolled out in the 1980s. 1G is an analog signal based system that first launched in Tokyo Japan in 1979. When someone speaks into a handset on a 1G network, their voice is modulated to a really high frequency and then sent to a radio tower for transit. This means you could dial in to other peoples calls if you were savvy enough.

2G

2G was a lot like 1G except that instead of using an analog signal, it uses a digital signal. It was launched in Finland in early 1990s. Instead of just modulating the callers voice to a higher frequency like in 1G, in 2G a callers voice is digitally encrypted so that the call cannot be intercepted in a meaningful way, only the receiver would be able to unscramble the messaging.

2G networks went through a series of partial version bumps:
2.5G implemented a different type of routing algorithm (packet-switching instead of circuit switching).It didn’t end up improving the service too much, although switching from circuit to packet for high traffic networks should generally free up resources so perhaps improvements were minor

2.75G Improved data transmission rates by using a different type of packet encoding.

These 2G networks were rolled out in the Americas and many were only decommisioned in the last couple years. For example, AT&T’s 2G network was fully decomissioned in 2017, Verizons this year in 2019.

3G

An internet standard called the International Mobile Telecomunications-2000 specifications (IMT2000) were formulated and became the backbone of the 3G network. With the rise of mobile internet, companies jumped at the change to implement a universal network layer that could send and receive internet sized packets at a reasonable (for the time) speed.

The IMT2000 standards guarnateed a speed of no less than 0.2 Mbit/s. A company could not claim to be selling a 3G network that did not satisfy this requirement.

The first 3G network was again launched in Japan in 1998. It caught on to large companies liek verizon and at&t in the early 2000s, however because 3G at times uses different frequencies and potentially new equipment, companies were slow to adopt due to the need to build new infrastructure when upgrading from their previous 2G towers.

3G offered better security standards, making sure it authenticated the network it was connecting to before beginning transmission. It also used a different cipher to encyrpt its messaging, however this cypher (KASUMI) later was found to have weaknesses.

4G

A group called International Telecommunications Union-Radio communications sector (ITU-R) came together to create the specifications for a 4G network. A 4G network must be able to transmit at 100megabits per second on a high mobilty area (train, car, etc), and 1 gigabit per second stationary. The network must be an all IP based packet-switched network. It also had a few specifications around smooth transitions between networks and suggestions about how resources should be shared to optimize for a maximum amount of concurrent users.

The first 4G network is sort of hard to pin down, as many different countries were able to demonstrate 4G speeds in certain test networks but again Japan is at the top of that list in the mid 2000s. The 4G networks began to roll out in the US in 2008/2009, Sprint Verizon and AT&T being among these players.

The early 4G didn’t actually meet the speed standards set out by the standards commission, but the implementation was in place and increased to reach those standards over time. This is the standard that most cell phone companies use today.

A common misconception is that 4G LTE is a better or equal version to 4G. This turns out as a way for slimy mobile companies to advertize something with 4G int he title without actually having to reach the 4G standards set out by the standards commission because it has LTE tacked onto the end.

LTE-A might be the closest thing on the market to true 4G standards but it still does not reach them in full.

5G

5G is not yet out, but optimists are hoping to see a network roll out in 2020. The IMT-2020 (5G spec) touts speeds of 20Gbits/s.
5G networks achieve this high level data transfer speed by using higher frequency waves between 700 MHz and 3 GHz. Because of this high frequency transfer rate, things like augmented reality and VR for mobile become a much more exciting prospect than they would be on 4G.

There are some issues with the waves that are used though, they have trouble passing through dense gasses or buildings, so things like antennas may begin popping up more frequently across cities like on top of power poles and sides of buildings to receive and transmit the 5G signals. This style of infrastructe is called a Many Input Many Output system, or a MIMO system. Companies leading the charge here in the US are Qualcom, Nokia, Cisco and Samsung, some heavy hitters to say the least.

The tax payers have been dumping plenty of money into the pockets of many of these large telecommunications companies, so it would be a real dissapointment if we didn’t begin to see some of these systems begin to roll out over the next few years, although I wouldn’t be surprised.

These paragraphs were researched from:
https://en.wikipedia.org/wiki/1G
https://en.wikipedia.org/wiki/2G
https://en.wikipedia.org/wiki/3G
https://en.wikipedia.org/wiki/4G
https://en.wikipedia.org/wiki/5G

Lecture

The lecuteres this week were a bit harder to follow, as they were a bit less lively than the others. The instructor seemed to be reading from the slides a bit more than in previous classes however, the lecture topics were a nice cherry on top. When I was listening to Professor Ruiz disuss the FakeInstaller malware, I realized how far I’d come in this class. He identified the malware from Russia as a polymorphic, and I actually knew what he was talking about! The malware would dynamicly change itself on each install so it would be incredibly hard to pin down from a virus protection perspective.

I also thought it was interesting hwo he discussed mobile malware trends over time. Mobile malware in the late 2000s was microscopic copmared to the boom tha happend in the early 2010s. This likely is because of the addition of mobiel stores as well as the surge of 4G network availability.

While I find the topics of mobile security interesting to a degree, I think the biggest takeway for me is that mobile users are far more likely to tap or click on interesting looking things because there is so much less taboo around mobile malware than there is against deskop viruses. For years it’s been beaten into our brains to please don’t open any sketchy attachments or click on any weird links, but that level of caution is somewhat dampened with the pocket computer model. Everything is quickly available on mobile these days and while some phones take precautions like defaulting to communicating in https for example, users are (IMO) even more vulnerable to phising and fake signup scams because it just seems like less of a risk to tap on interesting things on your phone. App store’s can only do so much to flag bad maleware. Espcially on android where anyone can upload applications, its just too easy to push malicious code and have poeple download it thinking its been 100% vetted by google or apple.

I’m looking forward to getting back into hacking the box for the final so I will leave you now. Thanks for reading this quarter! It’s been a blast!