Saturday, November 15, 2014

14 Things People Get Wrong About Technology

I often read things where I think, "Why did they say that? They are lying to people, and if they don't know it, then they don't know enough to be writing on it!" Sometimes it's just a common misconception, sometimes it's marketing, and sometimes it's just "tech" authors who are reading a bunch of articles all at one time in an attempt to seem like they know what they're talking about. But the worst part is that if enough people follow what has been written or said, they begin to believe the lies too. Here are some of the more obvious ones to me...

1) Macs Can't Get Viruses!
They can too! This has been floating around for a long time, and many people still believe it to be true. I have wrote about this before, but this is a biggie. The reason so many people believe this is for the simple fact that they are uncommon. The most widely used OS in the world is Windows, and if you were someone who likes to create viruses, then which OS would you choose to make viruses for? Exactly. Again, I've also said this before, but Macs used to have a phrase on their site referencing that they did not get viruses. It was subsequently taken down soon after a new virus made itself public and was Mac "friendly".

2) USB 3.0 SuperSpeed+
The actual SuperSpeed name was designated for USB 3.0, which could theoretically do 5Gbit/s. SuperSpeed+ is the name used for USB 3.1, which doubles the speed to 10Gbit/s. The two are meant to be distinct from one another, but more often than not, people confuse the terms (and reasonably so). A large amount of people think SuperSpeed means USB 3.1, while others think the name has just remained the same with a bump in speed. I doubt manufacturers, especially Chinese ones that churn out the cheap devices, are going to be well-informed on this matter; making it harder to just look at something and know if it's 5Gbit/s or 10Gbit/s throughput.

3) Macs are Better For Designing
I am not trying to pick on Macs (and reuse past items already discussed), but this one confuses me still to this day. I continue to meet people who believe this and it is frustrating to say the least. Firstly, there is a difference between technical betterment and personal preference. Just because you prefer something, doesn't actually make it better.

Secondly, somehow this rumor has become a non sequitur for Macs. For example, I will be discussing something or other about Macs with someone. The other person will say, "Macs are better for graphic design." I have learned better not to argue, and instead pose a basic question, "How are they better?" Usually I will get an ambiguous response, or just a reiteration of the first statement (if I'm really lucky, they don't know). I then follow up with, "What specifically about a Mac makes it better?" By this time the other person normally has to give up. If they continue by stating hardware, I continue by asking how that would make them better.

Essentially, hardware could be the only logical thing to make a Mac better at design. But just because it is logical does not make it true. All the hardware in any Mac is available for PC's. Not only that, the same hardware is less expensive for a PC, and there are usually more and better options [at least when first introduced] hardware options available for a PC. The only time I read anything to the contrary was that way back in the day graphic card bandwidth was actually slightly better on Macs, but that nowadays the reverse is true...

Adobe doesn't make Photoshop and say, "Let's give Macs the better version." They come with the same features, end of story. What I'm saying is that there is no reason to think Macs are better for design unless it is your personal preference.

4) It's Better to Keep a Computer Running (Never Shutdown)
This is a myth that countless people just blindly believe. I'm not sure why, as it is just ignorant. Sorry, it is. This isn't to say there's no reason to keep your computer on for lengthy periods of time. The most obvious are: You're working on it, you will be back in a short time, or you have a big download in the works.

The fact is that shutting down your computer saves you energy and the life of your computer and its components. There is no harm in shutting down a computer if done properly. There may be risk if constantly turning it on or off, but that would again lend to stupidity on the user's part.

I should also mention that the computer still draws a small amount of power when off, so some people may think that if you're going to use energy, why not have it on instead. This is true of anything plugged into a power source. If you actually unplug a device, instead of just turning it off, you will save that much more power. This is where a surge protector (with flippable switches for each additional outlet) comes in handy.

5) Base-2 & Base-10
You may not know what base-2 (binary) or base-10 (decimal) is, but chances are you have come across them more times than not. Both are methods of calculation, and one is what hard drive and SSD manufacturers love to advertise with. The easy way to explain this is that base-10 calculates 1MB (Megabyte) as 1,000KB (Kilobytes). Base-2 calculates 1MB as 1,024KB. So a 3TB hard drive would be 3TB according to base-10 calculations. But once you put that 3TB into a Windows OS (Macs have been changed to display drives as base-10 for a few years now), it should show something closer to 2.78TB. That's because Windows is using base-2 to calculate. (My actual 3TB shows 2.72TB, so there must be more things at play than just the calculation, i.e. manufacturer margin of error for storage space, reserved space for host system, etc.)

There are valid reasons to use base-2, but the main point is that while most of us think in terms of base-10, computers often do not. This also means that you are not losing any storage space when buying and using a SD card, USB flash drive, hard drive, or SSD advertised in base-10.

6) Java vs. JavaScript
Programmers may get upset about this with people who care little about programming and use the terms interchangeably. While both are OOP (Object Oriented Programming) languages, the name is merely coincidental. Java was made by someone from Sun Microsystems, while JavaScript was made by people at Netscape and was originally called LiveScript. A major difference is that Java can make standalone applications, while JavaScript is required to be inside a HTML document.

And yes, that is the correct way to spell JavaScript. 

7) Brick or Bricking a Phone
This is more a pet peeve of mine, but I see and hear it all the time. A lot of people will say that they "bricked" their phone, but it still operates in some capacity or another. Or that they bricked it and brought it back to life. The problem is that what they are actually talking about is a "soft brick". Bricking, or a "hard brick", means that your phone is just like a brick. It does nothing. I have soft bricked my phone on numerous occasions, but I've never bricked my phone. So if someone is using a phone and they tell you that it was bricked at some point, it's much more likely they soft bricked it instead of opening it up and actually fixing it.

RESOLUTIONS SECTION:

8) HD vs. Full HD
HD stands for high definition and refers to 720p up to 1080p (possibly 2K, as it doesn't seem to be a part of UHD). Full HD stands for full high definition, and refers to 1080p. This is important because of some of the other terms used below.

9) 720i
1080i is a common broadcast format for HD broadcasts. 1080i is interlaced and 1080p is progressive. The latter has better quality because of how it produces the picture. There is also 720p (arguably better than 1080i as well), and there a lot of people who reference 720i. These work on the same principles except for one major difference, 720i does not exist.

You may be able to produce a 720i video through software, or have some option for it on older TV sets, but there are no source devices that should be able to create 720i natively.


10) qHD vs. QHD
This is another thing I have already mentioned in the past, but should be mentioned again as I believe many smartphones will be sporting QHD screens in the next few years. QHD means Quad High Definition. It is sometimes confused as 4K, as many think that quad hi-def would be four times 1080p. Wrong, on at least two levels (one explained later)...

QHD refers to four times 720p. And QHD is actually short for WQHD, which is Wide Quad High Definition. For those looking for an easy number to use instead, it's 1440p.

qHD means quarter High Defintion. To make things more confusing, and in contrast of what I just stated above, this is a quarter of full Hi-Def. The only problem I have is that some people are writing qHD as QHD, when it is not. But it's understandable since the two actual meanings can be quite confusing.

11) 4K vs. 1080p
This is a simple one getting confused by loads of people, even friends I know who are quite knowledgeable when it comes to technology. People assume that 4K must be four times that of 1080p. Nope. 4K is actually almost 4 times 2K, and 2K should be slightly larger than 1080p (sometimes referred to with the same height). So, if anything, it should be a bit larger than six times 1080p.

12) UHD vs. 4K
UHD is Ultra High Definition. People are using 4K and UHD interchangeably, and this too is wrong. While 4K is UHD, UHD is not just 4K. UHD also includes 6K and 8K. So 4K TV's are really just partial UHD. It's a lot like having to make the distinction between a HD (720p) and a full HD (1080p) TV. You can trust me that I have had to go through the HD situation before, making it difficult for a salesperson to understand what I was actually looking for [1080p].

13) 4K UHD TV's Are Not Real 4K!
This is not something that bothers me, but is good to know if you really want to squabble over what is what. 4K was not created by TV manufactures, it was created by the Digital Cinema Initiatives consortium. The actual dimensions are 4096x2160 (note the "4096"), and actually has a special encoding standard as well. While there are some TV's that can show content in 4096x2160, most TV's advertised as 4K only do 3840x2160. So, in reality, most "4K" TV's cannot even support a height of 4000...

The Consumers Electronics Association decided to create the term UHD in 2012, several years after 4K had already been coined. This included any content of 3840x2160 and up. Thus, this is where the 4K confusion comes about. Many times TV manufacturers will try to state their "4K" TV's as "4K UHD" TV's, which is meant to be a sly way of saying that their 4K is not the real 4K.

If you want to annoy people who have fake 4K televisions (and won't admit it), you can make the point of saying that your 1080p TV is a 2K TV since it too is almost the same dimensions.

What I find weird about this is that suddenly someone(s) decided instead of using the width as the indicator, such as 1080p (1920x1080) or 720p (1280x720), that height should now be used...

14) 2K is What Dimensions?
2K seems to have varying dimensions, so this is one instance where you can be right and wrong at the same time! In digital video 2K is 2048x1080. This is just a tad taller than 1080p video. I normally use this as my reference since I deal in digital video. However, there are plenty of others:

The "reference" resolution is 2048x1536. The full-aperture resolution (I see now being used on smartphone screens) is 2048x1556. There is also the HDTV 16:9 aspect ratio resolution, 2048x1152. And finally, 2048x872, a Cinemascope 2.35:1 aspect ratio resolution. Maybe this is why it is hard to class 2K as either part of HD or UHD...

GEEK BONUS:

Zelda, the Hero!
No, she's not. If you have ever heard of the Legend of Zelda, many people assume that Zelda is the name of the hero in the game. Actual players of the game know otherwise. Link is the name of the hero, while Zelda is the princess he must save.

No comments:

Post a Comment