Explainers Archives - GadgetMatch https://www.gadgetmatch.com/category/sections/features/explainers/ We help you find the right gadgets to match your lifestyle Thu, 09 Feb 2023 08:18:55 +0000 en-US hourly 1 https://www.gadgetmatch.com/wp-content/uploads/2018/01/cropped-GadgetMatch-Icon-transparent-512x512-1-32x32.png Explainers Archives - GadgetMatch https://www.gadgetmatch.com/category/sections/features/explainers/ 32 32 ChatGPT Explained: Should we be scared of AI? https://www.gadgetmatch.com/chatgpt-explainer-language-learning-should-we-be-scared/ Thu, 09 Feb 2023 08:18:55 +0000 https://www.gadgetmatch.com/?p=151789 Will the talking robot take over the world?

The post ChatGPT Explained: Should we be scared of AI? appeared first on GadgetMatch.

]]>
Back in the earlier days of the internet, an emerging but short-lived trend involved chatbots who could generate conversation with whomever it talked to. Does this sound familiar? Today, a similar phenomenon is creating a lot of waves online, headed by the infamous ChatGPT. The exceedingly popular ChatGPT is turning heads out of fear that the technology will eventually upend society and eradicate a lot of jobs.

But what exactly is ChatGPT? How is it different from language programs in the past? Is the world right to worry about them?

On the rise of language learning

ChatGPT is hardly the first software to inexplicably generate comprehensible dialogue without human intervention. Decades ago, the internet hosted rudimentary versions of today’s chatbot technology. The concept is somewhat similar, though. The early versions relied on a database of responses from human users. If you asked about coffee, for example, the answer you get will likely come from the logs of another user who talked about coffee in the past.

Because the system was imperfect in its infancy, part of the appeal was trying to get the software to fumble a conversation. However, if it did mess up, you can count on it asking you what it should have said. The next time someone asks the same question, the software might mirror what you said, creating a learning process between the software and the user.

Today, chatbots — meaning those usually used by businesses today — operate in the same way. If a customer comes with a query, the software will rely on a set of responses to most appropriately address the user’s problem. If the software can’t come up with a solution, the ball usually gets passed on to a human consultant.

Is ChatGPT just another chatbot?

Though the label certainly gets thrown around, ChatGPT isn’t strictly a chatbot. Instead, the software uses GPT-3.5, a specific language model created by OpenAI. Whereas early and more rudimentary versions of the same technology can already store an unbelievable amount of information in its memory, ChatGPT can analyze billions of words and the relationship between them.

Further, OpenAI extensively trains the software, ensuring that comprehension and grammar can live up to today’s standards. The learning is supervised. In fact, the company even has a makeshift reward system to ensure that the software puts out the most appropriate response. With users also contributing to the software’s learning process, ChatGPT is quickly emerging as a powerhouse for the technology.

The results speak for themselves. While users can generate simple conversations with the software, ChatGPT can just as easily answer more extensive queries with lengthier responses. If you ask it to create an essay about Christopher Columbus, for example, it can write a lengthy piece that can easily fool a casual reader. It can even handle more speculative queries. In a sample published by the developer, ChatGPT can answer what would happen if Columbus discovered America in 2015.

What’s it good for?

Based solely on what the software can do, ChatGPT can find its purpose in today’s world. The software can improve voice assistants and chatbots all over the internet. It can make big strides in the world of automation, enabling a more responsive interface between user and software.

On a more human aspect, the software can also handle more professional jobs with simpler prompts such as those involving simple marketing copy. It can help with more ephemeral research efforts, allowing users to get simple answers for otherwise complex questions.

And, on a more technical side, ChatGPT can reportedly analyze and detect what’s wrong with a piece of coding. With the software, developers can use ChatGPT to potentially repair code without having to pore over every single line. Allowing a powerful tool to inspect code speaks volumes for a lot of applications all over the world including smart vehicles and technical machinery.

However, as with every piece of technology, users will always find a way to use something beyond what it was originally designed for. ChatGPT is now changing the world of education as students are using the software to do their homework for them. Though a lot of the sample texts look like they can fool only lower levels of education, a Wharton business school professor (via Business Insider) recently stated that he would have been fooled by a ChatGPT essay, grading a sample with a passable grade of B or B-.

Should we be scared of ChatGPT?

ChatGPT is undoubtedly rocking the world of education. While some schools have banned the technology outright, others are debating on the software’s impact on how schools are taught. Since ChatGPT deals out more factual information, could education reinvent itself to teach more personal, tailored learning, rather than just the ability to spit out memorized facts. (“Factual” might even be an exaggeration. CNET, which recently experimented with AI-written articles, discovered a plethora of errors from using the software.)

Now, education isn’t the only world in peril. The creative industry is facing an extreme challenge wherein ChatGPT can potentially cause workers their jobs. Though the danger certainly seems real, the limitations of technology are also real. ChatGPT can create comprehensible text that can fool a human, but it will likely stumble with conceptualization.

A piece of software is just software. Even if it can write an essay about existentialism, it cannot think of the concept metaphysically. In the same way, even if it can show you a photo of a parrot, it cannot think of that photo as anything but a pattern of pixels. To a language learning software, words don’t mean anything else besides their relationship with each other. It’s the same thought process as a dog learning to run to its human when its name is called. The dog doesn’t know that you just said its name (or even the mere concept of a name); it just knows to do a certain action after hearing a specific sound.

Can ChatGPT change the world? Overall, the jury is still out, but it’s unlikely that a piece of learning software can do much to replace human-centric work. Regardless, it’s important to think of how ChatGPT can improve (or detriment) humanity.

Like with other supposedly dangerous technology, the world of technology is a Pandora’s box. We can never put the genie back into the bottle. Once it’s out, it’s out. Instead of worrying about how technology can destroy the world, the more appropriate response is to figure out how it can better humanity without sacrificing anyone’s wellbeing in the process.

The post ChatGPT Explained: Should we be scared of AI? appeared first on GadgetMatch.

]]>
RAM Explained: The Unsung Hero of Smartphones https://www.gadgetmatch.com/micron-ram-explained-unsung-hero-of-smartphone/ Fri, 28 Oct 2022 13:00:56 +0000 https://www.gadgetmatch.com/?p=149076 There's more than just the chipset

The post RAM Explained: The Unsung Hero of Smartphones appeared first on GadgetMatch.

]]>
When you’re looking to buy a new device, which specs should you pay attention to? Which upgrades should you consider?

In this video instead of reviewing the latest new smartphone, we’re going to talk about its unsung hero: RAM.

We partnered with @MicronTech to help you understand all the magical things that you get to do on your smartphone thanks to internal memory and storage.

To find out more about Micron’s mobile memory and storage solutions and how they’re bringing mobile innovation to life, visit https://www.micron.com/solutions/mobile or watch our explainer video.

The post RAM Explained: The Unsung Hero of Smartphones appeared first on GadgetMatch.

]]>
Should you be excited for Apple’s satellite connectivity? https://www.gadgetmatch.com/apple-satellite-connectivity-explainer/ Sun, 23 Oct 2022 04:12:49 +0000 https://www.gadgetmatch.com/?p=147220 Fad or future?

The post Should you be excited for Apple’s satellite connectivity? appeared first on GadgetMatch.

]]>
If you watched the latest Far Out event from Apple, you might have noticed the debut of a new technology coming to smartphones: satellite connectivity. Though Apple has made the biggest deal of the new feature, the technology has existed even before Apple’s announcement. In fact, various smartphone makers are also announcing their own takes to go along with Apple’s satellites.

But what exactly is satellite connectivity? Before the world gets more of the new feature, let’s take a look at this emerging technology.

Look at the sky

The night sky is filled with satellites. Though you might not see any of them with the naked eye, they are all there in low Earth orbit (LEO). As you might imagine, their uses are aplenty. Most attribute LEO satellites to imaging, navigating, and measuring data. However, one emerging use is the improvement of connectivity for consumer devices.

Now, the biggest proponent of satellite internet is Starlink, a project of Elon Musk’s SpaceX. With over 300,000 satellites, the company’s satellite constellation provides internet to several countries including areas with low coverage. In fact, the company’s services are already somewhat in the Philippines.

Satellite internet does have its benefits. While the service isn’t the fastest, it offers connectivity that regular towers can’t normally reach. Imagine being out on a hike but you suddenly remember, oh my God, you left your stove running at home. Satellite internet will allow you to connect to the internet and notify a neighbor to check if your apartment has any wayward burn marks running up its walls.

Quite a feat, isn’t it? But is this what Apple just launched?

SOS, please someone help me

Contrary to what you might think, Apple’s new satellite connectivity doesn’t offer internet. It’s also not Starlink. Instead, it’s a simple SOS messaging service through the Globalstar satellite constellation. It won’t solve your faulty 5G service. However, it’ll help you in a pinch if you find yourself lost in the middle of nowhere.

Once again, imagine you’re out hiking. Suddenly, you fall down an unseen slope and break your ankle. You find yourself miles and miles away from civilization, and no one knows where you are. Apple’s satellite connectivity can help you send an SOS message to the authorities.

Here’s how it works. When you’re in a predicament, fire up the feature and point your phone towards the satellite nearest you. (Don’t worry; the phone will tell you where it is).

Now, it might take a few seconds to a minute before the phone can connect to a satellite — especially if the skies aren’t clear or if you’re underneath a canopy of trees. It might not even connect if the skies are completely obstructed. Regardless, while it tries to connect, the phone will ask a series of questions including who needs help and if anyone was harmed. This helps the phone craft and compress the necessary information for your message.

Because the service is only for emergencies, you can’t write an essay. Apple says that it will squeeze messages three time as small to ease transmission. It’s a wide-reaching 911 call for when you can’t actually call 911.

After compressing the message, the satellite will then beam the message to a nearby relay station on the ground, which will alert authorities for you.

Who can use it?

Naturally, only Apple users who have the latest devices will have the feature for now. Also, because it’s so new, it’s only coming to the United States and Canada with the upcoming iOS 16 update later this year. Of course, it might arrive to other territories soon after the initial launch.

China, however, is a no-go. Huawei launched its own satellite connectivity, packing the feature in the new Mate 50 series. If you’re in mainland China, you’ll have to use Huawei’s services.

Interestingly, Apple made it a point to say that the feature is free for the next two years. The implication seems clear: Users might have to fork over cash to keep the feature once the two years are up. Now, if you’re celebrating the coming of this arguably essential feature, the possibility of a paywall might leave a sour taste in your mouth. Should companies gatekeep who gets to send emergency messages in their time of need?

Passing fad or the future?

If you’re not a regular hiker, satellite connectivity might not appeal to you. However, it’s still interesting to wonder if the technology will make an impact outside of Apple. And it does seem that way.

It’s no coincidence that a few brands, including Apple and Huawei, have suddenly launched their own satellite connectivity features within a short span. Companies are putting a lot of money into the future of satellites. Very likely, Apple won’t be the last company to adopt the new feature.

Emergency satellite services are essential. Even if you don’t hike, you’ll never know when you might get into a precarious situation without cell coverage. Satellite connectivity mitigates that risk.

Now, how will the technology evolve beyond emergency services? Coupled with the efforts of Starlink, the early stages of satellite connectivity proves the concept of a satellite-laden future. Currently, a lot of services still struggle with the lack of towers in certain locations. The aid of satellites creates a future that won’t need towers everywhere. Though speed might be an issue, connectivity won’t.

That future is quite a possibility. However, it will also come with a host of questions. With space limited to only a handful of providers, will companies launch more satellites to address the potential need? How will space look like then? Will it just be a wasteland of used satellites? How much will everything cost? Though it’s coming, the future still has much to clarify.

Illustrations by Garel Perpetua.

The post Should you be excited for Apple’s satellite connectivity? appeared first on GadgetMatch.

]]>
DITO is all-in for the next generation of mobile connectivity https://www.gadgetmatch.com/dito-next-gen-volte-vilte-feature/ Thu, 25 Aug 2022 01:05:28 +0000 https://www.gadgetmatch.com/?p=130068 But what do they mean by this, exactly?

The post DITO is all-in for the next generation of mobile connectivity appeared first on GadgetMatch.

]]>
For as long as most Filipinos can remember, the Philippines has always had only two major mobile networks that they could choose from. This made the choice of network provider a bit simple but very limited for consumers. Recently, however, an actual third player came into the picture in the form of DITO Telecommunity.

While availability started back in March, DITO is making strides in terms of the services it’s offering. Currently, they are available in over 650 cities and municipalities across the country. Also, for the most part, these offers consist of high-speed data plans at an affordable rate. Where they want you to shift your attention to, however, is that their network also supports 5G connectivity. In their words, it’s the “next-gen technology” they want their consumers to experience.

So, what is this “next-gen technology” that they’re going on about?

Let’s review: the essence of 5G

The biggest thing about DITO’s new network service is their claim to bring “the real 5G” to the Philippines, at least according to DITO CTO Retired Major General Rodolfo Santiago. We had already talked about the whole 5G experience and what it brings to the table, so let’s not get into it too much. Basically, 5G serves as the next big thing in mobile connectivity, promising faster connections and wider coverage.

As such, telecommunication companies are opting-in to provide just that to its consumer base. However, like most new technologies, there are obstacles in the way that makes fast mobile data a little impossible to achieve. Well, DITO pretty much has that covered with what they call their world-class digital infrastructure.

Standing alone, or not standing alone?

While competitors began to introduce 5G to its consumers earlier, what they initially deployed was 5G non-standalone (NSA) network. This means that their 5G architecture is assisted by their existing 4G infrastructure. DITO, on the other hand, began developing their 5G standalone (SA) network since their rollout in 2019. 

In theory, 5G standalone networks like DITO provides super-fast transmission speeds with ultra-low latency, which is suitable for most enterprises. Per its namesake, it relies heavily on its own 5G infrastructure instead of using its legacy 4G infrastructure as a jump-off point. In turn, DITO users will experience true 5G speeds every time they connect to the internet through mobile data. In other words, 5G standalone is “true 5G”.

With 5G standalone, DITO unlocks the “true 5G” in accommodating what 4G networks previously couldn’t. While it is building on what 4G connectivity initially offered, over time, it will eventually solidify itself as the standard for mobile connectivity.

The true goal for DITO

“Our goal has been to allow Filipinos to experience next-generation technology and we in DITO are excited to bring 5G to more areas in the country to truly transform digital connectivity and online interactions,” added DITO Chief Technology Officer Rodolfo Santiago.

For DITO, this is the “breakthrough connectivity” they want to bring to the general public. Apart from achieving greater mobile data speeds, each DITO SIM gives users access to more enhanced versions of innovations that are already widely available since the introduction of 4G. Two of which are VoLTE and ViLTE, the latter of which is something DITO proudly boasts.

Their next-gen offer: VoLTE and ViLTE

What exactly are these two innovations they’re enhancing with their services? Let’s start with VoLTE, mostly because this isn’t necessarily something new for most people. Simply put, Voice over LTE or VoLTE allows users to make voice calls without compromising mobile data speed. Normally, it’s an either-neither thing, wherein one use case will be a priority.

What is new is ViLTE, or as DITO calls it: Video over LTE, which works the same way as VoLTE but for video calls. In essence, users can make video calls from your device without the need for a video calling app. Also, these video calls are charged with the same rates as a normal voice call. However, this feature is currently limited to video calls between DITO subscribers.

Bringing it all together, DITO offers a package with faster connections and greater savings considering the innovations. It’s not something that a lot of other telcos are offering; for DITO, however, the experience doesn’t stop there.

Any phone will do, but what exactly do you need?

Upon its early availability, DITO released a list of compatible phones that supposedly bring out the telco’s best features. Like most providers, the DITO SIM works with any smartphone, 5G or not, for the bare minimum features like calling, texting, and mobile data. To experience “the real 5G,” however, they want a 5G device with a more standalone architecture.

For context, the 5G smartphones on this list come with either 5G NSA or 5G SA. The main difference between the two is, well, non-standalone architecture isn’t necessarily true 5G; rather, it is applying 5G to a 4G network. Meanwhile, 5G SA is its own 5G network, built and connected to 5G network bases to deliver higher speeds with lower latency than 4G.

In DITO’s case, only a handful of smartphones actually support the 5G SA architecture, which is where their 5G capabilities are built on. Currently, they are continuously exploring avenues to expand their network in order to bring “the real 5G” to more Filipinos, provided they have a supported smartphone. Again, these kinds of smartphones are a bit pricey, but to experience “the real 5G,” it’s not a bad trade-off.

The future is DITO (here)?

DITO enters the scene with the goal of bringing the next big thing in telecommunications, and their offer hinges on it. In their eyes, the promise of faster internet and wider coverage is already here, and it’s just a matter of getting people to opt-in. With its latest innovations, DITO provides a more enhanced mobile data experience.

To fully experience the next generation of technology, users must be properly equipped to wield such power. There’s a reason that DITO put out a device compatibility list upon initial launch: to provide users the best possible experience with all the features they have. Sure, any device will work with the DITO SIM, but certain devices give you that best experience.

Is it time to make the switch to the next-gen? In DITO’s eyes, the answer is simple and they’re waiting for people to join them.


This feature is a collaboration between GadgetMatch and DITO Philippines.

The post DITO is all-in for the next generation of mobile connectivity appeared first on GadgetMatch.

]]>
The secrets behind iPhone 13’s Cinematic Mode https://www.gadgetmatch.com/iphone-13-series-cinematic-mode-explained/ Thu, 23 Sep 2021 13:00:07 +0000 https://www.gadgetmatch.com/?p=130834 Together with Apple's VP for iPhone Product Marketing as well as their Human Interface Designer

The post The secrets behind iPhone 13’s Cinematic Mode appeared first on GadgetMatch.

]]>
For the first time ever, we had a three-way interview with Apple’s VP for iPhone Product Marketing, Kaiann Drance as well as one of their leading Human Interface Designers, Johnnie Manzari. If you’re not starstruck enough, both of them appeared in Apple’s September 2021 Keynote event!

Other than new camera sensors, newer camera features are also found on the new iPhone 13 Series. One of those is the new Cinematic Mode.

If you’ve watched some of our latest iPhone videos including the Sierra Blue iPhone 12 Pro Max unboxing, we’ve let you take a sneak peek on that new video mode.

We’re not gonna lie, it’s one amazing camera feature Apple has managed to deliver.

But what are the secrets behind it? And are you curious how technicalities work?

Watch our 16-minute interview with the Apple executives explaining why Cinematic Mode is the next big thing in mobile videography.

 

The post The secrets behind iPhone 13’s Cinematic Mode appeared first on GadgetMatch.

]]>
How Google alerted the Philippines during the July earthquake https://www.gadgetmatch.com/how-google-alerted-the-philippines-during-the-july-earthquake/ Mon, 16 Aug 2021 02:38:43 +0000 https://www.gadgetmatch.com/?p=127852 Crowd-sourcing data

The post How Google alerted the Philippines during the July earthquake appeared first on GadgetMatch.

]]>
Back in July, an earthquake rocked Metro Manila. Unbeknownst to most but noticed by some, a globally renowned company was helping everyone through the natural incident: Google. In the few minutes leading up to and during the 6.7 magnitude earthquake, Android users received important alerts warning them of the ongoing tremors. Though it wasn’t the dreaded Big One, the alert afforded attentive users a few precious seconds to either seek appropriate cover or stop doing dangerous tasks.

Incidentally, the tech surrounding Google’s earthquake alert system wasn’t just hastily built on ongoing databases or social media. Google actually packed in a fully responsive earthquake sensor for Android phones.

Faster than an earthquake

The forever-increasing speed of technology has always been a contentious element since the rise of smartphones. Developers and users alike have wondered how accurate or quick our favorite devices can warn us of things happening around us. There’s even an XKCD comic about how Twitter can warn us of an earthquake minutes before it reaches the reader.

Over the years, technology has developed new ways to deliver alerts. From simple weather apps to city-wide messaging systems, users can receive warnings in a timely fashion. Practically nothing is a surprise anymore with the right technology.

That said, Google has successfully developed a new system that can rely on other Android smartphones to accurately tell whether or not an earthquake is happening.

A quake detector in your pocket

Speaking to Android Police, the feature’s lead engineer Marc Stogaitis described how Google’s earthquake sensor leveraged other devices to tell users about the quake. It all revolves around the different sensors built inside your phone.

As it is, every smartphone comes with a host of sensors to support its different functions. A light detector can seamlessly adjust brightness and camera settings, and a gyroscope can support compasses, for example. With earthquakes, the biggest element to ponder on is a smartphone’s movement and vibrations during an earthquake.

According to the lead engineer, figuring out the metrics for detecting an earthquake wasn’t a problem. After decades of accurate seismograph technology, developers already have an idea on what they need to measure.

However, the technology does not stop there. Naturally, there are hiccups to relying on just a single (or even every) phone’s data. For one, a city-wide messaging system can set off everyone’s phone in a single area, potentially causing false positives. Plus, relying on a single phone is definitely tricky. There are multiple actions which can cause vibrations akin to an earthquake.

Crowdsourcing a quake

The feature doesn’t rely on just one phone. It doesn’t tap into every Android phone in an area either. Instead, it collates data from phones plugged into a charger. Naturally, a plugged-in phone is the most reliable barometer in terms of battery reliability. They won’t die out in the middle of an earthquake and ruin a source of data. Additionally, charging phones are often stationary. They won’t be affected by motions that mimic earthquakes.

Google “listens” to charging devices in an area. If the subset meets the criteria for an earthquake, the company quickly determines the earthquake’s epicenter (based on approximate location) and magnitude. Once the system declares that a quake is indeed happening, it sends out an alert to nearby devices and gives them the time needed to seek shelter.

The alerts naturally prioritize people nearer to the epicenter. But, of course, the speed will ultimately depend on the phone’s connectivity. A phone hooked up to a building’s fast Wi-Fi connection will receive alerts faster than a commuter’s phone on data while going through a tunnel.

Still, the short time that the alerts give users is enough to save themselves from a precarious situation. Though the feature can potentially warn users of quakes minutes in advance, Stogaitis says that it will more realistically push alerts five to ten seconds before the incident. However, five seconds is enough to go under a table and have some sort of protection against falling debris.

Still keeping things private

For anyone worrying about how Google is handling their data, Stogaitis says that the company removes all identifiers from the data except for approximate location. And, despite that, Google still maintains that the feature will be the most accurate that it can be. Either way, the feature will be useful for any earthquakes in the future.

The earthquake sensor is available for any Android phone running Lollipop and above. Naturally, the feature still necessitates that users turn on emergency alerts on their phone.

The post How Google alerted the Philippines during the July earthquake appeared first on GadgetMatch.

]]>
The industry’s next big thing: Cloud gaming explained https://www.gadgetmatch.com/cloud-gaming-explainer-stadia-geforce-now/ Wed, 02 Dec 2020 02:27:10 +0000 https://www.gadgetmatch.com/?p=111253 It’s gaming on the go, but for internet that’s not slow

The post The industry’s next big thing: Cloud gaming explained appeared first on GadgetMatch.

]]>
Everybody’s getting into gaming these days, and you can’t blame them. With the pandemic continuing its ravaging ways in the world, people turn to their consoles or PCs for some action. However, not everyone can afford all the expensive PCs and the next-gen consoles when they come out.

Instead, a new player comes into the fray with a pretty great idea. What would happen if you can just play your favorite games from any device? Also, what if we told you that this won’t take up space on your device at all? This is basically what cloud gaming offers to you: a way to play games from any device at any time!

So, how does that actually work? What do you need to ensure quality gameplay, and should you even consider it?

The basics of playing on a cloud

On paper, it’s pretty easy to understand how cloud gaming works. Basically, you have access to a library of games from a cloud storage service. When you subscribe to the service, you can virtually play your library from any device regardless of the specs. Also, you don’t have to worry about storage problems since these games are stored on a server.

It’s no joke when these companies tell you that you can play your games on any device. With their dedicated data servers, they make sure that the games run smoothly once you access them from the cloud. On your end, you will need a strong and consistent internet connection to play the games smoothly.

Several companies already have cloud gaming software available for people to subscribe to. Some examples include NVIDIA’s GeForce Now, Microsoft’s xCloud, and Google Stadia — all of which store PC games on a server. These companies even take the time to update their server hardware every so often to bring the best possible quality.

System requirements for cloud gaming

Much like your ordinary PC or gaming console, companies that run cloud gaming servers need certain equipment to run smoothly. First, these companies must set up active data centers and server farms that run the games. These data centers ensure that games are up and running, while reducing latency. In other words, these serve as the powerhouse of cloud gaming.

Next on the list is the network infrastructure necessary to send these to the users. To ensure that people don’t experience lags when they play their games, companies also invest in acquiring proper data connections. However, in most cases, this isn’t something these companies have control over; it’s mostly coming from their available internet service providers.

On the front-end, companies also provide dedicated hardware and software to house the cloud. For example, NVIDIA integrated GeForce Now into their own cloud streaming device, the NVIDIA Shield back in 2013. Meanwhile, Google Stadia relies heavily on using pre-existing Google software like Google Chrome and the Stadia App.

Something great to offer, for the most part

Cloud gaming services offer something unique in the industry. Essentially, it eliminates the user from investing so much into buying expensive PCs as it allows people to play from virtually any device. Whether it’s on a smartphone, laptop, or even a smart TV, people get access to games at high frame rates without an RTX 3080.

Furthermore, the game and save files are stored on the cloud, and don’t take up any storage on your devices. This is greatly beneficial for people who are already running on limited storage space, especially if they play Call of Duty: Warzone. With everything stored on the cloud, you don’t need most of the 512GB of SSD storage.

However, one of the biggest issues with cloud gaming revolves around the thing it’s based on: the internet. Specifically, it’s on the user’s internet connection as these services require the fastest internet to run smoothly on any device. Basically, you will need either an Ethernet or a 5G wireless connection to ensure the lowest latency possible.

That infrastructure isn’t readily available in most markets, which is a prominent issue among several third-world countries. Furthermore, even if there are companies that have 5G in their pipeline, these same providers also put data caps on it. Even if the user can play at an optimal frame rate, they’re doing so with a restriction in place.

Does this new player have any place?

With the world continuously opening its arms to the gaming industry, innovation becomes the forefront of success. Companies come up with a variety of gaming technologies that seek to cater to a wide variety of people. From individual hardware to pre-built systems, gaming often revolved around these things.

With cloud gaming, it gives people not just another option within the mix. Rather, it seeks to challenge the notion of availability and accessibility, and give it a viable solution. Essentially, it takes away the physical hardware limitations on the user’s end, and makes it available for everyone.

But like most gaming technologies, everything is still limited somehow. These systems still experience bottlenecks both on the manufacturer and the user’s end. In the end, it will depend on how much you’re willing to shell out for them, and how willing you are to accept the risks.

Illustrations by Raniedel Fajardo

The post The industry’s next big thing: Cloud gaming explained appeared first on GadgetMatch.

]]>
Your MagSafe Questions Answered https://www.gadgetmatch.com/apple-magsafe-explainer/ Fri, 06 Nov 2020 06:42:12 +0000 https://www.gadgetmatch.com/?p=112822 Do you really need it?

The post Your MagSafe Questions Answered appeared first on GadgetMatch.

]]>
If you’ve ever owned an old MacBook before, you’ll know that those chargers magnetically snap onto place. That particular technology is called the ‘MagSafe’.

After the MacBook Pro touch bar and USB-C overhaul last 2016, everyone thought MagSafe ended for good. Not until they announced the new MagSafe for the iPhone 12 series four years later.

The MagSafe technology might not be new but the implementation for the latest iPhones makes the technology even more usable. Other than the securely-placed phone for wireless charging, there are a plethora of case manufacturers who continuously work on future accessories that support MagSafe existing ecosystem.

But is the Apple MagSafe more than just a gimmick? And do you really need it?

Watch our in-depth Apple MagSafe explainer here.

The post Your MagSafe Questions Answered appeared first on GadgetMatch.

]]>
Explaining smartphone display refresh rates https://www.gadgetmatch.com/explainer-smartphone-display-refresh-rates/ Tue, 11 Feb 2020 08:44:21 +0000 https://www.gadgetmatch.com/?p=94960 Are they really any different from PC displays?

The post Explaining smartphone display refresh rates appeared first on GadgetMatch.

]]>
Smartphones, little by little, are turning into mini-PCs with the features that come with it. From browsing on social media to playing video games, technology is slowly adopting a more “on-the-go” lifestyle. Recently, smartphones have acquired another feature that your own desktop or laptop already has.

Some of the recently released premium and gaming smartphones now come with displays having their own dedicated refresh rate. Refresh rates aren’t new, but to see it on a compact device has a lot of people wondering. How different or similar is it to a PC’s refresh rate? And is it actually something good to have?

A crash course on refresh rates

A display’s refresh rate, basically is the number of times your display updates every second. Your screen usually takes a few seconds to just a second to load new images, depending on that rate. For example, a 60Hz refresh rate means that in one second, any image on your display is refreshed 60 times. Your eyes wouldn’t catch it fast enough, but that’s how your display works.

For most PC displays, the default is at 60Hz with companies releasing displays that range up to 240Hz. You mostly see this in displays fit for gaming purposes, since gamers prefer the higher refresh rate for improved performance. If you’re someone who mostly likes to watch movies, it really doesn’t matter how high the refresh rate is.

Note that this is entirely different from frame rates, in that these show how many images are produced within a second. Although, having a high refresh rate allows you to perform a lot better because it is optimized for higher frame rates. That’s why you see some gamers complain about playing on a 60Hz display.

Transitioning to a smartphone near you

Eventually, the concept of amping up a refresh rate will reach the world of smartphones. In fact, the OnePlus 7 Pro was actually the first mainstream smartphone to have a display with a 90Hz refresh rate. Most smartphones, even budget ones, have displays built with a 60Hz refresh rate. Something about it just makes you scroll through your phone without feeling too dizzy, unless you scroll too fast.

Premium smartphones mostly incorporate either a 90Hz or 120Hz refresh rate for a smoother UI experience. With higher refresh rates, scrolling through your phone feels a lot smoother without risking an eye sore. Of course, these smartphones do cost significantly more than your average, everyday smartphone.

Apart from premium smartphones, gaming smartphones have also incorporated higher than 60Hz refresh rates. Phones like the Razer Phone 2 and the ASUS ROG Phone 2 both come with a 120Hz refresh rate to suit mobile gamers, especially FPS (first-person shooter) gamers. With these higher refresh rates, mobile gamers see clearer images with less motion blur involved.

Do you really need all the hertz?

That begs the question: what do you need a high refresh rate screen for? When you use a PC, 60Hz is already good for most tasks and games. Trying to go for higher refresh rates usually means that you’re doing a lot more than the ordinary. Tasks such as heavy-duty data analytics or hardcore gaming are optimal for higher refresh rates.

The same logic works for smartphone displays, except on a smaller screen size. A lot of what you can do, you’re able to do so on 60Hz displays. If you’re just using your phone to browse social media, watch Netflix on the daily, and play games casually, you don’t need anything higher. Although, it is a premium to have if you want buttery smooth software.

If you play games competitively, you would prefer higher refresh rates just like in gaming monitors. Higher refresh rates allow you to perform at an optimal level when going for higher frame rates. We’re talking close to no image tearing or motion blur when you play PUBG Mobile or Call of Duty. While you can perform well at the default 60Hz, going for a 90Hz or 120Hz ideally makes the experience better.

Some final thoughts

Smartphone display refresh rates have always been a part of the technology. These displays were built in a way that everyone can benefit from them. It’s only fairly recently that smartphone companies came up with a way to make the experience a lot smoother. Hence, smartphones started incorporating higher refresh rates.

It almost feels like having that high refresh rate is a premium, given only select smartphones have it. But it’s a premium that you don’t really need unless you have a good reason to. Apart from the cost of experiencing it, it really depends on what you plan to do with your smartphone.

At the end of the day, it’s better to ask yourself if it’s a feature worth getting. If it’s something you feel you can’t live without, by all means, right?

The post Explaining smartphone display refresh rates appeared first on GadgetMatch.

]]>
Stranger Things 3: What exactly is an ignition cable? https://www.gadgetmatch.com/stranger-things-3-what-exactly-is-an-ignition-cable/ Sat, 20 Jul 2019 00:42:20 +0000 https://www.gadgetmatch.com/?p=82299 Possessed Billy knew what he was doing

The post Stranger Things 3: What exactly is an ignition cable? appeared first on GadgetMatch.

]]>
By now, you’ve probably seen the third and newest season of Stranger Things on Netflix. If you still haven’t, it goes without saying that there are spoilers ahead and you should stay away from this article.

Seeing a pop culture reference such as Stranger Things together with the seemingly unrelated world of automotive in one writeup such as this could be strange (pun intended) for some. We really don’t mind and thought it would be a fun and unique way to talk about the show and learn a few things from it, as well.

So we ask the question: What exactly is an ignition cable?

The ignition cable is part of a vehicle’s ignition system. In simplest terms, it’s a mechanism that starts the engine. By generating a high voltage from the car’s battery to the spark plugs in its engine, it causes them to ignite the engine’s combustion chambers and get it up and running.

And in order to transfer that voltage from the source to the engine, you’ll need an ignition cable as it’s like a subway system that acts as pathways for the voltage to pass through. So if the ignition cable is not present, there’s no way to start the car.

Back to Stranger Things, Billy (although already possessed by the Mind Flayer) obviously still had his knowledge on cars so he took away the ignition cable trapping our favorite gang at Starcourt Mall’s parking lot.

Just to further stress the importance of an ignition cable and the whole ignition system for that matter, we’d like to visit other possibilities and ask, “What if Billy didn’t take it away?”

Well, the plan was for Eleven and her group to go to Bauman’s secret place and stay safe while Joyce, Hopper, and the rest try to close the portal and render the Mind Flayer powerless. If their ignition cable was intact, they’d be a lot safer away from the Mind Flayer although we wouldn’t be able to see that amazing fireworks scene inside the mall.

Through this, we see the importance of that one small part under the hood of the car. In real life, it really pays to make sure that everything is in good working condition and that one faulty cable could mean trouble for you if remained unaddressed — unless there’s a car on display inside a mall somewhere that you can take spare parts from!

SEE ALSO: Netflix launches AR Trailer with Stranger Things 3

 

The post Stranger Things 3: What exactly is an ignition cable? appeared first on GadgetMatch.

]]>
A phone’s water protection plan: IP ratings explained https://www.gadgetmatch.com/ip-rating-explainer/ Mon, 08 Jul 2019 11:01:21 +0000 https://www.gadgetmatch.com/?p=81592 It doesn’t give you the right to dunk it in water, though

The post A phone’s water protection plan: IP ratings explained appeared first on GadgetMatch.

]]>
If you plan to bring your phone to a beach trip with your friends, you normally bring a pouch with you. The main function of that pouch is to protect your phone from contact with any liquid while you enjoy the waves. Of course, it doesn’t fully guarantee that water won’t seep through it — especially when a big wave crashes on you and opens the pouch. But, it does give a sense of safety and security for your beloved smartphone.

That’s the whole concept behind an IP rating that’s given to most smartphones today. Nowadays, you hear a lot about these smartphones being advertised with IP68 ratings. But, what does an IP68 rating actually mean? Is it worth something to consider when buying a new smartphone?

What is an IP rating?

IP ratings are not new in the tech world. In fact, a lot of the electrical appliances and technologies you have at home come with it. An IP rating, or ingress protection rating basically tells you the level of protection any electrical device has against solid and liquid objects. It acts as a security measure to determine what objects the device can handle without malfunctioning.

The International Electrotechnical Commission (IEC) gives out these ratings to manufacturers as a safety measure for production. It consists of two numbers that describe its protection against a vast number of objects, even human touch. The first number denotes a device’s protection against common solid objects and dust. Meanwhile, the second number denotes a device’s protection against liquids, even steam-jet liquids. The higher the number, the more protection it gets!

IP ratings are not just present in most recent smartphones. Things like electrical sockets, cameras, even phone cases come with IP ratings, as well. 

The reason it exists

Manufacturers and consumers see an IP rating quite differently. Those two numbers ultimately stand for how well your device can stand against, well anything. For manufacturers, an IP rating basically gives them a standard to follow when producing more devices. Before shipping their latest smartphones, they subject their devices to numerous tests to validate their IP ratings.

Also, it gives a more concrete way of stating that their devices are resistant to such objects. When you come across smartphones that claim to be water resistant, oftentimes you tend to ask just how resistant it is. With manufacturers, the IP rating gives a more definitive measure to that claim. For example, a smartphone with an IP68 rating is heavily protected against dust, and you can submerge it in waters deeper than a meter — perfect for beach trips.

Las Cabanas Beach Resort, Maramegmeg Beach, El Nido

For consumers, the IP rating just provides a peace of mind when buying a new smartphone. It’s basically placed there to tell you that your phone can still be used even if you subject it to too much dust or water that’s too deep. You see this in most YouTube videos or channels that basically bend, scratch, and dunk phones in buckets of water. In the end, you won’t have to worry about destroying your phone that much when you go on that beach trip without a pouch.

Some manufacturers simply don’t need the rating

However, there are manufacturers that simply found the rating unnecessary or simply just a marketing tool. Companies like OnePlus even did an entire ad that showed off their new flagship devices, the OnePlus 7 and OnePlus 7 Pro without an IP rating. The whole issue sparked debates on whether or not IP ratings do make sense, or companies could simply do without them.

OnePlus argues that one reason their new smartphones don’t have an IP rating is because of the cost to get one. Even simply requesting for a phone for consideration costs a lot on the manufacturing side, which ultimately bumps up the phone’s price. Pete Lau, one of the co-founders of the company estimated the cost for getting an IP rating is at US$ 30. Of course, it is entirely up to the consumer’s view of its value to the overall product.

The other reason is because of the coverage of the device’s warranty, particularly towards water damage. OnePlus claims that even if smartphones have IP ratings that show how resistant they are to water, water damage isn’t fully covered by its warranty. This also furthers their argument on why they wouldn’t want to spend on getting one in the first place. An IP rating is not a legitimate reason for people to have their phones fixed for free after dunking them in buckets of water.

To them, it does not make sense to simply attach an IP rating onto a phone even as a marketing tool. It gives off the wrong impression that the device is waterproof when the rating basically leans towards phones being water resistant.

Do we really need to know the IP rating? 

The IEC created IP ratings for everyone’s protection — from manufacturers to consumers. The whole purpose of having an IP rating is to provide a level of protection for anything electrical, smartphones included. It ensures the safety of everyone, but it’s not a way to bail anyone out when they dunk their phones in water.

While some may argue that it helps to know what your device’s IP rating is for better care, others just see it as a marketing ploy. It only seeks to sell a device perceived to be waterproof according to a standard. However, IP ratings were not meant to waterproof your phone by any means. It’s there to tell you that your phone can handle water, just possibly not too much.

At the end of the day, we have to ask ourselves whether we truly see the value in having these IP ratings. Whether or not your preferred device has an IP rating, just remember: it’s not a reason for you to exploit your phone.

The post A phone’s water protection plan: IP ratings explained appeared first on GadgetMatch.

]]>
Huawei vs the US: A timeline https://www.gadgetmatch.com/huawei-trump-google-ban-faq-explained/ Fri, 28 Jun 2019 10:00:31 +0000 https://www.gadgetmatch.com/?p=81167 An FAQ on Huawei's problems

The post Huawei vs the US: A timeline appeared first on GadgetMatch.

]]>
Who’s afraid of Huawei? Right now, everyone is. Does anyone really know why?

Since 2017, the US has dealt continuous blows against the Chinese company. More than two years later, the war is still in full swing. Both sides have fired multiple salvos against the other. Still, despite the conflict’s longevity, most people are not really sure what’s happening.

Why are they fighting? Should we stay away from Huawei? Is it time to get rid of our Huawei devices as soon as possible? Should we really fear for our cybersecurity?

For ordinary consumers, the entire Huawei debacle is mired in political lingo and endless controversy. It’s time to clear the air. What’s up, Huawei?

How did this all begin?

Let’s go back to where it all started. In late 2017, American lawmakers reviewed the businesses of ZTE, another Chinese tech company. Soon after, the investigation unveiled a flurry of shady business deals involving Iran. By law, companies operating in the US are not allowed to communicate with blacklisted countries including North Korea and Iran. Naturally, the violation caused monumental sanctions against ZTE. The US banned ZTE from American soil — effectively, the same ban on Huawei today.

At this time, Huawei was just a moderately innocent passerby stuck between two fighting giants. At most, Huawei was accused of spying on its customers. American lawmakers proposed a boycott of Huawei’s products. The proposal drew from the emerging rise of Sinophobia. Still, at the time, the US government’s eyes were firmly on ZTE.

Current restrictions prevent Huawei from providing 5G technologies to American consumers

In its infancy, the Huawei-ZTE issue was a product of a small fear. It still hadn’t affected everyone. In fact, US President Donald Trump even tried to save both companies from utter destruction. Both companies enjoyed a reprieve from America’s ire. However, this was short-lived.

In a surprising about-face, Trump started his controversial trade war against China. The American leader abandoned his salvific efforts. Instead, he adopted an incredibly aggressive push against Chinese companies. Unsurprisingly, ZTE already crumbled from the initial push, leaving Trump without a company to make an example out of.

Trump set his sights on Huawei, the world’s second largest smartphone maker. His weapon: the same ban meant for ZTE. His motive: potential cybersecurity issues. This time, America means business. Recently, Trump finally pulled the trigger, enacting a total ban against Huawei on American soil. However, instead of just the US, Trump has been lobbying for a similar ban on other countries. Since then, Huawei has suffered a world of hurt.

What does the ban mean?

Naturally, a “total ban” sounds daunting. Banning Huawei smells like certain doom for the tech giant but what does the ban really mean?

When enforced, Huawei can no longer deal with American companies. To Huawei’s dismay, the tech maker uses a fair number of American components in its products. Most notably, Huawei’s smartphones come with Google’s Android. The ban will prevent Huawei from using the operating system going forward. On paper, this is a huge deal. Android remains the world’s biggest operating system. A lot of consumers trust Android. Huawei is losing a massive chunk of its package with the loss.

As if that wasn’t enough, Facebook — and its slew of apps — have withdrawn from Huawei’s products. The company’s smartphones will no longer have Facebook, Messenger, Instagram, or WhatsApp installed out of the box. The threat is becoming real.

Huawei makes its own chipsets but relies on several American companies for other components

Additionally, Intel, Broadcom, and Qualcomm have blacklisted Huawei after Google’s announcement. Huawei has also lost the support of the ubiquitous ARM chip architecture.

It’s not looking good for the Chinese company. Huawei is slowly being dismembered. Faced with an army of bans, it’s natural to worry about Huawei. Worst case scenario, Huawei will become a mere shadow of its former self, devoid of the components that helped its recent success.

Should we really worry, though?

Not just yet. Right now, Huawei is enjoying a temporary reprieve. Soon after the initial ban, the American government granted the company a three-month extension. Until around the end of August, Huawei can still operate with its current partnerships. Except Facebook, its devices will still ship with the same components we love. At least for the near future, Huawei is safe.

In the meantime, Huawei is hunting for adequate alternatives for its failing parts. This means a new operating system, new chips, and likely an entirely new package. To its credit, Huawei’s development team is working around the clock. Only a month removed from ground zero, they are already promising optimistic developments for the future. Huawei remains confident in their future, launching a bevy of new phones amidst the controversy.

Likewise, some American companies are also lamenting the loss of business. Before the ban, Huawei was a loyal customer, delivering American components to a massive global audience. They aren’t happy with Trump’s ban. For one, Google has publicly defended Huawei. According to them, Huawei’s — and subsequently, the world’s — cybersecurity standards will collapse without a collaboration between international companies. With Android, Google can act as Huawei’s checks and balances against potential cybersecurity threats from malicious forces. If anything, Huawei still has its share of public defenders.

The US ban will prevent Huawei from using Android as its operating system moving forward

Most importantly, Trump still has the power to reverse the ban before the 90-day extension runs out. If China and the US reach a meeting point, all might go back to normal. Though uncertain, it’s too early to give up on Huawei just yet.

What will Huawei 2.0 look like?

Unfortunately, Huawei’s future is muddled with uncertainty. This includes any potential iterations in the future. As far as we know, Huawei isn’t bleeding from the multitude of losses. The company has reinforced its Kirin chipsets. Further, they are developing their own dedicated operating system codenamed Ark OS.

Other than that, there’s not much to go on. Speculatively, the biggest changes will come from its app supports. If Google leaves, Huawei will be left without the Play Store’s support and security. The Chinese company will have to rely on its own native software to power their phones. Unfortunately, an all-Chinese ecosystem is less than ideal for most. In fact, having one might even justify the American Sinophobia. But again, it’s all up in the air.

I have a Huawei phone. Should I just sell it?

No, you still shouldn’t. The grey market is already doubling down against the onslaught of Huawei returns. If you don’t know a willing contact, finding a buyer will be difficult. If you do find one, you’ll receive only a mere fraction of what you paid for.

At its current iteration, Huawei’s phones are still on top. They are a delight to hold and use, and if anything, have challenged its competitors to offer better value to consumers over the years. Right now, it’s best to play the long game. Wait and see what happens. If anything, Huawei — and its official partners — already has an insurance policy in place. Several retailers have declared a 100 percent refund policy in countries like Singapore. If Google cuts the cord, Huawei users can get their money back.

Similarly, Google has promised Android Q support for existing Huawei handsets. Just this week Huawei also announced the rollout of Android-based EMUI 9.1 to older models. If you already own one, a Huawei phone shouldn’t be an immediate cause for panic.

So, should we really be worried about Huawei?

Understandably, uncertainty isn’t an ideal for everyone. Huawei’s troubles are an excruciating thorn for both businesses and consumers alike. Switching to another brand is a natural solution against the company’s shaky future. However, if you’re looking at the silver lining, worrying is likely a premature reaction. If you’re not a Huawei user, the controversies shouldn’t affect you. If you’re already a Huawei user or looking to buy a Huawei device, it will likely pay off to play a longer strategy. After all, Huawei devices are still some of the best smartphones you can buy on the market.

Editor’s Note: Looks like we really shouldn’t worry after all. Not even an entire day has passed since this article was originally published but Huawei no longer banned in the US. Rejoice, Huawei users!

The post Huawei vs the US: A timeline appeared first on GadgetMatch.

]]>
Explaining OLED screens and Dark Mode https://www.gadgetmatch.com/oled-explainer-dark-mode/ Sun, 07 Apr 2019 00:02:51 +0000 https://www.gadgetmatch.com/?p=73731 Why that screen fits in the dark

The post Explaining OLED screens and Dark Mode appeared first on GadgetMatch.

]]>
Most of the applications you’re currently using must have rolled out their own version of dark mode by now. The smooth transition from a light to dark interface can be done through a push of a button, or by sending the moon emoji on Messenger. A lot of people also find dark mode quite sexy, and that’s probably because of the screen they’re looking at.

A lot of newly released smartphones now have OLED screens, and dark mode seems to work best on such displays! But why is that? How do OLED panels allow dark mode to flourish?

Better, blacker, affordable screens

Organic LED (light-emitting diode) or OLED is essentially a kind of display technology. In a nutshell, OLED panels allow for better and clearer images and colors.

Thin layers of carbon fiber make up OLED screens. Because of these lightweight fibers, screens show brighter and more vibrant colors. Apart from that, OLED screens show deeper blacks and reduce instances of motion blur when navigating. The best part is that OLED screens are becoming gradually cheaper to manufacture. That explains why more and more of today’s smartphones use this panel.

More colorful than the rest

In comparison to regular LED screens of the past, OLED promises more accurate colors by producing light from individual pixels, instead of relying on backlighting. Back then, LCD screens relied heavily on the backlight of the display to make colors pop. Although, such displays also make the colors seem washed, especially when compared to OLED.

Image credit: Denise Chan

However, OLED’s colors don’t always turn out better than on LED and LCD screens. One such case is when you turn your screen’s brightness to its maximum, especially under strong daylight conditions. LED and LCD screens are designed to perform relatively better in color accuracy when your screen’s brightness is set to max. OLED screens were not designed for maximum brightness, so colors at that point would be saturated.

Which OLED is best?

There are two types of OLED technologies that currently exist: AMOLED and PMOLED. A lot of people hear AMOLED tossed around a lot because lots of smartphones use it. Essentially, AMOLED uses a storage capacitor that controls how much light each individual pixel will give off. It’s the one responsible for projecting all sorts of vibrant colors on most OLED smartphone screens. Apart from that, AMOLED screens do support wider resolutions at a more affordable and efficient rate.

PMOLED, on the other hand, does not have a storage capacitor and instead relies on user control. Essentially, the user will control lighting settings, and the individual pixels will adjust accordingly. You can find PMOLED screens on smaller devices like older iPods and pocket Wi-Fi devices. Take note that these screens use more power to implement such color changes.

Joining the dark side

Ever since dark mode rolled out for different apps and interfaces, people have been contemplating on switching to it — and for good reason. On normal LED or LCD screens, the new feature does not bode well with the technology. The depth of the black their dark mode possesses is not reflected well, to the point that the blacks look more gray than actual black. This is much more obvious when the screen’s brightness is turned all the way up.

Image credit: Mike Enerio

Aesthetically, dark mode looks better on OLED screens because of the technology’s emphasis on deeper blacks. Most OLED screens have capacitors that control light passing through each pixel, which also works for blacks and whites. As such, dark mode shows up deeper and blacker, which is the intended look compared to regular modes. But, there’s actually more to just aesthetics for this mode.

It’s also been proven that dark mode on OLED helps save your battery life. Google confirmed this at its Android Dev Summit, citing that on max brightness, blacks consume less power than all other colors. Individual pixels need less electricity to show blacks on screen, which results in lower power consumption through time. Note that Google got these findings through tests on their original Pixel smartphones and their own apps like YouTube.

What’s left for OLED and dark mode

Apps and operating systems are now starting to embrace or consider incorporating dark mode into their software. While apps like Twitter and YouTube introduced such an option early on, others are beginning to take notice. Of course, you’re gonna need the right screen to fully immerse yourself.

Image credit: Simone Dalmeri

It has been proven: OLED and dark mode are indeed a perfect match. But, it is entirely up to you whether you want to stay in the light or switch to the dark side.

The post Explaining OLED screens and Dark Mode appeared first on GadgetMatch.

]]>
The new online generation: Explaining 5G internet https://www.gadgetmatch.com/5g-internet-explainer/ Sun, 23 Dec 2018 00:32:10 +0000 https://www.gadgetmatch.com/?p=66280 Faster, better, and more available?

The post The new online generation: Explaining 5G internet appeared first on GadgetMatch.

]]>
Are you still bothered by slow internet in your country? Even with the advancements and supposed improvements in infrastructure, we’re all living in a 4G world. The current generation of internet connectivity is still present in today’s mobile and telecommunication networks. But now, a new generation has emerged, and it has the potential of taking the whole world by storm.

Let’s stop and ask first: What really is this new generation? How different is it from the existing generation’s internet? And, what needs to be done to welcome the change?

What really is 5G?

5G is the new generation we’re speaking of here. Specifically, it’s the next level of mobile network connectivity being rolled out at the moment. What 5G offers to everyone is pretty straightforward: faster internet speeds, close to zero latency, and improved accessibility. It’s expected that 5G will replace existing 4G technology once fully deployed in the near future.

Currently, 5G is still in its early stages of deployment — much like an early-access game. Companies are given plenty of time to integrate the 5G connectivity interface on their devices, or at least until March 2019. Once the initial deployment is done, 5G will be available in more devices, whether it’s your phone or your smart device.

A connection that comes in waves

Remember that one science class you had about the electromagnetic spectrum and visible light? Basically, devices that emit electromagnetic waves fall under a spectrum depending on their frequencies and wavelengths. For most network connections, their waves follow a similar concept, with 4G found on the leftmost and 5G in the middle.

There are two ways that 5G can work in any place at any time, and one of them includes waves. This strand of 5G is called the millimeter wave (mmWave), and is currently present in most research facilities and military devices. With mmWave, 5G connections are ideally faster (peaking at 10Gbps) and provide lag-free services because it adds additional bandwidth for devices to use. Although, it is held back by obstacles such as walls and floors that just bounce the signal off.

The second way is through a sub-6GHz spectrum. Unlike mmWave, the sub-6GHz spectrum is more of a middle-of-the-pack approach to 5G connectivity. Basically, 5G signals will strengthen connections that currently exist in the world like 3G and 4G. This is mostly because 3G (2.4GHz) and 4G (5GHz) fall under the 6GHz limit. This method is the more cost-effective approach, and it doesn’t easily experience interference.

How different is it really from 4G?

We always talk about how 5G is faster than 4G in terms of data transfer, which is true. But, there are other things that differentiate 5G from its predecessor. For starters, 5G connections can cover a wider area than 4G. This means that even if you’re far from your router or cell tower, you can still access 5G networks at the same speed. Just don’t be too far away, as the technology isn’t capable of reaching that far yet.

Apart from that, 5G is less prone to interference compared to 4G networks. Even if mmWave is hampered with the presence of obstacles, it still doesn’t stop it from performing relatively better than 4G. For example, even if there were several other antennas in your area, you still experience better speeds while on a 5G network compared to 4G. 5G targets devices directly, instead of spreading the waves across the whole area.

Finally, with 5G connections, more devices have access to the network. Currently, 4G networks still have a cap when it comes to the number of devices simultaneously connected. As more devices connect to the same 4G network, internet speeds tend to get slower. With 5G, however, adding more devices won’t hamper its overall performance mostly because of additional bandwidth and wider coverage.

What’s next for the new generation?

Believe it or not: We’re living in the early-access world of 5G. We hear about major telecommunication companies starting to adopt 5G in their mobile networks, and things are about to get bigger. While their data plans are available to the general public, several improvements to network infrastructure are to follow. We’re talking better signal towers, and more of them across the world.

In the future, 5G may not be limited to just mobile networks. Car companies are looking at the possibility of applying 5G to smart cars, especially for navigation. Cars on the road will be able to share data like traffic situation, road hazards, and other delays. Even things like virtual and augmented reality can make use of 5G for better simulations.

By March 2019, the early deployment of 5G will be finished. Hopefully by then, we can get more information on what 5G can do for the world. The new generation is here, but we still have to wait and see how far 5G will take us.

The post The new online generation: Explaining 5G internet appeared first on GadgetMatch.

]]>
C is the key: Explaining USB Type-C https://www.gadgetmatch.com/usb-type-c-explainer-thunderbolt/ Sun, 09 Dec 2018 23:10:44 +0000 https://www.gadgetmatch.com/?p=64879 What really makes this new standard special

The post C is the key: Explaining USB Type-C appeared first on GadgetMatch.

]]>
For years, people have grown accustomed to using USB ports for almost all of their devices. Whether you need to charge your phone using your computer or use a controller to play games, you can always count on a USB port to be readily available for you. But 2018 was the year of change and innovation, and the USB port you know and love welcomed change in a big way.

Introducing: USB Type-C, the newest port added to the family. Its round shape brought many new uses and functionalities to your ports. But, how different is it from its much older brothers? How have companies revolutionized its use in mainstream devices?

What is this USB Type-C port?

The USB Type-C (USB-C) port is a not-so-recent discovery in the world of tech. The USB Implementers Forum (USB-IF) developed this USB port back in 2013, and launched it into mass production the following year. The connector is a reversible oval shape, much different from the usual rectangular shape of the previous generation. Its reversibility allows any orientation of the cable for transferring files or charging your device.

USB-IF developed USB-C following the USB 3.1 standard. Such a standard was particularly used because of its faster transfer speeds and charging capabilities. With a USB-C port, you can transfer an hour-long movie in less than 30 seconds, provided you have the appropriate connector for it.

Computer and smartphone manufacturers have incorporated the USB-C port in most of their devices. One of the early adopters of the new technology was Apple, with their redesigned 12-inch MacBook in 2015. Other computer manufacturers followed in the later years, especially with the release of the Thunderbolt 3 technology used for gaming machines.

It’s the younger, faster and more all-around sibling

USB-C has been around for the past four years, and it has gradually developed into an all-around port for users. Alongside Thunderbolt 3, the USB-C port posts the highest data transfer speed across all the available USB connections in existence. Not only that, USB-C ports these days can now connect your devices to external GPUs and displays, and charge your devices. Most USB-C ports even support fast charging for smartphones.

While the technology behind it is supported by a USB 3.1 standard, it’s still very much different from other USB ports that use the USB 3.1 protocol. For starters, the USB 3.1 standard found in USB-C ports are USB 3.1 Gen 2 ports, which offer twice as much performance in data transfer as USB 3.1 Gen 1 ports. Most of the Gen 1 ports also use an older USB Type-A standard, which works for most of your gadgets and peripherals today. However, you would need more adapters for other functionalities, like displaying to a monitor.

But the USB-C port is a far cry from the old USB 2.0 and 3.0 protocols, which have been in existence for 14 years (and counting). Data transfer speeds for those two protocols are significantly slower compared to the USB-C port. An hour-long movie would ideally take around one to two minutes on a USB 2.0 port. Also, older USB protocols don’t really allow you to power up devices that need more electricity. So, charging devices on them might not be as fast.

Supercharged with Thunderbolt 3

So, you’re probably wondering what really makes a USB-C port just that fast. It’s not so much that it’s round, or that it’s new; rather, it’s the technology inside it. Late 2015 saw the arrival of the new Thunderbolt 3 standard specifically for USB-C ports. It first started out in most Windows laptops before making it to the 2016 MacBook Pro and several gaming motherboards.

What Thunderbolt 3 does for USB-C ports is to significantly increase its capacity and capabilities by a mile. We’re talking faster file transfer, heightened gaming experiences, and being able to plug in 4K displays for clearer images. Thunderbolt 3 also allows much bigger devices to be charged at a controlled rate. This is mostly evident with the MacBook Pro, several high-end Ultrabooks, and most recently, the 2018 iPad Pro.

The charging capacity brought about by Thunderbolt 3 deals with a tweak to how USB power delivery works. USB power delivery standards state that each USB standard has specific conditions that must be met to power up devices. Early versions of USB ports only allow a small amount of electricity (2.5W) for delivery, while USB-C allows for the full 100W.  Basically, you went from just powering up your mouse and keyboard to charging your entire laptop.

What’s to come for USB-C?

At this point in time, you’re already living in the future that the USB-C port hopes to achieve. Suddenly, you can simply bring a USB-C cable around, plug it into a powerbank, and you can already charge your expensive MacBook. More and more devices are starting to adopt USB-C because of its potential to enhance your tech experience as a whole.

However, people still find it difficult to switch to USB-C, and for good reason. Most devices continue to use a USB Type-A or micro-USB connector, especially gaming controllers and peripherals. Also, they can argue that the old ports are more accessible. In a not-so-distant future, using a USB-C port could potentially replace a phone’s headphone jack.

The future of USB-C is still uncertain. Companies will iron out the new technology more so it can become mainstream for the future. Let’s just hope that by the time that happens, there won’t be a USB Type-D yet.

The post C is the key: Explaining USB Type-C appeared first on GadgetMatch.

]]>
No more cords: Wireless charging explained https://www.gadgetmatch.com/wireless-charging-explainer-qi/ Sat, 01 Dec 2018 23:41:47 +0000 https://www.gadgetmatch.com/?p=64406 More and more things are going wireless

The post No more cords: Wireless charging explained appeared first on GadgetMatch.

]]>
A lot of things have gone wireless over the past few years. From internet connections to gaming with your friends, the world is becoming more accessible without the need for physical wires. Over the course of 2018, another aspect of our lives has gone this route: charging one’s device.

Perhaps you’ve already heard of wireless charging and its presence in today’s smartphones, particularly the latest Apple devices. You may have even owned something that could wirelessly charge devices. But, what is wireless charging all about?

Let’s break down the technicalities

Wireless charging is a highly technical concept in the world of electronics. Basically, the way it works is that your charging pad contains coils that give off electromagnetic fields. These fields carry energy with them, which can be converted into electricity to power up the compatible device when placed on the pad. 

There are two ways devices can wirelessly charge: inductive charging and resonance charging. Inductive charging is mostly present in low-power charging devices, or ones that require less electricity to power up. This form is limited in range, to the point that the only way your phone charges is if it’s on the pad. Resonance charging, on the other hand, maximizes the range but lessens the amount of charge transferred.

Induction charging

Within the last ten years, several non-profit organizations have created and set wireless charging standards for companies to follow. The most popular of which is the Qi standard established in 2008 by the Wireless Power Consortium (WPC). Other standards include the Power Matters Alliance (PMA) standard in 2012, and Rezence by Alliance for Wireless Power (A4WP) from 2012 to 2015.

All about that Qi

As mentioned earlier, the Qi standard is the most popular wireless charging standard in the world. Most of today’s smartphones and peripherals are supported by Qi. It was established in 2008, with smartphones first adopting it in 2012 through the Nokia Lumia 920.

Qi focuses primarily on energy regulation. Most charging pads that use this standard work with flat surfaces for better energy distribution. Chargers with the Qi standard regulate the amount of charge they give to devices, and immediately go on standby once full. These chargers only activate once a device is placed on top, saving on the cost of electricity in the process.

Magnetic resonance charging

Most smartphone companies have made the choice to implement the Qi standard in their latest models. Apart from Nokia, companies like LG and Samsung have adopted it beginning with the LG Nexus 4 and Samsung Galaxy S6, respectively. In 2017, Apple accepted the standard with the release of their iPhone 8, iPhone 8 Plus, and iPhone X. The company also planned a charging mat called AirPower that could charge multiple devices all at once, but it has yet to be launched.

Why do most companies prefer Qi, but some don’t?

The goal of the WPC is to put forward one standard for wireless charging in the world. The organization developed the Qi standard in such a way that companies are able to integrate them into their products seamlessly. It’s because of this standard that smartphones are aligned to wireless charging pads through magnets for better charging capacity.

Apart from that, the Qi standard allows for more intelligent control over charging your phone. It can tell if your phone is fully charged and will stop sending electricity to avoid overdoing it. Of course, you’ll be able to maximize the charging capacity of your Qi wireless charger if you’re only charging one device at a time.

Wireless charges for the Razer Phone 2, Google Pixel 3, and Xiaomi Mi Mix 3

However, some companies recognize that most people own several smart devices. This is where other organizations like Power Matters Alliance come in. PMA initially used inductive charging as their base for wireless charging, which is what Qi uses, as well. Now, that same organization was able to look into resonance charging, which removes the limitation Qi has.

That’s one of the reasons why Samsung, for example, incorporated both Qi and PMA standards into their Samsung Galaxy S6. With resonance charging, devices can be charged a few centimeters away from the pad. This is especially good for people who use their phones while charging. While WPC is looking to incorporate resonance charging into Qi, certain factors and compatibility issues with devices make the standard less effective.

What does the future hold for wireless charging?

With all the talk about standards and devices, there’s no denying that wireless charging is here to stay. There are talks between the WPC and PMA on possibly coming up with just one true standard for all companies to follow. The best part is that it doesn’t stop there.

Both organizations are looking to expand their technologies beyond smartphones and consumer devices. WPC has already done so with furniture retailers like IKEA to apply wireless charging peripherals to office tables and couches. Meanwhile, PMA is looking to introduce wireless charging to restaurants and establishments, like McDonald’s and Starbucks with wireless-charging tables. It even reached a point wherein tech startups are developing their own hardware for wireless charging from longer distances.

It’s safe to say that the future is definitely bright for wireless charging. Whether companies will start making it a must-have feature for all their products remains to be seen.

Illustrations by MJ Jucutan

The post No more cords: Wireless charging explained appeared first on GadgetMatch.

]]>
Here’s what you need to know about eSIM https://www.gadgetmatch.com/what-is-esim-explainer-iphone-xs/ Sat, 29 Sep 2018 10:12:55 +0000 https://www.gadgetmatch.com/?p=59193 The technology behind Apple's first dual-SIM iPhone

The post Here’s what you need to know about eSIM appeared first on GadgetMatch.

]]>
When Apple first revealed their new iPhone XS and iPhone XS Max, people were expecting something different. While on the outside nothing seems to have changed, the inside is a whole different story. The most notable change is the introduction of eSIM (embedded SIM) technology, something that they’ve done before with the Apple Watch.

But, what is this eSIM? How different is it from the SIM card that you know and love? And does using an eSIM change the game completely?

Let’s talk about the SIM and eSIM

One of the essentials for any phone in the market is a SIM card. Short for Subscriber Identity Module, a SIM card contains key identification and security features from any network carrier. It is used by these networks to identify their consumers and provide mobile connectivity for them — through calls, texts, and access to the internet. SIM cards also allow you to store information when you decide to switch devices every now and then.

eSIM technology, as the name implies, is embedded into the phone yet it still keeps the same functionalities as before. On devices that were designed with only one SIM card slot, adding an eSIM makes it a virtual dual-SIM machine. 

How have regions adopted eSIM?

As mentioned earlier, this isn’t the first time Apple dealt with eSIM tech. The company had initially launched the eSIM for their Apple Watch Series 3 to give it better connectivity on the go. While Apples continues to incorporate eSIM in its newer Watch Series 4, they’ve decided to take it one step further with the iPhone XS and iPhone XS Max.

However, as of writing, only ten countries in the entire world currently support eSIM. This is mostly due to these countries having the proper infrastructure to support the use of it. While smartphone companies are looking to incorporate this new technology, the market for it seems to be relatively small.

The good and bad about eSIM

Like any other new technology, eSIM comes with its own set of benefits and difficulties — especially for those transitioning from the traditional SIM card. With eSIM installed in your phone, users will no longer have to go through the hassle of buying a specific SIM card.

Ideally, having an eSIM also allows you to switch between networks easily. Apart from an eSIM-capable phone, it also comes with the needed software to make the switching process faster and easier. In essence, you will be able to free up the allocated SIM card slot for a physical SIM card if your device supports it. This is most helpful when you travel abroad, and you need a local number in that country to access their network.

However, there are some processes that prove to be difficult with eSIM, one of which is quickly transferring your phone number to another phone, especially if you frequently switch devices. Unlike traditional SIM cards wherein you just transfer the card, you’d have to contact your service provider to activate the number in your new phone. This could be cumbersome depending on your provider’s customer service.

Furthermore, if the eSIM in your phone becomes corrupted or gets damaged in any way, it’s possible that you would need to replace your whole phone. Because the eSIM is integrated inside your phone, it won’t be easy to pry it out when things go wrong. This wouldn’t be too big of a concern for traditional SIM cards, especially when the card gets destroyed.

Are smartphones ready for the eSIM?

The eSIM technology is still in its young stages, and only a handful of devices currently support it. There is potential for the tech to be implemented across more devices in the future despite only a few countries welcoming them. However, a lot of people still primarily utilize traditional SIM cards given the difficulties of using an eSIM.

In the case of the new iPhones, for example, you can’t create two instances of chat apps on iOS. So even if you have two numbers running at the same time, you’d need a separate phone for another WhatsApp or Viber number, until Apple comes up with a software patch for this.

In the end, the technology’s impact can only be measured once more devices embrace it. But, for now, let’s celebrate how the eSIM gave us the first dual-SIM iPhone and see where the future will take us.

Illustrations by MJ Jucutan

The post Here’s what you need to know about eSIM appeared first on GadgetMatch.

]]>
All filters: Article 13 of the EUCD explained https://www.gadgetmatch.com/eucd-article-13-explainer/ Mon, 24 Sep 2018 05:00:28 +0000 https://www.gadgetmatch.com/?p=58350 Is this the end for memes everywhere?

The post All filters: Article 13 of the EUCD explained appeared first on GadgetMatch.

]]>
If you haven’t been on the web often lately, this may be something that has slipped past your radar. On September 12, 2018, the European Parliament voted to pass a directive that could change the way we approach the internet for years to come. But, consider first that it’s only the initial review, with a final vote happening next year.

What is this directive, and why is the internet involved? Why are people suddenly seeing #Article13 trend on Twitter a few hours after the decision was passed? What’s with this #SaveTheInternet nonsense?

Understanding the copyright directive

The directive at the forefront of this entire debacle is known as the European Union Copyright Directive, or EUCD. The EUCD hopes to streamline effective regulations towards the protection of intellectual property in the EU. It was first adopted in 2001, following the ruling during the 1996 World Intellectual Property Organization Copyright Treaty. Earlier this year, another version of the directive was drafted with added articles and stipulations.

Basically, the EUCD seeks to create measures to protect one’s copyright on created content. The range of intellectual property that should be protected include music, videos, images, algorithms/codes, and even software. The directive calls for member countries to enact and implement laws that protect copyright owners. Eventually, such stipulations also reach big companies that operate within the EU.

You might be thinking why there is an outcry over it in the first place, especially when the directive’s purpose is clear. Well, there’s one particular part of the EUCD that a lot of people disagree on: Article 13.

The unlucky Article 13

Article 13 of the EUCD isn’t a lengthy piece of reading. The whole article contains three provisions for the implementation of copyright protection on websites that host user-generated content. The directive makes a note that these websites store large amounts of user-generated content, with the main purpose, if not one of its main purposes, of earning profit. Basically, any website that allows you to upload your own content and allows you to earn money from it is affected by the directive.

The article also cites that such websites should create measures such as “effective content recognition technologies,” complaint management systems, and tracking solutions. These measures should be readily available the moment users upload content on the website itself. With such measures taken into account, it allows content creators and service providers to properly engage in discussions should there be a dispute. It’s basically what YouTube Creators is all about.

Websites like YouTube, Twitch, Facebook, and Twitter, as well as streaming apps such as Spotify, Apple Music, and IGTV (when monetization is available) are most likely the article’s main targets. The directive also explicitly states that non-profit service providers and online marketplaces will not be affected. So, Wikipedia and Shopee aren’t affected, don’t worry.

The ongoing debate towards copyright protection

For some people, the EUCD is inherently good for intellectual property protection. They argue that the primary goal of the directive is to protect users from piracy and copyright infringement. Through the EUCD, there will be systems in place that protect music labels, content creators, and publishers from any illegal use of their content online. For these people, users should be held liable for infringement of any kind (memes, remixes, and parodies are a few examples).

Furthermore, the directive not only affects users but also the companies that run these websites. It basically mandates companies to create better content recognition systems, or change their already existing system for stricter copyright protection. If they don’t make adjustments, they will be held liable for any infringement-related issues. What Article 13 does, for those who are for the EUCD, is simply a suggested improvement.

However, there are others who believe that the directive is a little too extreme and could potentially do more harm than good. Leading institutions and companies in the tech industry think that the provisions are too vague, leaving it open for interpretation. This has the potential for companies to abuse copyright claims without effective ways of intervention. Furthermore, any significant changes to already-existing systems would require heavy costs to implement.

The bigger picture here is how the directive affects the internet as a whole. Big names in the tech industry argue that it’s an attack on the creative freedom of users. Instead of allowing the internet to be an open space for the right way of creativity, it simply adds more filters and restrictions in the process. Basically, you can’t put up an Avengers meme without having the approval of Disney and Marvel Studios first.

So, what happens now?

The EUCD was put in place to protect copyright — a simple and basic goal. There is recognition that there are measures that must be in place to uphold copyright. There is no denying that big companies have to abide by intellectual property rules, or suffer severe consequences for infringement. However, a lot of people are clamoring that these measures are both vague and sound extreme. Not only does the directive infringe one’s creative freedom in providing quality content, but it also makes the whole process costly and rigid.

At the end of the day, everybody wants to protect copyright. The argument for or against the EUCD is already past the debate on whether protecting copyright is right or wrong. The debate now is whether or not a open source like the internet should be kept that way or be strictly protected at all costs.

All of these will come into play in January 2019, when the European Parliament casts its vote for or against the directive. If you have the time to read the EUCD, you can access the full document here.

The post All filters: Article 13 of the EUCD explained appeared first on GadgetMatch.

]]>
Play more, charge less: Huawei’s GPU Turbo explained https://www.gadgetmatch.com/huawei-gpu-turbo-gaming-explained/ Sun, 26 Aug 2018 00:34:27 +0000 https://www.gadgetmatch.com/?p=52518 Better visuals without sacrificing battery life?

The post Play more, charge less: Huawei’s GPU Turbo explained appeared first on GadgetMatch.

]]>
Aside from using your phone to call, text, and take pictures, you now have the power to access the internet and play games with others. Instead of limiting yourself to Snake and Bounce, you now have online games such as PUBG Mobile and Mobile Legends.

There’s just one problem: Not all games are playable across all smartphones. With the gaming world now expanding to the mobile scene, you would need a smartphone with the latest hardware and software inside it. Even if that’s not the case, you would need a smartphone that can handle long hours of gaming, as well. It’s an intense fight over what matters to you the most: performance versus efficiency.

Fortunately, the choice shouldn’t be very difficult thanks to Huawei’s latest mobile advancement: GPU Turbo.

What’s GPU Turbo all about?

GPU Turbo processing technology aims to enhance the gaming experience across Huawei’s smartphones. Executives promise that the tech will boost gaming performance while maintaining the phone’s efficiency. This means you can play games on your smartphone without sacrificing much — like battery life, for example.

The technology looks at the graphical capabilities of your phone and adjusts it accordingly, especially for gaming. With GPU Turbo, technologies such as 4D gaming and both augmented and virtual reality (AR and VR) are taken care of. Huawei believes that GPU Turbo will boost graphical performance by 60 percent, and can make even budget phones play graphically intensive games.

Apart from boosting visual performance, GPU Turbo also enables smartphones to maximize efficiency. One common problem across all smartphones is that the battery depletes relatively fast while you’re gaming. Partner that with a non-effective cooling solution within the phone, and it’s basically device overkill when playing games. What GPU Turbo does is extend your phone’s battery life by 30 percent and keep your device relatively cool while playing.

Implications on Huawei Smartphones

One of the key insights Huawei executives received was about consumer demand for a smoother mobile gaming experience. Because people want to play the latest mobile games seamlessly, they would want to buy smartphones that are capable of doing so. Graphical performance should not suffer in the slightest, especially for multiplayer online battle arena (MOBA) and battle royale games.

The fun doesn’t stop there: With Huawei smartphones supporting GPU Turbo, other technologies such as AR and VR get a chance to truly shine. Huawei executives claim that GPU Turbo opens up opportunities for innovations like online shopping through AR or telemedicine through VR. At this rate, in theory, you could have a truly complete smartphone experience on your hands.

As of writing, GPU Turbo will take effect Huawei’s latest smartphones like the new Huawei Nova 3 series. However, older smartphones supported by the latest EMUI will experience the upgrade, as well. (View the list here.)

If you’ve been dying to have the full mobile gaming experience, GPU Turbo is definitely something to watch out for.

Illustrations by MJ Jucutan

The post Play more, charge less: Huawei’s GPU Turbo explained appeared first on GadgetMatch.

]]>
The future for all games: Ray Tracing explained https://www.gadgetmatch.com/nvidia-geforce-rtx-ray-tracing-explained/ Thu, 23 Aug 2018 09:05:44 +0000 https://www.gadgetmatch.com/?p=56139 The magic behind NVIDIA's RTX series

The post The future for all games: Ray Tracing explained appeared first on GadgetMatch.

]]>
NVIDIA seemed to have struck gold with the announcement of their brand new graphics cards for gamers. These cards are set to bring the gaming world into unparalleled heights thanks to the technologies behind them. The company calls them the RTX series, and the biggest feature within these graphics cards is real-time, ray tracing technology.

But, what is ray tracing technology? What is it about this technology that had NVIDIA wanting to produce a new line of cards to house it? Will it really change the gaming experience as a whole?

Ray tracing in a nutshell

Ray tracing is a rendering technique that uses rays of light to project images or objects onto your screen. These rays of light determine the colors, reflections, refractions, and shadows that the objects possess. These also show more accurate, more realistic images as the rays trace back to any source of light in the surrounding. To put it simply, ray tracing is what happens when you could take a picture of anything around you with your eyes.

The technology isn’t that new; in fact, it has been used since the 1960s for movies and television. Ray tracing is the main proponent behind CGI, where most special effects are often rendered to recreate realistic backgrounds with accurate coloring. In 2008, Intel showed a demonstration of the game Enemy Territory: Quake Wars that used ray tracing powered by a Xeon Tigerton processor. Currently, there are applications that allow you to edit videos using ray tracing such as Adobe’s After Effects.

Shifting from rasterization to ray tracing

For the longest time, NVIDIA has worked with multiple companies to produce game-grade graphics cards for consumers. The main technology behind these graphics cards is rasterization. In a nutshell, rasterization creates shapes to outline certain elements during gameplay. These shapes are given various colors to mimic reflections and shadows produced by such objects. The technology does not use up too much processing power to produce high-quality images for games. Rasterization enables gamers to play at smoother frame rates while getting the best and most realistic image quality.

However, NVIDIA wanted to take things up a notch when producing the next generation of graphics cards for the modern-day gamer. The company wanted to improve the gaming experience by any means, thus bringing in ray tracing in their graphics cards. With ray tracing, colors are more accurate allowing for a more immersive gaming experience — at least that’s how the company explains it. This is clearly seen with their exclusive gameplay of Shadow of the Tomb Raider:

This technology became the backbone for their new RTX graphics cards, putting much emphasis on real-time interactions within games. The RTX graphics cards possess greater memory capacity and processing speeds to keep up with the demands of the technology inside it. With NVIDIA’s Turing architecture, these new cards make the ray tracing processes much faster while using less computations.

Risks of going for ray tracing

Of course, with new technologies comes risks to consider before buying into them. First, ray tracing heavily relies on multiple calculations to generate accurate images on your screen. Back then, computers and graphics cards were not powerful enough to produce quality images immediately using ray tracing. Production of such images can take days to possibly weeks or months, as seen with most movies that heavily rely on CGI.

When applying ray tracing technology to modern games, graphics cards tend to suffer more. The computational requirement for ray tracing is much more than the graphics card’s virtual memory (VRAM) could handle. Of course, it depends on how much RAM is included in your graphics card — even then, it would consume more energy than it’s optimized for. These are the risks that NVIDIA is constantly trying to address with their new RTX cards.

There is still a lot of work needed to prove that ray tracing is the future for gaming. While the technology wants to bring to you the most immersive gaming experience ever, it also comes with a heavy cost — not just on your wallet. Let’s hope that the RTX series is worth the wait.

The post The future for all games: Ray Tracing explained appeared first on GadgetMatch.

]]>
The importance of artificial intelligence in smartphones https://www.gadgetmatch.com/artificial-intelligence-smartphones-importance/ Mon, 02 Jul 2018 05:41:23 +0000 https://www.gadgetmatch.com/?p=46837 Is this still the future of technology?

The post The importance of artificial intelligence in smartphones appeared first on GadgetMatch.

]]>
Have you ever wondered what smartphone brands actually mean when they tell you that their cameras use artificial intelligence (AI)?

With AI now becoming a significant part of our daily lives, we start to look into how this technology found its way into the market, and see whether or not AI truly is the future.

What is Artificial Intelligence?

Artificial intelligence, or AI for short, is a not-so-fairly new concept in the world of technology. What it basically means is that machines are given human-like intelligence through a system of information and programs or applications that are built into machines.

Machines with AI built inside can perform a variety of tasks mostly observed through human intuition like problem solving, gathering knowledge, and logical reasoning — among others. It’s basically making machines smarter and, in a way, more human-like.

Illustrations by Kimchi Lee

AI has been a part of many devices over the past few years, from smart homes to applications on your smartphone. Companies like Amazon and Google have come up with smart home devices that assist people with their day-to-day tasks such as Alexa and Google Assistant.

Businesses with online presence through company websites have also integrated chat boxes and online assistance bots that automatically answer any customer concerns depending on the information given.

How AI found its way to smartphones

Artificial intelligence was often associated with creating robots to perform human-like functions at a much faster, more efficient rate — which is heavily portrayed on mainstream media. Through AI, these machines learn more about the environment they’re in, and carefully adjust to meet the needs of the users. Such a process is called machine learning.

Nowadays, machine learning isn’t just limited to AI robots that learn what people are doing, but has now branched out to what people are thinking, inquiring about, and saying to other people. AI has slowly made its way into other devices that are much more accessible to us, primarily through the internet.

Machine learning is now incorporated into smart home devices, online video streaming websites like YouTube and Netflix, social media websites such as Facebook and Twitter; basically, the technology behind AI constantly learns more about people, their interests, and day-to-day activities.

The newest member of AI-integrated devices are smartphones themselves. Companies like Apple and Google have looked into integrating AI into the processors of their flagship phones — the iPhone and Pixel series, respectively. Early 2018 saw most Android smartphone brands integrate AI within their phones as a way of enhancing the user experience even further; Huawei and ASUS released their new flagship phone lines with their cameras utilizing AI for smarter responses to the environment around the user.

It’s quite possible that smartphones could very well lead the transition of all devices towards machine learning and AI in the near future.

Smartphones with AI

As mentioned, two companies have integrated AI into their smartphones to provide enhanced user experiences in a totally different way. One of these companies is ASUS, with their recently released ZenFone 5 series of smartphones with cameras powered by AI. Its shooters focus primarily on taking better photos and adjusting to the environment around you. The ZenFone 5’s AI Photo Learning allows the phone to learn how you like your photos and adjust the settings accordingly so you don’t have to.

Apart from its cameras, the ZenFone 5 series uses AI to boost overall performance. The base model is powered by a Qualcomm Snapdragon 636 processor, which enables the full utilization of AI features on the phone. The AI Boost technology allows the handset to have an instant hit in performance when running heavy-duty applications and games. Of course, AI in the ZenFone 5 also predicts which apps you will use next and learns which apps you use regularly.

Another company that integrates AI in its smartphones is Huawei, with the Mate 10 and P20 series. They’re powered by the Kirin 970 processor — which boosts overall performance and efficiency using integrated AI. This means that the phones will adjust to how much you use them and maximize performance every step of the way. They also come with Huawei’s EMUI 8.0 with its own set of AI features such as Smart Screen for multitasking and real-time translation during calls.

Much like the ZenFone 5, the Huawei Mate 10 and P20 phones also have cameras powered by AI. This powers the phones’ dual-lens camera setups for scene and object recognition, automatically adjusting the camera’s settings to suit the situation. Huawei also emphasizes producing professional-grade photos by allowing the AI to adjust the camera’s focus on the subject. That way, you are able to achieve a perfect-looking selfie or portrait — without the need to manually adjust the settings for a long period of time.

What we get from AI

Artificial intelligence opens up many opportunities for technology to be like humans in terms of processing thoughts and insights. What AI does is it allows machines to learn more about humans and tailor-fits its processes and capabilities to match us, from search engines to smarter applications. When treated properly, AI can actually deliver better and more efficient ways of dealing with the problems people face almost every single day.

The only downside is AI has the potential to even invade one’s privacy, especially through one’s smartphone. Because the technology is constantly learning more about its user through his or her devices, this opens the door for the data to be retrieved by, quite literally, anyone on the internet.

Because people nowadays access their smartphones almost every chance they get, people who truly know how AI works have the potential to abuse what they know and use it for their own personal gain, either through malicious activities like cyberstalking and cyberbullying, or online attacks like hacking or phishing.

The future of AI

2018 is looking like the year of AI with the unveiling of smartphones and revamped smart devices to upgrade the user experience. The possibilities for artificial intelligence are endless, given its wide usage across any available platform.

For now, it’s intelligent cameras on your smartphones that adjust settings for you to save the hassle of getting the perfect image. Some time in the future, AI could very well exist even on a gaming controller or mirrorless camera to adjust to your needs. However, we have to be aware about the dangers of using AI to its fullest as it can also lead to our own careless actions.

Indeed, the future is bright for artificial intelligence — as long as we use it for the right reasons.

The post The importance of artificial intelligence in smartphones appeared first on GadgetMatch.

]]>
Setting the VAR: Football’s newest technology https://www.gadgetmatch.com/var-football-fifa-world-cup/ Wed, 27 Jun 2018 07:10:26 +0000 https://www.gadgetmatch.com/?p=51314 And where better to shine than the 2018 FIFA World Cup

The post Setting the VAR: Football’s newest technology appeared first on GadgetMatch.

]]>
The 2018 FIFA World Cup is in full swing, with the last few matches about to take place en route to the Round of 16 on Saturday. While the world’s greatest football players are taking center stage, another main attraction in the tournament is the football world’s latest technology: the Video Assistant Referee.

The Video Assistant Referee or VAR has been adopted in sports like tennis and rugby, and recently by football leagues such as the English FA Cup and the Bundesliga in Germany. Ideally, the VAR helps make decisions for referees much easier and more accurate — especially for crucial, game-changing calls. But is the technology useful and helpful in every possible way?

Illustrations from FIFA.com

What is the VAR?

The VAR is a video system that feeds information to referees on the pitch through a wireless earpiece. Assistant referees  gather the information away from the stadium and forward these to the referees when a call is contested. The VAR marks the huge step football leagues are taking to digitize football, and has been used since last year.

It utilizes a goal line technology that allows the cameras in the stadium to scan the pitch at every minute. With this technology, movement on the pitch is detected at all possible angles and calls can be made more precisely. Assistant referees inside a control room have access to all these cameras and they send live feed to the pitch via tablet or iPad should the referees want to look at the footage themselves.

The VAR reviews game-changing calls on the football pitch at the time a protest is filed. FIFA lists only four game-changing calls to be considered: goals, penalties, direct red card incidents, and mistaken identity. The VAR checks the validity of these calls and sends the information to the referees. Do note, however, that the referees themselves still have the final decision on what call to make.

The system made its debut in a FIFA Club World Cup in December 2017 between Atletico Nacional and Kashima Antlers. The referee rewarded Kashima with a penalty after reviewing a play inside the penalty box.

Putting the VAR to work

2017 also saw the VAR’s debut in the English FA Cup, but it had its own set of controversies along the way. During a quarterfinal match between Tottenham and Rochdale, a goal by Tottenham was reversed for unclear reasons cited by the VAR. German football league Bundesliga also utilized the VAR during its latest season, but received mixed reactions from players and fans.

In the 2018 FIFA World Cup, the VAR takes center stage as a decision-making aide for referees in the group stages. The first instance was a non-call on a foul by Spain’s Diego Costa in their 3-3 epic against Portugal. Costa would slice the Portugese defense to tie the game at 1-1 at the time, but did so while taking down Pepe from Portugal. After the review from the VAR, the referee stood by his decision to count the goal.

https://www.youtube.com/watch?v=Hf8d06eXdxY

The second instance happened in the France-Australia game when French striker Antoine Griezmann was tackled inside the box, yet the referee called for play to continue. Griezmann received a pass from Paul Pogba, and virtually blitzed through the Socceroo’s defensive line. Griezmann was awarded the penalty after reviewing footage from the VAR as the French went on to win, 2-1.

The third instance was in the Peru-Denmark game when another penalty was awarded to Christian Cuerva of Peru. Denmark’s Yussuf Poulsen tackled the Peruvian in the box, yet the referee called for the play to continue until the incident was reviewed via VAR. However, Cuerva missed the penalty and Poulsen scored in another possession to give the Danish the win, 1-0.

A VAR too high or too low?

While the VAR has only been around for well over a year, it isn’t exempt from both praise and criticism. Many people have shown their praises for the newest technology applied to the football world. The VAR now adds certainty and legitimacy to calls made by referees during matches, instead of them making the same wrong call every time. With football players and managers focusing on the tiniest of details to improve their game, information from the VAR becomes important.

The VAR provides an opportunity for football games to be fair and balanced. Referees now have different vantage points to look at when making calls that ultimately change the outcome of the game. People came to see a quality match wherein the players truly shine, but sometimes the referee’s poor decisions hamper that. In this regard, there is no excuse for not making the right decision with all the video evidence available.

However, a lot of people also have strong feelings against using VAR. While the effort to make the right calls is appreciated, it gets in the way of what makes football so special. When referees call for the VAR — especially with contested goals — fans become anxious instead of jubillant. Usually, fans go into a frenzy the moment the ball goes through the net — no replays needed. It’s as if the game feels all too unrealistic because of all the technicalities.

For football players and coaches, the VAR only adds confusion to fans. Because some football stadiums are built without any big screens, fans become unaware of what’s happening when the referee calls for the VAR. Iran’s coach Carlos Queiroz lambasted the use of the VAR for close, judgment calls — particularly the offside call on his squad in a loss to Spain. He believes that the VAR was put in place to correct obvious mistakes by referees, not debatable calls.

Photo from FIFA.com

Final verdict

The VAR is a fairly new technology introduced in the world of football, and surely, it’s not perfect. It’s a bold take on digitalizing football, keeping up with the technological demands of today. Because football is decided by people making the right calls at the right time, the VAR becomes an important part in establishing the basis for such calls. The VAR is a useful solution for referees to make the right decisions on the pitch.

However, we must be critical about how the VAR should play in during very crucial moments in the game. The VAR should help give fans a fair yet exciting football match without losing its spirit. With the Round of 16 coming up, all eyes will be on the VAR and whether it will help make the road to the finals interesting or not.

At the end of the day, football fans came to see the best players in the world do what they do best, and no amount of technology should get in the way of that.

The post Setting the VAR: Football’s newest technology appeared first on GadgetMatch.

]]>
Why do Android updates arrive so late? https://www.gadgetmatch.com/why-android-updates-arrive-late-fragmentation/ Fri, 15 Jun 2018 05:58:55 +0000 https://www.gadgetmatch.com/?p=49199 And what Google has been doing to solve it

The post Why do Android updates arrive so late? appeared first on GadgetMatch.

]]>
With new devices popping up left and right, more and more people now have access to the latest Android operating system (OS) and its technologies. From artificial intelligence (AI)-powered cameras to smoother, simpler designs to the user interface, Android has been looking to attract more users to its platform over the past few years.

However, there are consumers who own or wish to buy cheaper devices that still unfortunately use the older versions of Android, and wonder if they get to experience the new updates for themselves — only for them to realize that it’s the end of the line for their gadgets.

Updates arrive slower, mostly in small parts, and sometimes the entire OS cannot be upgraded any further. The questions Android developers have been facing from consumers within the last few years are these: Why do updates arrive so late, and what is Google doing about it?

The Android way

The Android operating system is one big, open-source platform for developers and manufacturers. This means that they are given the liberty to modify such software to introduce and improve their products. Android smartphone companies are able to set themselves apart from the others mostly because of this approach towards the unique interfaces.

According to Google’s Android Developers website, 63.2 percent of Android devices in the market run on older Android systems than Android 7.0 Nougat; manufacturers opt to sell their devices with much older software due to their insistence of applying their own Android “skins” or their own version of the OS.

Companies such as Xiaomi, Samsung, Huawei, and ASUS customize the Android operating systems to give users a unique experience when using their devices. Xiaomi’s MIUI 10 and Samsung’s Experience bring new features for AI and major redesigns for their latest smartphones. ASUS’ ZenUI offers features that support the gaming capabilities of their smartphones, while Huawei’s EMUI allows you to sync your LinkedIn account to your address book.

Implementing such skins either limits the number of updates the device receives, or it makes the gadget no longer upgradeable. This is how Android fragmentation works, and unfortunately, is also the reason you can’t get your older Android device to upgrade to the latest software easily.

People were excited when several companies announced which smartphones would receive an upgrade to Android 8.0 Oreo over the past few months. However, only about six percent of devices have the update ready for users either due to delays in the rollout or because of bugs that affected the device’s performance.

Android fragmentation has become a problem for third-party developers, especially those who were hoping to use the newer and more updated software to create better games and utility apps for people. Because of fragmentation, developers are limited to the older and less secure versions of Android, as well as the codes and programs that come with it.

The applications these developers make are not guaranteed to work without encountering problems along the way. The late arrival of updates hampers the developers’ ability to make any changes to their applications, and even put the user’s safety at risk.

Google’s plan of action

At present, the developers at Google did a number of projects for updates to arrive faster and all at once for third-party developers and phone manufacturers.

They came up with pure Android software known as Android One, and they encouraged device manufacturers to create smartphones using the Android One OS. Android One became Google’s standard for manufacturers and developers to use in their new devices and applications. With smartphones incorporating Android One, updates become more regular and can be streamlined across multiple devices all at once.

Android One was already available on a few devices since its initial launch in 2014, from the Cherry Mobile G1 to the Xiaomi Mi A1. However, the pure Android OS disappeared for a while because the software itself gave no freedom for manufactures to differentiate themselves. Eventually, Android One found itself back in the market with Nokia spearheading the effort to reintroduce it with the likes of the well-received Nokia 7 Plus.

Don’t confuse Android One with Android Go, Google’s cut-down version of its Android OS, however. While Android One is the standard Android software Google wants to apply across all devices, Android Go is designed for entry-level devices. Devices running Android Go will be able to maximize storage options and mobile data management for you, so you will be able to do many things with your phone without worrying about space and data consumption.

The latest experiment: Project Treble

Another project undertaken by Google to address the fragmentation issue is Project Treble. Project Treble is a service offered to users to help streamline the process of updating their software to the latest version from Android, and is currently offered to devices that have Android Oreo installed out of the box.

What Project Treble does is that it allows manufacturers to deliver the updates themselves, without having to go through long and expensive processes to deliver them. This also allows developers themselves to create applications using new codes and programs provided by the Android software.

Following Project Treble was the release of the beta version for Android P. Like in previous iterations, Google did this so developers can already work on their own software-specific applications and technologies that fit the profile Android P brings to the table. Of course, the beta version is still only available to a select number of companies working on new devices, but it will be available across all devices once a final version is released.

Initially, Project Treble and Android P Beta were only available on Google’s Pixel phones, but they’ve now branched out to non-Pixel phones, as well. Treble is available for all new devices that have Android Oreo pre-installed, so developers can experience Android P Beta and work around the new software. A list of devices that already support Android P Beta can be found here and on Android’s Developer website.

What’s next for Android?

With Project Treble and Android continuously bringing updates to the platform faster to consumers, Google is hoping to have just one centralized operating system in the future. Over the past year, Google has been working on Fuchsia, designed to be the central operating system that is potentially going to replace both Chrome OS and Android in the near future. Fuchsia is expected to further streamline updates as a way of fighting Android fragmentation.

Android P is still in its beta version as of writing, meaning that Google is getting feedback from companies that have devices already powered or tested using the latest Android software over the past few months. Google is constantly working on better and faster ways for software updates to reach Android devices, provided that such devices have the necessary hardware to accommodate the upgrades.

For third-party developers, Google has even made their services more accessible to older Android devices. Recently, it gave older devices access to the company’s virtual assistant service, Google Assistant, as long as these devices were running at least an Android 5.0 Lollipop system.

With all these developments for Android, it’s safe to say that Google has done what it can to address the issue on updates arriving so late, so don’t worry if your phone is still running on an older Android OS, because Google hasn’t forgotten you.

Illustrations by Yanni Panesa

The post Why do Android updates arrive so late? appeared first on GadgetMatch.

]]>
Basics of cryptocurrency: Risks and benefits https://www.gadgetmatch.com/cryptocurrency-basics-bitcoin-risk-benefits/ Mon, 23 Apr 2018 00:46:55 +0000 https://www.gadgetmatch.com/?p=44225 Should you buy in on the craze?

The post Basics of cryptocurrency: Risks and benefits appeared first on GadgetMatch.

]]>
For a while, cryptocurrencies became the talk of the town across the internet. People all over the world saw the potential of what is essentially “virtual money,” starting a frenzy of investments, theories, and yes, memes — particularly towards one of the more popular cryptocurrencies, Bitcoin.

But do we really understand the power these cryptocurrencies yield, and how such power can affect the whole world over?

What are cryptocurrencies?

Cryptocurrencies are virtual currencies that are exchanged online with no interference from anyone, not even the government. These currencies, through their language of cryptography, contain secured information and are exchanged through a recording system known as a blockchain.

No one regulates the exchanges and no one controls how much of the cryptocurrency should be out there, but the blockchain keeps all of the exchanges transparent and fair for everyone. Think of it as openly sharing your share of a pizza to a friend in exchange for money, with your other friends keeping track of the exchange. Your friends make sure that you have a slice of pizza to give, your friend has the money he promised you, and that these items are actually from each of you and not from someone else.

Because of the creation of numerous cryptocurrencies all over the internet, a virtual market has been created for people who are interested and invested in these virtual currencies to trade among themselves. Groups of people have also made an effort to produce their own cryptocurrencies from their computers through cryptomining. Cryptomining, much like regular mining, is creating cryptocurrency tokens (an online version of coins) and putting them into the blockchain to be traded; it’s printing your own money, except it’s done from a computer and shared online.

In Bitcoin, for example: People who want to contribute to its blockchain to earn some share of the cryptocurrency would go through activities such as cryptomining. Despite it being one of the primary activities for creating and gaining Bitcoin, it’s also one of the more expensive ways of doing so since most cryptomining setups require computers with the most up-to-date hardware and processing speeds. Any person who wishes to do cryptomining would spend a ton of money just for the necessary hardware — all just to mine their own Bitcoin.

Where did the hype come from?

The tailend of 2017 (October to December) saw people get into a frenzy towards cryptocurrencies and its perceived value — a frenzy driven by growing interest. People had started to not only be invested (pun intended) in learning about cryptocurrencies in general, but they also searched “Bitcoin” a whole lot.

With more people understanding cryptocurrencies, investments towards such virtual currencies (particularly towards Bitcoin) increased, thereby expanding the market by a whopping 1,200 percent. Imagine getting 15,000 shares on your Facebook post about your dog within two days – that’s how quickly it blew up.

Another phenomenon that contributed to the rise of cryptocurrencies is the creation of initial coin offerings (ICO). An ICO is a public, unregulated way of earning funds for cryptocurrencies and is widely used by startups to bypass the usual fundraising activities for capital; ICOs are much like crowdfunding (such as Kickstarter or GoFundMe), except no one controls how the funding goes.

ICOs are usually distributed in Bitcoins; these will be used to start projects or applications that people create but initially have no money to operate. Because people have new ideas and the Internet is one of the faster ways to have the idea develop and spread all over, more and more people would go through ICOs to fund their projects instead of getting bank loans or using their own money.

Effects of cryptocurrencies

The impact of these cryptocurrencies take on a grand scale, especially from an economic context. People continually join the hype towards cryptocurrencies, so much so that it drives demand for them. Participating in online trading for cryptocurrencies is faster than those in the stock market, and is easily accessible by people since it is unregulated.

As such, governments are pushing for cryptocurrencies as a means for payment to add convenience for customers, especially those with plans to go paperless with their money. The Indian government, for example, is learning to embrace Bitcoin within their monetary system after taking in measures against tax evasion in black markets; they are also looking into regulating Bitcoin and other cryptocurrencies as well in the near future.

The risk of partaking in cryptocurrencies lies in its greatest feature: an organic form of virtual currency. Because no entity has any control of cryptocurrencies — including governments — these virtual currencies are prone to online attacks (most common form of attack: hacking), which rapidly hamper their growth and reduce their value significantly. With a large number of people currently trading cryptocurrencies online, the risk of hackers increases significantly, causing these people to lose more money when worse comes to worst.

Another threat posed by its greatest feature is that people would abuse the high interest rates and entice new investors to purchase tokens. Because there is no body to regulate the trading online, people engage in scams to take advantage of new investors who are not guided properly in the virtual currency market — despite it being heavily secured by cryptography.

Participating in the schemes makes the trade unfair, even with efforts to make things equal for everyone. One example is the Bitcoin Savings and Trust Ponzi scheme in 2011, which was shut down in 2012 due to the perpetrator, Trendon Shavers, being accused of raising 700,000 BTC — all from new investors who didn’t know any better.

Cryptocurrencies at present

At the moment, Bitcoin remains to be the top-traded cryptocurrency within the market, valued at US$ 151.1 billion — in spite of its decline over the past few months. Countries are starting to either accept Bitcoin as part of their national economies or reject Bitcoin and its risks. Litecoin, which was dubbed as an alternative to Bitcoin, is not performing as well as Bitcoin within the past month, culminating in a so-far failing venture with digital wallet service Abra. Ethereum, one of Bitcoin’s closest competitors, has quickly risen due to its value to customers.

There are countries in the world that think that cryptocurrencies can bring them out of total economic collapse and keep the country afloat. Venezuela, for instance had released its own cryptocurrency, Petro, after its own national currency lost its value. Other struggling nations such as Iran and Turkey are looking to follow suit, but would need enough investment to get the necessary equipment for creating their own cryptocurrencies.

Should you be worried? Do your research, familiarize yourselves with terminologies used in the world of cryptocurrencies, and always proceed with caution.

Even with the possibility of countries going paperless with their currencies, there are some that still fear its effects and have not wholeheartedly embraced cryptocurrencies. Despite the aforementioned efforts from the Indian government to shift to cryptocurrency-based payment methods, the Reserve Bank still finds engaging in cryptocurrencies illegal, to the point of barring banks from engaging in them. Reports of ransomware spreading in the United States, hacking computers used for mining Bitcoin raise security concerns for people investing in Bitcoin.

Should you be worried?

Whether you are currently investing in cryptocurrencies or not, the risks of such virtual currencies will remain to be there as long as other people keep increasing their investments towards them. The value of these cryptocurrencies continue to be unstable to this day, especially with the hype slowly dying down due to people learning more and more about cryptocurrencies and their possible (and real) dangers.

The call for people who wish to invest in these cryptocurrencies is to practice caution. Do some research, get to know more about the terminologies used in the world of cryptocurrencies, look at news reports — with the internet at your disposal, it’s better to know what you’re getting into, should you want to get into it. Anyone who wishes to create their own cryptocurrency might want to start saving up as early as now for all the hardware.

Should you be worried? Yes, to an extent, but it helps to be prepared.

Illustrations by Yanni Panesa

The post Basics of cryptocurrency: Risks and benefits appeared first on GadgetMatch.

]]>
How Google’s Android Go is different from Android One https://www.gadgetmatch.com/android-one-go-difference-explained/ Thu, 05 Apr 2018 00:00:40 +0000 https://www.gadgetmatch.com/?p=42649 Don't let their names confuse you

The post How Google’s Android Go is different from Android One appeared first on GadgetMatch.

]]>
When shopping for an Android phone, people often ask about software version. Google keeps this easy for us to remember by naming them after desserts and in alphabetical order. For example: Android 6.0 is Marshmallow, 7.0 is Nougat, and now we have Oreo for version 8.0, which is the latest publicly available version. Android P (no dessert name as of writing) is still in the works, so let’s not worry about it for now.

Most Android phones don’t get the latest version, though. Usually, only the expensive flagship phones receive them, leaving other affordable devices in the dust. This is where Android One comes into the picture.

What is Android One?

Android One is not new; it was announced by Google back in 2014 as a platform to bring a smooth Android experience to emerging markets. Android One was made for low-cost, low-spec devices that get major OS updates for two years, security updates for three years, have the core Google services, and a stock Android interface.

The introduction of Android One was a relief since manufacturers had been pre-installing bloatware on their phones. Android One phones were known as the “poor man’s Nexus” since they were priced around US$ 150 — you practically got the software support and lag-free performance of Nexus phones for cheap.

The program started to lose its momentum after a couple of years and we barely saw any new devices. Only a few countries got fresh releases.

Android One came back to the spotlight with the announcement of the Xiaomi Mi A1 last year. Google’s newfound partnership with Xiaomi gave us hope for the Android One program. But, this was also when we noticed that Android One isn’t focused on affordable devices anymore.

Android One became a platform for manufacturers to give consumers a pure and fresh Android experience. Nokia made the smart move to make all their new phones — midrange or high-end — embrace Android One. We honestly hope others will follow.

So, what happens with cheap devices now? That’s what Android Go is for.

What about Android Go?

Android Go was originally announced in May 2017, although we didn’t get to see any device running it until Mobile World Congress 2018 in Barcelona. Now referred to as Android Oreo (Go edition), it picks up where Android One left off — well, kind of. It’s a stripped-down version of Android (specifically Android Oreo) built to run on devices with 1GB of memory or even less.

The goal now is to make really cheap devices. Expect them to be priced under US$ 100 or less than US$ 50, in some cases. Examples of the smartphones with Android Oreo (Go edition) are the ZTE Tempo Go, Nokia 1, and Alcatel 1X — all entry-level devices.

How can Google make sure Android works okay despite the limited hardware?

Every core Android app, from Gmail to Maps to Assistant, has been rebuilt and stripped of extra features. They’re streamlined and now labeled with “Go” (e.g. Gmail Go and Maps Go). To highlight apps that’ll work best with 1GB of memory or less, the Play Store for Android Oreo (Go edition) is tweaked to showcase such apps like Facebook Lite.

Since Android Oreo (Go edition) is designed for truly low-cost phones, it features data management tools for both internal storage and mobile data. To help aid with limited storage, Android Go is nearly half the size of “stock” Android which means there’s more room for apps, especially if the phone only has 8GB of storage. Also, Go and Lite apps are 50 percent smaller in file size — some even need just 1MB to install. Moreover, the OS helps users save data by restricting background data access.

One thing to note about Android Oreo (Go edition) is that it has no promised updates, hence the specific Oreo label. Perhaps when Android P gets announced, we’ll then have Android P (Go edition).

Conclusion

Let’s be clear: Android Go is not necessarily a replacement for Android One.

Android One is a line of phones defined and managed by Google. Android Go is just software that can run on entry-level devices. Android Go stretches the original purpose of Android One by making sure that the Android OS can run even if your phone is very basic.

Android Go bridges the gap between feature phones and smartphones. Hopefully, if pricing is right, consumers in developing markets will just buy Android Go-powered phones instead of feature phones.

Illustrations by MJ Jucutan

SEE ALSO: What will Android P be called?

The post How Google’s Android Go is different from Android One appeared first on GadgetMatch.

]]>
Where did the 18:9 ratio come from? https://www.gadgetmatch.com/where-did-the-18-9-ratio-come-from/ Thu, 22 Mar 2018 00:02:01 +0000 https://www.gadgetmatch.com/?p=42132 And should you get an 18:9 phone?

The post Where did the 18:9 ratio come from? appeared first on GadgetMatch.

]]>
2018 will be the year of the notched, bezel-less display. Along with it comes a new number that not everyone knows what to make of yet — the 18:9 aspect ratio.

Smartphones are quickly adapting the new 18:9 standard. However, since the ratio is still in its relative infancy, you might have heard more about its smaller variant, 16:9. Since the invention of widescreen TVs, everyone took in 16:9 as the industry standard. With 18:9 just on the horizon, should we care that our phones are getting taller?

What is aspect ratio?

First, let’s define what an aspect ratio is. The two numbers describe how big a device’s screen or a piece of media is. Specifically, it compares how wide your screen is relative to its height. The larger number is its height, while the smaller is its width.

For example, the square photos on Instagram have an aspect ratio of 1:1. As it gets taller or wider, its corresponding number on the ratio increases. It can vary from the old 4:3 to the ubiquitous 16:9 to the new 18:9.

Where did it come from?

The history of aspect ratios has always been intimately linked to the movie industry. The art of experimenting with aspect ratios began as cinematographers trying to perfect their film’s vision. With every new experiment, a new aspect ratio is born. Sometimes, their experiments become popular enough to become an industry standard.

The world’s first documented aspect ratio, 4:3, is also one of the world’s most popular ones. Remember those bulky CRT monitors you used to have? Those used the 4:3 ratio, which was adapted from old-school cinema. The first films used the film negative’s perforations (or holes) to measure their screens. In 4:3’s case, the screen was three perforations tall and four perforations wide.

When television was invented, the world of cinema faced tremendous competition. At the time, TVs also started off with a 4:3 display. Naturally, the homely convenience of a TV placed it at an advantage over the inconvenience of driving to a movie theater. The film industry had to compete.

Going head-to-head with the TV, cinematographers invented wider and taller aspect ratios. From this era, we saw the invention of the 70mm film. These huge ratios could fit more content on the screen. They became so popular that 70mm is still a standard that’s used today.

Our old friend, 16:9, arrived shortly after this boom. With aspect ratios popping out of nowhere, there came a need for a standard that everyone could follow. For this, Dr. Kern Powers, an expert at the craft, proposed the 16:9 format, a compromise between the industry’s most used ratios. With this format, you can watch either a TV show or a movie with minimal letterboxing (or the black rectangles on the edges of your screen).

Its flexibility skyrocketed the ratio into ubiquity. A lot of screens adapted the ratio as a result. Years later, almost every device prior to 2017 used a 16:9 display. Even now, the ratio that we’re most familiar with is 16:9.

Now, if 16:9 so effective, where did 18:9 come from?

The birth of 18:9

Because of the massive popularity of 16:9, 4:3 displays became obsolete. Today, finding a 4:3 device involves a trip to the nostalgia store. 16:9 began as a compromise. Since everyone adapted the compromise, it became a norm. The feud now was between HDTVs and cinemas.

Hence, the gap between portable 16:9 screens and cinema’s 70mm still exists. The next logical step is to create a compromise between 16:9 and the current 70mm cinema standard.

In 1998, cinematographer Vittorio Storaro solved this by inventing the Univisium film format, or what we know now as 18:9. Seeing the need for a new standard, he saw 18:9 as a standard that can make both cinemas and TVs happy.

At the time of his invention, only a handful of films used his new standard. In fact, most of them like Exorcist: The Beginning were his own films. Univisium would lay low for over a decade.

The rise of 18:9

In 2013, Univisium entered a renaissance with hit streaming show House of Cards, which was shot in the format. Having found a new home, Univisium crawled its way into other shows like Stranger Things and Star Trek: Discovery. In some circles, 18:9 was already known as the “streaming ratio.”

With the effectiveness of 18:9 proven, devices started adopting the new ratio. In 2017, the LG G6 and the Samsung Galaxy S8 launched with 18:9 in tow. (The S8 would use a slightly adjusted 18.5:9.)

After the trendsetters, more phones started getting into the trend. Google, OnePlus, and Huawei would soon adapt the new ratio. Even the Apple iPhone X uses a taller ratio: 19.5:9.

Should you get an 18:9 phone?

As it’s still in its infancy, the usefulness of 18:9 isn’t as apparent. However, the ratio already carries a flurry of benefits for early adopters.

Firstly, getting an 18:9 phone ensures future-proofing for a standard that’s quickly gaining traction. More shows are using 18:9. Even Hollywood is already testing the waters. 2015 film Jurassic World used the ratio.

Secondly, Univisium optimizes existing smartphone features like split-screen view. With more real estate, two apps can easily share the screen for a true multitasking experience. Even without the feature, phones can display more content in one screen without scrolling.

It’s likely that you won’t see the benefits until further down the line. The shows that use 18:9 are still too few to call it a true standard. When you watch a contemporary video on an 18:9 screen, you’ll still notice letterboxing. Despite their flexibility, some apps might even have trouble stretching to 18:9.

The vision of 18:9 is still in the future. It will have its growing pains. Critics will even put it down as a fad. However, the new ratio shows a lot of promise in uniting content under one pleasing ratio.

Illustrations by MJ Jucutan

The post Where did the 18:9 ratio come from? appeared first on GadgetMatch.

]]>
Explainer: Differences between Snapdragon processors https://www.gadgetmatch.com/explainer-snapdragon-processors/ Mon, 19 Mar 2018 07:27:28 +0000 https://www.gadgetmatch.com/?p=41108 Let's understand what's inside our phones

The post Explainer: Differences between Snapdragon processors appeared first on GadgetMatch.

]]>
In the world of mobile phones, each device is ranked by performance based on what’s powering them. The processor inside your smartphone is constantly working as much as it can to keep your phone running.

Today, especially on Android phones and tablets, the most popular of all mobile processors is Snapdragon from Qualcomm. There are several Snapdragon processors out there, and each model number gets more confusing as new variants come out. Let us help you with that.

First, a brief introduction. Snapdragon is a family of system on chip (SoC) products made by Qualcomm for use in a variety of mobile devices such as phones and tablets. It contains not just a central processing unit (CPU), but also a graphics processing unit (GPU), global positioning system (GPS), modems for LTE and Wi-Fi, and whatever is needed to create a complete chip to power a mobile device. Let’s simply refer to it as a processor so we won’t get too technical.

Not all Snapdragon processors are of the same level. Currently, Qualcomm has four Snapdragon platforms, and they’re classified by three numbers. Each series helps classify what tier (i.e. entry-level, midrange, flagship) the phone belongs to during its launch. Knowing each series also gives us a quick idea of how the device’s performance will fair.

Snapdragon 200 series

The Snapdragon 200 series is the entry-level processor range. As of writing, there are five models under the 200 series: 200, 205, 208, 210, and 212. They are found on low-cost phones and other smaller devices that don’t require much processing power. The latest to be powered by these processors is the Nokia 2 which is a cheap Android smartphone for basic functions.

We don’t see many Snapdragon 200 series-powered phones lately due to competition with MediaTek, another SoC maker that’s known to be found on budget Android devices.

Snapdragon 400 series

Moving up the ladder, we have the Snapdragon 400 series. This series bridges the gap between the entry-level and mid-tier. Like with the 200 series, the 400 series is commonly used for budget devices around the US$ 200 range and also faces tough competition with MediaTek’s offerings.

There are a number of models in this series but thankfully, as the number goes up, the specifications and performance do too. Some models in the series don’t differ much with slight modifications in speed and modem features. Also, as high-tier processors get more advanced, the lower-tier processors like the 400 series get the old higher-end features.

Some of the phones in this series are inside the Huawei Y7 Prime and LG Q6 which both have a Snapdragon 435 and the OPPO A71 (2018) and Vivo V7 which have a Snapdragon 450 — the latest and greatest in the series as of writing.

Snapdragon 600 series

Many consider the Snapdragon 600 series to be the most well-rounded in Qualcomm’s family. Why? It offers a great balance between performance and cost. Smart buyers would prefer a great midrange phone rather than an expensive flagship which they would replace in a year or two. That’s where the 600 series comes in. It offers far greater performance than the 400 series and inherits the features of a high-tier processor without the added cost.

There are more model numbers that fall under the 600 series, but the most famous of them all is the Snapdragon 625. It was a game changer when it was announced back in 2016 because it brought the efficiency of more expensive processors to cheaper phones. The Snapdragon 625 is still widely used today since it’s a reliable processor and gives budget phones midrange performance.

Since the introduction of the 625, more manufacturers are relying on the 600 series. The latest releases, the Snapdragon 630/636 and 660, are now even up to par with flagship processors from 2016. The newest phones like the Nokia 7 Plus and OPPO R11s have the Snapdragon 660, while the recently announced ASUS ZenFone 5 has the Snapdragon 636 with artificial intelligence (AI) features.

Snapdragon 800 series

The Snapdragon 800 series is Qualcomm’s top-tier lineup. Flagship phones use the latest Snapdragon 800 series processor at launch. The 800 series is not as confusing as the others because Qualcomm doesn’t release multiple high-tier processors at the same time; they usually announce two per year. Actually, we only had one for 2017 which is the Snapdragon 835 and for 2018, we currently have the Snapdragon 845 so far.

All the newest features are found on the latest 800 series processor. It uses the latest manufacturing process, highest performing graphics unit, best display tech such as higher dynamic range, and has support for the fastest storage and memory. With the trend of artificial intelligence among mobile devices, the Snapdragon 845 even has a neural processing engine dedicated to AI.

The Snapdragon 800 series has the best and most exclusive features, but they come with a price. Since the 800 series processors power flagship phones, it’s always expensive to afford one except those from Xiaomi and OnePlus.

Since we’re still in the first quarter of 2018, there aren’t that many phones available with the latest Snapdragon 845 but the list already includes the Samsung Galaxy S9, Xperia XZ2, and ZenFone 5Z. Last year’s Android flagships were all powered by the Snapdragon 835 like the OnePlus 5T, Google Pixel 2 XL, LG V30, and HTC U11+.

Ranking of the processors

At this point, it’s pretty obvious that the 800 series is the best performer of the bunch since it always gets the latest features and advancements in mobile processors. But let’s not belittle the capabilities of the 600 series which vastly improves with every release. Since it’s the next in line, whatever the 800 series has will soon be available to the 600 series. There are even rumors about a 600 series processor based on the same 10nm manufacturing process of the Snapdragon 835/845 which will be a big deal for midrange phones.

The 400 series is there to draw the line between upper-midrange and lower-midrange phones. Gadgets powered by a 400 series processor, especially the latest Snapdragon 450, aren’t totally inferior to any of the 600 series-powered devices, though. The 400 series is also picking up from where the 600 series was every year. If the phone has a 200 series processor, don’t expect much. It’s really designed to cover the basics while keeping up with faster LTE speeds.

How the new low-tier processors are catching up to the old mid-tier processors

It may seem easy to rank the processors based on what series they belong to but, as mentioned earlier, lower-tier processors inherit the features of higher-tier processors. Also, a higher number doesn’t always mean better. The best example would be the Snapdragon 625 and the new Snapdragon 450. The Snapdragon 450 was announced a year after the Snapdragon 625, but they are practically the same. The only advantage of the 625 over the 450 is a slightly faster clock speed for marginally better performance.

Then there’s the Snapdragon 630 and Snapdragon 652. You’d think that the 652 is better than the 630, but it isn’t. The Snapdragon 630 is newer, more efficient, and performs better all around. We can’t blame you for the confusion because the Snapdragon 652 is formerly known as the Snapdragon 620. It is Qualcomm who brought up the confusion by renaming older processors

What about Kirin, Exynos, and MediaTek?

Before we wrap up, let’s be clear that Snapdragon is not the only mobile processor on the market. They might be widely used on phones, but even phone manufacturers themselves make their own: Samsung has Exynos which powers the Galaxy S9 in some markets while Huawei is quite loyal to the Kirin processors found on most of their phones.

Both Exynos and Kirin can match the performance of Snapdragon processors, thus making the phone market more exciting for consumers but fragmented for developers. Then there’s also MediaTek that’s quite popular among budget devices. They also have high-tier processors but they’re yet to make a dent in Snapdragon’s share.

Illustrations by Jeca Martinez

The post Explainer: Differences between Snapdragon processors appeared first on GadgetMatch.

]]>
4 electric car myths, debunked https://www.gadgetmatch.com/explainer-electric-car-vehicle-hybrid/ Sat, 24 Feb 2018 07:44:03 +0000 https://www.gadgetmatch.com/?p=38952 What you should know about the car of the future

The post 4 electric car myths, debunked appeared first on GadgetMatch.

]]>
Did you know that the first electric vehicle was invented by Scottish inventor Robert Anderson in 1832? Back then, electricity-powered cars were nothing but curiosities and novelties. Now, electric vehicles are readying themselves to take over the car industry in just a few decades.

As with all revolutionary technology, reception for electric cars is lukewarm at best. Most consumers are still wary with converting to full electric, citing an unstable and uncertain future for the industry.

With the car and fuel industry hanging in the balance, gas car companies have a lot to gain by downplaying the benefits of electric vehicles. Due to the lack of information available, unproven myths inevitably pop up. Myths, as always, need to be debunked especially when electric cars overtake gas car production.

Myth 1: Electric cars are more expensive than gas cars

The cost of an electric vehicle is the most hotly contested aspect of EVs. Admittedly, the world’s most famous electric car, the Tesla Model S, still falls under the luxury car category. The battery-powered car still hovers around the US$ 100,000 range.

Budget-friendlier alternatives are out now, but their price ranges are still a bit more than a conventional car. The Chevrolet Bolt and the Nissan Leaf both cost around US$ 40,000, for example.

Illustrations by Yanni Panesa

 

Additionally, installing a home charging station compounds that price by about US$ 600.

It’s no surprise that most consumers are turned off by the exorbitant costs of EVs. However, the one-time price tag fails to show how much cheaper it is in the long run.

Right now, the cost of one kilowatt-hour (the standard for EVs) is below the cost of one liter of gasoline. Roughly estimating, one kWh costs 20 cents, while one liter of gas costs US$ 1, according to today’s standards.

The Nissan Leaf carries a 40kWh battery. Charging it to full will cost 40kWh x US$ 0.20 = US$ 8. Meanwhile, a 40L gas car will cost 40L x US$ 1 = US$ 40. Added with a much steeper maintenance cost, gasoline vehicles will quickly overtake the cost of EVs in the long run. (Of course, actual costs will still vary on usage, real prices, and road conditions.)

Myth 2: EVs don’t perform as well as gas cars

Don’t be fooled. Even if EVs are remarkably silent on the road, they are hiding powerful engines that are quickly catching up to the standards of speed today.

At their core, gasoline vehicles are inherently faulty. Their emissions aren’t only a hit on air pollution; they also mean that a car wastes a huge portion of their energy through heat, smoke, and other harmful pollutants.

On the other hand, EVs convert up to 62 percent of their stored energy for movement. For comparison, gas cars only use up 21 percent of their energy.

In terms of mileage, EVs can travel up to 193 kilometers on a full charge, adequate for a day’s worth of traveling. However, gas cars still rule the road by hundreds of kilometers more. It’s only a matter of time before EVs catch up, though. The industry-leading Tesla Model S 100D already tops out at 530+ kilometers.

Finally, when it comes to speed, EVs can do well to catch up with you in traffic. For example, both the Nissan Leaf and the Chevrolet Bolt reach speeds of up to 150km/h. While the more widely available EVs can still be woefully left in the dirt on a straightaway, the Tesla Model X blazes through with a top speed of 250km/h.

Amid all of this, EVs do their jobs quietly. If you’re not paying attention, an EV can sneak up on you from behind. Besides air pollution, EVs avoid noise pollution, too.

Myth 3: Maintaining an EV is more trouble than it’s worth

Both an EV and a gas car take you from one place to the other. EVs just do it with far fewer components. Unlike conventional cars, EVs aren’t frequent visitors to the mechanics. Fewer parts mean fewer components to maintain.

That doesn’t mean that everything is breezy, though. Replacing the battery is a nightmare for your budgeting. For example, a Nissan Leaf replacement battery costs US$ 5,499.

Thankfully, batteries are a lot more durable than you would expect. The Nissan Leaf guarantees a battery life of eight years or 100,000 miles (or approximately 161,000 kilometers). Most electric car brands already offer warranties (including replacements) before their batteries expire. Moreover, electric car batteries are completely recyclable. You might even get a trade-in return for your old battery.

Currently, the only hurdle impeding an electric car’s maintenance is the lack of able mechanics who specialize in EVs. On the bright side, by the time that you’ll need a thorough repair on your EV, the employment industry will have evolved to accommodate your needs.

Myth 4: Electric vehicles are the saviors of the environment

There is no doubt that EVs eliminate the carbon emissions that gas cars will always emit. Even from their construction, EVs carry a design trait that puts them beyond gas cars: They don’t have a tailpipe.

Currently, 75 percent of air pollution comes from motor vehicles. With their energy-efficient design, EVs eliminate the pollution caused by carbon emission. Converting to an EV is one of the greenest decisions you can make to save the environment.

However, it has its own fair share of gray areas. Critics often share the myth that EVs only displace the emissions from the tailpipe to a coal plant’s smoke stack.

Which is partly true.

 

On their own, the world’s main methods of producing power are terribly unprepared for a sudden surge in demand. Despite recent developments in renewable energy, coal power is still the world’s leading generator of electricity.

Hypothetically, if everyone in the world adopted EVs right now, coal plants would have to exponentially increase their output, creating more smokestack emissions as a result.

Luckily, the world isn’t ready to go full EV yet. Early predictions still date the takeover to 2040. We still have a lot of time to adjust our energy consumption for more energy-efficient means, like solar, hydro, and nuclear.

In reality, EVs can’t save the world by themselves. The myth that they just displace damage is only half-true. However, the environment can’t survive with 50 percent solutions. It has to rely on us changing our perspectives on energy.

Electric vehicles are the future. But with unchecked energy consumption rates, that future can look quite grim.

SEE ALSO: The Best Car Tech of CES 2018

The post 4 electric car myths, debunked appeared first on GadgetMatch.

]]>
Battle of the reversibles: USB-C vs Lightning connector https://www.gadgetmatch.com/usb-c-vs-lightning-connector-port-explained/ Sat, 10 Feb 2018 02:09:21 +0000 https://www.gadgetmatch.com/?p=37879 Which port is best for your device?

The post Battle of the reversibles: USB-C vs Lightning connector appeared first on GadgetMatch.

]]>
Gone are the days of the peculiar dance of the ports thanks to reversible connectors. We’re talking about the USB-C standard and the Lightning connector from Apple. Both are amazing and helpful for consumers, but the two are quite different. And no, it’s not a matter of Android versus iPhone.

What is USB-C?

USB-C, technically known as USB Type-C, is the latest and most versatile USB connector to date. If you happen to have a premium phone, you already have a USB-C port for charging and wired connectivity. If you have the latest MacBook or MacBook Pro, it’s the sole type of port on your laptop for wired video and data output, as well as charging. You will find USB-C on most mobile devices nowadays, even laptops, because it’s a standard that anyone can use. But not all USB-C ports and connectors are created equal.

A technical explanation as to why they’re not all equal is that USB-C is actually just the style of connector and port; the real power comes from the USB 3.1 technology it uses, which can deliver 100 watts of power and is capable of a 10Gbps data transfer rate. It also supports Thunderbolt 3 technology for an even faster 40Gbps transfer. But not all USB-C types have USB 3.1 or Thunderbolt 3 speeds, especially for mobile phones.

While the older USB we’re familiar with are mainly used for storing and transferring files, the new USB-C standard is not limited to that. It can relay images for displays with support for full DisplayPort A/V performance up to an 8K resolution. It’s also backward-compatible with VGA, DVI, and the trusty HDMI as long as you have the right adapters.

Since all USB-C ports and connectors look alike, it’s now harder to distinguish what the port or cable is for. Could it be a power source or for charging? Maybe for high-resolution video? Or high-speed data transfer? You’ll have to know the specifications to be sure.

What is Lightning?

Apple already had their proprietary connector with the early iPhones, but it was only since the introduction of the Lightning connector along with the iPhone 5 in 2012 that made their own design popular.

From a cumbersome 30-pin dock connector, Apple had a smaller and reversible one which was ahead of its time. Even the common micro-USB port can’t compete with the convenience of the Lightning connector. Since it’s proprietary, only Apple can use it and third-party accessory manufacturers have to pay a licensing fee to apply it to their products.

The technical specification of Lightning is pretty limited, but when it first came out, tests showed that its speeds were up to 480Mbps — the same with the old USB 2.0 standards. In 2015, the iPad Pro showed a faster speed of 5Gbps, but that’s still only half of USB 3.1 speeds.

What are the significant differences between the two?

It’s easy to differentiate the two based on their appearances. If you’ve ever used or seen an iPhone, you’re already familiar with how the Lightning connector looks with its pins exposed. USB-C looks cleaner and simpler with its symmetrical connector.

Lightning connector (left) and USB-C (right)

Again, USB-C refers to the style of the port and connector rather than the technology it has. It is convenient because it’s reversible and universal. The whole point was to have a single style of connector and port that could run pretty much everything.

The Lightning connector is solely used to connect Apple mobile devices like iPhones, iPads, and iPods to host computers, external monitors, cameras, battery chargers, and other peripherals. You won’t find it on any other device, even MacBooks.

Why is Apple not using the Lightning connector on MacBooks and will USB-C replace Lightning on iPhones?

Will we ever see a Lightning connector on a MacBook? Highly unlikely. But there’s a possibility that Apple will use USB-C soon on iPhones. Last year’s rumors pointed to the iPhone X having USB-C, but it didn’t.

With the new MacBooks relying purely on USB-C, an iPhone with USB-C is not far from reality. That’s unless Apple wants to keep the revenue from Lightning connector licensing.

Which is better?

When paired with USB 3.1 or Thunderbolt 3 technology, USB-C is faster, more powerful, and provides greater versatility than Lightning. It’s also now widely adopted for consumer technologies may it be on phones, laptops, or other mobile gadgets.

USB-C is the future. Apple already accepted it on their premium notebooks which kind of triggered professionals who are using MacBooks, but that’s the future we’re heading towards. It will come to a point where we’ll just plug in a cable and it’ll simply work. For now, we still need to understand the differences and live with dongles.

Illustrations by MJ Jucutan

SEE ALSO: Why is USB Type-C so important?

The post Battle of the reversibles: USB-C vs Lightning connector appeared first on GadgetMatch.

]]>
Inside the house of tomorrow: Smart home explained https://www.gadgetmatch.com/smart-home-explained-google-assistant-alexa/ Thu, 01 Feb 2018 01:50:54 +0000 http://www.gadgetmatch.com/?p=36437 The future is now

The post Inside the house of tomorrow: Smart home explained appeared first on GadgetMatch.

]]>
Ten years ago, smart homes belonged to the realm of science fiction. Back then, you would only see connected smart devices in TV shows like Black Mirror, rather than in cons and tradeshows.

Today, a connected household isn’t just a working theory; it’s already a reality pushed by the world’s leading tech brands. Common, everyday tasks can now be automated by artificial intelligence or simplified through voice commands.

We are living in a world where every device has a voice, whether it’s Alexa, Siri, Cortana, or the Google Assistant. While some anticipate the curiosities that the future will bring, some fear the rapid changes that a house from the future beckons.

Regardless of how you might feel about the future of futuristic abodes, living in one can still be a mystery, especially for the everyday homeowner. There are layers of tech to wade through. As with every house hunt, it’s time to take a tour of the house of tomorrow before you inevitably live in one.

Garage: letting the right one in

As you pull up into the driveway with your electric car, the garage door automatically opens to the sound of your voice. The lights go on to help you park. You climb out of the car and hook it up to the charging station on the wall.

As you head to the front door, the garage door closes behind you. The smart camera above the door detects who you are and notifies your family that you’ve arrived. The locks disengage, and you enter.

The most common elements of a smart home are those seen from the outside. If you live within a gated community, you’ve seen automatic gates open and close by themselves. You might’ve also seen electric vehicles roaming the streets already. They may be outside the house, but these machines have become essential to the smart home ecosystem. They’ve become extensions of your smart house that you can take with you wherever.

Even as you exit your garage, other smart devices are being fitted around your house. The Nest Cam IQ, for example, is a smart security camera that adds an extra oomph in security. It can record in HD, listen in on conversations, and detect familiar faces.

Having an integrated smart security system allows you to enter and exit your home without fussing with keychains and padlocks.

Living room: command center

Entering your house, you kiss your spouse hello, kick off your shoes, and watch a bit of TV before dinner. Just as you plop down onto your couch, you remember that your house security is still disabled. You ask Alexa to turn it on. You rest easy while watching the latest House of Cards episode.

The evolution of the smart home began in earnest with the living room. As it was the central hub of the entire house, the living room also became the center for the Internet of Things. The new smart home ecosystem coordinated everything from the lights to the TV to the security system — right from the comforts of your sofa.

Who hasn’t heard of a smart TV? The industry’s newest TVs integrate the internet to build a more comprehensive entertainment experience. From a device that connects to mere broadcast stations, the TV evolved to access a vast catalog of online entertainment. You could watch Netflix while searching for your favorite recipes on Google. The smart TV became the desktop of your living room.

As people spent more time interacting with their TVs, smart devices started installing themselves around the luxuries of the living room. While you’re watching a movie, you can change everything from the temperature to the lights without standing up or pausing the programming.

The Philips Hue, for example, takes control of your house’s lighting system. That’s not all. The smart bulb automates your lights’ operation for both when you’re in and out of the house. It can even change a room’s hue to set the mood.

Another example is the Ecobee 3 smart thermostat. The automated system optimizes the temperature based on your activity inside and on the temperature outside. Further, it also makes your energy usage more efficient.

Kitchen: robots get hungry, too

As you open your fridge, a voice lists down the food you have in stock. Knowing how much pasta and olive oil you have left, the voice assistant suggests pesto for dinner. You agree. Alexa, then, preheats the oven for the pasta and preps your dishwasher for the oily dishes later.

Despite the oodles of devices inside a kitchen, tech makers are only starting to optimize the room for the smart home. LG, for example, launched a series of devices that assist you even before you start preparing the dish.

Their smart refrigerator catalogs the supplies you have left. It alerts you when you’re short of ingredients and recommends recipes based on what’s inside. Plus, it even has its own entertainment system to get your groove on while you cook.

After you gather all the ingredients, the system passes the recipe down to the appliances you’ll need. A smart oven preheats to fit the temperature you need; a smart dishwasher customizes its spin cycles to wash dishes optimally.

Bedroom: the last frontier

The day is over. Before you drift off to sleep, you remember to charge your devices — iPhone X, Apple Watch — on the wireless charging stand. You set Google Home to wake you up at 7am by playing a Rihanna song.

The bedroom is the last frontier of the home of tomorrow. The bed is the last sanctuary from a life taken over by tech. That, however, won’t last. As early as now, the Internet of Things follows you even to the bedroom.

Wireless charging stations, smart thermostat panels, and security panels pervade our bedrooms, allowing us easy access to how our house works before we call it a night. A smart bed is still forthcoming, but technology is already reaching out before it inevitably comes.

With Google Home and Amazon Echo, voice assistants now lull us to sleep and wake us up in the morning. Alexa, Siri, Google Assistant, and Cortana will become the first and last voices we hear every day. The eerily human voice assistants have already lent their voices to every device in our home.

It’s only a matter of time before our house becomes a machine itself. Whether you embrace the future or shun it, technology will always find a way to make our lives easier. But don’t worry when it comes. All you’ll hear is the soothing voice of Alexa, asking how you want your meat cooked.

Illustrations by MJ Jucutan

The post Inside the house of tomorrow: Smart home explained appeared first on GadgetMatch.

]]>
Why it’s time to finally make the switch to LTE https://www.gadgetmatch.com/smart-4g-lte-upgrade-switch/ Fri, 03 Nov 2017 09:30:10 +0000 http://www.gadgetmatch.com/?p=22412 Local telcos are continuously improving their infrastructure to improve the country’s mobile internet speed. And as mobile technology progresses, so should your mobile device. Do you still remember when 3G was made available in the Philippines in 2006? It was a game changer, because it introduced us to video calling before Skype was a thing […]

The post Why it’s time to finally make the switch to LTE appeared first on GadgetMatch.

]]>
Local telcos are continuously improving their infrastructure to improve the country’s mobile internet speed. And as mobile technology progresses, so should your mobile device.

Do you still remember when 3G was made available in the Philippines in 2006? It was a game changer, because it introduced us to video calling before Skype was a thing and enabled us to browse desktop websites on our mobile phones. But that was 11 years ago, and technology has evolved.

What was fast before is already slow in comparison to what’s new today. If you’re still using a 3G phone, you’re depriving yourself of a better internet experience. It’s now time to upgrade to LTE (Long-Term Evolution), also known as 4G.

If you’re going to jump to LTE, you can check out Smart’s fastest LTE network in the Philippines. According to the latest survey from OpenSignal, Smart is ahead in terms of 4G LTE download speeds with an average of 10.55Mbps or over 3Mbps faster than the competition.

Not only that, Smart is currently re-equipping their cell sites to use low-frequency bands like 700MHz and 850MHz. The use of such bands allows signals to better penetrate walls of houses and office buildings. This means we get better indoor coverage than before. Also, cell towers equipped with lower bands have a wider reach, extending service even to the outskirts of a town, for example.

Smart continues to expand their LTE coverage nationwide — not just in major cities. The plan is to give more than 90 percent of the country’s population access to Smart’s LTE network by end of 2018. And hopefully by then, we’ll also transition to more widespread LTE-Advanced. It’s a promising future to expect, so you better upgrade today.

How to upgrade to LTE?

The first step is to get an LTE phone. Most of the flagship smartphones fully support LTE including the new low-frequency bands. Don’t worry, fast mobile internet doesn’t have to be expensive because there are affordable LTE phones that are available in the market for as low as PhP 2,488.

When choosing your next phone, it is important to invest in a device that will work best with your network’s frequencies. For the best possible mobile data experience, Smart recommends LTE phones that are compatible with the 700MHz band.

Some examples of 700MHz-compatible phones which are available in the market today are:

But having an LTE phone is not enough. An LTE phone should have an LTE-ready SIM card inside. If you’re still holding on to your old SIM card, it’s time to have it replaced with a new one. The upgrade is free and you can still retain your existing number, even if you’re on prepaid. We all know how important it is to stick to a single mobile phone number.

Not sure if your SIM is already LTE-capable? If you’re a Smart user, you can do a quick check by texting SIMCHECK to 5832. If you’re not yet on LTE, you can upgrade for free at any Smart Store.

If you have an LTE phone with a new LTE SIM card, you can enjoy the best possible mobile internet experience in areas covered by LTE. Make sure you adjust your phone’s network settings with LTE or 4G as the preferred network type.

SEE ALSO: LTE-A Explained

[irp posts=”2500″ name=”LTE-A Explained”]


This feature was produced in collaboration between GadgetMatch and Smart Communications.

The post Why it’s time to finally make the switch to LTE appeared first on GadgetMatch.

]]>
Here’s all you need to know about HDR https://www.gadgetmatch.com/heres-need-know-hdr-explainer-tv-smartphone/ Mon, 26 Jun 2017 13:38:34 +0000 http://www.gadgetmatch.com/?p=15478 The screen is the most important part of your smartphone, and serves as both display and interface. Over the past decade or so, we’ve been inundated with selling points like Retina Display, 720p, 1080p, 1440p, and even 4K. The latest buzzword for screens is high dynamic range or HDR, and the latest flagship phones tout […]

The post Here’s all you need to know about HDR appeared first on GadgetMatch.

]]>
The screen is the most important part of your smartphone, and serves as both display and interface. Over the past decade or so, we’ve been inundated with selling points like Retina Display, 720p, 1080p, 1440p, and even 4K. The latest buzzword for screens is high dynamic range or HDR, and the latest flagship phones tout it as a must-have spec.

But what is HDR in the first place?

More colors, more brightness

Put simply, HDR lets your display exhibit a far wider range of colors. How? First, whites are whiter, and blacks are blacker. The screen does this by being much brighter than what you’re normally used to, to the tune of 1,000 nits (a unit used for display brightness). By comparison, your old flagship phone probably topped out at around 500 nits. An HDR display can also illuminate or darken specific areas of the screen, whereas a non-HDR display commonly lights the entire screen evenly.

Second is wide color gamut, or WCG. This feature increases both the color palette (the number of colors available and the bit depth (or the number of shades of those colors). Your old phone could display nearly 17 million colors. By contrast, an HDR display can show over a billion, which is much closer to what you can see in real life. When working in tandem, these two processes enable the screen to show a range of colors that are more akin to real life. This is why people who have seen an HDR display liken it to looking out a window.

But there lies the problem. The most difficult issue with conveying what HDR is exactly is that if you don’t actually have access to an HDR-capable display, it’s impossible to show the difference. Conversely, if you have an HDR display and a non-HDR display side by side, the improvements are instantly evident. This technology represents a generational leap, much like the jump from black-and-white to color, or standard definition to high definition.

A feature by any other name

You’ve probably heard the abbreviation “HDR” before as a vaunted feature for smartphones even before it was used for displays. Another source of confusion is that HDR also applies to another selling point for phones: the HDR feature of a camera. In camera HDR, the camera takes many exposures of the same scene and combines them, thereby showing all of the subtle steps in highlights and shadows.

The principle between HDR displays and photography remains roughly the same, with the end result being an image that has higher contrast and more colors. Crushed blacks and blown-out whites are also eliminated as a result. This is where the phrase “high dynamic range” comes in, representing the difference between the darkest parts of an image with the lightest.

Which gadgets have HDR?

All of this year’s important flagship phones have HDR-enabled displays, including the LG G6 and the Sony Xperia XZ Premium. However, these phones appear to be using different standards for what constitutes HDR — the LG G6 uses both the open HDR10 standard and proprietary Dolby Vision, and Sony’s 4K phone appears to be using its own definition of HDR.

Hopefully, HDR specifications normalize in subsequent generations of phones. You wouldn’t want to get a phone with HDR to find out that your HDR content won’t even display properly on it. Thankfully, at this year’s Mobile World Congress, the Ultra HD Alliance announced Mobile HDR Premium, which is an HDR standard specifically developed for smartphones, tablets, and laptops. The Samsung Galaxy S8 and S8+, two of the best phones of 2017, were the first pair to adhere to this certification.

Here are the specifics of Mobile HDR Premium for a wide variety of portable devices:

Device Resolution Dynamic Range Color Space Bit Depth
Smartphones (3- to 7-inch screens) 60 pixels/degree .0005-540 nits 90% of P3 Color gamut 10
Tablets (7- to 12.9-inch screens) 60 pixels/degree .0005-540 nits 90% of P3 Color gamut 10
Laptops (9.5- to 18-inch screens) 60 pixels/degree .0005-540 nits or 0.1-600 nits 90% of P3 Color gamut 10

Why do you need HDR?

If you’re after the top-end phones, you’ll find it increasingly difficult to avoid HDR. The benefits that a more accurate display brings is most useful if you’ll be doing image work on your phone, like quickly editing photos on VSCO before sharing it to the ether. But its advantages will also be immediately noticeable if you consume content on your phone (like most people do). More and more media providers are putting out HDR content —YouTube, for one, has had HDR support since last year, and both Netflix and Amazon Video have it as well.

And with diminishing returns in terms of resolution on a five- to six-inch screen (do you really need 4K on a screen as big as your palm?), HDR is one display buzzword that is instantly apparent. Plus, unlike a 4K screen that eats batteries for breakfast, HDR on your phone can actually extend your phone’s longevity; the dynamic and selective adjustment of brightness, depending on what’s shown, should increase your screen-on time by a significant amount.

SEE ALSO: What exactly is Fast Charging? And how does it work?

[irp posts=”14641″ name=”What exactly is Fast Charging? And how does it work?”]

The post Here’s all you need to know about HDR appeared first on GadgetMatch.

]]>
What exactly is Fast Charging? And how does it work? https://www.gadgetmatch.com/exactly-fast-charging-work/ Mon, 05 Jun 2017 09:15:12 +0000 http://www.gadgetmatch.com/?p=14641 The mention of fast charging technologies for smartphones has become quite common lately. You’ve probably already heard of Qualcomm’s Quick Charge, OPPO’s VOOC flash charge, or OnePlus’ Dash Charge, which can juice up a smartphone’s battery to around 60 percent in just 30 minutes. So, how exactly do they work? Most devices use lithium-ion batteries […]

The post What exactly is Fast Charging? And how does it work? appeared first on GadgetMatch.

]]>
The mention of fast charging technologies for smartphones has become quite common lately. You’ve probably already heard of Qualcomm’s Quick Charge, OPPO’s VOOC flash charge, or OnePlus’ Dash Charge, which can juice up a smartphone’s battery to around 60 percent in just 30 minutes. So, how exactly do they work?

Most devices use lithium-ion batteries

To understand how these technologies work, knowing the basic principle of how a smartphone’s battery gets charged is a must. Most, if not all, smartphones today use a type of battery called lithium-ion (Li-ion). A Li-ion battery is composed of a positive and negative electrode and an electrolyte in between them. The lithium ions inside the battery move from one electrode to another, allowing the battery to be in a charging (storing energy) or discharging (expending energy) state.

The direction of lithium ions determines whether a battery is charging (positive to negative) or discharging (negative to positive).

Battery capacity is measured in milliampere hour (mAh)

Great, we’ve got some background on how Li-ion batteries work! The next question is how exactly do we determine the speed at which a Li-ion battery gets charged. You’re probably familiar with the rating used to gauge the capacity of a smartphone’s battery. If not, it’s the number that uses mAh (milliampere hour) as its unit of measurement. A larger number means larger capacity, which translates to longer battery life.

A 6000mAh battery will last twice as long as a 3000mAh battery. The same thing applies to charging: The larger the capacity of a Li-ion battery, the longer it takes to fully charge. The amount of current that the charger can output is usually the determining factor on how fast a battery can be charged, which is why a tablet charger that can output 2A (ampere) will charge twice as fast as a smartphone charger that can output 1A.

Another important nature of a Li-ion battery is that it doesn’t charge in a linear fashion. It’s easier to charge the battery when it’s nearly empty compared to charging when it’s nearly full. Think of it like packing a bag; it gets harder to put things in as it gets filled.

As mentioned, increasing the current used to charge a battery decreases charging time, but only up to a certain point. A Li-ion battery can only take in so much current, and increasing it past the threshold only results in dissipated energy in the form of heat. Therefore, if you use a tablet charger to charge a smartphone, it usually charges faster but also heats up faster.

Battery charging has evolved through the years

With all these things in mind, we can go back to the question of how fast charging technologies work. As its name implies, it allows rapid charging of a smartphone’s battery. This is usually done by increasing the power output of a charger, either by increasing the voltage or current that it provides to the device. You might ask if it’s safe to increase the amount of power we pump into our devices: Theoretically, it isn’t safe, but with the right hardware for monitoring and checking power output and temperature, things become safer.

Smartphones nowadays are smart when it comes to charging. Most devices today have a built-in chip for monitoring battery temperatures and the amount of power going through as the phone charges. This allows the smartphone to intelligently lessen or stop receiving power from the charger once the battery is full or if the battery gets too hot. That’s why when you leave your phone to charge, you’ll notice the charger and the battery heat up while charging, and once they’re done, both will stop heating up.

Taking things further are these new fast charging technologies that can provide more than half of a battery’s capacity in less than an hour. They work by pushing as much power as the device can handle to ensure the battery is charging at its maximum rate. As mentioned earlier, when a battery is at a low capacity, it’s easier to charge since the lithium ions have more freedom to move. This nature is what Qualcomm and other manufacturers take advantage of for faster charging.

Qualcomm’s Quick Charge gets better every year 

Qualcomm’s Quick Charge technology leverages on different power outputs — mostly voltage adjustments — for the charger, depending on the current battery capacity of the device. Thanks to the special chip installed on both the device and charger, the latter can actively adjust the power output depending on the device’s needs. So, at lower capacities, it delivers the highest power rating the device can safely handle, and as the battery gets more juice, the device communicates with the charger and tells it to provide less power.

Ever since Quick Charge was introduced, Qualcomm has continued its development and currently has five iterations: Quick Charge 1.0, 2.0, 3.0, 4.0, and just recently, 4+. Here’s a table to summarize what the first four iterations of Quick Charge are capable of:

Quick Charge Version Voltage Current Power (Watts)
1.0 5V 2A Up to 10W
2.0 5V, 9V, 12V 2A, 2A, 1.67A Up to 18W
3.0 From 3.2V to 20V, dynamic increments of 200mV 2.6A, 4.6A Up to 18W
4.0 Dynamic Dynamic Up to 28W

Quick Charge 4.0 builds on the success of QC 3.0 by adding new features: compliance to USB Type-C and USB Power Delivery; a newer version of Intelligent Negotiation for Optimum Voltage (INOV), allowing the device to determine the optimum power level to request from the charger; and the inclusion of Dual Charge which adds a secondary power management chip in the device for better thermal dissipation and more efficient charging.

Even though few smartphones supporting QC 4.0 have been released, Qualcomm has already launched an update, version 4.0+. It further improves the Dual Charge feature of its predecessor with the addition of Intelligent Thermal Balancing, which eliminates hot spots by moving current through the coolest path available during charging. Building on the already robust safety features of QC 4.0, this update goes one step further by also monitoring the temperature levels of the case and connector. The added layer of protection helps prevent overheating and short-circuit damage.

High-current charging for OPPO and OnePlus

Being sister companies, OPPO’s VOOC charging technology and OnePlus’s Dash Charge have the same method for charging faster, and they do so by providing high amounts of current (around 4A) while charging. The level gets lower as the device gets charged up. Again, thanks to the special chips installed in the device and charger, OPPO and OnePlus devices supporting these technologies can charge faster.

Quick Charge and VOOC/Dash Charge may both be fast charging technologies, but they have some differences. Quick Charge mainly leverages on the use of higher voltages, while VOOC and Dash Charge use high-current charging. OPPO and OnePlus also made sure that the charger takes in the bulk of the heat generated while charging, which is not the case for Qualcomm’s Quick Charge, wherein both the charger and the device heat up.

Because of the phone not heating up too much, OPPO and OnePlus devices can be used while fast charging without any issues. In addition, OPPO and OnePlus’ fast charging technology is proprietary, which means you’ll need the charger and cable that came with your device to use it.

Samsung has its own Adaptive Fast Charging technology

If you own a recent Samsung device, you’re probably familiar with Adaptive Fast Charging. This is essentially the same as Qualcomm’s Quick Charge technology, since Samsung acquired the license from Qualcomm to use its technology on devices that have non-Qualcomm processors. This means a Quick Charge adapter can be used on a Samsung device that features Adaptive Fast Charging and vice versa.

Fast Charging requires specific hardware

Keep in mind that to make use of such tech, you’ll need a smartphone that supports a fast charging technology and a certified charger and/or cable. If you’re using a higher-end phone that’s been released in the last couple of years, chances are your handset supports fast charging.

Summing thing up: Fast, quick, rapid charging, or whatever they call it, is technically just a smarter form of charging that takes advantage of how Li-ion batteries work. With all the prerequisites — a compatible smartphone and charger — you won’t be stuck near a wall outlet for a few hours just to receive an ample amount of energy in your device. Until better battery technology comes out, fast charging might be the only solution we have for a while.

Illustrations: Kimchi Lee

SEE ALSO: Why is USB Type-C so important?

[irp posts=”9952″ name=”Why is USB Type-C so important?”]

The post What exactly is Fast Charging? And how does it work? appeared first on GadgetMatch.

]]>
This is how SIM cards work https://www.gadgetmatch.com/sim-card-explainer/ Fri, 31 Mar 2017 14:14:41 +0000 http://www.gadgetmatch.com/?p=11895 We don’t think about SIM cards as much as we used to — like back when they were required to start up a phone — but these tiny chips are still essential in keeping a modern smartphone wirelessly connected at all times. This brings us to a question we should be asking: How exactly does […]

The post This is how SIM cards work appeared first on GadgetMatch.

]]>
We don’t think about SIM cards as much as we used to — like back when they were required to start up a phone — but these tiny chips are still essential in keeping a modern smartphone wirelessly connected at all times. This brings us to a question we should be asking: How exactly does a SIM card work?

Subscriber Identity Module

We all talk about SIM cards without even knowing what it means. As you’d expect, it’s short for Subscriber Identity Module, and it does exactly what it implies: Providing cellular networks with your identity in order to establish a secure connection.

They come in different sizes, with nano-SIMs becoming the new standard because of how small they are, allowing phone manufacturers to leave more space for other vital components in their products. What’s also minuscule is their storage capacity, often limited to 256KB. That, however, is more than enough to store all the important pieces of information, including the identification numbers.

The four universal SIM card sizes

Without getting too technical (and there is a lot to take in when it comes to authentication protocols), every SIM card holds a unique 64-bit number that identifies the device it’s attached to with the cellular network. Being a 64-bit digit, there are more than enough possible combinations for trillions and trillions of subscribers — so, no, you can’t buy all the SIM cards in the world and run out of phone numbers to use.

Connecting to a network

As soon as you turn your phone on with a SIM card inside, it’ll communicate with the network carrier to establish a connection. Once its unique number along with a security authentication key is sent to a nearby tower, the service provider will send encrypted information back in hopes of a match. If your SIM card is able to decrypt the randomized code successfully, your handset will receive its well-deserved wireless signal.

This process ensures only your SIM has the capability of figuring out the encryption and letting you use the mobile number you were provided with — but that’s just for the service itself. What if someone physically gets a hold of your SIM card and decides to use it on his or her phone? That’s where the personal identification number (PIN) and personal unblocking code (PUK) come in.

Phones like the Huawei Mate 9 can accept two SIM cards at the same time

By adding a PIN to your SIM, a four-to-eight digit code will be required to turn it on. If in case someone tries to break in using brute force, the SIM will be locked after three unsuccessful attempts; the PUK, which can be found on the card that comes with every SIM, is needed to release the lock.

The future of SIM cards

What makes SIM cards so popular is how easy they are to transfer from one device to another. Some companies, however, believe we can take this a step further with embedded SIM cards (e-SIM for short). As you can tell by the name, this form would embed itself in a device and can’t be swapped for another. Doesn’t this defeat the purpose of traditional SIMs? Not at all.

By being integrated into the hardware, you can change your network carrier without replacing the physical card. Instead, all you have to do is jump in the phone or tablet’s interface and select from there. Apple already began implementing this technology on its iPads, and Samsung’s Gear smartwatches have been following suit.

While it may sound like a pain in the butt to adjust to yet another new standard, this technology could lead to even slimmer devices and greater convenience once more companies jump on the ship. The hassles of international roaming would also become a thing of the past, since selecting your preferred service would only take a few taps on your screen.

SEE ALSO: What’s the difference between RAM and internal storage?

[irp posts=”11696″ name=”What’s the difference between RAM and internal storage?”]

The post This is how SIM cards work appeared first on GadgetMatch.

]]>
What’s the difference between RAM and internal storage? https://www.gadgetmatch.com/whats-the-difference-between-ram-and-internal-storage/ Mon, 27 Mar 2017 15:45:22 +0000 http://www.gadgetmatch.com/?p=11696 RAM? Memory? 32 gigabytes of internal storage you can expand using a microSD card? What does this all mean?! We admit to throwing around lots of techie jargon when we talk about smartphones and computers, but we’ll now take a step back to talk about what RAM does and how it differs from typical data […]

The post What’s the difference between RAM and internal storage? appeared first on GadgetMatch.

]]>
RAM? Memory? 32 gigabytes of internal storage you can expand using a microSD card? What does this all mean?!

We admit to throwing around lots of techie jargon when we talk about smartphones and computers, but we’ll now take a step back to talk about what RAM does and how it differs from typical data storage.

Random Access Memory

When we talk about RAM, we’re referring to Random Access Memory, which is often just called memory. Practically every task you perform, whether it be opening a web browser or camera app, has the gadget’s processor temporarily store data in the memory while it’s in use; when a device shuts down or restarts, the entire memory clears up.

So, why have something that doesn’t keep data for long periods of time? RAM serves an important purpose in every computing device, and that’s to speed up the system. By having its own little space for dumping and retrieving data at rapid rates, the processor doesn’t have to access the system’s larger, slower internal storage.

RAM

Here are four sticks of RAM inside a desktop computer

Using RAM is a lot like going through the smartphone in your pocket to quickly find information, rather doing a search using the desktop computer nestled in your home. It’s all about speed and convenience; that’s why it’s needed

In theory, the more RAM you have, the better, since it allows your device to store more temporary data at once. At the same time, having too much memory can be a bad thing. It’s terribly inefficient to make your smartphone or computer constantly look around unused space to find just one piece of data.

A Huawei executive recently claimed that 4GB (gigabytes) of memory is more than enough on a smartphone (for now), and is the reason why the P10 doesn’t have as much RAM as, say, the generous 6GB of the OnePlus 3T. He added that it’s more beneficial to have lots of internal storage than excessive memory. Let’s delve into that next.

Internal Storage

This is as straightforward as it gets: More storage means more space for all your personal files, apps, and operating system. Unlike RAM, you can always expand your storage to fit more files. In the case of smartphones and tablets, you can extend the capacity by inserting a microSD card or accessing your favorite cloud services, such as Google Drive or Dropbox.

For laptops and desktop computers, you always have to option to add an additional HDD or SSD, or simply plug in a flash drive to an available USB port. As long you make use of all that space, there’s no need to hold back in purchasing more.

Qualcomm Snapdragon 835

For smartphones, the RAM, storage, and processor can all be found on a tiny SoC (System on a Chip) like the Qualcomm Snapdragon 835

And so, while RAM and storage are measured in similar ways, they have distinctly different purposes in a system. More importantly, they must work together — along with the device’s heart, the processor — to keep your gadget running as smoothly as possible.

SEE ALSO: SSD and HDD: What’s the difference?

[irp posts=”9623″ name=”SSD and HDD: What’s the difference?”]

Image credit: Ram Joshi

The post What’s the difference between RAM and internal storage? appeared first on GadgetMatch.

]]>
Why is USB Type-C so important? https://www.gadgetmatch.com/usb-type-c-important/ Mon, 06 Feb 2017 12:07:41 +0000 http://www.gadgetmatch.com/?p=9952 Over the past decade, devices using the Universal Serial Bus (USB) standard have become part of our daily lives. From transferring data to charging our devices, this standard has continued to evolve over time, with USB Type-C being the latest version. Here’s why you should care about it. First, here’s a little history Chances are […]

The post Why is USB Type-C so important? appeared first on GadgetMatch.

]]>
Over the past decade, devices using the Universal Serial Bus (USB) standard have become part of our daily lives. From transferring data to charging our devices, this standard has continued to evolve over time, with USB Type-C being the latest version. Here’s why you should care about it.

First, here’s a little history

Chances are you’ve encountered devices that have a USB port, such as a smartphone or computer. But what exactly is the USB standard? Simply put, it’s a communication protocol that allows devices to communicate with other devices using a standardized port or connector. It’s basically what language is for humans.

Here’s an example of a USB hub that uses Type-A connectors (Image credit: Anker)

When USB was first introduced to the market, the connectors used were known as USB Type-A. You’re likely familiar with this connector; it’s rectangular and can only be plugged in a certain orientation. To be able to make a connection, a USB Type-A connector plugs into a USB Type-A port just like how an appliance gets connected to a wall outlet. This port usually resides on host devices such as computers and media players, while Type-A connectors are usually tied to peripherals such as keyboards or flash drives.

There are also USB Type-B connectors, and these usually go on the other end of a USB cable that plugs into devices like a smartphone. Due to the different sizes of external devices, there are a few different designs for Type-B connectors. Printers and scanners use the Standard-B port, older digital cameras and phones use the Mini-B port, and recent smartphones and tablets use the Micro-B port.

Samples of the different USB Type-B connectors. From left to right: Standard-B, Mini-B, and Micro-B (Image credit: Amazon)

Specifications improved through the years

Aside from the type of connectors and ports, another integral part of the USB standard lies in its specifications. As with all specifications, these document the capabilities of the different USB versions.

The first-ever version of USB, USB 1.0, specified a transfer rate of up to 1.5Mbps (megabits per second), but this version never made it into consumer products. Instead, the first revision, USB 1.1, was released in 1998. It’s also the first version to be widely adopted and is capable of a max transfer rate of up to 12Mbps.

The next version, USB 2.0, was released in 2000. This version had a significantly higher transfer rate of up to 480Mbps. Both versions can also be used as power sources with a rating of 5V, 500mA or 5V, 100mA.

Next up was USB 3.0, which was introduced in 2008 and defines a transfer rate of up to 5Gbps (gigabits per second) — that’s a tenfold increase from the previous version. This feat was achieved by doubling the pin count or wires to make it easier to spot; these new connectors and ports are usually colored blue compared to the usual black/gray for USB 2.0 and below. USB 3.0 also improves upon its power delivery with a rating of 5V, 900mA.

In 2013, USB was updated to version 3.1. This version doubles what USB 3.0 was capable of in terms of bandwidth, as it’s capable of up to 10Gbps. The big change comes in its power delivery specification, now providing up to 20V, 5A, which is enough to power even notebooks. Apart from the higher power delivery, power direction is bidirectional this time around, meaning either the host or peripheral device can provide power, unlike before wherein only the host device can provide power.

Here’s a table of the different USB versions:

Version Bandwidth Power Delivery Connector Type
USB 1.0/1.1 1.5Mbps/12Mbps 5V, 500mA Type-A to Type-A,

Type-A to Type-B

USB 2.0 480Mbps 5V, 500mA Type-A to Type-A,

Type-A to Type-B

USB 3.0 5Gbps 5V, 900mA Type-A to Type-A,

Type-A to Type-B

USB 3.1 10Gbps 5V, up to 2A,

12V, up to 5A,

20V, up to 5A

Type-C to Type-C,

Type-A to Type-C

Now that we’ve established the background of how USB has evolved from its initial release, there are two things to keep in mind: One, each new version of USB usually just bumps its transfer rate and power delivery, and two, there haven’t been any huge changes regarding the ports and connectors aside from the doubling of pin count when USB 3.0 was introduced. So, what’s next for USB?

USB Type-C isn’t your average connector

After USB 3.1 was announced, the USB Implementers Forum (USB-IF) who handles USB standards, followed it up with a new connector, USB Type-C. The new design promised to fix the age-old issue of orientation when plugging a connector to a port. There’s no “wrong” way when plugging a Type-C connector since it’s reversible. Another issue it addresses is how older connectors hinder the creation of thinner devices, which isn’t the case for the Type-C connector’s slim profile.

Here’s how a USB Type-C connector looks like. Left: Type-A to Type-C cable, Right: Type-C to Type-C cable (Image credit: Belkin)

From the looks of it, the Type-C connector could become the only connector you’ll ever need in a device. It has high bandwidth for transferring 4K content and other large files, as well as power delivery that can power even most 15-inch notebooks. It’s also backwards compatible with previous USB versions, although you might have to use a Type-A-to-Type-C cable, which are becoming more common anyway.

Another big thing about USB Type-C is that it can support different protocols in its alternate mode. As of last year, Type-C ports are capable of outputting video via DisplayPort or HDMI, but you’ll have to use the necessary adapter and cable to do so. Intel’s Thunderbolt 3 technology is also listed as an alternate mode partner for USB Type-C. If you aren’t familiar with Thunderbolt, it’s basically a high-speed input/output (I/O) protocol that supports the transfer of both data and video on a single cable. Newer laptops have this built in.

A USB Type-C Thunderbolt 3 port (with compatible dock/adapter) does everything you’ll ever need when it comes to I/O ports (Image credit: Intel)

Rapid adoption of the Type-C port has already begun, as seen on notebooks such as Chromebooks, Windows convertibles, and the latest Apple MacBook Pro line. Smartphones using the Type-C connector are also increasing in number.

Summing things up, the introduction of USB Type-C is a huge step forward when it comes to I/O protocols, as it can support almost everything a consumer would want for their gadgets: high-bandwidth data transfer, video output, and charging.

SEE ALSO: SSD and HDD: What’s the difference?

[irp posts=”9623″ name=”SSD and HDD: What’s the difference?”]

The post Why is USB Type-C so important? appeared first on GadgetMatch.

]]>
SSD and HDD: What’s the difference? https://www.gadgetmatch.com/ssd-hdd-whats-difference/ Fri, 20 Jan 2017 02:02:24 +0000 http://www.gadgetmatch.com/?p=9623 For the past few years, solid-state drives (SSDs) have become quite popular in the computing world, mostly because of how fast they are compared to hard disk drives (HDDs). So, what exactly sets an SSD apart from an HDD? Nowadays, computers use non-volatile medium for storage, which means data that’s stored in it doesn’t get […]

The post SSD and HDD: What’s the difference? appeared first on GadgetMatch.

]]>
For the past few years, solid-state drives (SSDs) have become quite popular in the computing world, mostly because of how fast they are compared to hard disk drives (HDDs). So, what exactly sets an SSD apart from an HDD?

Nowadays, computers use non-volatile medium for storage, which means data that’s stored in it doesn’t get lost once the computer shuts down. Storage for modern-day computers and notebooks have been handled by hard disk drives for the longest time and it’s only now, with SSDs becoming more affordable, that consumers are seeing a different storage medium in their computers.

Hard disk drives have mechanical parts

If you aren’t familiar, hard disk drives store data on circular disks made up of aluminum, glass, or ceramic that are coated with a magnetic layer, often called platters. Since these platters are responsible for holding the data, the storage capacity of an HDD is dependent on how many platters it has.

The big disks are the platters and the arm hovering above is the actuator arm.

When the computer’s processor sends out instructions to read and write data, the motor on the drive moves the actuator arm across the platter. At the end of the actuator arm are the read/write heads which are made up of tiny magnets responsible for reading data already stored on the platter or writing new data on the empty spaces on the platter. The combined movement of the actuator arm and the rotation of the platter allows the computer to read and write data, which is kind of like the arm of a record player touching a vinyl record to play music.

Having all these moving parts means an HDD’s read and write speed is dependent on how fast the platters can rotate and how fast the actuator arm can track locations on the platter. These parts can only move up to a certain speed or else they’ll break down, and nobody wants a broken storage device. As with all mechanical parts, heat and noise are by-products of their movements, which is why an HDD can become hot and/or noisy during operation.

Solid-state drives have no moving parts

From its name, an SSD is a drive that uses a type of solid-state storage called flash memory, which is also a non-volatile storage medium, to store and retrieve data. Each flash memory chip found in the circuit board of an SSD contains memory cells that are made up of floating-gate transistors, which are a special type of transistor that can store or discharge an electrical charge in its cage-like part called the floating gate. The storing capability of these transistors is what allows the data to remain, even when there’s no electricity flowing through them.

Inside an SSD is a circuit board with a bunch of embedded chips, including the flash memory, controller, and cache.

As mentioned, an SSD doesn’t have moving parts like the actuator arm and motors of an HDD. Instead, it has an embedded processor called a controller. Much like the computer’s processor, the controller does all the heavy lifting, as it’s the one responsible for locating the blocks of memory where data can be read or written to.

This is also the reason why SSDs perform faster than HDDs; since they don’t need to wait for any moving parts to read or write data, the controller just needs to receive the instructions from the computer’s processor and it can start reading or writing data.

SSDs may not suffer from a mechanical breakdown, but they’re far from faultless. Flash memory can only have data written and erased a finite number of times before its cells degrade and become unreliable. This means an SSD can only write a certain amount of data before it fails, which is why SSD specification sheets typically include Terabytes Written (TBW), so consumers know how much data can be written into the drive before it eventually fails. However, SSDs these days can last more than ten years in typical day-to-day usage.

Both storage mediums have pros and cons

So, is a solid-state drive better than a hard disk drive, or vice-versa? Sadly, there’s no simple answer to this question, as it all depends on the needs of the consumer, which usually involves speed and storage capacity.

On one hand, if a person wants faster read/write times, an SSD is the clear winner, but you’ll lose out on storage capacity, since most SSDs today start from 120GB and can only go up to 1TB or 2TB. Mind you, those high-capacity SSDs will surely burn a big hole in your wallet.

On the other hand, if a person values capacity more, an HDD is the better option, with drives typically ranging from 500GB to 6TB of storage capacity for mainstream HDDs. Also, HDDs don’t cost an arm and a leg compared to SSDs if you want to get large-capacity ones.

With these in mind, there’s no stopping consumers from having both an SSD and an HDD in the same system. Setting up an SSD as your main drive with the operating system and other important software, while having a secondary HDD to store all your media and personal files, would net you the best of both worlds: a speedy system boot up without sacrificing storage space.

[irp posts=”2500″ name=”LTE-A Explained”]

The post SSD and HDD: What’s the difference? appeared first on GadgetMatch.

]]>
LTE-A Explained https://www.gadgetmatch.com/lte-a-explained/ Thu, 19 May 2016 11:57:15 +0000 http://www.gadgetmatch.com/?p=2500 A few weeks ago, I was lounging on a beach chair in Boracay Island in the Philippines, sipping on a mojito and enjoying crazy fast mobile internet speeds of about 200 megabits per second. For the uninitiated, at 200 Mbps, you can download the original Iron Man movie in high def, or the popular racing […]

The post LTE-A Explained appeared first on GadgetMatch.

]]>
A few weeks ago, I was lounging on a beach chair in Boracay Island in the Philippines, sipping on a mojito and enjoying crazy fast mobile internet speeds of about 200 megabits per second.

For the uninitiated, at 200 Mbps, you can download the original Iron Man movie in high def, or the popular racing game Need For Speed in well under 2 minutes.

Not that the average joe constantly needs to download large apps or movies while out and about. But the promise of LTE-A is an internet experience that’s free from waiting – YouTube videos that don’t buffer, web pages that load in a snap, and hiccup-free FaceTime calls.

It’s almost cruel that this alternate reality, this internet utopia, exists on this tiny little island. But that’s the point; because it is both a relatively small land mass and one that’s heavily frequented by local and foreign tourists, the paradise island of Boracay is the perfect test case for what eventually should roll out to the entire archipelago.

If all goes according to plan (and promise) the wait won’t be too long. Smart Communications, the telco behind the Boracay experiment says LTE-A will become a reality for Filipinos within a year and a few months. Users in Australia, the US, South Korea and Turkey already enjoy LTE-A speeds.

A few years back, while living in Seoul for a few months, I would get faster data speeds on my smartphone than I did in my apartment. And that’s saying a lot considering, South Korea enjoys some of the fastest internet speeds in the world.

So what does LTE-A mean? How does it work? And how is it able to deliver super fast mobile internet speeds

The secret sauce behind LTE-A is technology called carrier aggregation. But for you to understand this better, it’s important to first understand how data is transferred from the internet through cell towers to your phones.

Every time you access the internet to download a new instant message, stream a Spotify track, song or watch a video on Facebook data – it’s like transferring cargo from point A to point B. Trucks pick up the cargo and drive them down a single lane highway to their destination.

lte-a-explainer-01

Imagine each truck carried 100 megabytes of data. You’d need 30 trucks to make up 3GB, the average size of a Full HD movie.

lte-a-explainer-02

It will take a while for 30 trucks travel from point A to point B on a single lane highway.

lte-a-explainer-03

But imagine if there were 5 lanes –  more trucks could travel at the same time, and at a faster speed, allowing all 30 trucks to get to the destination sooner.

lte-a-explainer-04

That’s exactly how carrier aggregation on LTE-A works. Using multiple LTE bands (up to 5 in theory, 3 on Smart’s LTE-A network) data is transferred to your smartphone simultaneously. Smartphones that support LTE-A have multiple antennas for receiving multiple data streams from multiple signals.

lte-a-explainer-05

LTE-A promises theoretical speeds of up to 1Gbps. That’s insanely fast. Here’s to hoping that soon it will become a reality for everyone.

Editor’s Note: As is always the case, I expect a bit of backlash from the consumer public – why tease the next big thing in mobile internet when LTE speeds aren’t even that great yet, and let’s not get started on the data capping debate. Both are important issues that deserve attention and discussion –  but it should not distract from the fact that the technology exists, and if implemented properly is something that can improve lives.

[irp posts=”7566″ name=”Singapore, S. Korea dominate 4G LTE rankings, Philippines struggles”]

The post LTE-A Explained appeared first on GadgetMatch.

]]>