Why the OnePlus 6 is one of XDA's favorite gaming phones of 2018 (Sponsored) - OnePlus 5T Guides, News, & Discussion

Why the OnePlus 6 is one of XDA's favorite gaming phones of 2018
The OnePlus 6 is here and we at XDA have been testing it and sharing our thoughts with you on the portal. Our research has shown that this phone is the best gaming phone of 2018 so far. OnePlus tagged the OP6 with the slogan "The Speed You Need" and this proves to be true.
Powered by the Snapdragon 845 and options for 6 or 8GB of RAM, this is the fastest phone on the market right now. The Adreno 630 GPU makes the most recent games like PUBG run effortlessly at 30FPS, which is the max framerate for the game. There is also a noticeable improvement in game load-times. We tested Asphalt 8, and PUBG load times against the OnePlus 5T and here are the results:
Asphalt app launch time:
OnePlus 5T: 5.18 seconds
OnePlus 6: 4.95 seconds
PUBG app launch time:
OnePlus 5T: 19.79 seconds
OnePlus 6: 17.98 seconds
This performance extends to the UI as well. OnePlus always goes with a near-stock Android experience. They are constantly optimizing their software to ensure the smoothest UX possible. Navigating your phone results in very few dropped frames and animations that are quick and lag-free.
"The Adreno 630 featured in the OnePlus 6’s Snapdragon 845 is actually one of the beefiest specification upgrades this new flagship brings. This GPU features a revamped architecture, with Qualcomm claiming a 30% boost to graphics performance and 30% in power reduction (at the same level of performance as last 2017’s Snapdragon 835), something which we were able to verify in our Snapdragon 845 hands-on earlier this year." - Mario Serrafero
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
With up to 8GB of RAM, the OnePlus 6 has the same great RAM management that we have seen in previous phones. We were able to load forum mobile games and have them remain in memory (in particular, Asphalt 8, Lineage II: Revolution, PUBG and Modern Combat 5), so gamers should appreciate the additional RAM even if the number of apps that can be held at any given time is limited.
"The OnePlus 6 certainly feels extremely fast, and when you do try and measure its speed for improvements you can also find some small steps forward. I get to use multiple phones every year, and often carry two phones at any given time — I get to notice the speed advantage every day, even if it’s not always that significant. For example, while I’ve grown fond of my Galaxy Note 8 (which brought its fair share of improvements) the difference between that and the OnePlus 6 is clear and immediate the moment I switch phones. I’m not just talking app launch speeds here, either, it’s an advantage that permeates the user experience. This is an area where OnePlus has been consistently surpassing competitors, and that’s almost become popular knowledge with reviews, YouTube speed tests and user feedback all agreeing on the matter."- Mario Serrafero
Check out more of our coverage on the OnePlus 6 here:
OnePlus 6 Speed, Smoothness & Gaming XDA Review: Living up to the Slogan
OnePlus 6 Hands-On: Redefined Speed and a Premium Design that Reflects 2018’s Smartphone Trends
Check out the OnePlus 6 forums to see what users think about this phone.
OnePlus 6 Forums
We thank OnePlus for sponsoring this post. Our sponsors help us pay for the many costs associated with running XDA, including server costs, full time developers, news writers, and much more. While you might see sponsored content (which will always be labeled as such) alongside Portal content, the Portal team is in no way responsible for these posts. Sponsored content, advertising and XDA Depot are managed by a separate team entirely. XDA will never compromise its journalistic integrity by accepting money to write favorably about a company, or alter our opinions or views in any way. Our opinion cannot be bought.

What is this shameless spam post doing in our OP5T threads?????????????????.............take your "notch", and get out!! lol

Isn't it against forum rules ?
Rule 11 and 13 at least.

Aklo01 said:
Isn't it against forum rules ?
Rule 11 and 13 at least.
Click to expand...
Click to collapse
It says: "sponsored by OnePlus"............so that means, money talks!!

Aklo01 said:
Isn't it against forum rules ?
Rule 11 and 13 at least.
Click to expand...
Click to collapse
It is.

XDARoni said:
Why the OnePlus 6 is one of XDA's favorite gaming phones of 2018
The OnePlus 6 is here and we at XDA have been testing it and sharing our thoughts with you on the portal. Our research has shown that this phone is the best gaming phone of 2018 so far. OnePlus tagged the OP6 with the slogan "The Speed You Need" and this proves to be true.
Powered by the Snapdragon 845 and options for 6 or 8GB of RAM, this is the fastest phone on the market right now. The Adreno 630 GPU makes the most recent games like PUBG run effortlessly at 30FPS, which is the max framerate for the game. There is also a noticeable improvement in game load-times. We tested Asphalt 8, and PUBG load times against the OnePlus 5T and here are the results:
Asphalt app launch time:
OnePlus 5T: 5.18 seconds
OnePlus 6: 4.95 seconds
PUBG app launch time:
OnePlus 5T: 19.79 seconds
OnePlus 6: 17.98 seconds
This performance extends to the UI as well. OnePlus always goes with a near-stock Android experience. They are constantly optimizing their software to ensure the smoothest UX possible. Navigating your phone results in very few dropped frames and animations that are quick and lag-free.
"The Adreno 630 featured in the OnePlus 6’s Snapdragon 845 is actually one of the beefiest specification upgrades this new flagship brings. This GPU features a revamped architecture, with Qualcomm claiming a 30% boost to graphics performance and 30% in power reduction (at the same level of performance as last 2017’s Snapdragon 835), something which we were able to verify in our Snapdragon 845 hands-on earlier this year." - Mario Serrafero
With up to 8GB of RAM, the OnePlus 6 has the same great RAM management that we have seen in previous phones. We were able to load forum mobile games and have them remain in memory (in particular, Asphalt 8, Lineage II: Revolution, PUBG and Modern Combat 5), so gamers should appreciate the additional RAM even if the number of apps that can be held at any given time is limited.
"The OnePlus 6 certainly feels extremely fast, and when you do try and measure its speed for improvements you can also find some small steps forward. I get to use multiple phones every year, and often carry two phones at any given time — I get to notice the speed advantage every day, even if it’s not always that significant. For example, while I’ve grown fond of my Galaxy Note 8 (which brought its fair share of improvements) the difference between that and the OnePlus 6 is clear and immediate the moment I switch phones. I’m not just talking app launch speeds here, either, it’s an advantage that permeates the user experience. This is an area where OnePlus has been consistently surpassing competitors, and that’s almost become popular knowledge with reviews, YouTube speed tests and user feedback all agreeing on the matter."- Mario Serrafero
Check out more of our coverage on the OnePlus 6 here:
OnePlus 6 Speed, Smoothness & Gaming XDA Review: Living up to the Slogan
OnePlus 6 Hands-On: Redefined Speed and a Premium Design that Reflects 2018’s Smartphone Trends
Check out the OnePlus 6 forums to see what users think about this phone.
OnePlus 6 Forums
We thank OnePlus for sponsoring this post. Our sponsors help us pay for the many costs associated with running XDA, including server costs, full time developers, news writers, and much more. While you might see sponsored content (which will always be labeled as such) alongside Portal content, the Portal team is in no way responsible for these posts. Sponsored content, advertising and XDA Depot are managed by a separate team entirely. XDA will never compromise its journalistic integrity by accepting money to write favorably about a company, or alter our opinions or views in any way. Our opinion cannot be bought.
Click to expand...
Click to collapse
Don't have to take pictures with a sub par camera!

Related

SGS2(mali400) beats iPad2(sgx543) in GLbenchmark

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I check the GLbenchmark website just now,found that the result of i9100 is now public.
Captured the pictures for your convinience.
If the benchmark only runs at native resolution then this is useless because the iPad has a much higher resolution, therefore it would perform worse. Of course if it runs at a fixed resolution I couldn't be more happy.
I know maybe it's a little bit unfair,it runs at native resolution,but still impressive.
And I did a little calculation with other model's results,the 25% more pixels leads to 12.5% less frames.
iPad 2 have 100% more pixels than SGSII.
(all proximate value)
1.25*1.25*1.25= 2;
0.875*0.875*0.875 =0.67;
So if SGSII has a iPad 2 resolution,the benchmark would be one third less.
Well for me it seems that sgx543 has better performance. But good thing is that it wont be that much.
Sent from my GT-I9000 using Tapatalk
enzografix said:
Well for me it seems that sgx543 has better performance. But good thing is that it wont be that much.
Sent from my GT-I9000 using Tapatalk
Click to expand...
Click to collapse
I don't think performance is really that important, tbh. It's all about the kind of games that will be developed for the platforms. For example, even though the original SGS with its sgx540 is FAR better than the iphone 4's sgx535, the quality of games available and its graphics are much MUCH better on the iphone 4. I have both the gulf in class between the two when it comes to the quality of the gaming library is absolutely enormous, and only the tegra 2 phones have done something to bridge this gap for android phones.
One can only imagine the kind of awesome games that will be developed for that beastly sgx543 for the iphone 5. I just hope that the majority of the games will also find their way on the android market and the SGS2 (and not tegra zone), with its ever growing popularity because the mali definitely has the capability to run those games well.
Personally, i feel that the next iteration of the Xperia Play will be the ultimate device for hardcore gamers, beating even the iOS devices at the time. The current one is the ultimate emulation phone thanks to its brilliant gamepad, and runs all oldschool PSX, N64, SNES etc. games with aplomb - and if you're into that kind of thing then it's an absolute joy to behold.
No, most of the scores i've seen and done myself are around 40-44 fps for Mali, NOT 50 fps, I'm not sure what they've done to get over 50 fps but that's not what the phone normally scores.
Also theres a resolution difference.
EDIT: Just rechecked it's on Top not Average, when looking at benchmark numbers you need to look at the average framerate.
It's actually 46.7 fps or the iPad 2 and 40.2 fps for the Galaxy S II.
omersak said:
I don't think performance is really that important, tbh. It's all about the kind of games that will be developed for the platforms. For example, even though the original SGS with its sgx540 is FAR better than the iphone 4's sgx535, the quality of games available and its graphics are much MUCH better on the iphone 4. I have both the gulf in class between the two when it comes to the quality of the gaming library is absolutely enormous, and only the tegra 2 phones have done something to bridge this gap for android phones.
One can only imagine the kind of awesome games that will be developed for that beastly sgx543 for the iphone 5. I just hope that the majority of the games will also find their way on the android market and the SGS2 (and not tegra zone), with its ever growing popularity because the mali definitely has the capability to run those games well.
Personally, i feel that the next iteration of the Xperia Play will be the ultimate device for hardcore gamers, beating even the iOS devices at the time. The current one is the ultimate emulation phone thanks to its brilliant gamepad, and runs all oldschool PSX, N64, SNES etc. games with aplomb - and if you're into that kind of thing then it's an absolute joy to behold.
Click to expand...
Click to collapse
If the xperia play has good quality N64 roms, my SGS 2 is going back to mr orange
Honestly benchmarks mean nothing if there's NO apps.
and take it from me, my iOS library is over 600 purchased apps and games, and I've got around 20 purchased games on android.
That's a huge difference, I literally KNOW of every game almost in existence I get them RIGHT away when they come out. I know the very vast differences in choices from Android gaming to iOS.
and I always miss my iOS games when i'm out (android phone) it's a good thing my iPad 2 gives me the Fix I need. =D
for you iOS gamers out there, seriously try out a game called "Battleheart" It will devour your soul... freaking epic game...
MaxxiB said:
If the xperia play has good quality N64 roms, my SGS 2 is going back to mr orange
Click to expand...
Click to collapse
You'll be amazed: http://www.youtube.com/watch?v=qYodquYXArs
tbh every android phone can run n64 games well i guess, what sets the play apart is that there aren't any ugly and cumbersome touch controls taking up the screen.
But the fact remains that the SGS2, apart from the above, does everything else SO much better!
How many cores?
Can someone confirm if the Mali400 used in GS2 using multi core? I believe I read somewhere that Samsung decided to use only a single core for the GS2.
I think you can find some game pad for android on sale in the near future,since Gingerbread 2.3.4 supports Open Accessory API.
So you can connect to a pad when you play games,and get rid of it when you don't.
Based on some of those scores, it looks like it's hitting the 60fps cap
Even at higher res the exynos scores great, check the hard kernel benches =) their tab uses the same/similar chip
Sent from my SGH-T959 using Tapatalk
Bear in mind that the GPU in mobiles tend to clocked lower than that in tablets to keep the power consumption down. So diffcicult to do an apples for apples comparison (excuse the pun) in this case. Will have to wait for iPhone 5 numbers.
rd_nest said:
Can someone confirm if the Mali400 used in GS2 using multi core? I believe I read somewhere that Samsung decided to use only a single core for the GS2.
Click to expand...
Click to collapse
What i have read is that it uses a 2 core model. The 4 core model is headed for the Sony NGP.
Papi4baby said:
What i have read is that it uses a 2 core model. The 4 core model is headed for the Sony NGP.
Click to expand...
Click to collapse
That's the PowerVR SGX543 you're thinking of. The iPad 2 uses the dual-core variant, the NGP will be using the quad-core variant.
The GS II has a quad-core Mali-400. Anything less, and it would never attain the results it's currently getting on GLBenchmark.
ph00ny said:
Based on some of those scores, it looks like it's hitting the 60fps cap
Click to expand...
Click to collapse
The Pro test that's the case but it's not wit the Eqypt test, max is around 50 fps so it never even reaches 60 fps.

[Open source] ARM A9+FPGA+64 cores board : Parallella: A Supercomputer For Everyone

Hello,
I backed this interesting project on Kickstarter : Parallella: A Supercomputer For Everyone. Basically it's a 3.4'' x 2.1'' board.
It has open access (e.g. no NDAs), it is based on free and open source development tools and libraries, and it's very affordable (the project targets to make boards available for $100 a piece).
This project only has 25 hours left to reach its funding goal. You can help reach it funding goal by spreading the word, or even better, become a backer as well!
Looking forward to your comments!
Cheers,
-- Freddy
PS : as a XDA forum noob I'm not able to add any URLs to this post, but searching for "Parallella: A Supercomputer For Everyone" will do the trick!
Visionscaper said:
Hello,
I backed this interesting project on Kickstarter : Parallella: A Supercomputer For Everyone. Basically it's a 3.4'' x 2.1'' board.
It has open access (e.g. no NDAs), it is based on free and open source development tools and libraries, and it's very affordable (the project targets to make boards available for $100 a piece).
This project only has 25 hours left to reach its funding goal. You can help reach it funding goal by spreading the word, or even better, become a backer as well!
Looking forward to your comments!
Cheers,
-- Freddy
PS : as a XDA forum noob I'm not able to add any URLs to this post, but searching for "Parallella: A Supercomputer For Everyone" will do the trick!
Click to expand...
Click to collapse
Interesting proposal, this FPGA+SOC is the same as on the Zedboard which costs $299.
It's a good deal even without the parallel co-processor if you want ARM with the programmable FPGA setup.
It's useful for a whole host of embedded and robotic applications if you need the FPGA.
I guess the co-processor is a nice addition too.
Of course there is a risk that they won't deliver, I guess there is no guarantee that people who pledge get the board at the end.
I pledged for 2 boards, hopefully they will come through.
And the link
And the link.
http://www.kickstarter.com/projects/adapteva/parallella-a-supercomputer-for-everyone
I've backed it up for 2 boards, will be checking anxiously to see if it passes the funding goal tomorrow!!! Fingers crossed.
Very Cool!!
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Some More Info:
How many cores do the initial $99 Epiphany-III based Parallella boards have?
2 ARM-A9 cores and 16 Epiphany cores.
When will the 64-core Parallella boards be available?
We will be offering the Epiphany-IV based 66 (2+64) core version Parallella boards as soon as we reach our stretch funding goal of $3M. The reward will be available for those who pledge more than $199. The estimated delivery time for the 64 core boards would be May, 2013.
Why do you call the Parallella a supercomputer?
The Parallella project is not a board, it's intended to be a long term computing project and community dedicated to advancing parallel computing. The current $99 board aren't considered supercomputers by 2012 standards, but a cluster of 10 Parallella boards would have been considered a supercomputer 10 years ago. Our goal is to put a bona-fida supercomputer in the hands of everyone as soon as possible but the first Parallella board is just the first step. Once we have a strong community in place, work will being on PCIe boards containing multiple 1024-core chips with 2048 GFLOPS of double precision performance per chip. At that point, there should be no question that the Parallella would qualify as a true supercomputing platform.
Where can I learn more about the Epiphany processors?
Introduction:
http://www.adapteva.com/introduction
Microprocessor Report:
http://www.adapteva.com/news/adapteva-more-flops-less-watts/
Epiphany Datasheets:
http://www.adapteva.com/products/silicon-devices/e16g301
http://www.adapteva.com/products/silicon-devices/e64g401
Why is there only 1GB of RAM?
The current board configuration only supports up to 1GB of SDRAM. This is a limit our current host ARM CPU. If the Parallella project gets funded, there will be more boards coming that have significantly more RAM.
Will Parallella run Windows?
The Parallella board uses a dual core 32 bit A9 ARM CPU that currently supports Ubuntu 11.10 and bare bone Linux. Our plan is to move to Ubuntu LTS 12.04 as soon as possible. It may be possible to support Windows through Wine or a virtual machine going forward, but we haven't checked those options yet. It may not be practical due to the memory limitation of the board.
Will you open source the Epiphany chips?
Not initially, but it may be considered in the future.
Why is the performance so much lower than a leading edge GPU or CPU?
The Epiphany chips are much smaller high end CPUs and GPUs. The 64-core Epiphany chip only occupies 10mm^2, about 1/30th the size of modern GPUs and CPUs. If we would scale up our chips to the same die size, the Epiphany chips would win in terms of raw performance. Still, that's not really the point. For a 5 Watt power envelop, it's energy efficiency that matters.
Why should I buy this board instead of Raspberry Pi?
We think you should buy both! The Raspberry Pi has a much bigger eco-system at the moment and is a great starting point. Still, the Parallella board have some distinct advantage:
1.) 10-50x more performance than the Raspberry Pi
2.) An accelerator that can be programmed in OpenCL/ C/ C++
3.) Open specs/documents
4.) Gigabit ethernet
5.) More flexible and powerful GPIO
Why do you say the Parallella is a 45GHz computer?
We have received a lot of negative feedback regarding this number so we want to explain the meaning and motivation. A single number can never characterize the performance of an architecture. The only thing that really matters is how many seconds and how many joules YOUR application consumes on a specific platform.
Still, we think multiplying the core frequency(700MHz) times the number of cores (64) is as good a metric as any. As a comparison point, the theoretical peak GFLOPS number often quoted for GPUs is really only reachable if you have an application with significant data parallelism and limited branching. Other numbers used in the past by processors include: peak GFLOPS, MIPS, Dhrystone scores, CoreMark scores, SPEC scores, Linpack scores, etc. Taken by themselves, datasheet specs mean very little. We have published all of our data and manuals and we hope it's clear what our architecture can do. If not, let us know how we can convince you.
Does Parallella come with an Operating System?
Yes, the Parallella prototypes have been extensively tested with Ubuntu 12.04. The Ubuntu O/S runs on the dual-core ARM A9 CPU on the board.
Click to expand...
Click to collapse
I see nothing very new in this. Why? http://www.nvidia.co.uk/object/cuda_home_new_uk.html
Nvidia card for $100 can have even ~96 computing units and these can communicate on bus much faster than 1.4GB/s.
Impresionant!! Good! Interesting!
Rebellos said:
I see nothing very new in this. Why? http://www.nvidia.co.uk/object/cuda_home_new_uk.html
Nvidia card for $100 can have even ~96 computing units and these can communicate on bus much faster than 1.4GB/s.
Click to expand...
Click to collapse
Doesn't nvidia produce graphics cards?
This is a full computer, with CPU(s) and complete motherboard
Sent from my HTC One S using XDA app
hiu115 said:
Doesn't nvidia produce graphics cards?
This is a full computer, with CPU(s) and complete motherboard
Sent from my HTC One S using XDA app
Click to expand...
Click to collapse
I think they are both very interesting development research, but in the end the question is: What would you do with it?
There are plenty of interesting dev boards and highly parallel computing platforms out there. Another is Tilera. (which is obviously more focused on networking: L4-L7 filtering, utm applications)
I know that cuda is on a gpu it is designed for graphics, it is still a processor and you can use it to say, offload computational or compilation processes. So you could have a highly parallel build server. But there are many more applications you could do with a processor like that. It would require a host computer with pci-e and a large high output power supply.
The interesting note with parallella is it's power usage:
Once completed, the 64-core version of the Parallella computer would deliver over 90 GFLOPS of performance and would have the the horse power comparable to a theoretical 45 GHz CPU [64 CPU cores * 700MHz] on a board the size of a credit card while consuming only 5 Watts under typical work loads. For certain applications, this would provide raw performance than a high end server costing thousands of dollars and consuming 400W.
Click to expand...
Click to collapse
But again, what would you do with a board like this?
I can think of a few applications, but I don't think they would be android based.

NVIDIA Tegra 4 vs Nexus 10 processor

They've unveiled it today
http://www.engadget.com/2013/01/06/nvidia-tegra-4-official/
and apparently it's much powerful and faster than the eqynox on the nexus 10, but I don't know that much about this kind of tech, I'm probably finally going to buy the Nexus 10 this week if Samsung doesn't unveil a more powerful tablet, so I was wondering if this Tegra 4 processor is worth waiting for until it's implemented on a tablet.
May TEGRA 3 Rest in Peace ...
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Sent from my GT-I9100 using Tapatalk 2
Yes that thing is packing heat. Best case, the next device with a tegra 4 will come out next Christmas. Unless they've been hiding something.
cuguy said:
Yes that thing is packing heat. Best case, the next device with a tegra 4 will come out next Christmas. Unless they've been hiding something.
Click to expand...
Click to collapse
It will be out somewhere b/w June and August maybe..
It will not take that long ...
Sent from my GT-I9100 using Tapatalk 2
i think march....mark my words
Their browser test is having the Nexus 10 run Chrome while the Tegra runs AOSP. In my eyes that makes it a 100% unfair comparison.
Between bad experiences with Tegra 2 and 3 (Atrix/TF700) and their requirement that Tegra optimized games not run on other SoC vendors without any real reason other than because they can, I cant even consider a mobile Nvidia device. All they're good for is keeping the more reputable chip makers on their toes.
yes it's nice
Would be interesting to see this with both devices running the AOSP browser! From my experience it is much faster than the current chrome version (which still is version 18 on android, compared to 23 on desktop). Maybe the Tegra4 would be faster aswell, but not that much.
Everything on my N10 is extremely fast and fluid, so I wouldn't wait for whenever the first Tegra4 devices will be available. Plus its Nexus so you know what you are buying!
Jotokun said:
Their browser test is having the Nexus 10 run Chrome while the Tegra runs AOSP. In my eyes that makes it a 100% unfair comparison.
Between bad experiences with Tegra 2 and 3 (Atrix/TF700) and their requirement that Tegra optimized games not run on other SoC vendors without any real reason other than because they can, I cant even consider a mobile Nvidia device. All they're good for is keeping the more reputable chip makers on their toes.
Click to expand...
Click to collapse
Agreed, they're making an Apples and Pears comparison that was undoubtedly set to show the new processor in a good light. It's only to be expected, it is a sales pitch after all. It will no doubt be a faster chip though.
Sent from my Nexus 10 using XDA Premium HD app
I would much rather see a couple benchmark runs myself. A time comparison of a web browser is no way to test the power of a new chipset.
Still, I would expect Tegra 4 to be WAY WAY WAY more powerful than the Exynos 5250. Both devices use the A15 architecture, and Tegra 4 has twice as many CPU cores as we have. This alone is already a big boost in multithreaded apps. Then look at the GPU where you cant even compare the two at all except by end result numbers. They are just far too different. We have 4 GPU cores, Tegra 4 has 72 GPU cores. But those cores are designed far differently and not nearly as powerful per core. It is all about the companies definition of what a GPU "core" is. And then you have a smaller process node as well, which by itself already promises to use less power than the larger process node the Exynos 5250 uses.
I would honestly expect the Tegra 4 chipset to completely destroy our tablet in terms of performance, but a much more fair comparison would be to compare the Tegra 4 to the Exynos 5 quad. Those two are actually designed to compete with each other.
If you want to compare exynos and tegra 4 then wait for exynos 5450 (quad a15) which should come with galaxy s4 no of cores makes a difference here t4 is quad but early gl benchmarks show that A6X and exynos 5250 have a better GPU
First Tegra 4 Tablet running stock android 4.2:
http://www.theverge.com/2013/1/7/3845608/vizio-10-inch-tablet-combines-tegra-4-android-thin-body
Lets hope and pray Vizio managed to add microSD capability to it. Free of Nexus restraints, it should be doable, but since it is running stock JB, the storage would have to be mounted as USB (?) . So far this Vizio 10" was the most exciting Android development out of CES. We have few more our of Press events scheduled in Vegas and then it will be all over
rashid11 said:
Lets hope and pray Vizio managed to add microSD capability to it. Free of Nexus restraints, it should be doable, but since it is running stock JB, the storage would have to be mounted as USB (?) . So far this Vizio 10" was the most exciting Android development out of CES. We have few more our of Press events scheduled in Vegas and then it will be all over
Click to expand...
Click to collapse
Dont expect the Nexus advantage of up-to-date software or timely updates.
EniGmA1987 said:
I would much rather see a couple benchmark runs myself. A time comparison of a web browser is no way to test the power of a new chipset.
Still, I would expect Tegra 4 to be WAY WAY WAY more powerful than the Exynos 5250. Both devices use the A15 architecture, and Tegra 4 has twice as many CPU cores as we have. This alone is already a big boost in multithreaded apps. Then look at the GPU where you cant even compare the two at all except by end result numbers. They are just far too different. We have 4 GPU cores, Tegra 4 has 72 GPU cores. But those cores are designed far differently and not nearly as powerful per core. It is all about the companies definition of what a GPU "core" is. And then you have a smaller process node as well, which by itself already promises to use less power than the larger process node the Exynos 5250 uses.
I would honestly expect the Tegra 4 chipset to completely destroy our tablet in terms of performance, but a much more fair comparison would be to compare the Tegra 4 to the Exynos 5 quad. Those two are actually designed to compete with each other.
Click to expand...
Click to collapse
look at the new iphone only 2 cores (different architecure) beating the higher clock double cored galaxy s3 in some disciplines..
this presentation is scientifically SO SO irrelevant especially becasue they use different software.. I LOL SO HARD at ppl thinking this is anywhere near to be comparable
schnip said:
look at the new iphone only 2 cores (different architecure) beating the higher clock double cored galaxy s3 in some disciplines..
this presentation is scientifically SO SO irrelevant especially becasue they use different software.. I LOL SO HARD at ppl thinking this is anywhere near to be comparable
Click to expand...
Click to collapse
The new iPhone 5 doesnt use the same ARM architecture are the S3 though, it is a custom design. So those can be compared against each other fine to see which architecture is better. And if a slower clock speed CPU gets better scores then we know it is a superior design. This is the basis of all benchmarking. If we were only allowed to compare the exact same architectures together then we wouldnt learn anything.
Tegra4 uses a (probably slightly modified) A15 core, and the Exynos 5xxx uses a fairly stock A15 core. So a higher clocked A15 should beat out a lower clocked A15 in a direct comparison no matter what. Then when you throw 2 additional cores on top it should always win in multithreaded benchmarks too. Seems pretty common sense to me.
The main difference will be in the graphics side of things, where Nvidia has their own designed GPU compared to Samsung's use of the Mali GPU's.
You can still compare them together just fine, it just need to be both of them on the same browser if there is a browser comparison being done. In this PR release, Nvidia skewed the results like all companies do. So we cant really see the difference between the two from those pictures and we need to wait for 3rd party review sites to do proper testing to see actual results. Yet we can still estimate performance plenty fine since we have a baseline of the architecture already with this tablet.
"Tegra 4 more powerful than Nexus 10"... well duh! It's a new chip just unveiled by nvidia that won't show up in any on sale devices for at least a couple of months. Tablet and smartphone tech is moving very quickly at the moment, nvidia will hold the android performance crown for a couple of months and then someone (probably samsung or qualcomm) will come along with something even more powerful. Such is the nature of the tablet/smartphone market. People that hold off on buying because there is something better on the horizon will be waiting forever because there will always be a better device just a few months down the line!
EniGmA1987 said:
The new iPhone 5 doesnt use the same ARM architecture are the S3 though, it is a custom design. So those can be compared against each other fine to see which architecture is better. And if a slower clock speed CPU gets better scores then we know it is a superior design. This is the basis of all benchmarking. If we were only allowed to compare the exact same architectures together then we wouldnt learn anything.
Tegra4 uses a (probably slightly modified) A15 core, and the Exynos 5xxx uses a fairly stock A15 core. So a higher clocked A15 should beat out a lower clocked A15 in a direct comparison no matter what. Then when you throw 2 additional cores on top it should always win in multithreaded benchmarks too. Seems pretty common sense to me.
The main difference will be in the graphics side of things, where Nvidia has their own designed GPU compared to Samsung's use of the Mali GPU's.
You can still compare them together just fine, it just need to be both of them on the same browser if there is a browser comparison being done. In this PR release, Nvidia skewed the results like all companies do. So we cant really see the difference between the two from those pictures and we need to wait for 3rd party review sites to do proper testing to see actual results. Yet we can still estimate performance plenty fine since we have a baseline of the architecture already with this tablet.
Click to expand...
Click to collapse
That was kind of his point.
I don't think anyone is denying that the tegra will be faster. What's being disputed here is just how much faster it is. Personally, I don't think it'll be enough to notice in everyday use. Twice the cores does not automatically a faster CPU make, you need software that can properly take advantage and even then is not a huge plus in everyday tasks. Also, in the past Nvidia has made pretty crappy chips due to compromise. Good example being how the tegra 2 lacked neon support. The only concrete advantages I see are more cores and a higher clock rate.
Based on the hype : performance ratio of both Tegra 2 & 3 I wouldn't have high hopes until I see legit benchmark results.
What does seem promising though, is the fact that they are making more significant changes than from T2 to T3, such as dual channel memory (finally after 1-2 years of all other SoCs having it -.-) and the GPU cores are different too.
Still, the GPU has always been the weakest point of Tegra, so I still don't think it can beat an overclocked T-604 by much, even though this time around they will not be the first ones to debut a next-gen SoC. Given the A15 architecture they can't really screw up the CPU even if they wanted to, so that should be significantly faster than the Exynos 5 Dual.
I've also just read this article on Anandtech about power consumption and the SoC in the Nexus 10 consumes multiple times as much power as other tablet chipsets, making me wonder how nVidia plans to solve the battery life issue with 2 times as many cores and a (seemingly) beefier GPU, not even mentioning implementation in phones..
freshlysqueezed said:
First Tegra 4 Tablet running stock android 4.2:
http://www.theverge.com/2013/1/7/3845608/vizio-10-inch-tablet-combines-tegra-4-android-thin-body
Click to expand...
Click to collapse
Apparently this is the tablet that comes to take Nexus 10 spot, Vizio 10 inch tablet with Tegra 4, 2560 x 1600 resolution, 32gb storage, Android 4.2. And it should be coming out Q1 2013, I think this one makes me wait to hear some more about it until I buy the Nexus 10, although to be honest the brand is a bit of a let down for me.
Edit: the 10-inch model, key specs (aside from Tegra 4) include a 2,560 x 1,600 display, 32GB of on-board memory, NFC and dual 5MP / 1.3MP cameras.
http://www.engadget.com/2013/01/07/vizio-10-inch-tegra-4-tablet-hands-on/

[Discussion] M9+ benchmarks, real life performance experiences

All HTC M9+ owners!
We're a handful yet on XDA, but getting more and more as the device is rolling out to more locations. and people who look for an elegant, high end device with premium build quality and the extra features like fingerprint scanner and 2K display and the duo camera settle with this fantastic device. It's not the perfect for everything unfortunately, not the best gaming phone out there, but in my experience a very well performing device in terms of phone call quality, reception quality, wifi power, multimedia, battery, display panel and overall UX feel, smoothness. High end games of latest year 2015 might stutter a bit here and there or lack some important shaders to compensate, but nonetheless good for last years (2014 and before 3D gaming and all kinds of 2D games), let's gather the experience and benchmarks of this unique device which was maybe the first MTK device in alubody
Let's discuss performance related experience, real user feel/experience and benchmarks free of whining, facing the truth that in some respects it's not the top-notch device, but let the curious ones who consider to buy this device, know what to expect if choosing this elegant business class phone.
I'll start with some of my benchmarks result screenshots in separate posts.
UPDATE:
Here's my short game testing video on M9+
https://www.youtube.com/watch?v=QmLGCoI4NLw
Antutu on stock Europe base 1.61
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
My real life experience is that in everyday chores, the phone is very snappy, with some small lags when starting new apps to the memory. Task switching is quite good.
There's unfortunately some memory leak issue with 5.0.x Android version which after a while kicks in (on most 5.0.x devices as well), but overall the UX smoothness is just quite good. Occasionally there are some apps that brings up popup windows a bit stuttery, like facebook comments animation tends to be a bit stuttery.
The Sense Home/Blink Feed experience is just perfect. At normal operations, when no big application updates are happening in the background, I never faced any lags on the Sense Home UI.
As for the games, games from and before 2014 run perfectly. New ones might get some shaders removed, or with reduced polygon counts, so don't expect a jaw dropping 3D experience. If you are ok with last years (2014 and before) 3d game quality, M9+ is quite good, but latest games will be most probably running in the dumbed down mode accordingly to the PowerVR GPU the M9+ with mtk chipset has. The phone keeps a good temperature mostly, except when charging battery and at the same time playing 3D heavy games. (But that's expected from most devices, especially with alu body)
The screen quality is quite good, i've got a perfect panel with the first unit I got, and the refresh rate is 60hz, smooth and with lovely dpi and brightness of 2K.
The benchmarks show a general good performance with operations bound to the CPU. Where the MTK chip comes shorthanded is the GPU part. All tests show 3D performance that is below 2014's flagships performance by far. The GPU PowerVR 6200 is the same that was built into iPhone5s, which let's face is a mediocre GPU for a 2K display. If you face this fact and accept that gaming won't be with the highest quality textures and shaders, probably you won't be disappointed.
I'm curious what others think...
Played with it for a few hours. First impression is that it's clearly slower then the M9. The fingerprint sensor is almost perfect, a few hit and misses. I'll be back after a few days of using it with a more adequate feedback.
Sent from my HTC_M9pw using Tapatalk
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
DeadPotato said:
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
Click to expand...
Click to collapse
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
tbalden said:
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
Click to expand...
Click to collapse
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
DeadPotato said:
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
Click to expand...
Click to collapse
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
tbalden said:
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
Click to expand...
Click to collapse
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
DeadPotato said:
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
Click to expand...
Click to collapse
Second you.
X10 is MTK's first high-end tier SoC. They should equip with high-end graphics card to become well-known,not two-years ago one same with iPhone5s era. Although the most casual gaming is OK,but it is not covered with "flagship" devices.
It's excellent there were some possibility to tweak it on kernel level. But the Mediatek Helio X20 should work better,better & better on GPU part.
Not heavy gamer,the UX is pretty good for me. (Camera is another topic.)
Even I measured the CPU speed with Chess game,the result reached the top-tier.
tbalden said:
Antutu on stock Europe base 1.61
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
Click to expand...
Click to collapse
FYI.
Mine : stock , Taiwanese base 1.08,rooted device
Sigh.... the result was maxed out for my device on Ice Storm "Extreme".
Very similar result as mine was except ice storm unlimited. I think it might be related to temperature throttling, I'll test it again later.
Yeah, second run on base 1.61
Hello guys,
I live in France and i really look forward the HTC m9+ ; i thinks it is what the M9 should have been, but i don't understand HTC 2015 sh*tty choices of everything.
WHat i would like to know is about battery life ; Can you guys tell me whats about it ?
I just bought galaxy s6 edge a few days ago and it's creepy. I have no battery in the end of afternoon. I don't play games, just few photos and browsing, and messenger sometimes. And anyways i miss boomsound and metal body.
I'm so desapointed. Should never have sold my old one m8 or xperia Z3.
My only hope is turned on lenovo vibe x3 but there is no news since march ...
Thanks guys !
After installing more and more software that is synchronized, like dropbox and a few other my battery life got lower a notch, from 23+ hours to 22+ and average screen on around 4 hrs. All in all, not a miracle, but that's what I usually got with my previous phone, OPO with its larger battery and lower screen resolution. A bit less though, admittedly. So it gets me through the day nicely, but I'm not gaming.
It's average I think. Compared to the amazing Sony z3 compact it's pathetic, my friend's z3c gets 3+ days and 5 hours of screen on time. And I think it's done with some great stock kernel enchanted by Sony engineers and Sony stock rom... wish other phones could do that
Not sure why, I could not choose no lock screen and lock screen without security mode under security settings. Both options are greyed out and it says it is disabled by administrator,encryption policies or other apps.
I only have prey and lockout security installed which could affect this but this was totally fine on my M7 device.
Anyone has any idea? Thank you.
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
JellyKitkatLoli said:
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
Click to expand...
Click to collapse
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Yes, and it's fairly accurate.
I got 11871 with 2K, I couldn't screenshot as it doesn't know how to (Doesn't work above 1080p without dedicated software for it).
M8 1080p.
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
JellyKitkatLoli said:
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
Click to expand...
Click to collapse
hmmm......
The efficiency of SW decode will be performed vividly by multi-cores in your real life.
There were much comparison between MT6795(not "T") and SD801 in this site.
About the browser speed, RAR compression ,real gaming, power consumption........
http://tieba.baidu.com/p/3825199511?see_lz=1#69839534914l
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Are those results after reducing resolution to 1080p, did I read correctly? Because if that's the case, it's sad to confirm the limits of the 3D graphic card as other stated before, if you see the above M8 (which I find surprisingly higher than I experienced on my own M8 anyway) you'll notice even the last year's M8 has a better 3D score, and not to mention the regular M9 which produces almost double of that score. Multicore wise the Helio X10 is a beast, but graphics wise, looks like it's not so much

The 808 is why the 5X > 6P

Anyone who bought a nexus device this year probably has followed the saga of the 810 and its thermal related issues which have been all over the news.
It is now clear that no version (or revision) of the 810 can operate at its max frequency for very long before thermal controls kick in and throttle the device to lower frequencies.
Throttling comparison snapdragon 810 vs 808:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The 810 v2.1 revision was supposed to fix all this, but an in depth look at the v2.1 "revision" shows the 810 v2.1 still throttles like crazy which will lead to the same poor performance and battery drain associated with other 810 devices.
http://www.anandtech.com/show/9388/comparing-snapdragon-810-v2-and-v21
"there’s noticeably less throttling on the A57 cluster compared to Snapdragon 810 v2. However even with that change - and unlike the Snapdragon 808 and competing SoCs - both variants of the Snapdragon 810 still see the unfortunate characteristic of ultimately forcing all threads off of the A57 cluster to stay within TDP limits in high load conditions, such as when running Basemark OS II’s battery test."
All the 810's problems are well documented at this point, and its obvious LG and Motorola made the right choice to use the 808 in lieu of the 810 for their flagships. Every 810 device released suffers from excess heat generation and must throttle down processing power so quickly that the device's performance suffers during average usage.
@ my fellow 5X users - be glad our device comes with the 808 and not the 810.
Sent from my Nexus 5X using Tapatalk
you should've added a disclaimer stating that your post has an extremely tenuous link to either of the new Nexus phones since the graph shown is not related to either of the Nexus phones, and doesn't take into consideration, for example, the fact that the Nexus 6P has an all aluminium body.
oscillik said:
you should've added a disclaimer stating that your post has an extremely tenuous link to either of the new Nexus phones since the graph shown is not related to either of the Nexus phones, and doesn't take into consideration, for example, the fact that the Nexus 6P has an all aluminium body.
Click to expand...
Click to collapse
Thanks for your input. The htc one M9 also has an all aluminum body and a snapdragon 810. It was one of the first devices to suffer heating related issues because of the 810.
As proven by anandtech, having an all aluminum body or using excessive thermal paste (xiaomi mi note pro) won't keep the 810 from overheating.
http://www.anandtech.com/show/9388/comparing-snapdragon-810-v2-and-v21
Mi note pro 810 overheating even with excessive thermal reducing design elements, including paste:
http://www.phonearena.com/news/Xiao...t-is-equipped-with-Snapdragon-810-SoC_id69242
_jordan_ said:
Thanks for your input. The htc one M9 also has an all aluminum body and a snapdragon 810. It was one of the first devices to suffer heating related issues because of the 810.
As proven by anandtech, having an all aluminum body or using excessive thermal paste (xiaomi mi note pro) won't keep the 810 from overheating.
http://www.anandtech.com/show/9388/comparing-snapdragon-810-v2-and-v21
Mi note pro 810 overheating even with excessive thermal reducing design elements, including paste:
http://www.phonearena.com/news/Xiao...t-is-equipped-with-Snapdragon-810-SoC_id69242
Click to expand...
Click to collapse
So you've just proved my point again, it seems?
The HTC One M9 doesn't have version 2.1, and the Mi Note Pro has a glass back...
When you've got some graphs that are actually pertinent to these phones, then this thread will make sense. At the moment, it's just FUD.
Hi
oscillik said:
So you've just proved my point again, it seems?
The HTC One M9 doesn't have version 2.1, and the Mi Note Pro has a glass back...
When you've got some graphs that are actually pertinent to these phones, then this thread will make sense. At the moment, it's just FUD.
Click to expand...
Click to collapse
The problem with cooling is the SoC typically has a big DRAM chip stuck over the top of it which means the type of case doesn't really help with cooling as the heat just can not transfer quick enough out of the SoC when under full load.
To be fair its in the design that only bursty usage is able to run at full performance. Even Intel CPUs throttle on benchmarks despite huge fans and heatsinks. Phones are not really games consoles and high clocking CPUs are driven more by marketing with best case ratings being the headline which soon falls flat with artificial benchmarks.
For usage the device is designed for they perform well, but push them and it becomes obvious oir smart phones are not water cooled super computers.
Regards
Phil
LG GFlex 2 is a poor implementation and representation of the 810.
Test it against the newer 810 devices and lets come back and discuss.
---------- Post added at 09:19 PM ---------- Previous post was at 09:18 PM ----------
oscillik said:
So you've just proved my point again, it seems?
The HTC One M9 doesn't have version 2.1, and the Mi Note Pro has a glass back...
When you've got some graphs that are actually pertinent to these phones, then this thread will make sense. At the moment, it's just FUD.
Click to expand...
Click to collapse
M9 does have the 2.1...
ALL phones that use 810 have 2.1...
Qualcomm has essentially said 2.1 is an update to drivers of the chip, nothing more.
While I've also preferred the N5X because of size (though once I got it I hope it could be smaller... ),
The 808 was part of the decision. I hoped for a device that isn't frying your hand.
I won't go extreme on saying how the 810 is bad. is it a design failure? maybe. but it's still capable and even reliable when underclocked as latest devices and reviews showed.
In addition to that,
- The N6P was reviewed and I haven't read much complaints about overheating keeping benchmarks usually good/on par but as with the N5X not at the top.
(but as always new OS means kernel changes that might impact benchmarks?)
- The processor is only part of the system and usually they include benefits of SoC. that's the reason why the N6P got better camera features for example.
The only sad thing is I think that at the N5X price ranges they could've provided additional USB-C -> USB-A cable and stick 3GB of RAM.
Let's be honest here - Qualcomm SOCs have been a disaster for the past 2 years, since they got caught flat footed by Apple's release of the 64 bit A7 (and I can't stand Apple).
- 801/805 was them trying to milk their old architecture for everything it was worth.
- 810 was an abortion with overheating and throttling issues because they had to rush to get a 64 bit chip to market and went with stock A57/A53 cores.
- 808 was an attempt to fix the 810 overheating issues by removing 2 of the 4 A57 cores.
- 617 (new HTC One A9) is 8 A53 cores and makes zero sense, let's just have a ****load of low power cores and no fast ones.
- 618/620 (upcoming) is basically a replacement of the 808/810 with stock A72 cores in place of the A57.
All of these are interim hack jobs until Qualcomm can get the 820 out early next year. By all accounts, the 820 will be a great SOC. But in the mean time, arguing over which interim release is best is largely a waste of time. They're all capable but none are really great.

Categories

Resources