G2 Processor faster than Nexus One? - G2 and Desire Z General

I constantly see references to the G2 beating the Nexus One in benchmark tests but the benchmarks always include GPU performance as part of the score. It was established before the phone came out that the GPU performance would beat just about everything else (save the Galaxy S), however that is somehow conflated with CPU performance.
Yes, the G2 can and probably will be overclocked once root is achieved but for now, apples to apples comparison:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Overall score is higher than Nexus One but CPU score is lower
I don't know that this is all that significant and I'd still take my G2 over a Nexus One, but please stop spreading misinformation by saying how the "Scorpion CPU pwnz the Snapdragon LOLZWTF!!!!1!!"

And I'm sure once the G2 get clocked At its native speed then its gonna put your point to rest...G2 still runs faster than my nexus .. I don't care about numbers what I care about is real world performance
Sent from my T-Mobile G2 using XDA App

Both are Snapdragon SoC (G2 and N1), and both have Scorpion CPU's.. newer one is on a 45nm die while older is on a 65nm (lesser heat emission, lower consumption of power as a result longer battery life on the G2 thanks to the 45nm die).
CPU architecture wise there is not much difference.. the application processing however has been tweaked to perform better overall(thanks to the 45nm die I believe) and obviously as you can see the GPU, Adreno 205 thrashes the older Adreno 200..
Also the newer SoC supports more video codecs(mainly DivX actually) as well. ((MPEG-4, H.264, H.263, VC-1, DivX, DivX 3.11, Sorenson Spark, VP6) vs (MPEG-4, H.264, H.263, VC-1, Sorenson Spark, VP6).

msmith1991 said:
And I'm sure once the G2 get clocked At its native speed then its gonna put your point to rest
Click to expand...
Click to collapse
I constantly see references to the G2 beating the Nexus One in benchmark tests
Click to expand...
Click to collapse
Yes, the G2 can and probably will be overclocked once root is achieved but for now, apples to apples comparison
Click to expand...
Click to collapse
I don't know if I came across as bashing the phone. Maybe it was when I said I still prefer it over a Nexus one?
Regardless, my point wouldn't be "put to rest." For now, the CPU on the Nexus One outperforms the G2 on a benchmark. That is what I was addressing, nothing more.
Superfrag said:
Both are Snapdragon SoC (G2 and N1), and both have Scorpion CPU's.. newer one is on a 45nm die while older is on a 65nm (lesser heat emission, lower consumption of power as a result longer battery life on the G2 thanks to the 45nm die).
CPU architecture wise there is not much difference.. the application processing however has been tweaked to perform better overall(thanks to the 45nm die I believe) and obviously as you can see the GPU, Adreno 205 thrashes the older Adreno 200..
Also the newer SoC supports more video codecs(mainly DivX actually) as well. ((MPEG-4, H.264, H.263, VC-1, DivX, DivX 3.11, Sorenson Spark, VP6) vs (MPEG-4, H.264, H.263, VC-1, Sorenson Spark, VP6).
Click to expand...
Click to collapse
My personal experience matches what you've said exactly. Barely a hiccup running the phone and excellent battery life (relatively speaking at least). Like I said, the only thing I'm addressing is people using benchmarks which include the GPU in the score to say that the G2 processor is faster at 800mhz than the N1 at 1ghz.

Is there anything like Adobe Flash benchmark or test page? G2 is supposed to hardware accelerate Flash.

G2 is pure sex compared to my last phone, motorola cliq hahah

Is there anything like Adobe Flash benchmark or test page? G2 is supposed to hardware accelerate Flash.
Click to expand...
Click to collapse
This is particularly interesting, as the phone seems to load and scroll webpages faster than my Nexus One did with Enom's ROM. Well, the G2 is faster in everything, basically. Not by a lot, but enough to be noticeable.

What is interesting is the obnoxiously higher IO score in the G2 compared to the N1.
I wonder what is entailed in an IO test on a phone...

here is some light reading, and chart lookin....
http://androidandme.com/2010/10/news/3dmarkmobile-gpu-showdown-adreno-205-vs-powervr-sgx540/
But jsut remember that the Galaxy class is still stuck on 2.1 so we are missing some updated drivers and JIT and yes that does make a difference.

Oh I forgot.
Better IO scores I believe is thanks to the fact that HTC decided to to with ext3 partition for its OS, thus having faster read/writes(I'm not sure about this.. I think its thanks to the ext3 partition)
Also yeah, like I said, the CPU architecture is similar, (very similar actually), just the overall application processing is better due to tweaks, and one guy mentioned hardware flash acceleration which I forgot to mention, and obviously the GPU. Basically its a very well refined package of the Snapdragons that were on the N1/Desire.
Next year's 3rd Gen Snapdragons will be dual core Scorpion CPU's, even better tweaking and optimization, and HUGE gpu improvement.
Qualcomm says that Adreno 220 is 4-5 times faster than Adreno 205.

carlitozway57 said:
My personal experience matches what you've said exactly. Barely a hiccup running the phone and excellent battery life (relatively speaking at least). Like I said, the only thing I'm addressing is people using benchmarks which include the GPU in the score to say that the G2 processor is faster at 800mhz than the N1 at 1ghz.
Click to expand...
Click to collapse
Yup, at stock speeds the Nexus One's CPU is faster simply due to clock speed(since they have similar architecture). BUT, thanks to the overall package, tweaking and optimization, and a better GPU, the new 2nd gen Snapdragons are extremely well refined and thus the phone runs MUCH smoother. That's why you see Linpack scores of the G2 and N1 are very similar.

Superfrag said:
Yup, at stock speeds the Nexus One's CPU is faster simply due to clock speed(since they have similar architecture). BUT, thanks to the overall package, tweaking and optimization, and a better GPU, the new 2nd gen Snapdragons are extremely well refined and thus the phone runs MUCH smoother. That's why you see Linpack scores of the G2 and N1 are very similar.
Click to expand...
Click to collapse
agreed
my G2 scores 1675 in quadrant
smoothness comparable to iphone, but just running at 245mhz!
many people got scared of for the 800mhz mark
but I would proudly say they have underestimated that babe
running at 800mhz brings satisfactory battery stamina as well

o>c said:
agreed
my G2 scores 1675 in quadrant
smoothness comparable to iphone, but just running at 245mhz!
many people got scared of for the 800mhz mark
but I would proudly say they have underestimated that babe
running at 800mhz brings satisfactory battery stamina as well
Click to expand...
Click to collapse
Tell me about it, it's the smoothest Android phone out there and fastest too I might add.
Once we root and O/C this baby it'll fly!

Think of the difference between the N1 QSD8250 vs the G2 MSM7230 as a Core 2 Duo E6700 vs Core 2 Duo E7200 for the most part. One is older, supports slightly less instructions, but the overall architecture is the same. They just refined it a bit when they made the switch to 45nm is all.

I think the G2 cpu after using almost every snapdragon phone over the few months including my old nexus 1 this 800mhz cpu reminds me of the intel centrino cpus although they lower clock rates the where fast cpus from what I see and read the G2 as a package is the fastest phone on the martket. The cpu has been optimise and after going through many androids phones this one is by far the smoothest and fastest of them all in my opinion.
Sent from my T-Mobile G2 using XDA App

This is the highest I've been able to get it to

carlitozway57 said:
I don't know that this is all that significant and I'd still take my G2 over a Nexus One, but please stop spreading misinformation by saying how the "Scorpion CPU pwnz the Snapdragon LOLZWTF!!!!1!!"
Click to expand...
Click to collapse
I haven't done any benchmarking but I can definitively say the G2 is far and away snappier and seems to run much smoother than the N1. It's the first time an android phone has seemed to just flow and not get in the way of itself. Granted I stopped using cooked roms when I ditched winmo, so I can't speak for CM6.
Sent from my T-Mobile G2 using Tapatalk

Of course the absolute correct statement would be chipset vs chipset, not processor vs processor, but thats all semantics. You cannot compare processors vs processors when it comes to mobile devices. These are all SOCs. You cannot seperate them.
And the fact is, the G2 as a whole has better performance than the N1 as a whole. The reason why people at the first place keep harping on the fact that G2 is faster than N1 is just to make a point that being 800mhz does not mean that the phone as a whole is slower than the N1, and thus less qualified for a Gingerbread upgrade than the N1 (which in turn proves that the so called rumour of a 1Ghz requirement for Gingerbread is not logical). How can one say that just because N1 has 1Ghz proc, and the G2 has 800mhz proc, the G2 is not as qualified as the N1 to get Gingerbread (again, assuming the 1Ghz requirement) when as a whole phone its faster than the N1. So we can conclude that the 1Ghz requirement is just bull****, and that the G2 is not handicapped for having a 800mhz proc.
Sorry did i confused you? I think i confused myself

I was able to get it 1 point higher lmao
Sent from my T-Mobile G2 using XDA App

Some of you keep thinking the G2 has a snapdragon processor in it. It's a scorpion processor not a 2nd gen snapdragon.

Related

Is HummingBird Really Slower than Snapdragon Gen 2? [Independent of JIT]

I know this topic has been debates over time but I noticed that most people attributed the differences in performance is caused by firmware difference (2.1 vs. 2.2).
Today there's an article release about G2 overlock to 1.42 Ghz. Along with the article I noticed "Native Benchmark" using SetCPU which doesn't uses JIT.
Lower is Better.
G2 Result:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Now My Vibrant at 1.2 Ghz:
C: 702.89
Neon: 283.15
The difference between the two phone is so great that I doubt it is due to the 200 MHz difference alone.
As a comparison, my score at regular 1 GHz is:
C: 839.21
Neon: 334.51
There is about 130 ms decrease for 200 Mhz overclock, which is Vibrant is at 1.4 Ghz would put the two CPU really close to each other but with G2 having a slight edge. Remember this test is suppose to be JIT independent running Native Codes. But since the vibrant can only be stable overclocked to 1.3 Ghz (what is available anyways), the newer generation of Snapdragon may just be more efficient than Hummingbird, despite us the galaxy owner believes otherwise.
Another thing to keep in mind though, is that Snapdragon are supposedly to have an edge in Neon instruction Set, so I didn't look into that score too much.
It appears to be true.
It appears Hummingbird is not only slower than the new Generation Scorpions, it also appears the Hummingbird is unable to fully capture the CPU performance gain of the Dalvik JIT compiler in Froyo 2.2
http://www.youtube.com/watch?v=yAZYSVr2Bhc
Dunno Something Is Not Right About This 2.2
The Thing That Really Bugs Me Is 2.2 is Suppose To Allow The Full Functionality Of Our 512MB of Ram..But It Doesn't
Erickomen27 said:
Dunno Something Is Not Right About This 2.2
The Thing That Really Bugs Me Is 2.2 is Suppose To Allow The Full Functionality Of Our 512MB of Ram..But It Doesn't
Click to expand...
Click to collapse
It's not 2.2, its Samsung.
SamsungVibrant said:
It's not 2.2, its Samsung.
Click to expand...
Click to collapse
I agree, they should use ext 4 on their phones.
I don't see why they would stick to their old RFS.
SamsungVibrant said:
It appears Hummingbird is not only slower than the new Generation Scorpions, it also appears the Hummingbird is unable to fully capture the CPU performance gain of the Dalvik JIT compiler in Froyo 2.2
http://www.youtube.com/watch?v=yAZYSVr2Bhc
Click to expand...
Click to collapse
I'm sorry, but could you explain what your youtube link has to do with the topic? I'm curious, as I wasn't any wiser on the question at hand when I watched it.
NEON is architecture extension for the ARM Cortex™-A series processors*
Is Snapdragon an ARM Cortex™-A series processor? NO!
Remember SSE instruction set in Intel, and the war AMD vs Intel?
Welcome back, LOL
*The source for NEON: http://www.arm.com/products/processors/technologies/neon.php
Probably is, but does it really matter?
Sent from my SGS Vibrant.
Scorpion/Snapdragon have faster FPU performance due to a 128 bit SIMD FPU datapath compared to Cortex-A8's 64 bit implementation. Both FPUs process the same SIMD-style instructions, the Scorpion/snapdragon just happens to be able to do twice as much.
http://www.insidedsp.com/Articles/t...ualcomm-Reveals-Details-on-Scorpion-Core.aspx
2.2 isnt going to magically give the galaxy S similar scorpion/snapdragon high scores
just look at droidX and other Cortex-A8 phones that already have official 2.2 ROMS they avr 15-20 linpack scores
This doesn't make the hummingbird a bad CPU at all LOL its stupid benchmarks IMHO not going to show in realword use...maybe when the OS matures and becomes more complex but not now..and even by then we will have dualcore CPU's...its a gimmick for HTC to have the "Fastest CPU"
IMO in real world use they are pretty much on par but then when you look at GPU performance its quit obvious the galaxy S pulls ahead thanks to the 90mts PowerVR SGX540
demo23019 said:
Scorpion/Snapdragon have faster FPU performance due to a 128 bit SIMD FPU datapath compared to Cortex-A8's 64 bit implementation. Both FPUs process the same SIMD-style instructions, the Scorpion/snapdragon just happens to be able to do twice as much.
http://www.insidedsp.com/Articles/t...ualcomm-Reveals-Details-on-Scorpion-Core.aspx
2.2 isnt going to magically give the galaxy S similar scorpion/snapdragon high scores
just look at droidX and other Cortex-A8 phones that already have official 2.2 ROMS they avr 15-20 linpack scores
This doesn't make the hummingbird a bad CPU at all LOL its stupid benchmarks IMHO not going to show in realword use...maybe when the OS matures and becomes more complex but not now..and even by then we will have dualcore CPU's...its a gimmick for HTC to have the "Fastest CPU"
IMO in real world use they are pretty much on par but then when you look at GPU performance its quit obvious the galaxy S pulls ahead thanks to the 90mts PowerVR SGX540
Click to expand...
Click to collapse
Once again quoting ARM HQ website:
NEON technology is cleanly architected and works seamlessly with its own independent pipeline and register file.
NEON technology is a 128 bit SIMD (Single Instruction, Multiple Data) architecture extension for the ARM Cortex™-A series processors, designed to provide flexible and powerful acceleration for consumer multimedia applications, delivering a significantly enhanced user experience. It has 32 registers, 64-bits wide (dual view as 16 registers, 128-bits wide.
Click to expand...
Click to collapse
Scorpion is not ARM Cortex™-A series processor
Fuskand said:
I'm sorry, but could you explain what your youtube link has to do with the topic? I'm curious, as I wasn't any wiser on the question at hand when I watched it.
Click to expand...
Click to collapse
I provided the link, because the first part of the link talks about the JIT compiler which increases CPU performance. I put that there in-case someone has never heard of this before. Thus, when I mentioned the Hummingbird can not take full advantage of the JIT compiler, someone would know what I'm talking about.
demo23019 said:
Scorpion/Snapdragon have faster FPU performance due to a 128 bit SIMD FPU datapath compared to Cortex-A8's 64 bit implementation. Both FPUs process the same SIMD-style instructions, the Scorpion/snapdragon just happens to be able to do twice as much.
http://www.insidedsp.com/Articles/t...ualcomm-Reveals-Details-on-Scorpion-Core.aspx
2.2 isnt going to magically give the galaxy S similar scorpion/snapdragon high scores
just look at droidX and other Cortex-A8 phones that already have official 2.2 ROMS they avr 15-20 linpack scores
This doesn't make the hummingbird a bad CPU at all LOL its stupid benchmarks IMHO not going to show in realword use...maybe when the OS matures and becomes more complex but not now..and even by then we will have dualcore CPU's...its a gimmick for HTC to have the "Fastest CPU"
IMO in real world use they are pretty much on par but then when you look at GPU performance its quit obvious the galaxy S pulls ahead thanks to the 90mts PowerVR SGX540
Click to expand...
Click to collapse
Search the net, people have made real world Videos of galaxy s running 2.2, compared to G2. The G2 is faster in the real world on things like launching aps.
lqaddict said:
Once again quoting ARM HQ website:
Scorpion is not ARM Cortex™-A series processor
Click to expand...
Click to collapse
LOL i never said the scorpion is ARM Cortex™-A
try reading my post again
SamsungVibrant said:
Search the net, people have made real world Videos of galaxy s running 2.2, compared to G2. The G2 is faster in the real world on things like launching aps.
Click to expand...
Click to collapse
LOl if it is faster it might be by the most 1-2 sec if its lucky
sorry its going to take allot more than that to impress me..again its a phone now a highend PC
SamsungVibrant said:
Search the net, people have made real world Videos of galaxy s running 2.2, compared to G2. The G2 is faster in the real world on things like launching aps.
Click to expand...
Click to collapse
Due to different filesystem implementation largely, once there is a workable hack to convert the entire filesystem on the Galaxy S to a real filesystem you can make the comparison of the things like launching apps.
Demo, I didn't mean to come off as a **** I was just pointing out the flaw in the OP benchmark - NEON instruction set execution is flawed. G2 processor is ARMv7 which is the base of Cortex-A8, Cortex-A8 adds the instructions specifically targeted for application, like multimedia, and that's where NEON comes into place.
lqaddict said:
Due to different filesystem implementation largely, once there is a workable hack to convert the entire filesystem on the Galaxy S to a real filesystem you can make the comparison of the things like launching apps.
Click to expand...
Click to collapse
Agreed. +10 char
lqaddict said:
Demo, I didn't mean to come off as a **** I was just pointing out the flaw in the OP benchmark - NEON instruction set execution is flawed. G2 processor is ARMv7 which is the base of Cortex-A8, Cortex-A8 adds the instructions specifically targeted for application, like multimedia, and that's where NEON comes into place.
Click to expand...
Click to collapse
No problem I didn't really take it
Also noticed i overlooks allot of things in OP...blame the ADD
What difference does it make? In real world use the difference is negligible. And in three months our phones will be mid-tier anyway. At least we don't have hinges that will fall apart in two months.
how is the hummingbird not able to fully take advantage of JIT?
Well there is a fix for our phones now. And from what I can tell there no way the g2 can open apps faster than my vibrant with the z4mod. Its smocking fast.by far the fastest I've ever seen this phone. No delays whatsoever. Can't wait till I get froyo with ocuv and this will be unreal. I feel like this phone us a high end pc running android or something. When I say instant it's instant lol.
Kubernetes said:
What difference does it make? In real world use the difference is negligible. And in three months our phones will be mid-tier anyway. At least we don't have hinges that will fall apart in two months.
Click to expand...
Click to collapse
Exactly.
People seem to forget that the SGS line is like 6 months old now, we should be glad they're still doing as well as they are.
Then theres the fact that there aren't many other phones that come with 16gb internal. To me, having 16GB and being able to upgrade to 48 [minus the 2GB that Samsung steals] total is worth way more than starting with 4GB [1.5GB usable] and upgrading to a max of 36 [minus what HTC steals from internal].
But, if you don't like your phone, SELL IT while it's still worth something!

If Nexus S (2?) has similar hardware to Galaxy S, is it easy to port 2.3?

Even though I have followed dev/porting for 8 months starting with HTC Touch, I have little knowledge of how it is actually done. So here's my question to the developers.
We all know that Nexus S (2?) will have Gingerbread 2.3. Looking at the rumored specs and model number, it seems that Nexus S is a slight upgrade from Galaxy S.
Assuming most of the hardware is identical to Galaxy S, how easy is it to port 2.3 to Epic 4G, once Galaxy S becomes available?
Specifically, what is needed to bake a new 2.3 rom? Do you need to reverse engineer like what devs did on HTC WM devices? Or is it a straight port? I suspect it's somewhere in between but want to hear from you.
(If necessary, please move this to android development forum.)
The simple answer to this (which has been answered in other threads already if you looked) is no its won't be easy.
Also, the Nexus S is rumored and pretty much guaranteed to launch with a dual core processor. The rumor is that they delayed the device and gingerbread to implement this, since it will be google's new flagship device and has to be cutting edge. Everyone knows that dual core processors are set to hit the market within the first couple months of 2011 anyway, so releasing an old generation processor in a flagship google phone just makes no sense.
So no, it will not be easy to port from the Nexus S. It will not only have a completely different processor, but will also probably only be a GSM phone.
muyoso said:
but will also probably only be a GSM phone.
Click to expand...
Click to collapse
I agree and think this is the bigger problem (for the Epic at least).
Sure would be nice if the folks at Google would release at one clean Google device for each carrier. I'd be on it in a heart beat.
vansmack said:
I agree and think this is the bigger problem (for the Epic at least).
Sure would be nice if the folks at Google would release at one clean Google device for each carrier. I'd be on it in a heart beat.
Click to expand...
Click to collapse
Yes that would be nice if they did that.I am surprised they are even doing another google phone by the way it sounded when they stopped marketing the nexus 1 that had no interest in doing another google branded phone.
muyoso said:
Also, the Nexus S is rumored and pretty much guaranteed to launch with a dual core processor. The rumor is that they delayed the device and gingerbread to implement this, since it will be google's new flagship device and has to be cutting edge. Everyone knows that dual core processors are set to hit the market within the first couple months of 2011 anyway, so releasing an old generation processor in a flagship google phone just makes no sense.
So no, it will not be easy to port from the Nexus S. It will not only have a completely different processor, but will also probably only be a GSM phone.
Click to expand...
Click to collapse
Those are just rumors, all of the official specs that have come out say it will be a 1.2ghz hummingbird. The dual core 1ghz orion chip is definitely on the horizon but I highly doubt they will be able to get it in at the last minute, and there's a good chance we won't see it in phones until late next year. All rumors of a dual core Nexus S have had no credibility with their sources.
That said, if this turns out to be a dual core phone and gingerbread turns out to be optimized for dual cores, a port will probably be very difficult. But if it's just a 1.2ghz hummingbird then it would just be a matter of getting the CDMA radio working.
LucJoe said:
Those are just rumors, all of the official specs that have come out say it will be a 1.2ghz hummingbird. The dual core 1ghz orion chip is definitely on the horizon but I highly doubt they will be able to get it in at the last minute, and there's a good chance we won't see it in phones until late next year. All rumors of a dual core Nexus S have had no credibility with their sources.
That said, if this turns out to be a dual core phone and gingerbread turns out to be optimized for dual cores, a port will probably be very difficult. But if it's just a 1.2ghz hummingbird then it would just be a matter of getting the CDMA radio working.
Click to expand...
Click to collapse
The reason that I think he has to have a dual core in it is as follows:
If it has a 1.2 ghz hummingbird processor, BFD. It immediately launches and is mediocre. Nothing exciting at all about it.
If it launches as the first dual core phone, it is the top of the line phone worthy of being branded as a Google flagship device.
Also, if it just had a 1.2 ghz hummingbird processor, what is the holdup? It is no different from phones released months ago. Also, if its a 1.2 Ghz processor, it will be eclipsed within a matter of a month or two performance wise by Tegra 2 and dual core snapdragon processors. Basically, it would be an embarassing flagship device. The original Nexus is still to this day a damn good phone that is near the top of the pack of android phones performance wise, and it is a year old.
muyoso said:
The reason that I think he has to have a dual core in it is as follows:
If it has a 1.2 ghz hummingbird processor, BFD. It immediately launches and is mediocre. Nothing exciting at all about it.
If it launches as the first dual core phone, it is the top of the line phone worthy of being branded as a Google flagship device.
Also, if it just had a 1.2 ghz hummingbird processor, what is the holdup? It is no different from phones released months ago. Also, if its a 1.2 Ghz processor, it will be eclipsed within a matter of a month or two performance wise by Tegra 2 and dual core snapdragon processors. Basically, it would be an embarassing flagship device. The original Nexus is still to this day a damn good phone that is near the top of the pack of android phones performance wise, and it is a year old.
Click to expand...
Click to collapse
You are fully aware the the Tegra 2 do not even exceed the hummingbird in gpu or cpu performance right? Im just saying cause it would suck if you didn't know what you're talking about.
Plus gingerbread will have HW acceleration, putting gpu performance on a step for the overall fluidity of the gui. So again... what's faster?
Really? I assumed it would greatly ourperform. Where did u get your facts.
Sent from my SPH-D700 using XDA App
InfDaMarvel said:
Really? I assumed it would greatly ourperform. Where did u get your facts.
Sent from my SPH-D700 using XDA App
Click to expand...
Click to collapse
I'm trying to find the article again, but i know they were close but Tegra 2 did not outperform the hummingbird. Apparently now they optimized the platform more.
Don't get me wrong i love nvidia, that's all i've purchased and stayed with them even thou they still dont have a decent dx11 card that doesnt need 2 power supplies. But they really need to step up and their CEO needs to wattch what he says and deliver more.
Here's a quote "On the 3D side, Nvidia says it has doubled the performance of the initial Tegra, resulting in a peak speed of 90 million triangles per second. This level is well beyond the performance of any mobile processor shipping or even sampling today." Hummingbird has the same exact performance. And CPU performance is a very interesting area. Anyway the GPU performance is almost par with the Hummingbird leading maybe by 3-5%
apatcas said:
I'm trying to find the article again, but i know they were close but Tegra 2 did not outperform the hummingbird. Apparently now they optimized the platform more.
Don't get me wrong i love nvidia, that's all i've purchased and stayed with them even thou they still dont have a decent dx11 card that doesnt need 2 power supplies. But they really need to step up and their CEO needs to wattch what he says and deliver more.
Here's a quote "On the 3D side, Nvidia says it has doubled the performance of the initial Tegra, resulting in a peak speed of 90 million triangles per second. This level is well beyond the performance of any mobile processor shipping or even sampling today." Hummingbird has the same exact performance. And CPU performance is a very interesting area. Anyway the GPU performance is almost par with the Hummingbird leading maybe by 3-5%
Click to expand...
Click to collapse
Since Tegra 2 is dual core and android does not have 2 core support till Gingerbread (Actually I don't think it even supports cortex A9 till gingerbread)..so if they ran 1 core vs 1 core I'd see a hummingbird win against a Tegra 2..but if Tegra 2 is running dual core (and optimized for it) it should win...but by that analogy Orion would then be superior.
gTen said:
Since Tegra 2 is dual core and android does not have 2 core support till Gingerbread (Actually I don't think it even supports cortex A9 till gingerbread)..so if they ran 1 core vs 1 core I'd see a hummingbird win against a Tegra 2..but if Tegra 2 is running dual core (and optimized for it) it should win...but by that analogy Orion would then be superior.
Click to expand...
Click to collapse
Not to mention that Tegra 2 does 1080P video recording. So yes, releasing a google flagship phone that within one month is eclipsed by LG with the first Tegra 2 phone, would be embarrassing. The Nexus 1 set the standard for almost a year, before the Galaxy S line came out. If the Nexus 2 can only set the standard for under a month, that would be stupid. Therefore, it is easy to conclude that the rumors of the Nexus S having a dual core are most likely true. Doesn't mean it has to be the Orion, but it would be awesome if it was.
Tegra 2 is a Cortex A9 CPU... as is the Samsung Orion and the TI OMAP4xxx chips. They accomplish 2.5 instructions per MHz as opposed to the 2 instructions per MHz in the Cortex A8 Hummingbird, and that's not counting improvements to instruction efficiency (getting more done with less instructions.) Add to that improvements such as out of order instruction handling and dual-channel memory support and Cortex A9 chips are head and shoulders above Cortex A8.
The only reason Tegra 2 wouldn't outperform Hummingbird significantly is, as mentioned, lack of dual-core support in current builds of Android, and the nVidia GPU which is, surprisingly, only just about on par with Hummingbird's PowerVR SGX540.
Sent from my SPH-D700 using XDA App
Electrofreak said:
Tegra 2 is a Cortex A9 CPU... as is the Samsung Orion and the TI OMAP4xxx chips. They accomplish 2.5 instructions per MHz as opposed to the 2 instructions per MHz in the Cortex A8 Hummingbird, and that's not counting improvements to instruction efficiency (getting more done with less instructions.) Add to that improvements such as out of order instruction handling and dual-channel memory support and Cortex A9 chips are head and shoulders above Cortex A8.
The only reason Tegra 2 wouldn't outperform Hummingbird significantly is, as mentioned, lack of dual-core support in current builds of Android, and the nVidia GPU which is, surprisingly, only just about on par with Hummingbird's PowerVR SGX540.
Sent from my SPH-D700 using XDA App
Click to expand...
Click to collapse
OK here's facts dudes. Tegra sucks... really please get it.
Tegra250 based Toshiba AC100 Running Neocore Benchmark
http://www.youtube.com/watch?v=uJav9ns6b4o
apatcas said:
OK here's facts dudes. Tegra sucks... really please get it.
Tegra250 based Toshiba AC100 Running Neocore Benchmark
http://www.youtube.com/watch?v=uJav9ns6b4o
Click to expand...
Click to collapse
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I'm not sure that triangles per second is accurate to describe performance.
Still, if you want an article:
http://alienbabeltech.com/main/?p=17125
The Hummingbird has memory bandwidth limitations that I don't think the Tegra 250 will. Lets wait and see.
apatcas said:
OK here's facts dudes. Tegra sucks... really please get it.
Tegra250 based Toshiba AC100 Running Neocore Benchmark
http://www.youtube.com/watch?v=uJav9ns6b4o
Click to expand...
Click to collapse
Here's the facts dude, and try to get this; Tegra2 does not, in fact, suck. You just posted a video of it being benchmarked on a netbook running Android 2.1 which cannot make full use of Tegra2's dual-core CPU. Secondly, neocore is a GPU test, not a CPU test. We already discussed the fact that the Tegra2 GPU is only just about on par with the SGX540. Thirdly, that test is being run at a signifcantly higher resolution than a mobile device would run, and frankly, considering this, the score isn't bad.
Fail, man, fail.
Sent from my SPH-D700 using XDA App
sauron0101 said:
I'm not sure that triangles per second is accurate to describe performance.
Still, if you want an article:
http://alienbabeltech.com/main/?p=17125
The Hummingbird has memory bandwidth limitations that I don't think the Tegra 250 will. Lets wait and see.
Click to expand...
Click to collapse
I would also like to point out that I wrote the article that sauron0101 just linked. It's also posted on my blog (linked in my signature) posted back in March. Tegra2 does feature dual-channel memory support as part of the Cortex A9 architecture, which is a significant advantage.
Sent from my SPH-D700 using XDA App
Electrofreak said:
I would also like to point out that I wrote the article that sauron0101 just linked. It's also posted on my blog (linked in my signature) posted back in March. Tegra2 does feature dual-channel memory support as part of the Cortex A9 architecture, which is a significant advantage.
Sent from my SPH-D700 using XDA App
Click to expand...
Click to collapse
Any idea how fast the other Arm Cortex A9s are compared to the 250?
- We know that the SGX540 will be in the TI OMAP 4 series; probably not bandwidth limited - I am surprised that they did not opt for the SGX545
- The Samsung Orion series has Mail 400 (unknown performance)
- The Snapdragon (A8 unless Qualcomm opts to keep the name "Snapdragon for its A9 CPUs) will have a new generation of Adreno 300 graphics
Unknown if we will see this on mobile
- Marvell also has a new SOC
Also interesting is Samsung's Netbook roadmap, which uses the same SOCs on a phone:
Sorry if all of this is a bit off topic, but it is worth looking at what everyone has.
Edit: Qualcomm is keeping the Snapdragon name for the A9 processors.
Does no one see i was talking about Gpu perfomance? That's what's gonna matter in Gingerbread. And that's running 1024x600 on that res Galaxy tab is around 53 fps. It's the same thing that Vista started doing with HW accel so u understand.

Adreno 225 vs "New" Mali 400MP4

So I have been doing a lot of research looking for what will be better and I am guessing that the Mali 400 is going to out perform the Adreno 225. I wish android had a solid GPU test that would give something close to real world results. But if you take a look at these articles you will see on paper the Adreno is the same as the Apple 4S's Power VR543MP2
http://www.anandtech.com/show/5559/...mance-preview-msm8960-adreno-225-benchmarks/3
Now the International version will have the new Exynos 4 Quad (4412) Quad Core Cortex A9 but the US version is rumored to have a Dual Core Qualcomm Snapdragon MSM8960 with the Adreno 225.
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
and here are some other benchmarks just to sum up the difference in performance.
http://www.anandtech.com/show/5810/samsung-galaxy-s-iii-performance-preview
The main thing I am worried about is GPU performance the CPUs in just about every phone out right not seem over kill. I want to make sure the phone I buy will be able to run FPSE(Playstation Emulator) and N64oid(N64 Emulator) smooth. FPSE now has an Open GL plugin that needs a hard core GPU to run well. My Galaxy nexus is just not cutting it anymore.
So............... get the international version.
cmd512 said:
So............... get the international version.
Click to expand...
Click to collapse
LTE Speed vs 3G Speed = not worth it.
Don't get me wrong I want a phone with a power house GPU but if the mobile connection is slow its just not worth it. I'm on Verizon and I don't want to move away from their LTE.
Zzim said:
LTE Speed vs 3G Speed = not worth it.
Don't get me wrong I want a phone with a power house GPU but if the mobile connection is slow its just not worth it. I'm on Verizon and I don't want to move away from their LTE.
Click to expand...
Click to collapse
I hear ya, LTE is blazing fast. But on my unbranded SGS2, I get downloads of up to 7.5Mbps, pay $10 a month for unlimited HSPA+ data w/ tethering, and everything is plenty fast for what I do on my phone. So, while LTE is tempting for sure, still doesn't outweigh the other benefits.
Now, if I ever need my phone to seed torrents or something, I'll have to look at LTE then... hah.
cmd512 said:
I hear ya, LTE is blazing fast. But on my unbranded SGS2, I get downloads of up to 7.5Mbps, pay $10 a month for unlimited HSPA+ data w/ tethering, and everything is plenty fast for what I do on my phone. So, while LTE is tempting for sure, still doesn't outweigh the other benefits.
Now, if I ever need my phone to seed torrents or something, I'll have to look at LTE then... hah.
Click to expand...
Click to collapse
Are you with ATT because I can get a line through my work for 20 a month unlimited everything. 7.5 would be enough speed to make me switch and how consistent are these speeds?
How could the just give the us version a dual core? That makes the phone a very slight upgrade to the s2
Sent from my HTC Sensation Z710e using XDA
@ Op
did you see the date of the article regarding "Mobile SoC GPU Comparison" ? its dated february and they are comparing with the sgs2 mali 400 gpu not the one in sgs3. the new mali gpu is already beating all the current lineup of many gpus in many becnhmarks
bala_gamer said:
@ Op
did you see the date of the article regarding "Mobile SoC GPU Comparison" ? its dated february and they are comparing with the sgs2 mali 400 gpu not the one in sgs3. the new mali gpu is already beating all the current lineup of many gpus in many becnhmarks
Click to expand...
Click to collapse
The other articles were just to show the performance of the 225 this article shows how the new Mali will run http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
That article also show that the GS2(Mali400) and the GS3(Mali400MP4) are different in some way.
Zzim said:
Are you with ATT because I can get a line through my work for 20 a month unlimited everything. 7.5 would be enough speed to make me switch and how consistent are these speeds?
Click to expand...
Click to collapse
At work (average congestion), it's consistently 7Mbps+. In areas of great congestion (the mall, etc), it does slow down, but again, for work E-mails, surfing the web, youtube, etc, I've never had issues. Of course, I'm in Austin, TX as well, and I've heard HSPA+ speeds are very much region specific.
If you can get a line through from work with unlimited everything, they may be able to get you onto the smartphone data plan tier, which some folks have gotten up to 10-11+Mpbs. I'm on the $10 a month unlimited non-smartphone plan, so I think AT&T caps it at around 7.5-8Mpbs. Still though, plenty fast for what I do with my phone.
(And, the unlimited tethering is a blessing when you're in airports and stuff. Our US airports blow as there is almost never free WIFI.)
Zzim said:
The other articles were just to show the performance of the 225 this article shows how the new Mali will run http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
Click to expand...
Click to collapse
the scorecharts does shows the new mali400 topping the chart with good margin. what else do you need from a gpu ?
The SGS3's Mali-400 is just overclocked.
Anyways, if the US SGS3 comes with the S4 Pro (which has the new Adreno 320) then the difference in GPU will probably be minor.
dude have you seen those scores ? it beats the 4s graphics which we cant deny has a great gpu...
this is more than just an overclocked mali400 ... it may still be a mali400/mp4 but its not just overclocked its remade and has much higher clocks by the look of it
also im not sure about it coming with the s4 pro with adreno 320... i heard its not ready till end of year at earliest.. the mali-400 was the best android gpu and now its the best mobile gpu out atm
^^
u are right its not only just overclocked,there are some changes in the hardware part which we will know eventually in the upcoming days. i can easily OC my sgs2 mali400 to 400mhz, but you people know it wont give the same result as sgs3 which has much more pixels than s2
urmothersluvr said:
How could the just give the us version a dual core? That makes the phone a very slight upgrade to the s2
Sent from my HTC Sensation Z710e using XDA
Click to expand...
Click to collapse
More cores doesn't = faster...
Look at AMD's bulldozer CPU with 8 cores vs Intel's core i5 with 4 cores...the i5 is faster in basically almost everything except for very specalized applications.
Faster cores > more cores.
The LTE dual core version of the SGS3 will use Krait S4 cores which are faster than A9 Exynos cores.
I wished Samsung did dual core A15s instead of Quad Core A9s.
Daemos said:
More cores doesn't = faster...
Look at AMD's bulldozer CPU with 8 cores vs Intel's core i5 with 4 cores...the i5 is faster in basically almost everything except for very specalized applications.
Faster cores > more cores.
The LTE dual core version of the SGS3 will use Krait S4 cores which are faster than A9 Exynos cores.
I wished Samsung did dual core A15s instead of Quad Core A9s.
Click to expand...
Click to collapse
Let's be clear on this
CPU vs CPU
Dual core S4 is not quicker than Quad Core Exynos
ph00ny said:
Let's be clear on this
CPU vs CPU
Dual core S4 is not quicker than Quad Core Exynos
Click to expand...
Click to collapse
Hmmm I don't know about that...
Zzim said:
Hmmm I don't know about that...
Click to expand...
Click to collapse
I for one certainly do. The Exynos 4412 uses a 32nm fab process as opposed to nearly every other A9 architecture based processor (like the 4+1 T3) and High K metal gate tech which basically means twice the processing power of the Exynos 4410 dual core with about 20% less power consumption and that's on a core against core basis. The 4410 was used in the Galaxy S II. So even if the Exynos 4412 was dual core, it's already natively 20% more battery efficient and twice as powerful than last year's model. Clearly we're talking about a lot more than just quad vs dual and 28nm vs 32 or 40. There is a LOT that has gone into the design of the Exynos. For instance keeping it the same size physically as the dual core model, or accepting 128 bit instructions rather than the paltry 64 bit instructions most other mobile processors are limited to.
Trust me, do your research, a Google search of Exynos 4412 brought up instant results that detail what a beast this chip set is.
Like these:
http://www.phonearena.com/news/Exyn...re-processor-in-the-Samsung-Galaxy-S3_id29615
http://www.phonearena.com/news/Sams...nos-to-appear-in-Samsung-Galaxy-S-III_id29494
And of course the official press release. Read through this and then the benchmarks you pointed out in the OP (I'm linking em anyway) Anandtech's benchmark tests were performed on demo units on display to handled and groped by hundreds of people. There's no telling how many people had used it before they bench marked it and no telling if they were able to do it clean (reboot device, no other apps running). If not than they tested it after some fairly heavy use and it still proved itself a beast.
http://phandroid.com/2012/04/25/sam...ynos-4-quad-for-their-next-generation-galaxy/
http://www.anandtech.com/show/5810/samsung-galaxy-s-iii-performance-preview
Research is your best friend. If you're looking for the most powerful CPU and GPU on a phone right now, this is it. And when the devs get a hold of it, it will become even better and will really be utilized to its full.
Sent from my PG86100 using Tapatalk 2
Gene_Bailey said:
I for one certainly do. The Exynos 4412 uses a 32nm fab process as opposed to nearly every other A9 architecture based processor (like the 4+1 T3) and High K metal gate tech which basically means twice the processing power of the Exynos 4410 dual core with about 20% less power consumption and that's on a core against core basis. The 4410 was used in the Galaxy S II. So even if the Exynos was dual core, it's already natively 20% more battery efficient and twice as powerful. Clearly we're talking about a lot more than just quad vs dual and 28nm vs 32 or 40. There is a LOT that has gone into the design of the Exynos. For instance keeping it the same size physically as the dual core model, or accepting 128 bit instructions rather than the paltry 64 bit instructions most other mobile processors are limited to.
Trust me, do your research, a Google search of Exynos 4412 brought up instant results that detail what a beast this chip set is.
Like these:
http://www.phonearena.com/news/Exyn...re-processor-in-the-Samsung-Galaxy-S3_id29615
http://www.phonearena.com/news/Sams...nos-to-appear-in-Samsung-Galaxy-S-III_id29494
And of course the official press release. Read through this and then the benchmarks you pointed out in the OP (I'm linking em anyway) Anandtech's benchmark tests were performed on demo units on display to handled and groped by hundreds of people. There's no telling how many people had used it before they bench marked it and no telling if they were able to do it clean (reboot device, no other apps running). If not than they tested it after some fairly heavy use and it still proved itself a beast.
http://phandroid.com/2012/04/25/sam...ynos-4-quad-for-their-next-generation-galaxy/
http://www.anandtech.com/show/5810/samsung-galaxy-s-iii-performance-preview
Research is your best friend. If you're looking for the most powerful CPU and GPU on a phone right now, this is it. And when the devs get a hold of it, it will become even better and will really be utilized to its full.
Sent from my PG86100 using Tapatalk 2
Click to expand...
Click to collapse
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Zzim said:
Hmmm I don't know about that...
Click to expand...
Click to collapse
Outside of floating point tests such as linpack, CPU benches will even have quad core tegra3 well ahead of the dual core S4
Quadrant, Antutu, etc will all show the same exact same performance gap and it's a big one
Let's get this straight
Main selling points for Dual Core S4 setup = battery life from 28nm die size and integrated LTE
Spartoi said:
The SGS3's Mali-400 is just overclocked.
Anyways, if the US SGS3 comes with the S4 Pro (which has the new Adreno 320) then the difference in GPU will probably be minor.
Click to expand...
Click to collapse
you are wrong,mali 400mp4 is a quad core gpu while mali 400 is a dual core gpu.
I want to see mali 400mp4 against sgs543mp4

GLBenchmark HTC ONE only 34 FPS @ Egypt HD 1080P Offscreen

HTC ONE GLBenchmark only scores 34 FPS at 1080P offscreen, this is much lower than the Samsung SHV-E300s which scores 41.3 FPS, both using Snapdragon 600, in the same test. IRC HTC One is using LPDDR2 RAM, so are we seeing a lack of bandwidth compared to the Samsung which may use LPDDR3, which is supported by the S600.
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
how do you know HTC One uses LPDDR2 memory
kultus said:
how do you know HTC One uses LPDDR2 memory
Click to expand...
Click to collapse
http://www.htc.com/uk/smartphones/htc-one/#specs
http://www.anandtech.com/show/6754/hands-on-with-the-htc-one-formerly-m7/2
Turbotab said:
HTC ONE GLBenchmark only scores 34 FPS at 1080P offscreen, this is much lower than the Samsung SHV-E300s which scores 41.3 FPS, both using Snapdragon 600, in the same test. IRC HTC One is using LPDDR2 RAM, so are we seeing a lack of bandwidth compared to the Samsung which may use LPDDR3, which is supported by the S600.
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
Click to expand...
Click to collapse
My first question would be is how they even got a benchmark of the SHV-E300?
Xistance said:
My first question would be is how they even got a benchmark of the SHV-E300?
Click to expand...
Click to collapse
How do any results appear on GLbenchmark?
I believe with GLBenchmark, that if you don't register / login before running the test, it automatically uploads to their server for public viewing, so maybe it was done intentionally, or somebody forgot to login?
fp581 said:
he is spamming all around the htc one just look at his posts plz ban him from posting in any htc forum ever again.
he probably works in sony nokia or samsung
Click to expand...
Click to collapse
Who are you talking about?
sorry wrong person i'll delete that lest one.
but i would love pics of that benchmark for proof
fp581 said:
sorry wrong person i'll delete that lest one.
but i would love pics of that benchmark for proof
Click to expand...
Click to collapse
Dude I was going to go atomic, I admit it I have a terrible temper
I believe the benchmark was run by a German Android site, called Android Next, there is a video on Youtube, the GLBenchmark starts at 2.22
http://www.youtube.com/watch?v=Wl1dmNhhcXs&list=UUan0vBtcwISsThTNo2uZxSQ&index=1
thanks turbo for advanced my knoledge...what a shame they didnt choose LPDDR3 but i think its nt issue these days
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Maedhros said:
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Click to expand...
Click to collapse
GLBenchmark is a test of GPU performance, and isn't really changed by CPU clockkspeed, but it is affected by bandwidth.
As a test, I downclocked my Nexus 7 from an overclocked 1.6 GHz to just 1.15 GHz, I ran GLBench and got 10 FPS. I then ran at the test again but with CPU at 1.6 GHz, the result, 10 FPS again.
I've benched the N7 with both CPU & GPU overclocked to the same level as Transformer Infinity, which gets 13 FPS, but I always get 10 FPS, the reason my N7 has lower memory bandwidth than the Transformer Infinity, because it use slower RAM and thus has less bandwidth. That is a difference of 30% in FPS, just because of lower bandwidth.
I read that LPDDR3 starts at 800 MHz or 12.8 GB/s in dual-channel configuration, whereas LPDDR2 maxs at 533 MHz or 8.5 GB/s max bandwidth in dual-channel configuration.
Turbotab said:
GLBenchmark is a test of GPU performance, and isn't really changed by CPU clockkspeed, but it is affected by bandwidth.
As a test, I downclocked my Nexus 7 from an overclocked 1.6 GHz to just 1.15 GHz, I ran GLBench and got 10 FPS. I then ran at the test again but with CPU at 1.6 GHz, the result, 10 FPS again.
I've benched the N7 with both CPU & GPU overclocked to the same level as Transformer Infinity, which gets 13 FPS, but I always get 10 FPS, the reason my N7 has lower memory bandwidth than the Transformer Infinity, because it use slower RAM and thus has less bandwidth. That is a difference of 30% in FPS, just because of lower bandwidth.
I read that LPDDR3 starts at 800 MHz or 12.8 GB/s in dual-channel configuration, whereas LPDDR2 maxs at 533 MHz or 8.5 GB/s max bandwidth in dual-channel configuration.
Click to expand...
Click to collapse
In that case the results are quite disappointing.
All these fantastic new phones, and so much disappointment.
Sent from my GT-I9300 using xda premium
Tomatoes8 said:
They could have used faster memory for the same price if they didn't cut off Samsung as a supplier. Makes you wonder where their priorities lie. Making the best products possible or just going with the motions.
Click to expand...
Click to collapse
No one is going to take anything you say here seriously, as you've managed to have 2 threads closed in the last 30 mins. One of those inane posts you made involved you saying that HTC is going to be paying, according to your genius calculation, 20% of their profits to Apple (I forget what insanely unintelligent reason you gave). Yeah, because being able to completely migrate data from 1 completely different phone to another is such a bad idea for a company that wants to push their product.
So, what is the per unit cost of what HTC is paying for RAM now vs. what they could have gotten from Samsung? Exactly, you have no idea. I also didn't hear anything about HTC "cutting off" Samsung as a supplier, but maybe I missed it, so I google'd "htc cut off samsung supplier" and found 2 links...
http://tech2.in.com/news/smartphones/following-apple-htc-cuts-component-orders-from-samsung/505402
http://www.digitimes.com/news/a20121009PD213.html
I'm not sure if you have the capability of reading or not, but I'll spoon feed you this information, ok hunny? I've taken the info from the 1st link, since there is more there.
After Apple Inc slashed its orders for memory chips for its new iPhone from major supplier and competitor, Samsung Electronics Co Ltd, HTC too has reportedly cut down on its smartphone component orders from the South Korean company.
Click to expand...
Click to collapse
So, Apple cut down on memory orders. You know, they are the one's who make the iPhone? Have a logo of an Apple on their products? Steve Jobs was the CEO before he died. Anyway, I'll continue...
According to a report by DigiTimes, HTC has reduced its orders from Samsung, and instead opted to order CMOS image sensors from OmniVision and Sony. The company has also chosen to move part of its AMOLED panel orders to AU Optronics, DigiTimes reported citing ‘sources’.
Click to expand...
Click to collapse
Notice it said that HTC reduced its orders from Samsung, specifically on the image sensors (that's for the camera, if you didn't know) and the screen. You know, the thing on the front of your phone that you touch to make it do things? You know what I mean, right? I encourage you to read this link (or possibly have someone read it to you)...
http://dictionary.reference.com/browse/reduce
The point is that reduce isn't the same as cut off. Cutting off would require HTC not ordering ANYTHING from Samsung. Guess what? The One doesn't use an OmniVision CMOS sensor (don't forget, that's what the camera uses) or an AMOLED screen (the bright part of your phone that shows you info).
Also, this is a far better designed phone, especially in regards to hardware, than anything Samsung has ever produced. I went back to my EVO 4G LTE, mainly because I couldn't stand the terrible build quality of the Note 2. It just feels like a cheap toy. And, IMO, Sense is far better than TW. Samsung may have the market right now because of the Galaxy line of products, but that doesn't mean that HTC is out of the game by any means.
Seriously, attempt to use just a bit of intelligence before opening your mouth and spewing diarrhea throughout the One forums. As the saying goes: "it's better to keep your mouth shut and have people think you're an idiot, then to open your mouth and prove it". Unfortunately for you, it's too late.
I really think Turbo was too hasty to open a new thread for this as we've been discussing this in the mega thread
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
It scores 34fps in Egypt HD 1080p offscreen, while the leaked Samsung s600 device socres 41fps which is perfectly inline with Qualcomm's promised speed (3x Adreno 225)
here is a video of what I suspect the source of the benchmark, because we had no benchmark before it
http://www.youtube.com/watch?v=Wl1dmNhhcXs
notice how the battery is almost at end (HTC bar at this stage means its in the last 25%) also notice the activity in the notification area
more important the post ran more than a few full benchmarks, like quadrant before running GL benchmark, this alone is enough to lower the score, especially since Adreno 320 was known to throttle in the Nexus 4
I think benchmarks scores should not be relied on in such events, especially with hundreds of hands messing with the device, we have learned from the One X launch where videos poped up showing horrible performance from the One X, eventually turned out to be were very far from the final device in ur hands
finally both the One X and Nexus 7 at the same gpu clock, but the first is DDR2 and the second is DDR3, score the same in GL Benchmark
in other words its worrying but it's best to wait for proper testers like Anand
Thread cleaned
...from some serious trolling. There should be no trace from him for some time .
but remember:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
But...
I just wonder that a Samsung phone uses high end parts from Qualcomm instead of Samsungs processors. But I am not in Samsung devices so far, so I would not judge this
Gz
Eddi
Here's a second video also showing Egypt off screen bench at 34FPS.
https://www.youtube.com/watch?v=wijp79uCwFg
Skip to 3:30
Maedhros said:
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Click to expand...
Click to collapse
So you're saying that 200mhz o the CPU can account for 7 fps on a GPU test?
Following what you said, the Nexus 4 should have scored 27 fps? Since it has 200mhz less...
But no, it scored 33.7...only 0.3 fps less than the One!
And you know why? First both use the same GPU (and it's what counts for a graphic test) and second the HTC phones are always slower due to Sense!
So stop *****ing and realize that the One is no god phone
Samsung device is running 4.2.1

[Discussion] M9+ benchmarks, real life performance experiences

All HTC M9+ owners!
We're a handful yet on XDA, but getting more and more as the device is rolling out to more locations. and people who look for an elegant, high end device with premium build quality and the extra features like fingerprint scanner and 2K display and the duo camera settle with this fantastic device. It's not the perfect for everything unfortunately, not the best gaming phone out there, but in my experience a very well performing device in terms of phone call quality, reception quality, wifi power, multimedia, battery, display panel and overall UX feel, smoothness. High end games of latest year 2015 might stutter a bit here and there or lack some important shaders to compensate, but nonetheless good for last years (2014 and before 3D gaming and all kinds of 2D games), let's gather the experience and benchmarks of this unique device which was maybe the first MTK device in alubody
Let's discuss performance related experience, real user feel/experience and benchmarks free of whining, facing the truth that in some respects it's not the top-notch device, but let the curious ones who consider to buy this device, know what to expect if choosing this elegant business class phone.
I'll start with some of my benchmarks result screenshots in separate posts.
UPDATE:
Here's my short game testing video on M9+
https://www.youtube.com/watch?v=QmLGCoI4NLw
Antutu on stock Europe base 1.61
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
My real life experience is that in everyday chores, the phone is very snappy, with some small lags when starting new apps to the memory. Task switching is quite good.
There's unfortunately some memory leak issue with 5.0.x Android version which after a while kicks in (on most 5.0.x devices as well), but overall the UX smoothness is just quite good. Occasionally there are some apps that brings up popup windows a bit stuttery, like facebook comments animation tends to be a bit stuttery.
The Sense Home/Blink Feed experience is just perfect. At normal operations, when no big application updates are happening in the background, I never faced any lags on the Sense Home UI.
As for the games, games from and before 2014 run perfectly. New ones might get some shaders removed, or with reduced polygon counts, so don't expect a jaw dropping 3D experience. If you are ok with last years (2014 and before) 3d game quality, M9+ is quite good, but latest games will be most probably running in the dumbed down mode accordingly to the PowerVR GPU the M9+ with mtk chipset has. The phone keeps a good temperature mostly, except when charging battery and at the same time playing 3D heavy games. (But that's expected from most devices, especially with alu body)
The screen quality is quite good, i've got a perfect panel with the first unit I got, and the refresh rate is 60hz, smooth and with lovely dpi and brightness of 2K.
The benchmarks show a general good performance with operations bound to the CPU. Where the MTK chip comes shorthanded is the GPU part. All tests show 3D performance that is below 2014's flagships performance by far. The GPU PowerVR 6200 is the same that was built into iPhone5s, which let's face is a mediocre GPU for a 2K display. If you face this fact and accept that gaming won't be with the highest quality textures and shaders, probably you won't be disappointed.
I'm curious what others think...
Played with it for a few hours. First impression is that it's clearly slower then the M9. The fingerprint sensor is almost perfect, a few hit and misses. I'll be back after a few days of using it with a more adequate feedback.
Sent from my HTC_M9pw using Tapatalk
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
DeadPotato said:
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
Click to expand...
Click to collapse
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
tbalden said:
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
Click to expand...
Click to collapse
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
DeadPotato said:
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
Click to expand...
Click to collapse
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
tbalden said:
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
Click to expand...
Click to collapse
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
DeadPotato said:
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
Click to expand...
Click to collapse
Second you.
X10 is MTK's first high-end tier SoC. They should equip with high-end graphics card to become well-known,not two-years ago one same with iPhone5s era. Although the most casual gaming is OK,but it is not covered with "flagship" devices.
It's excellent there were some possibility to tweak it on kernel level. But the Mediatek Helio X20 should work better,better & better on GPU part.
Not heavy gamer,the UX is pretty good for me. (Camera is another topic.)
Even I measured the CPU speed with Chess game,the result reached the top-tier.
tbalden said:
Antutu on stock Europe base 1.61
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
Click to expand...
Click to collapse
FYI.
Mine : stock , Taiwanese base 1.08,rooted device
Sigh.... the result was maxed out for my device on Ice Storm "Extreme".
Very similar result as mine was except ice storm unlimited. I think it might be related to temperature throttling, I'll test it again later.
Yeah, second run on base 1.61
Hello guys,
I live in France and i really look forward the HTC m9+ ; i thinks it is what the M9 should have been, but i don't understand HTC 2015 sh*tty choices of everything.
WHat i would like to know is about battery life ; Can you guys tell me whats about it ?
I just bought galaxy s6 edge a few days ago and it's creepy. I have no battery in the end of afternoon. I don't play games, just few photos and browsing, and messenger sometimes. And anyways i miss boomsound and metal body.
I'm so desapointed. Should never have sold my old one m8 or xperia Z3.
My only hope is turned on lenovo vibe x3 but there is no news since march ...
Thanks guys !
After installing more and more software that is synchronized, like dropbox and a few other my battery life got lower a notch, from 23+ hours to 22+ and average screen on around 4 hrs. All in all, not a miracle, but that's what I usually got with my previous phone, OPO with its larger battery and lower screen resolution. A bit less though, admittedly. So it gets me through the day nicely, but I'm not gaming.
It's average I think. Compared to the amazing Sony z3 compact it's pathetic, my friend's z3c gets 3+ days and 5 hours of screen on time. And I think it's done with some great stock kernel enchanted by Sony engineers and Sony stock rom... wish other phones could do that
Not sure why, I could not choose no lock screen and lock screen without security mode under security settings. Both options are greyed out and it says it is disabled by administrator,encryption policies or other apps.
I only have prey and lockout security installed which could affect this but this was totally fine on my M7 device.
Anyone has any idea? Thank you.
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
JellyKitkatLoli said:
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
Click to expand...
Click to collapse
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Yes, and it's fairly accurate.
I got 11871 with 2K, I couldn't screenshot as it doesn't know how to (Doesn't work above 1080p without dedicated software for it).
M8 1080p.
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
JellyKitkatLoli said:
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
Click to expand...
Click to collapse
hmmm......
The efficiency of SW decode will be performed vividly by multi-cores in your real life.
There were much comparison between MT6795(not "T") and SD801 in this site.
About the browser speed, RAR compression ,real gaming, power consumption........
http://tieba.baidu.com/p/3825199511?see_lz=1#69839534914l
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Are those results after reducing resolution to 1080p, did I read correctly? Because if that's the case, it's sad to confirm the limits of the 3D graphic card as other stated before, if you see the above M8 (which I find surprisingly higher than I experienced on my own M8 anyway) you'll notice even the last year's M8 has a better 3D score, and not to mention the regular M9 which produces almost double of that score. Multicore wise the Helio X10 is a beast, but graphics wise, looks like it's not so much

Categories

Resources