GLBenchmark HTC ONE only 34 FPS @ Egypt HD 1080P Offscreen - One (M7) General

HTC ONE GLBenchmark only scores 34 FPS at 1080P offscreen, this is much lower than the Samsung SHV-E300s which scores 41.3 FPS, both using Snapdragon 600, in the same test. IRC HTC One is using LPDDR2 RAM, so are we seeing a lack of bandwidth compared to the Samsung which may use LPDDR3, which is supported by the S600.
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One

how do you know HTC One uses LPDDR2 memory

kultus said:
how do you know HTC One uses LPDDR2 memory
Click to expand...
Click to collapse
http://www.htc.com/uk/smartphones/htc-one/#specs
http://www.anandtech.com/show/6754/hands-on-with-the-htc-one-formerly-m7/2

Turbotab said:
HTC ONE GLBenchmark only scores 34 FPS at 1080P offscreen, this is much lower than the Samsung SHV-E300s which scores 41.3 FPS, both using Snapdragon 600, in the same test. IRC HTC One is using LPDDR2 RAM, so are we seeing a lack of bandwidth compared to the Samsung which may use LPDDR3, which is supported by the S600.
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
Click to expand...
Click to collapse
My first question would be is how they even got a benchmark of the SHV-E300?

Xistance said:
My first question would be is how they even got a benchmark of the SHV-E300?
Click to expand...
Click to collapse
How do any results appear on GLbenchmark?
I believe with GLBenchmark, that if you don't register / login before running the test, it automatically uploads to their server for public viewing, so maybe it was done intentionally, or somebody forgot to login?

fp581 said:
he is spamming all around the htc one just look at his posts plz ban him from posting in any htc forum ever again.
he probably works in sony nokia or samsung
Click to expand...
Click to collapse
Who are you talking about?

sorry wrong person i'll delete that lest one.
but i would love pics of that benchmark for proof

fp581 said:
sorry wrong person i'll delete that lest one.
but i would love pics of that benchmark for proof
Click to expand...
Click to collapse
Dude I was going to go atomic, I admit it I have a terrible temper
I believe the benchmark was run by a German Android site, called Android Next, there is a video on Youtube, the GLBenchmark starts at 2.22
http://www.youtube.com/watch?v=Wl1dmNhhcXs&list=UUan0vBtcwISsThTNo2uZxSQ&index=1

thanks turbo for advanced my knoledge...what a shame they didnt choose LPDDR3 but i think its nt issue these days

Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.

Maedhros said:
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Click to expand...
Click to collapse
GLBenchmark is a test of GPU performance, and isn't really changed by CPU clockkspeed, but it is affected by bandwidth.
As a test, I downclocked my Nexus 7 from an overclocked 1.6 GHz to just 1.15 GHz, I ran GLBench and got 10 FPS. I then ran at the test again but with CPU at 1.6 GHz, the result, 10 FPS again.
I've benched the N7 with both CPU & GPU overclocked to the same level as Transformer Infinity, which gets 13 FPS, but I always get 10 FPS, the reason my N7 has lower memory bandwidth than the Transformer Infinity, because it use slower RAM and thus has less bandwidth. That is a difference of 30% in FPS, just because of lower bandwidth.
I read that LPDDR3 starts at 800 MHz or 12.8 GB/s in dual-channel configuration, whereas LPDDR2 maxs at 533 MHz or 8.5 GB/s max bandwidth in dual-channel configuration.

Turbotab said:
GLBenchmark is a test of GPU performance, and isn't really changed by CPU clockkspeed, but it is affected by bandwidth.
As a test, I downclocked my Nexus 7 from an overclocked 1.6 GHz to just 1.15 GHz, I ran GLBench and got 10 FPS. I then ran at the test again but with CPU at 1.6 GHz, the result, 10 FPS again.
I've benched the N7 with both CPU & GPU overclocked to the same level as Transformer Infinity, which gets 13 FPS, but I always get 10 FPS, the reason my N7 has lower memory bandwidth than the Transformer Infinity, because it use slower RAM and thus has less bandwidth. That is a difference of 30% in FPS, just because of lower bandwidth.
I read that LPDDR3 starts at 800 MHz or 12.8 GB/s in dual-channel configuration, whereas LPDDR2 maxs at 533 MHz or 8.5 GB/s max bandwidth in dual-channel configuration.
Click to expand...
Click to collapse
In that case the results are quite disappointing.

All these fantastic new phones, and so much disappointment.
Sent from my GT-I9300 using xda premium

Tomatoes8 said:
They could have used faster memory for the same price if they didn't cut off Samsung as a supplier. Makes you wonder where their priorities lie. Making the best products possible or just going with the motions.
Click to expand...
Click to collapse
No one is going to take anything you say here seriously, as you've managed to have 2 threads closed in the last 30 mins. One of those inane posts you made involved you saying that HTC is going to be paying, according to your genius calculation, 20% of their profits to Apple (I forget what insanely unintelligent reason you gave). Yeah, because being able to completely migrate data from 1 completely different phone to another is such a bad idea for a company that wants to push their product.
So, what is the per unit cost of what HTC is paying for RAM now vs. what they could have gotten from Samsung? Exactly, you have no idea. I also didn't hear anything about HTC "cutting off" Samsung as a supplier, but maybe I missed it, so I google'd "htc cut off samsung supplier" and found 2 links...
http://tech2.in.com/news/smartphones/following-apple-htc-cuts-component-orders-from-samsung/505402
http://www.digitimes.com/news/a20121009PD213.html
I'm not sure if you have the capability of reading or not, but I'll spoon feed you this information, ok hunny? I've taken the info from the 1st link, since there is more there.
After Apple Inc slashed its orders for memory chips for its new iPhone from major supplier and competitor, Samsung Electronics Co Ltd, HTC too has reportedly cut down on its smartphone component orders from the South Korean company.
Click to expand...
Click to collapse
So, Apple cut down on memory orders. You know, they are the one's who make the iPhone? Have a logo of an Apple on their products? Steve Jobs was the CEO before he died. Anyway, I'll continue...
According to a report by DigiTimes, HTC has reduced its orders from Samsung, and instead opted to order CMOS image sensors from OmniVision and Sony. The company has also chosen to move part of its AMOLED panel orders to AU Optronics, DigiTimes reported citing ‘sources’.
Click to expand...
Click to collapse
Notice it said that HTC reduced its orders from Samsung, specifically on the image sensors (that's for the camera, if you didn't know) and the screen. You know, the thing on the front of your phone that you touch to make it do things? You know what I mean, right? I encourage you to read this link (or possibly have someone read it to you)...
http://dictionary.reference.com/browse/reduce
The point is that reduce isn't the same as cut off. Cutting off would require HTC not ordering ANYTHING from Samsung. Guess what? The One doesn't use an OmniVision CMOS sensor (don't forget, that's what the camera uses) or an AMOLED screen (the bright part of your phone that shows you info).
Also, this is a far better designed phone, especially in regards to hardware, than anything Samsung has ever produced. I went back to my EVO 4G LTE, mainly because I couldn't stand the terrible build quality of the Note 2. It just feels like a cheap toy. And, IMO, Sense is far better than TW. Samsung may have the market right now because of the Galaxy line of products, but that doesn't mean that HTC is out of the game by any means.
Seriously, attempt to use just a bit of intelligence before opening your mouth and spewing diarrhea throughout the One forums. As the saying goes: "it's better to keep your mouth shut and have people think you're an idiot, then to open your mouth and prove it". Unfortunately for you, it's too late.

I really think Turbo was too hasty to open a new thread for this as we've been discussing this in the mega thread
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
It scores 34fps in Egypt HD 1080p offscreen, while the leaked Samsung s600 device socres 41fps which is perfectly inline with Qualcomm's promised speed (3x Adreno 225)
here is a video of what I suspect the source of the benchmark, because we had no benchmark before it
http://www.youtube.com/watch?v=Wl1dmNhhcXs
notice how the battery is almost at end (HTC bar at this stage means its in the last 25%) also notice the activity in the notification area
more important the post ran more than a few full benchmarks, like quadrant before running GL benchmark, this alone is enough to lower the score, especially since Adreno 320 was known to throttle in the Nexus 4
I think benchmarks scores should not be relied on in such events, especially with hundreds of hands messing with the device, we have learned from the One X launch where videos poped up showing horrible performance from the One X, eventually turned out to be were very far from the final device in ur hands
finally both the One X and Nexus 7 at the same gpu clock, but the first is DDR2 and the second is DDR3, score the same in GL Benchmark
in other words its worrying but it's best to wait for proper testers like Anand

Thread cleaned
...from some serious trolling. There should be no trace from him for some time .
but remember:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

But...
I just wonder that a Samsung phone uses high end parts from Qualcomm instead of Samsungs processors. But I am not in Samsung devices so far, so I would not judge this
Gz
Eddi

Here's a second video also showing Egypt off screen bench at 34FPS.
https://www.youtube.com/watch?v=wijp79uCwFg
Skip to 3:30

Maedhros said:
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Click to expand...
Click to collapse
So you're saying that 200mhz o the CPU can account for 7 fps on a GPU test?
Following what you said, the Nexus 4 should have scored 27 fps? Since it has 200mhz less...
But no, it scored 33.7...only 0.3 fps less than the One!
And you know why? First both use the same GPU (and it's what counts for a graphic test) and second the HTC phones are always slower due to Sense!
So stop *****ing and realize that the One is no god phone

Samsung device is running 4.2.1

Related

G2 Processor faster than Nexus One?

I constantly see references to the G2 beating the Nexus One in benchmark tests but the benchmarks always include GPU performance as part of the score. It was established before the phone came out that the GPU performance would beat just about everything else (save the Galaxy S), however that is somehow conflated with CPU performance.
Yes, the G2 can and probably will be overclocked once root is achieved but for now, apples to apples comparison:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Overall score is higher than Nexus One but CPU score is lower
I don't know that this is all that significant and I'd still take my G2 over a Nexus One, but please stop spreading misinformation by saying how the "Scorpion CPU pwnz the Snapdragon LOLZWTF!!!!1!!"
And I'm sure once the G2 get clocked At its native speed then its gonna put your point to rest...G2 still runs faster than my nexus .. I don't care about numbers what I care about is real world performance
Sent from my T-Mobile G2 using XDA App
Both are Snapdragon SoC (G2 and N1), and both have Scorpion CPU's.. newer one is on a 45nm die while older is on a 65nm (lesser heat emission, lower consumption of power as a result longer battery life on the G2 thanks to the 45nm die).
CPU architecture wise there is not much difference.. the application processing however has been tweaked to perform better overall(thanks to the 45nm die I believe) and obviously as you can see the GPU, Adreno 205 thrashes the older Adreno 200..
Also the newer SoC supports more video codecs(mainly DivX actually) as well. ((MPEG-4, H.264, H.263, VC-1, DivX, DivX 3.11, Sorenson Spark, VP6) vs (MPEG-4, H.264, H.263, VC-1, Sorenson Spark, VP6).
msmith1991 said:
And I'm sure once the G2 get clocked At its native speed then its gonna put your point to rest
Click to expand...
Click to collapse
I constantly see references to the G2 beating the Nexus One in benchmark tests
Click to expand...
Click to collapse
Yes, the G2 can and probably will be overclocked once root is achieved but for now, apples to apples comparison
Click to expand...
Click to collapse
I don't know if I came across as bashing the phone. Maybe it was when I said I still prefer it over a Nexus one?
Regardless, my point wouldn't be "put to rest." For now, the CPU on the Nexus One outperforms the G2 on a benchmark. That is what I was addressing, nothing more.
Superfrag said:
Both are Snapdragon SoC (G2 and N1), and both have Scorpion CPU's.. newer one is on a 45nm die while older is on a 65nm (lesser heat emission, lower consumption of power as a result longer battery life on the G2 thanks to the 45nm die).
CPU architecture wise there is not much difference.. the application processing however has been tweaked to perform better overall(thanks to the 45nm die I believe) and obviously as you can see the GPU, Adreno 205 thrashes the older Adreno 200..
Also the newer SoC supports more video codecs(mainly DivX actually) as well. ((MPEG-4, H.264, H.263, VC-1, DivX, DivX 3.11, Sorenson Spark, VP6) vs (MPEG-4, H.264, H.263, VC-1, Sorenson Spark, VP6).
Click to expand...
Click to collapse
My personal experience matches what you've said exactly. Barely a hiccup running the phone and excellent battery life (relatively speaking at least). Like I said, the only thing I'm addressing is people using benchmarks which include the GPU in the score to say that the G2 processor is faster at 800mhz than the N1 at 1ghz.
Is there anything like Adobe Flash benchmark or test page? G2 is supposed to hardware accelerate Flash.
G2 is pure sex compared to my last phone, motorola cliq hahah
Is there anything like Adobe Flash benchmark or test page? G2 is supposed to hardware accelerate Flash.
Click to expand...
Click to collapse
This is particularly interesting, as the phone seems to load and scroll webpages faster than my Nexus One did with Enom's ROM. Well, the G2 is faster in everything, basically. Not by a lot, but enough to be noticeable.
What is interesting is the obnoxiously higher IO score in the G2 compared to the N1.
I wonder what is entailed in an IO test on a phone...
here is some light reading, and chart lookin....
http://androidandme.com/2010/10/news/3dmarkmobile-gpu-showdown-adreno-205-vs-powervr-sgx540/
But jsut remember that the Galaxy class is still stuck on 2.1 so we are missing some updated drivers and JIT and yes that does make a difference.
Oh I forgot.
Better IO scores I believe is thanks to the fact that HTC decided to to with ext3 partition for its OS, thus having faster read/writes(I'm not sure about this.. I think its thanks to the ext3 partition)
Also yeah, like I said, the CPU architecture is similar, (very similar actually), just the overall application processing is better due to tweaks, and one guy mentioned hardware flash acceleration which I forgot to mention, and obviously the GPU. Basically its a very well refined package of the Snapdragons that were on the N1/Desire.
Next year's 3rd Gen Snapdragons will be dual core Scorpion CPU's, even better tweaking and optimization, and HUGE gpu improvement.
Qualcomm says that Adreno 220 is 4-5 times faster than Adreno 205.
carlitozway57 said:
My personal experience matches what you've said exactly. Barely a hiccup running the phone and excellent battery life (relatively speaking at least). Like I said, the only thing I'm addressing is people using benchmarks which include the GPU in the score to say that the G2 processor is faster at 800mhz than the N1 at 1ghz.
Click to expand...
Click to collapse
Yup, at stock speeds the Nexus One's CPU is faster simply due to clock speed(since they have similar architecture). BUT, thanks to the overall package, tweaking and optimization, and a better GPU, the new 2nd gen Snapdragons are extremely well refined and thus the phone runs MUCH smoother. That's why you see Linpack scores of the G2 and N1 are very similar.
Superfrag said:
Yup, at stock speeds the Nexus One's CPU is faster simply due to clock speed(since they have similar architecture). BUT, thanks to the overall package, tweaking and optimization, and a better GPU, the new 2nd gen Snapdragons are extremely well refined and thus the phone runs MUCH smoother. That's why you see Linpack scores of the G2 and N1 are very similar.
Click to expand...
Click to collapse
agreed
my G2 scores 1675 in quadrant
smoothness comparable to iphone, but just running at 245mhz!
many people got scared of for the 800mhz mark
but I would proudly say they have underestimated that babe
running at 800mhz brings satisfactory battery stamina as well
o>c said:
agreed
my G2 scores 1675 in quadrant
smoothness comparable to iphone, but just running at 245mhz!
many people got scared of for the 800mhz mark
but I would proudly say they have underestimated that babe
running at 800mhz brings satisfactory battery stamina as well
Click to expand...
Click to collapse
Tell me about it, it's the smoothest Android phone out there and fastest too I might add.
Once we root and O/C this baby it'll fly!
Think of the difference between the N1 QSD8250 vs the G2 MSM7230 as a Core 2 Duo E6700 vs Core 2 Duo E7200 for the most part. One is older, supports slightly less instructions, but the overall architecture is the same. They just refined it a bit when they made the switch to 45nm is all.
I think the G2 cpu after using almost every snapdragon phone over the few months including my old nexus 1 this 800mhz cpu reminds me of the intel centrino cpus although they lower clock rates the where fast cpus from what I see and read the G2 as a package is the fastest phone on the martket. The cpu has been optimise and after going through many androids phones this one is by far the smoothest and fastest of them all in my opinion.
Sent from my T-Mobile G2 using XDA App
This is the highest I've been able to get it to
carlitozway57 said:
I don't know that this is all that significant and I'd still take my G2 over a Nexus One, but please stop spreading misinformation by saying how the "Scorpion CPU pwnz the Snapdragon LOLZWTF!!!!1!!"
Click to expand...
Click to collapse
I haven't done any benchmarking but I can definitively say the G2 is far and away snappier and seems to run much smoother than the N1. It's the first time an android phone has seemed to just flow and not get in the way of itself. Granted I stopped using cooked roms when I ditched winmo, so I can't speak for CM6.
Sent from my T-Mobile G2 using Tapatalk
Of course the absolute correct statement would be chipset vs chipset, not processor vs processor, but thats all semantics. You cannot compare processors vs processors when it comes to mobile devices. These are all SOCs. You cannot seperate them.
And the fact is, the G2 as a whole has better performance than the N1 as a whole. The reason why people at the first place keep harping on the fact that G2 is faster than N1 is just to make a point that being 800mhz does not mean that the phone as a whole is slower than the N1, and thus less qualified for a Gingerbread upgrade than the N1 (which in turn proves that the so called rumour of a 1Ghz requirement for Gingerbread is not logical). How can one say that just because N1 has 1Ghz proc, and the G2 has 800mhz proc, the G2 is not as qualified as the N1 to get Gingerbread (again, assuming the 1Ghz requirement) when as a whole phone its faster than the N1. So we can conclude that the 1Ghz requirement is just bull****, and that the G2 is not handicapped for having a 800mhz proc.
Sorry did i confused you? I think i confused myself
I was able to get it 1 point higher lmao
Sent from my T-Mobile G2 using XDA App
Some of you keep thinking the G2 has a snapdragon processor in it. It's a scorpion processor not a 2nd gen snapdragon.

Is HummingBird Really Slower than Snapdragon Gen 2? [Independent of JIT]

I know this topic has been debates over time but I noticed that most people attributed the differences in performance is caused by firmware difference (2.1 vs. 2.2).
Today there's an article release about G2 overlock to 1.42 Ghz. Along with the article I noticed "Native Benchmark" using SetCPU which doesn't uses JIT.
Lower is Better.
G2 Result:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Now My Vibrant at 1.2 Ghz:
C: 702.89
Neon: 283.15
The difference between the two phone is so great that I doubt it is due to the 200 MHz difference alone.
As a comparison, my score at regular 1 GHz is:
C: 839.21
Neon: 334.51
There is about 130 ms decrease for 200 Mhz overclock, which is Vibrant is at 1.4 Ghz would put the two CPU really close to each other but with G2 having a slight edge. Remember this test is suppose to be JIT independent running Native Codes. But since the vibrant can only be stable overclocked to 1.3 Ghz (what is available anyways), the newer generation of Snapdragon may just be more efficient than Hummingbird, despite us the galaxy owner believes otherwise.
Another thing to keep in mind though, is that Snapdragon are supposedly to have an edge in Neon instruction Set, so I didn't look into that score too much.
It appears to be true.
It appears Hummingbird is not only slower than the new Generation Scorpions, it also appears the Hummingbird is unable to fully capture the CPU performance gain of the Dalvik JIT compiler in Froyo 2.2
http://www.youtube.com/watch?v=yAZYSVr2Bhc
Dunno Something Is Not Right About This 2.2
The Thing That Really Bugs Me Is 2.2 is Suppose To Allow The Full Functionality Of Our 512MB of Ram..But It Doesn't
Erickomen27 said:
Dunno Something Is Not Right About This 2.2
The Thing That Really Bugs Me Is 2.2 is Suppose To Allow The Full Functionality Of Our 512MB of Ram..But It Doesn't
Click to expand...
Click to collapse
It's not 2.2, its Samsung.
SamsungVibrant said:
It's not 2.2, its Samsung.
Click to expand...
Click to collapse
I agree, they should use ext 4 on their phones.
I don't see why they would stick to their old RFS.
SamsungVibrant said:
It appears Hummingbird is not only slower than the new Generation Scorpions, it also appears the Hummingbird is unable to fully capture the CPU performance gain of the Dalvik JIT compiler in Froyo 2.2
http://www.youtube.com/watch?v=yAZYSVr2Bhc
Click to expand...
Click to collapse
I'm sorry, but could you explain what your youtube link has to do with the topic? I'm curious, as I wasn't any wiser on the question at hand when I watched it.
NEON is architecture extension for the ARM Cortex™-A series processors*
Is Snapdragon an ARM Cortex™-A series processor? NO!
Remember SSE instruction set in Intel, and the war AMD vs Intel?
Welcome back, LOL
*The source for NEON: http://www.arm.com/products/processors/technologies/neon.php
Probably is, but does it really matter?
Sent from my SGS Vibrant.
Scorpion/Snapdragon have faster FPU performance due to a 128 bit SIMD FPU datapath compared to Cortex-A8's 64 bit implementation. Both FPUs process the same SIMD-style instructions, the Scorpion/snapdragon just happens to be able to do twice as much.
http://www.insidedsp.com/Articles/t...ualcomm-Reveals-Details-on-Scorpion-Core.aspx
2.2 isnt going to magically give the galaxy S similar scorpion/snapdragon high scores
just look at droidX and other Cortex-A8 phones that already have official 2.2 ROMS they avr 15-20 linpack scores
This doesn't make the hummingbird a bad CPU at all LOL its stupid benchmarks IMHO not going to show in realword use...maybe when the OS matures and becomes more complex but not now..and even by then we will have dualcore CPU's...its a gimmick for HTC to have the "Fastest CPU"
IMO in real world use they are pretty much on par but then when you look at GPU performance its quit obvious the galaxy S pulls ahead thanks to the 90mts PowerVR SGX540
demo23019 said:
Scorpion/Snapdragon have faster FPU performance due to a 128 bit SIMD FPU datapath compared to Cortex-A8's 64 bit implementation. Both FPUs process the same SIMD-style instructions, the Scorpion/snapdragon just happens to be able to do twice as much.
http://www.insidedsp.com/Articles/t...ualcomm-Reveals-Details-on-Scorpion-Core.aspx
2.2 isnt going to magically give the galaxy S similar scorpion/snapdragon high scores
just look at droidX and other Cortex-A8 phones that already have official 2.2 ROMS they avr 15-20 linpack scores
This doesn't make the hummingbird a bad CPU at all LOL its stupid benchmarks IMHO not going to show in realword use...maybe when the OS matures and becomes more complex but not now..and even by then we will have dualcore CPU's...its a gimmick for HTC to have the "Fastest CPU"
IMO in real world use they are pretty much on par but then when you look at GPU performance its quit obvious the galaxy S pulls ahead thanks to the 90mts PowerVR SGX540
Click to expand...
Click to collapse
Once again quoting ARM HQ website:
NEON technology is cleanly architected and works seamlessly with its own independent pipeline and register file.
NEON technology is a 128 bit SIMD (Single Instruction, Multiple Data) architecture extension for the ARM Cortex™-A series processors, designed to provide flexible and powerful acceleration for consumer multimedia applications, delivering a significantly enhanced user experience. It has 32 registers, 64-bits wide (dual view as 16 registers, 128-bits wide.
Click to expand...
Click to collapse
Scorpion is not ARM Cortex™-A series processor
Fuskand said:
I'm sorry, but could you explain what your youtube link has to do with the topic? I'm curious, as I wasn't any wiser on the question at hand when I watched it.
Click to expand...
Click to collapse
I provided the link, because the first part of the link talks about the JIT compiler which increases CPU performance. I put that there in-case someone has never heard of this before. Thus, when I mentioned the Hummingbird can not take full advantage of the JIT compiler, someone would know what I'm talking about.
demo23019 said:
Scorpion/Snapdragon have faster FPU performance due to a 128 bit SIMD FPU datapath compared to Cortex-A8's 64 bit implementation. Both FPUs process the same SIMD-style instructions, the Scorpion/snapdragon just happens to be able to do twice as much.
http://www.insidedsp.com/Articles/t...ualcomm-Reveals-Details-on-Scorpion-Core.aspx
2.2 isnt going to magically give the galaxy S similar scorpion/snapdragon high scores
just look at droidX and other Cortex-A8 phones that already have official 2.2 ROMS they avr 15-20 linpack scores
This doesn't make the hummingbird a bad CPU at all LOL its stupid benchmarks IMHO not going to show in realword use...maybe when the OS matures and becomes more complex but not now..and even by then we will have dualcore CPU's...its a gimmick for HTC to have the "Fastest CPU"
IMO in real world use they are pretty much on par but then when you look at GPU performance its quit obvious the galaxy S pulls ahead thanks to the 90mts PowerVR SGX540
Click to expand...
Click to collapse
Search the net, people have made real world Videos of galaxy s running 2.2, compared to G2. The G2 is faster in the real world on things like launching aps.
lqaddict said:
Once again quoting ARM HQ website:
Scorpion is not ARM Cortex™-A series processor
Click to expand...
Click to collapse
LOL i never said the scorpion is ARM Cortex™-A
try reading my post again
SamsungVibrant said:
Search the net, people have made real world Videos of galaxy s running 2.2, compared to G2. The G2 is faster in the real world on things like launching aps.
Click to expand...
Click to collapse
LOl if it is faster it might be by the most 1-2 sec if its lucky
sorry its going to take allot more than that to impress me..again its a phone now a highend PC
SamsungVibrant said:
Search the net, people have made real world Videos of galaxy s running 2.2, compared to G2. The G2 is faster in the real world on things like launching aps.
Click to expand...
Click to collapse
Due to different filesystem implementation largely, once there is a workable hack to convert the entire filesystem on the Galaxy S to a real filesystem you can make the comparison of the things like launching apps.
Demo, I didn't mean to come off as a **** I was just pointing out the flaw in the OP benchmark - NEON instruction set execution is flawed. G2 processor is ARMv7 which is the base of Cortex-A8, Cortex-A8 adds the instructions specifically targeted for application, like multimedia, and that's where NEON comes into place.
lqaddict said:
Due to different filesystem implementation largely, once there is a workable hack to convert the entire filesystem on the Galaxy S to a real filesystem you can make the comparison of the things like launching apps.
Click to expand...
Click to collapse
Agreed. +10 char
lqaddict said:
Demo, I didn't mean to come off as a **** I was just pointing out the flaw in the OP benchmark - NEON instruction set execution is flawed. G2 processor is ARMv7 which is the base of Cortex-A8, Cortex-A8 adds the instructions specifically targeted for application, like multimedia, and that's where NEON comes into place.
Click to expand...
Click to collapse
No problem I didn't really take it
Also noticed i overlooks allot of things in OP...blame the ADD
What difference does it make? In real world use the difference is negligible. And in three months our phones will be mid-tier anyway. At least we don't have hinges that will fall apart in two months.
how is the hummingbird not able to fully take advantage of JIT?
Well there is a fix for our phones now. And from what I can tell there no way the g2 can open apps faster than my vibrant with the z4mod. Its smocking fast.by far the fastest I've ever seen this phone. No delays whatsoever. Can't wait till I get froyo with ocuv and this will be unreal. I feel like this phone us a high end pc running android or something. When I say instant it's instant lol.
Kubernetes said:
What difference does it make? In real world use the difference is negligible. And in three months our phones will be mid-tier anyway. At least we don't have hinges that will fall apart in two months.
Click to expand...
Click to collapse
Exactly.
People seem to forget that the SGS line is like 6 months old now, we should be glad they're still doing as well as they are.
Then theres the fact that there aren't many other phones that come with 16gb internal. To me, having 16GB and being able to upgrade to 48 [minus the 2GB that Samsung steals] total is worth way more than starting with 4GB [1.5GB usable] and upgrading to a max of 36 [minus what HTC steals from internal].
But, if you don't like your phone, SELL IT while it's still worth something!

Why Nexus S is Google's new baby.

Hardware. Pure and simple. The Nexus One hardware was great at the time, but there are a few things that the Nexus One's hardware that needed to be upgraded, or they wanted to support in their new dev phone:
1) Proper Multi-touch screen.
Nexus One's screen isn't multi touch, and it's hardly even dual touch. It's a single touch screen that offered some limited dual touch support that only really works for pinch to zoom. The rotate with two fingers gesture that's in the new version of maps isn't supported on the Nexus One.​
2) Front facing camera.
iPhone has one, and made it somewhat popular. Google needed it in their dev phone to keep up.​
3) PowerVR SGX540 GPU.
The PowerVR SGX540 chip is *the* most powerful mobile chip on the market. It's significantly better than the adreno 200 found in n1, and has roughly double the power of PowerVR SGX535 that's in the iPhone 4 and iPad. Galaxy S maxes out most of the commonly used benchmarks, and comes close to maxing nenamark1 too.​
4) Wolfson Sound Chip is brilliant
The Galaxy S phones have *the* best sound chip on the market, and Nexus S has the same chip
check out the perfect audio quality part in GSMArena's review of Galaxy S​
Oh, and there's also the NFC chip, Super AMOLED screen, three-axis Gyroscope, and larger battery.
Rawat said:
Hardware. Pure and simple. The Nexus One hardware was great at the time, but there are a few things that the Nexus One's hardware that needed to be upgraded, and they wanted to support in their new dev phone:
1) Proper Multi-touch screen.
Nexus One's screen isn't multi touch, and it's hardly even dual touch. It's a single touch screen that offered some limited dual touch support that only really works for pinch to zoom. The rotate with two fingers gesture that's in the new version of maps isn't supported on the Nexus One.​
2) Front facing camera.
iPhone has one, and made it somewhat popular. Google needed it in their dev phone to keep up.​
3) PowerVR SGX540 GPU.
The PowerVR SGX540 chip is *the* most powerful mobile chip on the market. It's significantly better than the adreno 205 found in n1, and has roughly double the power of PowerVR SGX535 that's in the iPhone 4 and iPad. Galaxy S maxes out most of the commonly used benchmarks, and comes close to maxing nenamark1 too.​
4) Wolfson Sound Chip is brilliant
The Galaxy S phones have *the* best sound chip on the market, and Nexus S has it too
check out the perfect audio quality part in GSMArena's review of Galaxy S​
Oh, and there's also the NFC chip, Super AMOLED screen, three-axis Gyroscope, and larger battery.
Click to expand...
Click to collapse
This is a great analysis Rawat
I'll be really interested to see how quick my Nexus-1 gets gingerbread. If it takes weeks after the 16th or after the new year then I would have to agree
ap3604 said:
This is a great analysis Rawat
I'll be really interested to see how quick my Nexus-1 gets gingerbread. If it takes weeks after the 16th or after the new year then I would have to agree
Click to expand...
Click to collapse
I think the trend will be that the newer versions of android will be developed on Nexus S, and as such they'll be the first to receive it, and the N1 will get the updates a around a month or so later, as long as the device meets the minimum spec.
Google said they do their OS development on one device. Think it was andy rubin when he was showing parts of the mototab, and it was maybe in one of the Nexus S / gingerbread phone videos.
The nexus one actually has an adreno 200 the 205's are much more improved as seen in the g2,desire hd,and my touch 4g. Also the new snapdragons are believed to be on par if not better than hummingbird cpu's
Some comparison androidevolutions . com /2010/10/13/gpu-showdown-adreno-205-msm7230-in-htc-g2-vs-powervr-sgx540-hummingbird-in-samsung-galaxy-s/
Indeed you're correct. 1st gen chips had adreno 200, 2nd gen had 205s.
I don't think the gpu and CPU are the reason more so the screen along with samsungs ability to prodce said screens.
adox said:
The nexus one actually has an adreno 200 the 205's are much more improved as seen in the g2,desire hd,and my touch 4g. Also the new snapdragons are believed to be on par if not better than hummingbird cpu's
Some comparison androidevolutions . com /2010/10/13/gpu-showdown-adreno-205-msm7230-in-htc-g2-vs-powervr-sgx540-hummingbird-in-samsung-galaxy-s/
Click to expand...
Click to collapse
The CPU's may be on parr. However. CPU isn't what needs improved on the Snapdragons.
This is correct. SGX540 does perform about 2x as fast as SGX530 (found in Droid X, Droid 2, iPhone 3GS and a variation of it in iPhone 4). Unfortunately, Samsung's Galaxy S has been using the same GPU for many months now. So TI is playing a catch up on Samsung's SoC. To be fair, other manufacturers aren't exactly doing any better. Qualcomm's second generation GPU - Adreno 205 also performs significantly worse than SGX540 and (soon to be released) Tegra 2's GPU is also expected to be outperformed by SGX540. With Samsung claiming Orion improving GPU performance by another 3-4x over SGX540 must sound scary to other manufacturers!
Click to expand...
Click to collapse
SGX540 = Hummingbird's GPU.
GPU means a ton when it comes to what you're actually going to see in action on the screen.
In the link I posted that doesn't seem so, the gpu actually faired well against the humming bird in the epic
adox said:
I don't think the gpu and CPU are the reason more so the screen along with samsungs ability to prodce said screens.
Click to expand...
Click to collapse
Google said they added more features for better game programming. That's one of the major improvements in 2.3, so why would they pick screen over gpu? Galaxy S phones are considered one of the best device for Android gaming so it makes a lot of sense to have Samsung make a phone. The screen is an icing on the cake. I bet Samsung is going to use samoled screens a lot more on big phones they manufacture.
so true cant wait!
adox said:
In the link I posted that doesn't seem so, the gpu actually faired well against the humming bird in the epic
Click to expand...
Click to collapse
On one benchmark. I wouldn't read into those results too much
http://www.anandtech.com/show/4059/nexus-s-and-android-23-review-gingerbread-for-the-holidays
anadtech review
Rawat said:
Hardware. Pure and simple. The Nexus One hardware was great at the time, but there are a few things that the Nexus One's hardware that needed to be upgraded, or they wanted to support in their new dev phone:
1) Proper Multi-touch screen.
Nexus One's screen isn't multi touch, and it's hardly even dual touch. It's a single touch screen that offered some limited dual touch support that only really works for pinch to zoom. The rotate with two fingers gesture that's in the new version of maps isn't supported on the Nexus One.​
2) Front facing camera.
iPhone has one, and made it somewhat popular. Google needed it in their dev phone to keep up.​
3) PowerVR SGX540 GPU.
The PowerVR SGX540 chip is *the* most powerful mobile chip on the market. It's significantly better than the adreno 200 found in n1, and has roughly double the power of PowerVR SGX535 that's in the iPhone 4 and iPad. Galaxy S maxes out most of the commonly used benchmarks, and comes close to maxing nenamark1 too.​
4) Wolfson Sound Chip is brilliant
The Galaxy S phones have *the* best sound chip on the market, and Nexus S has the same chip
check out the perfect audio quality part in GSMArena's review of Galaxy S​
Oh, and there's also the NFC chip, Super AMOLED screen, three-axis Gyroscope, and larger battery.
Click to expand...
Click to collapse
Goddammint!!! I can't wait til Thursday!!!
zachthemaster said:
Goddammint!!! I can't wait til Thursday!!!
Click to expand...
Click to collapse
Rofl I can't wait till there are tons of threads started such as "Goddammit I LOVE this phone!!!"
ap3604 said:
Rofl I can't wait till there are tons of threads started such as "Goddammit I LOVE this phone!!!"
Click to expand...
Click to collapse
Haha goddammit i can't wait to post in those threads.. I'm so excited... New phone, new network... PUMPED
hmm.. sounds awsome..
but hey, someone knows if we can open the battery cover to replace the battery? im too used to carry two batteries.. i need it cause long weekends with heavy usage of the phone.. >.<
i didnt find anything about this :3
D4rkSoRRoW said:
hmm.. sounds awsome..
but hey, someone knows if we can open the battery cover to replace the battery? im too used to carry two batteries.. i need it cause long weekends with heavy usage of the phone.. >.<
i didnt find anything about this :3
Click to expand...
Click to collapse
yah... duh haha
Sure you can
Here's a view of the phone with the cover off:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

[US Variants] Adreno 225 woes

Two years ago I bought an Incredible. I could have waited a month for the fascinate, and I'm glad I didn't, but something began to bug me; the Adreno 200 was very underpowered. Fast forward to now and I'm forced to upgrade to keep my unlimited data. The obvious choice is the upcoming Galaxy S3, so I pre-ordered one. I can't help but wonder if I'm repeating my last phone purchase by obtaining hardware with a GPU that simply wont hold up in the future.
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed. This upset many QSD based device owners, especially the Nexus One adopters. I know ICS roms are continually improving for QSD based phones but they'll always lag. Meanwhile the fascinate has received great ICS support due to having the same GPU as the Galaxy Nexus. One month could have changed my future proof experience, but inevidbly the fascinate had a bunch of issues I'm glad I didn't have to deal with.
I know hardware becomes obsolete. It happens, I get it. We all want to try and do the best we can though, especially those of us on Verizon with unlimited data; this is our last subsidized upgrade allowing us to retain unlimited data.
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
Will the US Galaxy S3 withstand the test of time and provide future proof hardware for a reasonable amount of time? Or will it fall short of the two year expected lifespan like the QSD Adreno 200 devices did?
I am uncertain and wish Qualcomm would seriously step its GPU game. Am I alone with this line of thinking?
[I originally posted this in the international forum, but felt it belonged here.]
Div033 said:
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed.
Click to expand...
Click to collapse
I thought the biggest problem with the Nexus One was the limited space for system files. Other Adreno 200 devices, such as the Evo 4G, have Android 4.0 running on them, and I hear it works really well.
I know that the early Exynos found on the Nexus S also works quite well.
I think any modern chipset easily surpasses the performance required for the type of GPU tasks being implemented at the system level. Games are still a concern, but compatibility is more of an issue there than performance, and the Adreno 225 is popular enough that it should be supported.
But there's always next year for really kick-ass GPUs.
Div033 said:
Two years ago I bought an Incredible. I could have waited a month for the fascinate, and I'm glad I didn't, but something began to bug me; the Adreno 200 was very underpowered. Fast forward to now and I'm forced to upgrade to keep my unlimited data. The obvious choice is the upcoming Galaxy S3, so I pre-ordered one. I can't help but wonder if I'm repeating my last phone purchase by obtaining hardware with a GPU that simply wont hold up in the future.
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed. This upset many QSD based device owners, especially the Nexus One adopters. I know ICS roms are continually improving for QSD based phones but they'll always lag. Meanwhile the fascinate has received great ICS support due to having the same GPU as the Galaxy Nexus. One month could have changed my future proof experience, but inevidbly the fascinate had a bunch of issues I'm glad I didn't have to deal with.
I know hardware becomes obsolete. It happens, I get it. We all want to try and do the best we can though, especially those of us on Verizon with unlimited data; this is our last subsidized upgrade allowing us to retain unlimited data.
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
Will the US Galaxy S3 withstand the test of time and provide future proof hardware for a reasonable amount of time? Or will it fall short of the two year expected lifespan like the QSD Adreno 200 devices did?
I am uncertain and wish Qualcomm would seriously step its GPU game. Am I alone with this line of thinking?
[I originally posted this in the international forum, but felt it belonged here.]
Click to expand...
Click to collapse
Well, you also have to look at resources available and time constraints. The introduction of LTE in the US probably forced said chip maker to make some concessions. What they lost in state of the art GPU, the gained in the ridiculous profit they made this year because they are the only chip that includes LTE. From their perspective, they've won the war thus far.
I agree with this line of thinking. As an earlier poster said, the Evo 4G had the Adreno 200. I use N64oid all the time. The Evo would struggle with games that the first Galaxy S family had no problem at all with. I have since switched to the Motorola Photon 4G, and the Tegra 2 (Nvidia GeForce GPU). It handles both emulators, as well as high end games so much better than my Evo did. I would have already pre-ordered the S3 if it weren't for this.
real world performance > benchmarks
My Sensation has an Adreno220 and it plays every game and movie just fine. Sure it doesn't get the best benchmark numbers but it more than holds it's own when playing any game. I'm sure the Adreno225 will hold up just fine over the next couple years. In fact I still love my Sensation. Side by side it's still just as fast as most phones out there. You only see a difference when running benchmarks which isn't practical. I personally don't care if i'm getting 200fps or 50. It's not like anyone can tell the difference.
I also want to note that the 220 is crazy powerful compared to the 200 and 205. It was the first GPU that Qualcomm seemed to really take a stab at gaming with. I'm fine with the 220 and can't wait to begin using the 225.
bradleyw801 said:
I agree with this line of thinking. As an earlier poster said, the Evo 4G had the Adreno 200. I use N64oid all the time. The Evo would struggle with games that the first Galaxy S family had no problem at all with. I have since switched to the Motorola Photon 4G, and the Tegra 2 (Nvidia GeForce GPU). It handles both emulators, as well as high end games so much better than my Evo did. I would have already pre-ordered the S3 if it weren't for this.
Click to expand...
Click to collapse
As nativestranger said in another thread,
"The 225 despite its deceptive naming is 4-6x the performance of the 205 and roughly 1.6-2x the 220."
Also, performance on the Evo cannot be based solely on Gpu(Adreno 200). Cpu, Ram, Resolution etc... have a ton to do with it as well.
Will the GPU do better in this phone due to the extra ram compared to the one series with the same S4/225 combo?
Div033 said:
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
[I originally posted this in the international forum, but felt it belonged here.]
Click to expand...
Click to collapse
You should also note that the GS2 only had 480 x 800 (384000 total pixels) resolution and even at that way lower resolution it's score was only slightly higher in that test whereas the GS3 is pushing 720x1280 (921600 total pixels). That means that the GS3 is working 2.4 times harder than the GS2 and it delivers almost the same gaming performance at worst, and better performance in others. That's not bad if you ask me seeing as how we all thought the GS2 was a powerhouse just 12 months ago.
incubus26jc said:
As nativestranger said in another thread,
"The 225 despite its deceptive naming is 4-6x the performance of the 205 and roughly 1.6-2x the 220."
Also, performance on the Evo cannot be based solely on Gpu(Adreno 200). Cpu and Ram have a ton to do with it as well.
Click to expand...
Click to collapse
agreed, i haven't seen anyone suffering from GPU woes other than benchmark nuts who obsess over the highest score.....everyone actually using it for real things say it works great.....and honestly, my take is if you want gaming performance, don't use a phone, plug your 360 into your big screen and kick ass from the couch in HD and surround sound
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Div033 said:
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Click to expand...
Click to collapse
Well it might interest you to know the Adreno 225 supports DirectX 9.3 and texture compression where Mali 400 does not. Its a requirement for Windows 8. Now, you might say so what....but I for one plan on trying to dual boot or even run a version of Windows RT perhaps on a virtual machine. Something else that the S4 Krait/Adreno package supports natively I do believe, that the Exynos/Mali doesn't.
Sent from my DROIDX using xda premium
Div033 said:
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
Click to expand...
Click to collapse
This is almost certainly a RAM issue. With the ridiculous 2GB on this phone, I would hope you wouldn't run into issues until Android at least wraps the alphabet.
Voltage Spike said:
This is almost certainly a RAM issue. With the ridiculous 2GB on this phone, I would hope you wouldn't run into issues until Android at least wraps the alphabet.
Click to expand...
Click to collapse
+1. I do believe more ram (largest among this generation phones) matters for long term usage.
The GPU is fine.
Guys, this is a Galaxy S phone. The newest one at least.
It is GUARANTEED a Jelly Bean update from Samsung (albeit late). It is also most likely getting at least 1 or 2 more major Android updates because of XDA.
Remember, ALL OF US has the SAME Galaxy S3. That is a LOT of devs that will be working on it.
Don't worry about that. It will come with time.
Div033 said:
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Click to expand...
Click to collapse
You simply don't compare the Adreno 200 of first generation snapdragon devices with the 225 of current devices. The Adreno 200 gpu even for it's time is woefully weak. The 205 that followed after was easily 3x or more the performance. Most devices based on this gpu including the xperia play actually had no problem playing the latest games. The 220 saw a similarly huge increase. Qualcomm eased off a little with the 225 by using the same architecture but built on a smaller process and increased the clocks/ memory bandwidth resulting in 1.6x - 2x performance increase. Hence the comments of it being a lame upgrade. However when we look at the desktop gpu of the AMD 6000 series. The upgrade was less than 20% over previous year and people are universally appraising the 6970 gpu. This is how gullible fanboyism and can result in strongly skewed perception over actual results.
nativestranger said:
You simply don't compare the Adreno 200 of first generation snapdragon devices with the 225 of current devices. The Adreno 200 gpu even for it's time is woefully weak. The 205 that followed after was easily 3x or more the performance. Most devices based on this gpu including the xperia play actually had no problem playing the latest games. The 220 saw a similarly huge increase. Qualcomm eased off a little with the 225 by using the same architecture but built on a smaller process and increased the clocks/ memory bandwidth resulting in 1.6x - 2x performance increase. Hence the comments of it being a lame upgrade. However when we look at the desktop gpu of the AMD 6000 series. The upgrade was less than 20% over previous year and people are universally appraising the 6970 gpu. This is how gullible fanboyism and can result in strongly skewed perception over actual results.
Click to expand...
Click to collapse
Fair enough. I suppose you're right, the Adreno 200 was already severely underpowered at launch. The 225 may not be the best, but it's still up among the top tier GPUs. I guess I have nothing to worry about. The 2GB ram is definitely nice too.
Sent from my Droid Incredible using the XDA app.
Just put this here for a reference:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The nexus one is running the Adreno 200. The htc one v with Adreno 205 is over 5X faster. The rezound has an Adreno 220. over 3X faster than the one V but also running more than 2X the resolution. The GS3 with adreno 225 is hard up against the vsync wall in he hoverjet test and about 3x faster than the rezound in the egypt test. It's amazing how much Adreno has upgraded in just 2 years. From the 2fps on the 200 to never dropping below 60fps on the 225.
Thank you this helps me make my decision too ^. Also does having a high resolution screen make graphics look better? Like NOVA 3 on my SGS2 looks awesome. All the effects are there bullet smoke and what not. So will these effects or the graphics in general look better on this sgs3 screen?
Thanks!
Your GPU will be just fine. Other posters have already shown that it is perfectly competitive with the gpu's in other top-tier phones of this generation, and most top-tier phones sold in the US since the beginning of the year have run S4 SoC's. It comes down to LTE, something the rest of the world doesn't have (for the most part, and nowhere near our level). I for one would much rather give up an absolutely world-crushing gpu than to be stuck on 3g forever.
Also, keep in mind that the Galaxy S3 is on-track to be the fastest-selling device ever. The American market is huge, and is the home to many of the major players in this industry (including of course Google themselves), not to mention that Samsung seems to want to treat the two devices (US and International versions) as one in the same. Its not like they'll want all those customers to have a gpu that'll make the phone feel old in 3 months, so I wouldn't worry.
And honestly, I don't really see Android becoming significantly more hardware-intensive anytime real soon. The current/upcoming generation of hardware can beat many people's PC's in terms of UI navigation, launching apps, and even running games. Two HUGE things Google talked about with Jellybean was introducing the PDK, and Project Butter. This shows that they recognize that some of the platform's biggest weaknesses were its slow, inconsistent updates and perceived lower performance to iOS due to the UI. From what I have seen of Jellybean in videos, I don't see much further room to speed up the UI, its already about as fast and smooth as it can get it seems. I would imagine screen resolution won't be making huge jumps in the immediate future; there'll be some 1080p screens, but I doubt people will demand that in a sub 5" device they hold a foot in front of their face, considering the negative impact on battery life.
What I'm trying to say is I don't see Android really demanding significantly more powerful hardware to keep growing, as it already runs as smooth and fast as possible on hardware we're already calling outdated. Sure, more ram and more powerful cpu's may let you multitask with more apps without slowdowns, and better gpu's can run newer games even better, but I just don't see the system itself requiring significantly better hardware than we have now for a few generations. Looks like we'll be more focused on multitasking performance, efficiency/battery life, and manufacturing costs so everyone can enjoy the kind of performance that's currently reserved for us nuts buying new top-tier phones every 8 months.
So again, no, I wouldn't worry. Sure, its not the best thing out there, and will soon be outclassed, but I don't see it handicapping Android anytime soon, especially with 2GB of RAM.
Cruiserdude said:
Your GPU will be just fine. Other posters have already shown that it is perfectly competitive with the gpu's in other top-tier phones of this generation, and most top-tier phones sold in the US since the beginning of the year have run S4 SoC's. It comes down to LTE, something the rest of the world doesn't have (for the most part, and nowhere near our level). I for one would much rather give up an absolutely world-crushing gpu than to be stuck on 3g forever.
Also, keep in mind that the Galaxy S3 is on-track to be the fastest-selling device ever. The American market is huge, and is the home to many of the major players in this industry (including of course Google themselves), not to mention that Samsung seems to want to treat the two devices (US and International versions) as one in the same. Its not like they'll want all those customers to have a gpu that'll make the phone feel old in 3 months, so I wouldn't worry.
And honestly, I don't really see Android becoming significantly more hardware-intensive anytime real soon. The current/upcoming generation of hardware can beat many people's PC's in terms of UI navigation, launching apps, and even running games. Two HUGE things Google talked about with Jellybean was introducing the PDK, and Project Butter. This shows that they recognize that some of the platform's biggest weaknesses were its slow, inconsistent updates and perceived lower performance to iOS due to the UI. From what I have seen of Jellybean in videos, I don't see much further room to speed up the UI, its already about as fast and smooth as it can get it seems. I would imagine screen resolution won't be making huge jumps in the immediate future; there'll be some 1080p screens, but I doubt people will demand that in a sub 5" device they hold a foot in front of their face, considering the negative impact on battery life.
What I'm trying to say is I don't see Android really demanding significantly more powerful hardware to keep growing, as it already runs as smooth and fast as possible on hardware we're already calling outdated. Sure, more ram and more powerful cpu's may let you multitask with more apps without slowdowns, and better gpu's can run newer games even better, but I just don't see the system itself requiring significantly better hardware than we have now for a few generations. Looks like we'll be more focused on multitasking performance, efficiency/battery life, and manufacturing costs so everyone can enjoy the kind of performance that's currently reserved for us nuts buying new top-tier phones every 8 months.
So again, no, I wouldn't worry. Sure, its not the best thing out there, and will soon be outclassed, but I don't see it handicapping Android anytime soon, especially with 2GB of RAM.
Click to expand...
Click to collapse
I see lol well that's good I don't wanna have to buy a new phone every half a year! But will the HD resolution make any of the game loft games look any better than they do on my galaxy s2 with the Mali 400 gpu? Thanks!
Sent from my SPH-D710 using XDA Premium App

[Discussion] M9+ benchmarks, real life performance experiences

All HTC M9+ owners!
We're a handful yet on XDA, but getting more and more as the device is rolling out to more locations. and people who look for an elegant, high end device with premium build quality and the extra features like fingerprint scanner and 2K display and the duo camera settle with this fantastic device. It's not the perfect for everything unfortunately, not the best gaming phone out there, but in my experience a very well performing device in terms of phone call quality, reception quality, wifi power, multimedia, battery, display panel and overall UX feel, smoothness. High end games of latest year 2015 might stutter a bit here and there or lack some important shaders to compensate, but nonetheless good for last years (2014 and before 3D gaming and all kinds of 2D games), let's gather the experience and benchmarks of this unique device which was maybe the first MTK device in alubody
Let's discuss performance related experience, real user feel/experience and benchmarks free of whining, facing the truth that in some respects it's not the top-notch device, but let the curious ones who consider to buy this device, know what to expect if choosing this elegant business class phone.
I'll start with some of my benchmarks result screenshots in separate posts.
UPDATE:
Here's my short game testing video on M9+
https://www.youtube.com/watch?v=QmLGCoI4NLw
Antutu on stock Europe base 1.61
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
My real life experience is that in everyday chores, the phone is very snappy, with some small lags when starting new apps to the memory. Task switching is quite good.
There's unfortunately some memory leak issue with 5.0.x Android version which after a while kicks in (on most 5.0.x devices as well), but overall the UX smoothness is just quite good. Occasionally there are some apps that brings up popup windows a bit stuttery, like facebook comments animation tends to be a bit stuttery.
The Sense Home/Blink Feed experience is just perfect. At normal operations, when no big application updates are happening in the background, I never faced any lags on the Sense Home UI.
As for the games, games from and before 2014 run perfectly. New ones might get some shaders removed, or with reduced polygon counts, so don't expect a jaw dropping 3D experience. If you are ok with last years (2014 and before) 3d game quality, M9+ is quite good, but latest games will be most probably running in the dumbed down mode accordingly to the PowerVR GPU the M9+ with mtk chipset has. The phone keeps a good temperature mostly, except when charging battery and at the same time playing 3D heavy games. (But that's expected from most devices, especially with alu body)
The screen quality is quite good, i've got a perfect panel with the first unit I got, and the refresh rate is 60hz, smooth and with lovely dpi and brightness of 2K.
The benchmarks show a general good performance with operations bound to the CPU. Where the MTK chip comes shorthanded is the GPU part. All tests show 3D performance that is below 2014's flagships performance by far. The GPU PowerVR 6200 is the same that was built into iPhone5s, which let's face is a mediocre GPU for a 2K display. If you face this fact and accept that gaming won't be with the highest quality textures and shaders, probably you won't be disappointed.
I'm curious what others think...
Played with it for a few hours. First impression is that it's clearly slower then the M9. The fingerprint sensor is almost perfect, a few hit and misses. I'll be back after a few days of using it with a more adequate feedback.
Sent from my HTC_M9pw using Tapatalk
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
DeadPotato said:
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
Click to expand...
Click to collapse
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
tbalden said:
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
Click to expand...
Click to collapse
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
DeadPotato said:
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
Click to expand...
Click to collapse
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
tbalden said:
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
Click to expand...
Click to collapse
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
DeadPotato said:
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
Click to expand...
Click to collapse
Second you.
X10 is MTK's first high-end tier SoC. They should equip with high-end graphics card to become well-known,not two-years ago one same with iPhone5s era. Although the most casual gaming is OK,but it is not covered with "flagship" devices.
It's excellent there were some possibility to tweak it on kernel level. But the Mediatek Helio X20 should work better,better & better on GPU part.
Not heavy gamer,the UX is pretty good for me. (Camera is another topic.)
Even I measured the CPU speed with Chess game,the result reached the top-tier.
tbalden said:
Antutu on stock Europe base 1.61
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
Click to expand...
Click to collapse
FYI.
Mine : stock , Taiwanese base 1.08,rooted device
Sigh.... the result was maxed out for my device on Ice Storm "Extreme".
Very similar result as mine was except ice storm unlimited. I think it might be related to temperature throttling, I'll test it again later.
Yeah, second run on base 1.61
Hello guys,
I live in France and i really look forward the HTC m9+ ; i thinks it is what the M9 should have been, but i don't understand HTC 2015 sh*tty choices of everything.
WHat i would like to know is about battery life ; Can you guys tell me whats about it ?
I just bought galaxy s6 edge a few days ago and it's creepy. I have no battery in the end of afternoon. I don't play games, just few photos and browsing, and messenger sometimes. And anyways i miss boomsound and metal body.
I'm so desapointed. Should never have sold my old one m8 or xperia Z3.
My only hope is turned on lenovo vibe x3 but there is no news since march ...
Thanks guys !
After installing more and more software that is synchronized, like dropbox and a few other my battery life got lower a notch, from 23+ hours to 22+ and average screen on around 4 hrs. All in all, not a miracle, but that's what I usually got with my previous phone, OPO with its larger battery and lower screen resolution. A bit less though, admittedly. So it gets me through the day nicely, but I'm not gaming.
It's average I think. Compared to the amazing Sony z3 compact it's pathetic, my friend's z3c gets 3+ days and 5 hours of screen on time. And I think it's done with some great stock kernel enchanted by Sony engineers and Sony stock rom... wish other phones could do that
Not sure why, I could not choose no lock screen and lock screen without security mode under security settings. Both options are greyed out and it says it is disabled by administrator,encryption policies or other apps.
I only have prey and lockout security installed which could affect this but this was totally fine on my M7 device.
Anyone has any idea? Thank you.
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
JellyKitkatLoli said:
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
Click to expand...
Click to collapse
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Yes, and it's fairly accurate.
I got 11871 with 2K, I couldn't screenshot as it doesn't know how to (Doesn't work above 1080p without dedicated software for it).
M8 1080p.
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
JellyKitkatLoli said:
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
Click to expand...
Click to collapse
hmmm......
The efficiency of SW decode will be performed vividly by multi-cores in your real life.
There were much comparison between MT6795(not "T") and SD801 in this site.
About the browser speed, RAR compression ,real gaming, power consumption........
http://tieba.baidu.com/p/3825199511?see_lz=1#69839534914l
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Are those results after reducing resolution to 1080p, did I read correctly? Because if that's the case, it's sad to confirm the limits of the 3D graphic card as other stated before, if you see the above M8 (which I find surprisingly higher than I experienced on my own M8 anyway) you'll notice even the last year's M8 has a better 3D score, and not to mention the regular M9 which produces almost double of that score. Multicore wise the Helio X10 is a beast, but graphics wise, looks like it's not so much

Categories

Resources