HD+ cm10.1 emmc Benchmark Comparisons - Nook HD, HD+ General

Only a few tablets on the list, but a nice comparison:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

mrpunk2u said:
Only a few tablets on the list, but a nice comparison:
Click to expand...
Click to collapse
Nice to see it schools the kindle. Really glad I got this puppy for $150!! :good:

Thank you for the benchmark!
It clearly shows that the Nook HD+ specs are spectacular. You're probably think I'm crazy right now, but if you take a look, the HD+ is below the Galaxy S2, and also below the Nexus 7 by a less than half. The Galaxy S2 pushes only 480x800 pixels on the screen, whereas the Nexus 7 pushes 1280x800 pixels (slightly higher than 720 HD), and let's not forget, the Nexus 7 is a quad-core after all.
The Nook HD+ has been able to sustain it's place, on a dual-core sporting a little higher than full HD resolution, 1920x1280. The size is also at 9", compared to the Nexus as 7" and S2 at 4.3". If a quad-core (let's say a Tegra 3, same as Nexus 7) was placed into the Nook, I'd comfortably say that it would excel the benchmarks of the Nexus 7.
The Nook's processor, RAM and gpu is working harder than a Nexus 7, or even the S2's. Nexus only sports 720 HD display, compared to the 1080 display of the Nook. I'd say the Nook HD+ with CyanogenMod 10 is a perfect replacement for high-end Android tablets.
EDIT: I wonder how the Nook HD (7") would compare on this benchmark. The HD, if my knowledge is correct, has the same processor, RAM and GPU as the Nook HD+. So it should be a fair test to compare against the Nexus. At the moment I don't have access to my Nook HD, plus it's installed on an SD card via a Hybrid setup. Once I do get a chance on it, I'll run the benchmark.

the hd+ is 1/2 the score of my s4 but sure as heck doesn't feel half as slow in actual use.

Very good benchmarks indeed and makes me pleased I went with the HD+ over the Kindle 8.9.
Seems the biggest area it loses out is on 3D graphics, but have never had a problem when it comes to 3D games etc.
Shows the difference between a benchmark and real world performance I guess.

Anyone have the stock rom benchmarks? If not, I'll download it and do it tonight. I just got mine yesterday and I haven't put CM on it yet. It would be nice to see an array of benchmarks on both stock and CM 10.1 for good comparison.

Poke_N_PDA said:
Anyone have the stock rom benchmarks? If not, I'll download it and do it tonight. I just got mine yesterday and I haven't put CM on it yet. It would be nice to see an array of benchmarks on both stock and CM 10.1 for good comparison.
Click to expand...
Click to collapse
I ran it (couple of days ago) on my unrooted stock HD+ and the score came out in excess of 9000!
I'm at work currently, will look at it more closely when I get home tonight.

Poke_N_PDA said:
Anyone have the stock rom benchmarks? If not, I'll download it and do it tonight. I just got mine yesterday and I haven't put CM on it yet. It would be nice to see an array of benchmarks on both stock and CM 10.1 for good comparison.
Click to expand...
Click to collapse
I am still on stock 2.1.0 about to make the jump to CM. I downloaded the free antutu benchmark from the market to run it. (wasnt sure what was used above because the results I got didnt have the graphic layout like what is pictured above) but the antutu benchmark I got on stock was 9737. Could this be possible that the stock is getting this over a custom ROM?

jack man said:
I am still on stock 2.1.0 about to make the jump to CM. I downloaded the free antutu benchmark from the market to run it. (wasnt sure what was used above because the results I got didnt have the graphic layout like what is pictured above) but the antutu benchmark I got on stock was 9737. Could this be possible that the stock is getting this over a custom ROM?
Click to expand...
Click to collapse
Actually, AOSP roms often get worse performance anymore due to driver optimizations on stock and compatibility issues. Most drivers aren't released open source to the community. Consequently, the developers of these ROM just have to do the best they can. For instance, the Exynos chips on many Samsung phones don't work very well on AOSP due to lack of openness by Samsung.
I'm not sure what the status of CM compatibility for the Nook is. It uses a TI SOC (System on Chip, mainly means the processor) which is an OMAP. I'm not sure how open TI is. Typically the best AOSP ROMs are for products that use the Quailcom SOCs (Snapdragon like the S4, 600, and 800).
Anyone know the status of the drivers for the Nook on CM? I've been disappointed with CM on my Note 2 and wouldn't mind knowing if the Nook HD+ has the same issues.

I got better antutu results
Not sure what the difference is but I got 9871 on my Nook HD+ CM10.1 with a Sandisk Class 10 MicroSD card. If I don't put the MicroSD card in I got 95xx the highest and 87xx the lowest. (Antutu score varies for some reason.)
Attached is the screenshot. Note this is without modification and based on the default setting of 396Mhz->1500Mhz, CPU governor is interactive and IO is cfq. (all default settings.)

Poke_N_PDA said:
Actually, AOSP roms often get worse performance anymore due to driver optimizations on stock and compatibility issues. Most drivers aren't released open source to the community. Consequently, the developers of these ROM just have to do the best they can. For instance, the Exynos chips on many Samsung phones don't work very well on AOSP due to lack of openness by Samsung.
I'm not sure what the status of CM compatibility for the Nook is. It uses a TI SOC (System on Chip, mainly means the processor) which is an OMAP. I'm not sure how open TI is. Typically the best AOSP ROMs are for products that use the Quailcom SOCs (Snapdragon like the S4, 600, and 800).
Anyone know the status of the drivers for the Nook on CM? I've been disappointed with CM on my Note 2 and wouldn't mind knowing if the Nook HD+ has the same issues.
Click to expand...
Click to collapse
TI has Omapzoom which I've heard is very open and a great resource for devs (that was said by top CM maintainers for other devices)

View92612 said:
Not sure what the difference is but I got 9871 on my Nook HD+ CM10.1 with a Sandisk Class 10 MicroSD card. If I don't put the MicroSD card in I got 95xx the highest and 87xx the lowest. (Antutu score varies for some reason.)
Attached is the screenshot. Note this is without modification and based on the default setting of 396Mhz->1500Mhz, CPU governor is interactive and IO is cfq. (all default settings.)
Click to expand...
Click to collapse
I get 9891 with no sdcard inserted.
I guess the test results just fluctuate too much all by themselves

madsquabbles said:
the hd+ is 1/2 the score of my s4 but sure as heck doesn't feel half as slow in actual use.
Click to expand...
Click to collapse
Quad core makes a much bigger impact in benchmarks than in actual use.
My Galaxy Tab 2 felt pretty much just as fast as my Nexus 7 in actual use. And that is weaker than the Nook HD (ridiculously low res too)
drakester09 said:
TI has Omapzoom which I've heard is very open and a great resource for devs (that was said by top CM maintainers for other devices)
Click to expand...
Click to collapse
OMAP is one of the most open SoCs out there.

My first reactions to this device after using both stock and CM10.1?
It seems the screen resolution is a bit much for this GPU to push. Unless there's some tweaking to be done to smooth the UI out.

Saw some video in youtube, seems lag
Is it real ? Since the video date is not latest.
Anyone can upload for reference. Thinking to buy it.
Thanks

Just chiming in, antutu benchmark = 9731 w/ cm10.1 emmc 6/24

My Nook tested 11066 on cm 10.1 6/30 build stock. Fwiw, my s3 on euroskank cm10.1 only tested 10504. The Nook hd is peppy!

aigochamaloh said:
My Nook tested 11066 on cm 10.1 6/30 build stock. Fwiw, my s3 on euroskank cm10.1 only tested 10504. The Nook hd is peppy!
Click to expand...
Click to collapse
Nook HD is just NookD+ with smaller screen (but same cpu and everything else).
And you can see how smaller screen (= less pixels to push around) favorably affects framerates in the screen benchmarks, it's those values that in the end result in higher overall number (the rest of the numbers are the same between HD ad HD+ - both memory bandwidth, cpu speed and sdcard io speed).
In particular in the 3D benchmark with a ninja fighting scene you can see that in Nook HD framerates remain in 10-17 fps range, and in Nook HD+ they are in 5-11 range.

verygreen said:
Nook HD is just NookD+ with smaller screen (but same cpu and everything else).
And you can see how smaller screen (= less pixels to push around) favorably affects framerates in the screen benchmarks, it's those values that in the end result in higher overall number (the rest of the numbers are the same between HD ad HD+ - both memory bandwidth, cpu speed and sdcard io speed).
In particular in the 3D benchmark with a ninja fighting scene you can see that in Nook HD framerates remain in 10-17 fps range, and in Nook HD+ they are in 5-11 range.
Click to expand...
Click to collapse
Yep that makes perfect sense. Loving this tablet, and thank you for your hard work!
Individual benchmark numbers are higher for every category compared to my s3 as well like cpu and all that jazz.
Double checked and my s3 only tested 10100 actually.

The OMAP 4470 isn't a terrible performer.
http://www.anandtech.com/show/6158/the-archos-101-xs-review/3

Related

Poor Graphics Performance on Galaxy Tab 10.1

Really frustrating to see that THD versions of games are not as frame rate heavy as they are on the Xoom.
Check out the video I put up to see what I mean.
http://www.youtube.com/watch?v=pfNJ8kIblR8
What do you guys think?
I'm still not entirely convinced that this thing is a tegra 2. has anyone confirmed this?
Anyway, I have noticed some significant lag a lot of times. I have a feeling it might just be a software issue though because some things are very smooth, e.g. carousel in music app.
I don't have the hd version of fruit ninja and I don't really want to buy it since I already paid for the non-hd version. Can you try running logcat with it? Look for any SurfaceFlinger. I wish these things had grep...
Edit: if you don't have the adb drivers, you should be able to get them here
Edit2: This must be a tegra device. there are tons of files like tegra_gpio, tegra_rpc, etc on there.
Yeah, it has to be. But I haven't seen it referred to as Tegra anywhere other than by the official NVIDIA Tegra twitter. But, they never mentioned Tegra2.. which they always mention. Hmmm..
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
oxeneers said:
Yeah, it has to be. But I haven't seen it referred to as Tegra anywhere other than by the official NVIDIA Tegra twitter. But, they never mentioned Tegra2.. which they always mention. Hmmm..
Click to expand...
Click to collapse
Run Quadrant and see if it says Ventana under CPU Hardware?
Sent from my HTC Eva 4G using Tapatalk
Jocelyn said:
Run Quadrant and see if it says Ventana under CPU Hardware?
Sent from my HTC Eva 4G using Tapatalk
Click to expand...
Click to collapse
Yes please do, or $cat /proc/cpuinfo or Android System Info -> System -> Expand CPU. If its like the other dev/carrier/test unit posted in this area it will say "p3" which is anybodys guess as to what that is (might be Samsung packaged Tegra special stuff they bought from nVidia.) Or......crazy unfounded theory here:
Tegra 3.....
I owned a xoom and it also bogged down.
Jocelyn said:
Run Quadrant and see if it says Ventana under CPU Hardware?
Sent from my HTC Eva 4G using Tapatalk
Click to expand...
Click to collapse
here you go
http://www.androidpolice.com/2011/0...rizon-samsung-galaxy-tab-10-1-android-tablet/
I am really sad to see this
but did you see others having the same problem ? maybe your tab is faulty
I hope the final version will be better
Edit: btw I see Xoom is lagging in games after 3.1 update :S
Can this actually play videos? Or do we have the same playback issues the xoom is plagued with?
Sent from my Nexus S using XDA App
kponti said:
Can this actually play videos? Or do we have the same playback issues the xoom is plagued with?
Sent from my Nexus S using XDA App
Click to expand...
Click to collapse
Play videos from what, the browser via Flash or YouTube app? What do you mean..
xavierdylan said:
here you go
http://www.androidpolice.com/2011/0...rizon-samsung-galaxy-tab-10-1-android-tablet/
Click to expand...
Click to collapse
His remarks on the screen are insulting. Side by side with the iPad 2 the galaxy tab is clearly brighter and more vibrant and that's saying alot because the ipad screen is damn good. I had the xoom, that screen is awful, well not awful it just I was spoiled by the iPad screen.
As for tegra, yes it is, how do I know? It sucks at 720p high profile videos ..player helps out a little bit
Sent from my GT-P7510 using XDA Premium App
kponti said:
Can this actually play videos? Or do we have the same playback issues the xoom is plagued with?
Sent from my Nexus S using XDA App
Click to expand...
Click to collapse
I suspect this will still have all the same video play back issues and dramas that the Xoom, Transformer, and other Tegra 2 devices have.
bobdude5 said:
His remarks on the screen are insulting. Side by side with the iPad 2 the galaxy tab is clearly brighter and more vibrant and that's saying alot because the ipad screen is damn good. I had the xoom, that screen is awful, well not awful it just I was spoiled by the iPad screen.
As for tegra, yes it is, how do I know? It sucks at 720p high profile videos ..player helps out a little bit
Sent from my GT-P7510 using XDA Premium App
Click to expand...
Click to collapse
Yeah, I like this line from the article...
"It won’t win any kind of awards for its brightness or crispness – unfortunately, it’s just LCD, rather than AMOLED"
Just LCD? It's certainly not as bright and crisp as all those other AMOLED tab.... oh wait, there are no AMOLED tablets.
I'm also disappointed it only has wifi instead of a flux capacitor. So old school.
Samsung is known for their amoled screens in their android phones.I'm on one now what with my little vibrant and it is staggeringly beautiful compared to most regular lcds. I'm personally still holding out and hoping a tablet with a similarly wonderful screen will eventually appear, as its hard to go back after using this
Ravynmagi said:
Yeah, I like this line from the article...
"It won’t win any kind of awards for its brightness or crispness – unfortunately, it’s just LCD, rather than AMOLED"
Just LCD? It's certainly not as bright and crisp as all those other AMOLED tab.... oh wait, there are no AMOLED tablets.
Click to expand...
Click to collapse
The guys comments in the review about the screen are rubbish. it much better than the xoom. the colours are much more vibrant and the images a lot more crisp! He is either comparing it to the amoled phone screen or he is an apple fanboy.
I own a Galaxy Tab 10.1v, in Quadrant it says for CPU:
Name: ARMv7 Processor rev0 (v7l)
Current freq.: 216MHz
Max freq: 1000 MHz
Min freq: 216MHz
Cores: 2
Architecture: 7
BogoMIPS: 999.42
Hardware: P3
Revision: 0
Serial #: 000000000000000000
For GPU it says:
Vendor: NVIDIA Corporation
Renderer: NVIDIA AP
Version: OpenGL ES-CM 1.1
Max texture units: 2
Max texture size: 2048
Max lights: 8
Not sure what that means... Does that mean it's a Tegra2 or not?
EDIT: I just watched your video, your device seems to have tons of more lag than mine does. I have not lag at all playing Fruit Ninja HD... I have to say though, I rooted my device, I found it really laggy at starts, so I went into SetCPU to check if my device was underclocked or locked. I found out that the scaling was set to Interactive, I changed this to OnDemand which was superior in my experience... After doing that I saw a noticable difference in the menu's and playing Angry Birds...
When I used to play Angry Birds and pulled a bird back before launch it would have the same lag that you have in your video's... Now it's silky smooth
I have a google i/o one...
cat /proc/cpuinfo says:
Processor : ARMv7 Processor rev 0 (v7l)
processor : 0
BogoMIPS : 1998.84
processor : 1
BogoMIPS : 1998.84
Features : swp half thumb fastmult vfp edsp vfpv3 vfpv3d16
CPU implementer : 0x41
CPU architecture: 7
CPU variant : 0x1
CPU part : 0xc09
CPU revision : 0
Hardware : p3
Revision : 000b
Serial : 32479030325.....
Click to expand...
Click to collapse
oxeneers said:
Really frustrating to see that THD versions of games are not as frame rate heavy as they are on the Xoom.
Check out the video I put up to see what I mean.
http://www.youtube.com/watch?v=pfNJ8kIblR8
What do you guys think?
Click to expand...
Click to collapse
This video is rubbish. First off, what Android version is he running, and what is he running in the background. Android 3.1 made my Xoom seem 25% faster. On my Xoom, if I have alot of items running in the background and I don't use task killer or close the apps manually like nes, snes, emulators, etc, some games rarely will bog down. Notice the video is zoomed in and doesnt show the task bar.
This guy could be a inferior pad2, I mean ipad2 fan, so beware and ignore this article.
DensoYamaha41 said:
This video is rubbish. First off, what Android version is he running, and what is he running in the background. Android 3.1 made my Xoom seem 25% faster. On my Xoom, if I have alot of items running in the background and I don't use task killer or close the apps manually like nes, snes, emulators, etc, some games rarely will bog down. Notice the video is zoomed in and doesnt show the task bar.
This guy could be a inferior pad2, I mean ipad2 fan, so beware and ignore this article.
Click to expand...
Click to collapse
Haha, dude.. you are way off point. I am running 3.0 on the 10.1 tab, OBVIOUSLY, since that's the only version out for it right now. SINCE IT'S NOT EVEN OUT YET.
And I absolutely love the tablet. I am just wondering if there's something wrong. Nothing is running in the background. I am just wondering why the hell there's any lag. I also own a Xoom. So, relax.
There's a reason I've been going to I/O ever year for the last 3 years and you were sitting at home crying.
oxeneers said:
Haha, dude.. you are way off point. I am running 3.0 on the 10.1 tab, OBVIOUSLY, since that's the only version out for it right now. SINCE IT'S NOT EVEN OUT YET.
And I absolutely love the tablet. I am just wondering if there's something wrong. Nothing is running in the background. I am just wondering why the hell there's any lag. I also own a Xoom. So, relax.
There's a reason I've been going to I/O ever year for the last 3 years and you were sitting at home crying.
Click to expand...
Click to collapse
Sorry, I didnt meant any disrespect. So the Samsung doesnt have 3.1 yet? My Xoom updated itself last Friday to 3.1. It made a huge difference in performance. Give us an update when the Samsung is running 3.1.
Also has anybody seen a video or pics of the Samsung 10.1 next the to iPad 2? My friend keeps rubbing it in on how much better the iPad2 screen is to the Xoom. I cant wait to get the Samsung.

SGS2(mali400) beats iPad2(sgx543) in GLbenchmark

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I check the GLbenchmark website just now,found that the result of i9100 is now public.
Captured the pictures for your convinience.
If the benchmark only runs at native resolution then this is useless because the iPad has a much higher resolution, therefore it would perform worse. Of course if it runs at a fixed resolution I couldn't be more happy.
I know maybe it's a little bit unfair,it runs at native resolution,but still impressive.
And I did a little calculation with other model's results,the 25% more pixels leads to 12.5% less frames.
iPad 2 have 100% more pixels than SGSII.
(all proximate value)
1.25*1.25*1.25= 2;
0.875*0.875*0.875 =0.67;
So if SGSII has a iPad 2 resolution,the benchmark would be one third less.
Well for me it seems that sgx543 has better performance. But good thing is that it wont be that much.
Sent from my GT-I9000 using Tapatalk
enzografix said:
Well for me it seems that sgx543 has better performance. But good thing is that it wont be that much.
Sent from my GT-I9000 using Tapatalk
Click to expand...
Click to collapse
I don't think performance is really that important, tbh. It's all about the kind of games that will be developed for the platforms. For example, even though the original SGS with its sgx540 is FAR better than the iphone 4's sgx535, the quality of games available and its graphics are much MUCH better on the iphone 4. I have both the gulf in class between the two when it comes to the quality of the gaming library is absolutely enormous, and only the tegra 2 phones have done something to bridge this gap for android phones.
One can only imagine the kind of awesome games that will be developed for that beastly sgx543 for the iphone 5. I just hope that the majority of the games will also find their way on the android market and the SGS2 (and not tegra zone), with its ever growing popularity because the mali definitely has the capability to run those games well.
Personally, i feel that the next iteration of the Xperia Play will be the ultimate device for hardcore gamers, beating even the iOS devices at the time. The current one is the ultimate emulation phone thanks to its brilliant gamepad, and runs all oldschool PSX, N64, SNES etc. games with aplomb - and if you're into that kind of thing then it's an absolute joy to behold.
No, most of the scores i've seen and done myself are around 40-44 fps for Mali, NOT 50 fps, I'm not sure what they've done to get over 50 fps but that's not what the phone normally scores.
Also theres a resolution difference.
EDIT: Just rechecked it's on Top not Average, when looking at benchmark numbers you need to look at the average framerate.
It's actually 46.7 fps or the iPad 2 and 40.2 fps for the Galaxy S II.
omersak said:
I don't think performance is really that important, tbh. It's all about the kind of games that will be developed for the platforms. For example, even though the original SGS with its sgx540 is FAR better than the iphone 4's sgx535, the quality of games available and its graphics are much MUCH better on the iphone 4. I have both the gulf in class between the two when it comes to the quality of the gaming library is absolutely enormous, and only the tegra 2 phones have done something to bridge this gap for android phones.
One can only imagine the kind of awesome games that will be developed for that beastly sgx543 for the iphone 5. I just hope that the majority of the games will also find their way on the android market and the SGS2 (and not tegra zone), with its ever growing popularity because the mali definitely has the capability to run those games well.
Personally, i feel that the next iteration of the Xperia Play will be the ultimate device for hardcore gamers, beating even the iOS devices at the time. The current one is the ultimate emulation phone thanks to its brilliant gamepad, and runs all oldschool PSX, N64, SNES etc. games with aplomb - and if you're into that kind of thing then it's an absolute joy to behold.
Click to expand...
Click to collapse
If the xperia play has good quality N64 roms, my SGS 2 is going back to mr orange
Honestly benchmarks mean nothing if there's NO apps.
and take it from me, my iOS library is over 600 purchased apps and games, and I've got around 20 purchased games on android.
That's a huge difference, I literally KNOW of every game almost in existence I get them RIGHT away when they come out. I know the very vast differences in choices from Android gaming to iOS.
and I always miss my iOS games when i'm out (android phone) it's a good thing my iPad 2 gives me the Fix I need. =D
for you iOS gamers out there, seriously try out a game called "Battleheart" It will devour your soul... freaking epic game...
MaxxiB said:
If the xperia play has good quality N64 roms, my SGS 2 is going back to mr orange
Click to expand...
Click to collapse
You'll be amazed: http://www.youtube.com/watch?v=qYodquYXArs
tbh every android phone can run n64 games well i guess, what sets the play apart is that there aren't any ugly and cumbersome touch controls taking up the screen.
But the fact remains that the SGS2, apart from the above, does everything else SO much better!
How many cores?
Can someone confirm if the Mali400 used in GS2 using multi core? I believe I read somewhere that Samsung decided to use only a single core for the GS2.
I think you can find some game pad for android on sale in the near future,since Gingerbread 2.3.4 supports Open Accessory API.
So you can connect to a pad when you play games,and get rid of it when you don't.
Based on some of those scores, it looks like it's hitting the 60fps cap
Even at higher res the exynos scores great, check the hard kernel benches =) their tab uses the same/similar chip
Sent from my SGH-T959 using Tapatalk
Bear in mind that the GPU in mobiles tend to clocked lower than that in tablets to keep the power consumption down. So diffcicult to do an apples for apples comparison (excuse the pun) in this case. Will have to wait for iPhone 5 numbers.
rd_nest said:
Can someone confirm if the Mali400 used in GS2 using multi core? I believe I read somewhere that Samsung decided to use only a single core for the GS2.
Click to expand...
Click to collapse
What i have read is that it uses a 2 core model. The 4 core model is headed for the Sony NGP.
Papi4baby said:
What i have read is that it uses a 2 core model. The 4 core model is headed for the Sony NGP.
Click to expand...
Click to collapse
That's the PowerVR SGX543 you're thinking of. The iPad 2 uses the dual-core variant, the NGP will be using the quad-core variant.
The GS II has a quad-core Mali-400. Anything less, and it would never attain the results it's currently getting on GLBenchmark.
ph00ny said:
Based on some of those scores, it looks like it's hitting the 60fps cap
Click to expand...
Click to collapse
The Pro test that's the case but it's not wit the Eqypt test, max is around 50 fps so it never even reaches 60 fps.

[TEST] Tegra 2 OC vs Tegra 3

Hello guys! Walking around the net I found some diagrams that describe the benchmarks of Tegra 2 stock vs. 3 relative to other processors. I always wondered if now that the processor is overclocked to 1.7GHz (70% more power) has reduced the distances from the future quad-core (which we remember to be 5 times faster than current SoC as Nvidia has said).
Some developers can make this comparison? It would be interesting to see the benefits
"when looking at the compiler versions and settings used to compile the CoreMark benchmark, the Core 2 Duo numbers were produced via GCC 3.4 and only the standard set of optimizations (-O2), while the Tegra 3 numbers were run on a more recent GCC 4.4 with aggressive optimizations (-O3). Il Sistemista website took a Core 2 Duo T7200 and re-ran the benchmark compiled with GCC 4.4 and the same optimization settings. The results were no longer in favor of NVIDIA, as the Core 2 chip scored about 15,200 points, compared to the Tegra's 11,352."
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
CoreMark benchmark comparing nVidia Tegra 3 @1GHz clock to various real and hypothetical products
CoreMark/MHz index shows how much Coremarks can a particular chip extract given its frequency
WoW....More powerful than Core2Due O_O!!!
Now THATS something to get excited about!!!
Ahmed_PDA2K said:
WoW....More powerful than Core2Due O_O!!!
Now THATS something to get excited about!!!
Click to expand...
Click to collapse
if you read the bold they fixed compiler optimizations and the Tegra did not hold up
At this point it would be really interesting to know how much you have reduced the gap with the super-overclock to 1.7GHz of Tegra (which is really the maximum achievable?). I don't know how to operate the CoreMark 1.0 so I ask someone in the community for check if values ​​are underestimated or substantially improved compared to those offered by Nvidia and especially to see if at this point Tegra 3 can really be worth .
I recently read that Nvidia is pushing a lot of their projects and have already popped the development of Tegra 4. The specifications include a 4 Tegra chip manufacturing to 28Nm. IMHO it would be more correct to wait for this to update the Tegra 2 hardware version (so these benchmarks could be a way to understand this in this context).
I highly doubt many people can get their tegra 2 up to 1.7, I can only get mine stable at 1.4, probably 1.5 and maybe 1.6 if I spent a lot of time messing with the voltages. I think I read somewhere that 1.7ghz produces a lot of heat, like almost in the danger zone but I could be exaggerating it a little.
Sent from my ADR6300 using Tapatalk
In any case it would be nice to make a comparative improvement in all frequencies compared to those values ​​offered by Nvidia, to give an idea of who has the device the hardware's potential. But I think it's a matter of development and optimization as we are seeing here on the forum these days ... the tablet is slowly improving on all fronts
I'm sure that once we have access to the OS code the tf will run like a beast! I had an OG Droid and the difference between stock and modded was mind blowing.
Sent from my ADR6300 using Tapatalk
brando56894 said:
I highly doubt many people can get their tegra 2 up to 1.7, I can only get mine stable at 1.4, probably 1.5 and maybe 1.6 if I spent a lot of time messing with the voltages. I think I read somewhere that 1.7ghz produces a lot of heat, like almost in the danger zone but I could be exaggerating it a little.
Sent from my ADR6300 using Tapatalk
Click to expand...
Click to collapse
I'm one of the few with one that does 1.7GHz no problem.
Its not only the cpu speed of tegra3 that will pwn the transformer1, a 12core gpu and support for dynamic lightning , high profile hd decoding and much more. No way a tegra2 will outperform the tegra3.
Just face it: transformer1 will be pwned big time by transformer2, launched okt/nov 2011....
Not to mention the new Ti cpus coming q1 2012.... they will pwn even more then tegra3...
Sent from my Desire HD using XDA App
Tempie007 said:
Its not only the cpu speed of tegra3 that will pwn the transformer1, a 12core gpu and support for dynamic lightning , high profile hd decoding and much more. No way a tegra2 will outperform the tegra3.
Just face it: transformer1 will be pwned big time by transformer2, launched okt/nov 2011....
Not to mention the new Ti cpus coming q1 2012.... they will pwn even more then tegra3...
Sent from my Desire HD using XDA App
Click to expand...
Click to collapse
Yeah it is true that is stronger, but my thesis is that Tegra 3 is only a technology shift. Although it has 12 computing cores capable of dynamically processing the lights, this CPU produces these values ​​and it would be interesting to relate them to the values ​​of the Tegra 2 overclocked and see if indeed the Tegra 3 can really be a solution to replace Tegra 2. We are faced with two different SoC, the first a dual core, the second a quad core, but probably, common sense tells me that if well exploited this device could give a long hard time to the next model (I remember the story of HTC HD2, which was released back in 2009 and today is one of the longest-running phones through a rearrangement of the software every day). I argue that there is no need for raw power in these devices but the fineness of the calculation, since the batteries are not as performance and putting more capacity probabily they can't make the devices more streamlined in near future.
Someone knows how to use CoreMark to update these values​​?
Of course the tegra two will beat the crap out of the tegra 3, thats like comparing a core 2 duo to a core i7 lol
chatch15117 said:
I'm one of the few with one that does 1.7GHz no problem.
Click to expand...
Click to collapse
Lucky bastard! :-D how many mV are you running it at?
Never tried 1.7Ghz but have done 1.5-1.6 on 3 tabs without messing with voltages. Acutally right now running 1.2Ghz messing with LOWERING the voltages.
Coremark download
If anyone can compile Coremark for run under Android here there is the software to download:
Download the Coremark Software readme file
Download the CoreMark Software documentation
This documentation answers all questions about porting, running and score reporting
Download the Coremark Software
Download the CoreMark Software MD5 Checksum file to verify download
Use this to verify the downloads: >md5sum -c coremark_<version>.md5
Download the CoreMark Platform Ports
These CoreMark ports are intended to be used as examples for your porting efforts. They have not been tested by EEMBC, nor do we guarantee that they will work without modifications.
Here there are result's table:
http://www.coremark.org/benchmark/index.php
If a DEV can compile and Run some test for comparing the results with the new frequencys would be great!
brando56894 said:
I highly doubt many people can get their tegra 2 up to 1.7, I can only get mine stable at 1.4, probably 1.5 and maybe 1.6 if I spent a lot of time messing with the voltages. I think I read somewhere that 1.7ghz produces a lot of heat, like almost in the danger zone but I could be exaggerating it a little.
Sent from my ADR6300 using Tapatalk
Click to expand...
Click to collapse
There is no danger zone.. at least not on my kernel. I haven't posted the one for 3.1 (only 3.01), but there is a built in down clock once the chip reaches temps beyond 47degC... This only happens if the tablet (in laptop form) is left in a bag with the screen on (by accident, sometimes the hinge says its open when its closed).. So if you can run 1.7ghz stable, theres no reason not to run it.. as you'll be protected from heat by the built in thermal throttling.
Default thermal throttling had some seriously bizzare ranges.. 80-120degC..
I wish nvidia's boards were produced like this:
http://hardkernel.com/renewal_2011/main.php
http://www.hardkernel.com/renewal_2011/products/prdt_info.php?g_code=G129705564426
Something modular would be a nice change in the mobile tech segment.. Swapping SoCs would require some serious low-level work tho.. Unless writing to the bootloader and boot partitions were made easy via external interface.. something like that.
Blades said:
There is no danger zone.. at least not on my kernel. I haven't posted the one for 3.1 (only 3.01), but there is a built in down clock once the chip reaches temps beyond 47degC... This only happens if the tablet (in laptop form) is left in a bag with the screen on (by accident, sometimes the hinge says its open when its closed).. So if you can run 1.7ghz stable, theres no reason not to run it.. as you'll be protected from heat by the built in thermal throttling.
Default thermal throttling had some seriously bizzare ranges.. 80-120degC..
I wish nvidia's boards were produced like this:
http://hardkernel.com/renewal_2011/main.php
http://www.hardkernel.com/renewal_2011/products/prdt_info.php?g_code=G129705564426
Something modular would be a nice change in the mobile tech segment.. Swapping SoCs would require some serious low-level work tho.. Unless writing to the bootloader and boot partitions were made easy via external interface.. something like that.
Click to expand...
Click to collapse
And so? are you good to use CoreMark to run some bench on your device and compare with Tegra 3 results? coz here i would test this
devilpera64 said:
"when looking at the compiler versions and settings used to compile the CoreMark benchmark, the Core 2 Duo numbers were produced via GCC 3.4 and only the standard set of optimizations (-O2), while the Tegra 3 numbers were run on a more recent GCC 4.4 with aggressive optimizations (-O3).
Click to expand...
Click to collapse
That statement there tells why benchmarks are extremely useless and misleading.
im one of the few that can reach 1.7 ghz no problem to. never ran it long enough to get hot tho. maybe 30 mins just to run benchmarks. never had fc's or have my system crash

GLBenchmark HTC ONE only 34 FPS @ Egypt HD 1080P Offscreen

HTC ONE GLBenchmark only scores 34 FPS at 1080P offscreen, this is much lower than the Samsung SHV-E300s which scores 41.3 FPS, both using Snapdragon 600, in the same test. IRC HTC One is using LPDDR2 RAM, so are we seeing a lack of bandwidth compared to the Samsung which may use LPDDR3, which is supported by the S600.
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
how do you know HTC One uses LPDDR2 memory
kultus said:
how do you know HTC One uses LPDDR2 memory
Click to expand...
Click to collapse
http://www.htc.com/uk/smartphones/htc-one/#specs
http://www.anandtech.com/show/6754/hands-on-with-the-htc-one-formerly-m7/2
Turbotab said:
HTC ONE GLBenchmark only scores 34 FPS at 1080P offscreen, this is much lower than the Samsung SHV-E300s which scores 41.3 FPS, both using Snapdragon 600, in the same test. IRC HTC One is using LPDDR2 RAM, so are we seeing a lack of bandwidth compared to the Samsung which may use LPDDR3, which is supported by the S600.
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
Click to expand...
Click to collapse
My first question would be is how they even got a benchmark of the SHV-E300?
Xistance said:
My first question would be is how they even got a benchmark of the SHV-E300?
Click to expand...
Click to collapse
How do any results appear on GLbenchmark?
I believe with GLBenchmark, that if you don't register / login before running the test, it automatically uploads to their server for public viewing, so maybe it was done intentionally, or somebody forgot to login?
fp581 said:
he is spamming all around the htc one just look at his posts plz ban him from posting in any htc forum ever again.
he probably works in sony nokia or samsung
Click to expand...
Click to collapse
Who are you talking about?
sorry wrong person i'll delete that lest one.
but i would love pics of that benchmark for proof
fp581 said:
sorry wrong person i'll delete that lest one.
but i would love pics of that benchmark for proof
Click to expand...
Click to collapse
Dude I was going to go atomic, I admit it I have a terrible temper
I believe the benchmark was run by a German Android site, called Android Next, there is a video on Youtube, the GLBenchmark starts at 2.22
http://www.youtube.com/watch?v=Wl1dmNhhcXs&list=UUan0vBtcwISsThTNo2uZxSQ&index=1
thanks turbo for advanced my knoledge...what a shame they didnt choose LPDDR3 but i think its nt issue these days
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Maedhros said:
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Click to expand...
Click to collapse
GLBenchmark is a test of GPU performance, and isn't really changed by CPU clockkspeed, but it is affected by bandwidth.
As a test, I downclocked my Nexus 7 from an overclocked 1.6 GHz to just 1.15 GHz, I ran GLBench and got 10 FPS. I then ran at the test again but with CPU at 1.6 GHz, the result, 10 FPS again.
I've benched the N7 with both CPU & GPU overclocked to the same level as Transformer Infinity, which gets 13 FPS, but I always get 10 FPS, the reason my N7 has lower memory bandwidth than the Transformer Infinity, because it use slower RAM and thus has less bandwidth. That is a difference of 30% in FPS, just because of lower bandwidth.
I read that LPDDR3 starts at 800 MHz or 12.8 GB/s in dual-channel configuration, whereas LPDDR2 maxs at 533 MHz or 8.5 GB/s max bandwidth in dual-channel configuration.
Turbotab said:
GLBenchmark is a test of GPU performance, and isn't really changed by CPU clockkspeed, but it is affected by bandwidth.
As a test, I downclocked my Nexus 7 from an overclocked 1.6 GHz to just 1.15 GHz, I ran GLBench and got 10 FPS. I then ran at the test again but with CPU at 1.6 GHz, the result, 10 FPS again.
I've benched the N7 with both CPU & GPU overclocked to the same level as Transformer Infinity, which gets 13 FPS, but I always get 10 FPS, the reason my N7 has lower memory bandwidth than the Transformer Infinity, because it use slower RAM and thus has less bandwidth. That is a difference of 30% in FPS, just because of lower bandwidth.
I read that LPDDR3 starts at 800 MHz or 12.8 GB/s in dual-channel configuration, whereas LPDDR2 maxs at 533 MHz or 8.5 GB/s max bandwidth in dual-channel configuration.
Click to expand...
Click to collapse
In that case the results are quite disappointing.
All these fantastic new phones, and so much disappointment.
Sent from my GT-I9300 using xda premium
Tomatoes8 said:
They could have used faster memory for the same price if they didn't cut off Samsung as a supplier. Makes you wonder where their priorities lie. Making the best products possible or just going with the motions.
Click to expand...
Click to collapse
No one is going to take anything you say here seriously, as you've managed to have 2 threads closed in the last 30 mins. One of those inane posts you made involved you saying that HTC is going to be paying, according to your genius calculation, 20% of their profits to Apple (I forget what insanely unintelligent reason you gave). Yeah, because being able to completely migrate data from 1 completely different phone to another is such a bad idea for a company that wants to push their product.
So, what is the per unit cost of what HTC is paying for RAM now vs. what they could have gotten from Samsung? Exactly, you have no idea. I also didn't hear anything about HTC "cutting off" Samsung as a supplier, but maybe I missed it, so I google'd "htc cut off samsung supplier" and found 2 links...
http://tech2.in.com/news/smartphones/following-apple-htc-cuts-component-orders-from-samsung/505402
http://www.digitimes.com/news/a20121009PD213.html
I'm not sure if you have the capability of reading or not, but I'll spoon feed you this information, ok hunny? I've taken the info from the 1st link, since there is more there.
After Apple Inc slashed its orders for memory chips for its new iPhone from major supplier and competitor, Samsung Electronics Co Ltd, HTC too has reportedly cut down on its smartphone component orders from the South Korean company.
Click to expand...
Click to collapse
So, Apple cut down on memory orders. You know, they are the one's who make the iPhone? Have a logo of an Apple on their products? Steve Jobs was the CEO before he died. Anyway, I'll continue...
According to a report by DigiTimes, HTC has reduced its orders from Samsung, and instead opted to order CMOS image sensors from OmniVision and Sony. The company has also chosen to move part of its AMOLED panel orders to AU Optronics, DigiTimes reported citing ‘sources’.
Click to expand...
Click to collapse
Notice it said that HTC reduced its orders from Samsung, specifically on the image sensors (that's for the camera, if you didn't know) and the screen. You know, the thing on the front of your phone that you touch to make it do things? You know what I mean, right? I encourage you to read this link (or possibly have someone read it to you)...
http://dictionary.reference.com/browse/reduce
The point is that reduce isn't the same as cut off. Cutting off would require HTC not ordering ANYTHING from Samsung. Guess what? The One doesn't use an OmniVision CMOS sensor (don't forget, that's what the camera uses) or an AMOLED screen (the bright part of your phone that shows you info).
Also, this is a far better designed phone, especially in regards to hardware, than anything Samsung has ever produced. I went back to my EVO 4G LTE, mainly because I couldn't stand the terrible build quality of the Note 2. It just feels like a cheap toy. And, IMO, Sense is far better than TW. Samsung may have the market right now because of the Galaxy line of products, but that doesn't mean that HTC is out of the game by any means.
Seriously, attempt to use just a bit of intelligence before opening your mouth and spewing diarrhea throughout the One forums. As the saying goes: "it's better to keep your mouth shut and have people think you're an idiot, then to open your mouth and prove it". Unfortunately for you, it's too late.
I really think Turbo was too hasty to open a new thread for this as we've been discussing this in the mega thread
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
It scores 34fps in Egypt HD 1080p offscreen, while the leaked Samsung s600 device socres 41fps which is perfectly inline with Qualcomm's promised speed (3x Adreno 225)
here is a video of what I suspect the source of the benchmark, because we had no benchmark before it
http://www.youtube.com/watch?v=Wl1dmNhhcXs
notice how the battery is almost at end (HTC bar at this stage means its in the last 25%) also notice the activity in the notification area
more important the post ran more than a few full benchmarks, like quadrant before running GL benchmark, this alone is enough to lower the score, especially since Adreno 320 was known to throttle in the Nexus 4
I think benchmarks scores should not be relied on in such events, especially with hundreds of hands messing with the device, we have learned from the One X launch where videos poped up showing horrible performance from the One X, eventually turned out to be were very far from the final device in ur hands
finally both the One X and Nexus 7 at the same gpu clock, but the first is DDR2 and the second is DDR3, score the same in GL Benchmark
in other words its worrying but it's best to wait for proper testers like Anand
Thread cleaned
...from some serious trolling. There should be no trace from him for some time .
but remember:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
But...
I just wonder that a Samsung phone uses high end parts from Qualcomm instead of Samsungs processors. But I am not in Samsung devices so far, so I would not judge this
Gz
Eddi
Here's a second video also showing Egypt off screen bench at 34FPS.
https://www.youtube.com/watch?v=wijp79uCwFg
Skip to 3:30
Maedhros said:
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Click to expand...
Click to collapse
So you're saying that 200mhz o the CPU can account for 7 fps on a GPU test?
Following what you said, the Nexus 4 should have scored 27 fps? Since it has 200mhz less...
But no, it scored 33.7...only 0.3 fps less than the One!
And you know why? First both use the same GPU (and it's what counts for a graphic test) and second the HTC phones are always slower due to Sense!
So stop *****ing and realize that the One is no god phone
Samsung device is running 4.2.1

[Discussion] M9+ benchmarks, real life performance experiences

All HTC M9+ owners!
We're a handful yet on XDA, but getting more and more as the device is rolling out to more locations. and people who look for an elegant, high end device with premium build quality and the extra features like fingerprint scanner and 2K display and the duo camera settle with this fantastic device. It's not the perfect for everything unfortunately, not the best gaming phone out there, but in my experience a very well performing device in terms of phone call quality, reception quality, wifi power, multimedia, battery, display panel and overall UX feel, smoothness. High end games of latest year 2015 might stutter a bit here and there or lack some important shaders to compensate, but nonetheless good for last years (2014 and before 3D gaming and all kinds of 2D games), let's gather the experience and benchmarks of this unique device which was maybe the first MTK device in alubody
Let's discuss performance related experience, real user feel/experience and benchmarks free of whining, facing the truth that in some respects it's not the top-notch device, but let the curious ones who consider to buy this device, know what to expect if choosing this elegant business class phone.
I'll start with some of my benchmarks result screenshots in separate posts.
UPDATE:
Here's my short game testing video on M9+
https://www.youtube.com/watch?v=QmLGCoI4NLw
Antutu on stock Europe base 1.61
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
My real life experience is that in everyday chores, the phone is very snappy, with some small lags when starting new apps to the memory. Task switching is quite good.
There's unfortunately some memory leak issue with 5.0.x Android version which after a while kicks in (on most 5.0.x devices as well), but overall the UX smoothness is just quite good. Occasionally there are some apps that brings up popup windows a bit stuttery, like facebook comments animation tends to be a bit stuttery.
The Sense Home/Blink Feed experience is just perfect. At normal operations, when no big application updates are happening in the background, I never faced any lags on the Sense Home UI.
As for the games, games from and before 2014 run perfectly. New ones might get some shaders removed, or with reduced polygon counts, so don't expect a jaw dropping 3D experience. If you are ok with last years (2014 and before) 3d game quality, M9+ is quite good, but latest games will be most probably running in the dumbed down mode accordingly to the PowerVR GPU the M9+ with mtk chipset has. The phone keeps a good temperature mostly, except when charging battery and at the same time playing 3D heavy games. (But that's expected from most devices, especially with alu body)
The screen quality is quite good, i've got a perfect panel with the first unit I got, and the refresh rate is 60hz, smooth and with lovely dpi and brightness of 2K.
The benchmarks show a general good performance with operations bound to the CPU. Where the MTK chip comes shorthanded is the GPU part. All tests show 3D performance that is below 2014's flagships performance by far. The GPU PowerVR 6200 is the same that was built into iPhone5s, which let's face is a mediocre GPU for a 2K display. If you face this fact and accept that gaming won't be with the highest quality textures and shaders, probably you won't be disappointed.
I'm curious what others think...
Played with it for a few hours. First impression is that it's clearly slower then the M9. The fingerprint sensor is almost perfect, a few hit and misses. I'll be back after a few days of using it with a more adequate feedback.
Sent from my HTC_M9pw using Tapatalk
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
DeadPotato said:
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
Click to expand...
Click to collapse
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
tbalden said:
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
Click to expand...
Click to collapse
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
DeadPotato said:
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
Click to expand...
Click to collapse
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
tbalden said:
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
Click to expand...
Click to collapse
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
DeadPotato said:
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
Click to expand...
Click to collapse
Second you.
X10 is MTK's first high-end tier SoC. They should equip with high-end graphics card to become well-known,not two-years ago one same with iPhone5s era. Although the most casual gaming is OK,but it is not covered with "flagship" devices.
It's excellent there were some possibility to tweak it on kernel level. But the Mediatek Helio X20 should work better,better & better on GPU part.
Not heavy gamer,the UX is pretty good for me. (Camera is another topic.)
Even I measured the CPU speed with Chess game,the result reached the top-tier.
tbalden said:
Antutu on stock Europe base 1.61
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
Click to expand...
Click to collapse
FYI.
Mine : stock , Taiwanese base 1.08,rooted device
Sigh.... the result was maxed out for my device on Ice Storm "Extreme".
Very similar result as mine was except ice storm unlimited. I think it might be related to temperature throttling, I'll test it again later.
Yeah, second run on base 1.61
Hello guys,
I live in France and i really look forward the HTC m9+ ; i thinks it is what the M9 should have been, but i don't understand HTC 2015 sh*tty choices of everything.
WHat i would like to know is about battery life ; Can you guys tell me whats about it ?
I just bought galaxy s6 edge a few days ago and it's creepy. I have no battery in the end of afternoon. I don't play games, just few photos and browsing, and messenger sometimes. And anyways i miss boomsound and metal body.
I'm so desapointed. Should never have sold my old one m8 or xperia Z3.
My only hope is turned on lenovo vibe x3 but there is no news since march ...
Thanks guys !
After installing more and more software that is synchronized, like dropbox and a few other my battery life got lower a notch, from 23+ hours to 22+ and average screen on around 4 hrs. All in all, not a miracle, but that's what I usually got with my previous phone, OPO with its larger battery and lower screen resolution. A bit less though, admittedly. So it gets me through the day nicely, but I'm not gaming.
It's average I think. Compared to the amazing Sony z3 compact it's pathetic, my friend's z3c gets 3+ days and 5 hours of screen on time. And I think it's done with some great stock kernel enchanted by Sony engineers and Sony stock rom... wish other phones could do that
Not sure why, I could not choose no lock screen and lock screen without security mode under security settings. Both options are greyed out and it says it is disabled by administrator,encryption policies or other apps.
I only have prey and lockout security installed which could affect this but this was totally fine on my M7 device.
Anyone has any idea? Thank you.
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
JellyKitkatLoli said:
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
Click to expand...
Click to collapse
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Yes, and it's fairly accurate.
I got 11871 with 2K, I couldn't screenshot as it doesn't know how to (Doesn't work above 1080p without dedicated software for it).
M8 1080p.
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
JellyKitkatLoli said:
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
Click to expand...
Click to collapse
hmmm......
The efficiency of SW decode will be performed vividly by multi-cores in your real life.
There were much comparison between MT6795(not "T") and SD801 in this site.
About the browser speed, RAR compression ,real gaming, power consumption........
http://tieba.baidu.com/p/3825199511?see_lz=1#69839534914l
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Are those results after reducing resolution to 1080p, did I read correctly? Because if that's the case, it's sad to confirm the limits of the 3D graphic card as other stated before, if you see the above M8 (which I find surprisingly higher than I experienced on my own M8 anyway) you'll notice even the last year's M8 has a better 3D score, and not to mention the regular M9 which produces almost double of that score. Multicore wise the Helio X10 is a beast, but graphics wise, looks like it's not so much

Categories

Resources