SGSIII Mali 400 Drivers on the note! - Galaxy Note GT-N7000 General

The folks at the HTC Sensation/EVO 3D section extracted the Adreno 225 drivers from the HTC One S, as some of you may know that the Adreno 225 is the same as the Adreno 220 GPU but just have double the frequency! the frequency has nothing to do here if you ask, using these drivers gave them a HUGE performance boost with the STOCK frequency
as we know that the Mali 400 GPU at the SGSIII is clocked at 400mhz but even if you clocked your Mali 400 GPU in your Note (which has the same Resolution) you wont be able to reach that performance which tells me that its all about the drivers just like the Adreno 225
So can the Developers extract the Mali 400 Drivers from the SGSIII so we can use it on our phones?
This is not a question so i think it belongs to here not the Q/A section as its just a discussion if this is going to work or not!

Same driver, bigger screen = performance loss.
That is why Sammy set CPU 200 Mhz faster on Note over S2.

Screen has NOTHING to do with anything the Resolution does, which is the same in the SGSIII and the Note
Also that's why i said if you overclock the GPU to 400mhz you still wont reach that performance so it has to do with the Drivers

The note and SGSIII do indeed have different different screen resolutions, the Note being at 1280x800, while the SGSIII is at 1280x720. not much of a difference though, basically 16:10 vs 16:9, respectively. I believe the new Mali400 Drivers will be in the next ROM update anyway.

Hell Guardian said:
The folks at the HTC Sensation/EVO 3D section extracted the Adreno 225 drivers from the HTC One S, as some of you may know that the Adreno 225 is the same as the Adreno 220 GPU but just have double the frequency! the frequency has nothing to do here if you ask, using these drivers gave them a HUGE performance boost with the STOCK frequency
as we know that the Mali 400 GPU at the SGSIII is clocked at 400mhz but even if you clocked your Mali 400 GPU in your Note (which has the same Resolution) you wont be able to reach that performance which tells me that its all about the drivers just like the Adreno 225
So can the Developers extract the Mali 400 Drivers from the SGSIII so we can use it on our phones?
This is not a question so i think it belongs to here not the Q/A section as its just a discussion if this is going to work or not!
Click to expand...
Click to collapse
Well , if they are exactly the same just different clock speeds then I would think they should work indeed.
This is interesting and I certainly hope it does , not that at 400mhz or even less, the GPU is lacking but who does not like more performance for free?

Muskie said:
The note and SGSIII do indeed have different different screen resolutions, the Note being at 1280x800, while the SGSIII is at 1280x720. not much of a difference though, basically 16:10 vs 16:9, respectively. I believe the new Mali400 Drivers will be in the next ROM update anyway.
Click to expand...
Click to collapse
I know that but that deference is not major by any mean to effect the performance that much is they are both have the same frequency
shaolin95 said:
Well , if they are exactly the same just different clock speeds then I would think they should work indeed.
This is interesting and I certainly hope it does , not that at 400mhz or even less, the GPU is lacking but who does not like more performance for free?
Click to expand...
Click to collapse
My thoughts exactly, If they folks at the Sensation did it, why can't we?

Link of the Drivers that got extracted from the One S
http://forum.xda-developers.com/showthread.php?t=1643472
Just check the replies to see the performance boost, This is the EXACT same situation as the Note and the SGSIII GPU

Wow, that's a good boost.

nex7er said:
Wow, that's a good boost.
Click to expand...
Click to collapse
I think if the Note users can have that kind of boost on their phones that will eliminate ANY kind of lag in the UI and it i will be Amazingly smooth it will also give huge boost to the SGSII users

if this really happened and it does work, what about the battery-life... can be poorer i think

In theory, I see where you're going with this, and in theory it sounds plausible. However, something that I think has been overlooked is the process design of the new S3's chipset vs the ones found in the current generation S2/Note (45nm vs 32nm). It's entirely possible that the only reason why Samsung is able to run the Mali-400 at 400mhz is due to the fact that the 32nm process is just that much more efficient, such that you can safely run at 400mhz using the same power as you would running at 266mhz on the 45nm process.
I just get the feeling that trying to push the 45nm process up to 400mhz might simply melt the silicon (or at least gobble your battery life in one gulp!). Call me defeatist if you have to, but I remain skeptical until I see evidence to the contrary.

I run my galaxy nexus with the GPU clocked to 512mhz (standard is 308mhz), and that cpu too uses the 45nm process.
Been running it like that for the last 3 months with no issue, and game fps is greatly improved.
Is there any kernels at all that even support over clocking the GNote gpu?

Very interesting, Would like to see this being investigated further for sure!

screen has nothing to do with it...on note we got 100k more pixels 1280x800-1280x720=100k
,,, and s3 has more cores in the mali-gpu...but yea i think the drivers would get us more performance

lyp9176 said:
if this really happened and it does work, what about the battery-life... can be poorer i think
Click to expand...
Click to collapse
The sg s3 seems to have a decent battery life

resistant said:
screen has nothing to do with it...on note we got 100k more pixels 1280x800-1280x720=100k
,,, and s3 has more cores in the mali-gpu...but yea i think the drivers would get us more performance
Click to expand...
Click to collapse
After some digging I found that the GPU In Exinos 4210(SGS2/Note) and 4412 (SGS3) is absolutely the same Mali 400MP4 (same number of GPU cores)! The only difference is that the 4412 GPU Can Go up to 400MHz (which is doable to our GPU too and have been done to the SGS2 already). The main difference here are the four CPU cores that help the GPU. I'm skeptical that the new drivers will do much (if at all) in terms of performance! Oh and lets not forget that the Adreno GPU Drivers are written by Qualcomm and they can't do anything right so the updated drivers may just be better written (or at least less buggier) than the old ones!

Manya3084 said:
I run my galaxy nexus with the GPU clocked to 512mhz (standard is 308mhz), and that cpu too uses the 45nm process.
Been running it like that for the last 3 months with no issue, and game fps is greatly improved.
Click to expand...
Click to collapse
It has been proved to make very little improvement over a well developed kernal. Hence why developers like Franco and imyosen took it out.
Game frame rate is simply due to force gpu being active
Sent from my GT-I9300 using xda premium

Mahoro.san said:
The sg s3 seems to have a decent battery life
Click to expand...
Click to collapse
That is due to the new processor voltage and the low idle drain of the CPU
Sent from my GT-I9300 using xda premium

GR36 said:
It has been proved to make very little improvement over a well developed kernal. Hence why developers like Franco and imyosen took it out.
Game frame rate is simply due to force gpu being active
Sent from my GT-I9300 using xda premium
Click to expand...
Click to collapse
This during kernel development in the gingerbread days or the new current ics kernels?
Sent from my Galaxy Nexus using Tapatalk 2

May be...
Clocking the GPU at 400Mhz would give a boost in performance but at the cost of battery life....and also making the phone really hot....which is not idle...just wait a little while and see how will s3 perform under those conditions...

Related

Galaxy S SGX540 GPU. Any details up 'till now?

Hi everyone
For quite a long time i've been thinking about the whole "galaxy s can do 90mpolys per second" thing.
It sounds like total bull****.
So, after many, many hours of googling, and some unanswered mails to imgtec, i'd like to know-
Can ANYONE provide any concrete info about the SGX540?
From one side i see declerations that the SGX540 can do 90 million polygons per second, and from the other side i see stuff like "Twice the performance of SGX530".
...but twice the performance of SGX530 is EXACTLY what the SGX535 has.
So is the 540 a rebrand of the 535? that can't be, so WHAT THE HELL is going on?
I'm seriously confused, and would be glad if anyone could pour light on the matter.
I asked a Samsung rep what the difference was and this is what I got:
Q: The Samsung Galaxy S uses the SGX540 vs the iPhone using the SGx535. The only data I can find seems like these two GPU's are very similar. Could you please highlight some of the differences between the SGX535 and the SGX540?
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
I also tried getting in contact with ImgTec to find out an answer, but I haven't received a reply back. It's been two weeks now.
Also, the chip is obviously faster than snapdragon with the adreno 200 gpu. I don't know if Adreno supports TBDR, I just know it's a modified Xenon core. Also, Galaxy S uses LPDDR2 ram. So throughput is quite a bit faster, even though it's not *as* necessary with all the memory efficiencies between the Cortex A8 and TBDR on the SGX540.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
i think that is the cue, for cost saving for Samsung
besides who will need a 2D Accelerator, with a CPU as fast as it's already.
The HTC Athena (HTC Advantage) failed miserably at adding the ATI 2D Accelerator which no programmers were able to take advantage of, in the end the CPU did all the work.
I'd imagine its a 535 at 45nm. Just a guess, the cpu is also 45nm
Having tried a few phones the speed in games is far better, much better fps though there is a problem that we might have to wait for any games to really test its power as most are made to run on all phones.
This was the same problem with the xbox and ps2, the xbox had more power but the ps2 was king and so games were made with its hardware in mind which held back the xbox, only now and then did a xbox only game come out that really made use of its power....years later xbox changed places which saw 360 hold the ps3 back (dont start on which is better lol) and the ps3 has to make do with 360 ports but when it has a game made just for it you really get to see what it can do...anywayits nice to know galaxy is future proof game wise and cannot wait to see what it can do in future or what someone can port on to it.
On a side note I did read that the videos run through the graphics chip which is causing blocking in dark movies (not hd...lower rips) something about it not reading the difference between shades of black, one guy found a way to turn the chip off and movies were all good, guess rest of us have to wait for firmware to sort this.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
smart move sammy
voodoochild2008-
I wouldn't say we'd have to wait so much.
Even today, snapdragon devices don't do very well in games, since their fillrate is so low (133Mpixels)
Even the motorola droid (SGX530 at 110mhz, about 9~ Mpoly's and 280~ Mpixels with that freq) fares MUCH better in games, and actually, runs pretty much everything.
So i guess the best hardware is not yet at stake, but weaker devices should be hitting the limit soon.
bl4ckdr4g00n- Why the hell should we care? I don't see any problem with 2D content and/or videos, everything flies at lightspeed.
well I can live in hope, and I guess apples mess (aka the iphone4) will help now as firms are heading more towards android, I did read about one big firm in usa dropping marketing for apple and heading to android, and well thats what you get when you try to sell old ideas...always made me laugh when the first iphone did like 1meg photo when others were on 3meg, then it had no video when most others did, then they hype it when it moves to a 3meg cam and it does video.....omg, ok I am going to stop as it makes my blood boil that people buy into apple, yes they started the ball rolling and good on them for that but then they just sat back and started to count the money as others moved on.................oh and when I bought my galaxy the website did say "able to run games as powerfull as the xbox (old one) so is HALO too much to ask for lol
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
he 535 is a downgrade from the 540. 540 is the latest and greatest from the PowerVR line.
Samsung did not cost cut, they've in fact spent MORE to get this chip on their Galaxy S line. No one else has the 540 besides Samsung.
Like i said, its probably just a process shrink which means our gpu uses less power and is possibly higher clocked.
p.s. desktop gfx haven't had 2d acceleration for years removing it saves transistors for more 3d / power!
This worries me as well... Seems like it might not be as great as what we thought. HOWEVER again, this is a new device that might be fixed in firmware updates. Because obviously the hardware is stellar, there's something holding it back
Pika007 said:
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
Click to expand...
Click to collapse
http://www.slashgear.com/droid-x-review-0793011/
"We benchmarked the DROID X using Quadrant, which measures processor, memory, I/O and 2D/3D graphics and combines them into a single numerical score. In Battery Saver mode, the DROID X scored 819, in Performance mode it scored 1,204, and in Smart mode it scored 963. In contrast, the Samsung Galaxy S running Android 2.1 – using Samsung’s own 1GHz Hummingbird CPU – scored 874, while a Google Nexus One running Android 2.2 – using Qualcomm’s 1GHz Snapdragon – scored 1,434. "
The N1's performance can be explained by the fact it's 2.2...
But the Droid X, even with the "inferior" GPU, outscored the Galaxy S? Why?
gdfnr123 said:
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
Click to expand...
Click to collapse
Same here. I want to know which one is has the better performance as well.
Besides that. Does anyone know which CPU is better between Dorid X and Galaxy S?
I knew that OMAP chip on the original Droid can overclock to 1.2Ghz from what, 550Mhz?
How about the CPU on Droid X and Galaxy S? Did anyone do the comparison between those chips? Which can overclock to a higher clock and which one is better overall?
Sorry about the poor English. Hope you guys can understand.
The CPU in the DroidX is a stock Cortex A8 running at 1GHz. The Samsung Hummingbird is a specialized version of the Cortex A8 designed by Intrinsity running at 1Ghz.
Even Qualcomm does a complete redesign of the Cortex A8 in the snapdragon cpu at 1GHz. But while the original A8 could only be clocked at 600Mhz with a reasonable power drain, the striped down versions of the A8 could be clocked higher while maintaining better power.
An untouched Cortex A8 can do more at the same frequencies compared to a specialized stripped down A8.
If anything the Samsung Galaxy S is better balanced, leveraging the SGX 540 as a video decoder as well. However, the Droid X should be quite snappy in most uses.
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Pika007 said:
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Click to expand...
Click to collapse
The SGS might be falling behind in I/O speeds... It is well known that all the app data is stored in a slower internal SD-card partition... Has anyone tried the benchmarks with the lag fix?
Also, if only android made use of the GPU's to help render the UI's... It's such a shame that the GPU only goes to use in games...
Using the GPU to render the UI would take tons of battery power.
I preffer it being a bit less snappy, but a whole lot easier on the battery.
thephawx said:
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
Click to expand...
Click to collapse
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
TexUs said:
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
Click to expand...
Click to collapse
I think so. The battery will be the biggest issue for the smart phone in the future if it just remain 1500mAh or even less.
The dual-core CPU could be fast but power aggressive as well.

Tegra 2 overclocking?

Any info out there about this baby overclocked? Will standard overclocking tools work or does new software need to be devloped?
To overclock the cpu I think you'd need a custom kernel that allows it first. But if the bootloader is locked then custom kernels can't be flashed.
You won't have to worry about performance issues with tegra 2 for while though .
As if you needed to run Crysis on it?
Tough crowd this morning!
This site is here for getting the most out of devices. Rooting and removing bloatware increases performance. Customized ROMS increase perfomance and user experience. I merely asked about another tool for optimizing a device.
bee55 said:
To overclock the cpu I think you'd need a custom kernel that allows it first. But if the bootloader is locked then custom kernels can't be flashed.
You won't have to worry about performance issues with tegra 2 for while though .
Click to expand...
Click to collapse
Haha,don't underestimate the people who hang out at XDA and other dev sites, we find ways to work these phones to the bone. I know for myself I will have probably 100 apps downloaded and installed in the first 24 hours, and will be testing its limits.
You have the best cpu in a phone ever and you want to over clock. Wow. Why?
Sent from my SAMSUNG-SGH-I897 using Tapatalk
snapdragon was the best @ one time and most roms had overclock built in!
Snapdragon is the worst CPU for 1ghz. Even the TI OMAP is better than Qualcomm. The main reason wont buy anymore HTC phones is because of Qualcomm and there ****ty performance in phone in comparison to Samsung, TI, and now Nvidia.
Recon Freak said:
snapdragon was the best @ one time and most roms had overclock built in!
Click to expand...
Click to collapse
Sent from my SAMSUNG-SGH-I897 using Tapatalk
Hence why he said 'at one time'.
Sent from my SGH-I897 using XDA App
AllTheWay said:
Snapdragon is the worst CPU for 1ghz. Even the TI OMAP is better than Qualcomm. The main reason wont buy anymore HTC phones is because of Qualcomm and there ****ty performance in phone in comparison to Samsung, TI, and now Nvidia.
Sent from my SAMSUNG-SGH-I897 using Tapatalk
Click to expand...
Click to collapse
Snapdragon is far from being the worst CPU, clock for clock. First of all, Snapdragon is not a CPU, is a SoC (System on a Chip), and the CPU core inside Snapdragon is called Scorpion. Scorpion is neither a standard ARM Cortex A8 nor A9 core unlike the CPU core inside the Hummingbird/TI OMAP/Nvidia Tegra. But it can be thought of as among the same class as Cortex A8 CPUs. The Scorpion has some big advantage over standard Cortex A8 core in some areas (e.g. floating point). The reason why many found the first generation (in Nexus One and HTC Desire) to be "slow" was that they look only at composite benchmark like Quadrant and/or 3D games. The first generation of Snapdragon has a rather dated GPU (Adreno 200) in it, and Adreno 200's 3D performance is honestly, bad. The second generation Snapdragon (Desire Z/G2, Desire HD) uses a much faster GPU, Adreno 205, making the Snapdragon 3D performance on par with Hummingbird and other current generation SoC.
So before you go again saying Snapdragon is the slowest "CPU", go do some reading, and think, before saying. Here is some good reading for you:
http://www.anandtech.com/show/4144/...gra-2-review-the-first-dual-core-smartphone/4
http://www.anandtech.com/show/4165/the-motorola-atrix-4g-preview/5
AllTheWay said:
Snapdragon is the worst CPU for 1ghz. Even the TI OMAP is better than Qualcomm. The main reason wont buy anymore HTC phones is because of Qualcomm and there ****ty performance in phone in comparison to Samsung, TI, and now Nvidia.
Sent from my SAMSUNG-SGH-I897 using Tapatalk
Click to expand...
Click to collapse
if you blindly trust benchmarks the Scorpion CPU in the 2nd gen snapdragons are quite fast... my G2 benchmarks at...
Quadrant: 2,700ish
Linpack: 52.69
Sunspider:2,257
Neocore:57
infact, all of those benchmarks either match, or surpass the Atrix 4G.
No problems here with my snapdragon 1Ghz. linpacks constant 42+
Now that the phone is rooted can we use setCPU to underclock it so to save battery.
Or does setcpu not support dual core.
Also is what I said above true. if we have root we can underclock without putting custom kernels.
The nvidia tegra 2 kernel does not have a simple method to modify the CPU freq table. The dev working on the gtablet kernel would be a good resource to ask, his name is Pershoot. From my understanding he would have to backport the original ARM scaling which is not trivial in the least.
Maybe someone can figure out another way.
tsekh501 said:
As if you needed to run Crysis on it?
Click to expand...
Click to collapse
Actually yeah, and who wouldn't? That's probably enough to get you instantly laid in some countries.
Arkasai said:
Actually yeah, and who wouldn't? That's probably enough to get you instantly laid in some countries.
Click to expand...
Click to collapse
Serious bragging rights right there.
Guy 1: "Damnit, I just got Crysis 2, and I can't even run Crysis 1 on my computer."
Guy 2: "Yeah well I can run it on my cell phone...look."
Guy 1's Girlfriend: "Take me, now, Guy 2!."
You get the picture.
Sorry to go off-topic there. But I do have a question. Isn't the Tegra 2 ARM9 based? And there's nothing wrong with wanting to push a device to it's limits. Overclocking is fun.
dandmcd said:
Haha,don't underestimate the people who hang out at XDA and other dev sites, we find ways to work these phones to the bone. I know for myself I will have probably 100 apps downloaded and installed in the first 24 hours, and will be testing its limits.
Click to expand...
Click to collapse
lol same here. I have about 45 installed on my Galaxy Tab and all of them will be installed on the Atrix immediately and tested. I plan on testing every single game I can find on the market lol biggest being Dungeon Defenders for now...runs a bit slow on the Galaxy Tab and I've heard on Tegra2 it runs *GREAT*.
AllTheWay said:
You have the best cpu in a phone ever and you want to over clock. Wow. Why?
Sent from my SAMSUNG-SGH-I897 using Tapatalk
Click to expand...
Click to collapse
Because you can make it better. Why settle for less? My captivate is fast and does everything I need it to do at 1ghz but I have it at 1.3 now; and under volted.
Why? Because it is better.
Captivate 2.2.1 Paragon
Is there a simple way to backup all the apps installed on my phone so I can just dump them instantly into a new phone? Preferably without having to hit "install" for every app on the market.
wow, its a dual core processor and you want OC... ugh, get out... lol

[Q] low clock speed's

any reason for the 1.15ghz CPU speed and 400(ish)MHz gpu speed other than cost? or do you think they underclocked to save the battery? hoping we can over clock to t30l speeds
foxorroxors said:
any reason for the 1.15ghz CPU speed and 400(ish)MHz gpu speed other than cost? or do you think they underclocked to save the battery? hoping we can over clock to t30l speeds
Click to expand...
Click to collapse
Deffently clocked to increase battery and reduce heat
Sent from my GT-N7000 using xda premium
No need to worry. developers will get this tablet to at least 1.5ghz or more. overclck tweaks for transformer prime should work on this also. all it'll need is root
Do we really need to overclock this? I mean I probably will anyways but a 1.3 Quad is pretty zippy by itself!
As the tegra 3's gpu compared to say the galaxy s3 (international) is fairly weak, I only hope we can OC the GPU by enough to make a difference. I am not that bothered to about OCing the cpu but I do care about the GPU
miketoasty said:
Do we really need to overclock this? I mean I probably will anyways but a 1.3 Quad is pretty zippy by itself!
Click to expand...
Click to collapse
True, even at 1.0 ghz it'll do fine with most games..
I underclock my S2 to 1.0 ghz and i experienced no hiccups whatsoever.. and I'm still on dual core
Sent from my GT-I9100 using Tapatalk 2
Questions go in the Q&A section
foxorroxors said:
any reason for the 1.15ghz CPU speed and 400(ish)MHz gpu speed other than cost? or do you think they underclocked to save the battery? hoping we can over clock to t30l speeds
Click to expand...
Click to collapse
The Tegra 3 used in the Nexus 7 is a version of the Tegra 3 chip that didn't work within guidelines at the regular speeds, but were within guidelines for a lower speed. This is done regularly in Intel/AMD CPUs as well. That's why there are different speed CPUs in the same model family. This way they can sell the high speed CPUs at a higher cost and still make money off the CPUs that can't run as fast. Eventually the process to make the chips will be so efficient that they will artificially lower the speeds to sell as the cheaper version and that's when you can overclock like crazy and not have instability (if the CPU product cycle lasts that long).
http://en.wikipedia.org/wiki/Product_binning
Outrager said:
The Tegra 3 used in the Nexus 7 is a version of the Tegra 3 chip that didn't work within guidelines at the regular speeds, but were within guidelines for a lower speed. This is done regularly in Intel/AMD CPUs as well. That's why there are different speed CPUs in the same model family. This way they can sell the high speed CPUs at a higher cost and still make money off the CPUs that can't run as fast. Eventually the process to make the chips will be so efficient that they will artificially lower the speeds to sell as the cheaper version and that's when you can overclock like crazy and not have instability (if the CPU product cycle lasts that long).
http://en.wikipedia.org/wiki/Product_binning
Click to expand...
Click to collapse
This suggests Nexus 7 probably won't OC so well. Which wouldn't surprise or disappoint me. It appears Asus dropped a lot of little features to keep cost down(which I think is a good move), and using CPU s that didn't bin well is one good way to keep cost low.
i777 w/ Siyah 3.4.3 dual booting AOKP and Shostock... yet sent from my iPad using Forum Runner

CPU/Processor Showdown - HTC One vs Galaxy S4

Which processow will be better, Exynos 5 Octa or A simple Snapdragon 600 quad?
In my POV, Octa will be useless since it will be a battery hog and no apps really use that much cores and power. The S600 will be more efficient for day-to-day use since it consumes less power and will actually be used.
-.-.-.-.-.-.-.-.-
Sent from a dark and unknown place
Galaxy Tab 2 7.0 P3100
I thought the s4 had the same processor as the One, but it was clocked to 1.9? I could be wrong. I wasn't really paying attention.
Sent from my HTC One using Tapatalk 2
I'd imagine this thread will get closed.
In the meantime, read this thread and then make a judgement because the "it uses more power so it sucks" mentality is just simply incorrect.
[Info] Exynos Octa and why you need to stop the drama about the 8 cores
AndreiLux said:
Misconception #1: Samsung didn't design this, ARM did. This is not some stupid marketing gimmick.
Misconception #2: You DON'T need to have all 8 cores online, actually, only maximum 4 cores will ever be online at the same time.
Misconception #3: If the workload is thread-light, just as we did hot-plugging on previous CPUs, big.LITTLE pairs will simply remain offline under such light loads. There is no wasted power with power-gating.
Misconception #4: As mentioned, each pair can switch independently of other pairs. It's not he whole cluster who switches between A15 and A7 cores. You can have only a single A15 online, together with two A7's, while the fourth pair is completely offline.
Misconception #5: The two clusters have their own frequency planes. This means A15 cores all run on one frequency while the A7 cores can be running on another. However, inside of the frequency planes, all cores run at the same frequency, meaning there is only one frequency for all cores of a type at a time.
Click to expand...
Click to collapse
Addition: I am not a Samsung fanboy by any means, however, the amount of incorrect information floating around about both of these flagships is starting to get annoying.
2nd addition: Read this as well, the big.LITTLE technology being used in the Octa is pretty amazing: big.LITTLE Processing
I hope that the overclocking or higher clock rate doesn't produce Moment-esque results.
Alsybub said:
I thought the s4 had the same processor as the One, but it was clocked to 1.9? I could be wrong. I wasn't really paying attention.
Sent from my HTC One using Tapatalk 2
Click to expand...
Click to collapse
In the US that is true, they are both S600's, with the S4 having a .2ghz higher clockspeed. Many of the other S4's will have the Octa Exynos chip.
crawlgsx said:
In the US that is true, they are both S600's, with the S4 having a .2ghz higher clockspeed. Many of the other S4's will have the Octa Exynos chip.
Click to expand...
Click to collapse
Ah. I see. Different hardware for different regions. Like the One X.
Even though it's eight cores it is probably complete overkill. Yet another bigger number to put on marketing. How many apps will actually use that? How many apps use four cores at the moment?
There have been some articles about multiple cores being more for point of sale than for the end user. Even if you're signing up for a contract right now I doubt that much would be making use of it in two years time. So, the future proofing argument is moot.
It'll be interesting to see. Of course the galaxy builds of Android will use the cores. With things like the stay awake feature and pip it is useful. Outside of the OS I can't see it being necessary.
Sent from my Transformer Prime TF201 using Tapatalk HD
The "octa" core processor is complete bullsh*t. Imo, 2/4 cores are perfectly fine as long as they optimize it and perfect the hardware, why stack 8 cores when only 4 work at one time and no app will use all that power.
They should've focused on design to make it look less like a toy phone and use better finish, instead.
Oh the marketing..
Not HTC or whatever fanboy, just stating my opinion.
rotchcrocket04 said:
I'd imagine this thread will get closed.
In the meantime, read this thread and then make a judgement because the "it uses more power so it sucks" mentality is just simply incorrect.
[Info] Exynos Octa and why you need to stop the drama about the 8 cores
Addition: I am not a Samsung fanboy by any means, however, the amount of incorrect information floating around about both of these flagships is starting to get annoying.
2nd addition: Read this as well, the big.LITTLE technology being used in the Octa is pretty amazing: big.LITTLE Processing
Click to expand...
Click to collapse
Very good read, thanks for taking the time to post it. Surprised no-one has mentioned that we need this in our Ones. Would certainly help with the battery.
Saying its a 8 core cpu is marketing simply put.
Like it has been said only 4 out of 8 cores will only ever be enabled at once max.
The GPU on the Octa might be better then the Adreno 320 but its have to wait for benchmarks.
Nekromantik said:
Saying its a 8 core cpu is marketing simply put.
Like it has been said only 4 out of 8 cores will only ever be enabled at once max.
The GPU on the Octa might be better then the Adreno 320 but its have to wait for benchmarks.
Click to expand...
Click to collapse
Benchmarks show adreno320 keeps up nicely. You won't see any real world differences besides a slightly lower benchmark score
http://forum.xda-developers.com/showthread.php?t=2191834
Sent from my ADR6425LVW using xda app-developers app
Squirrel1620 said:
Benchmarks show adreno320 keeps up nicely. You won't see any real world differences besides a slightly lower benchmark score
http://forum.xda-developers.com/showthread.php?t=2191834
Sent from my ADR6425LVW using xda app-developers app
Click to expand...
Click to collapse
Those are from the S600 version.
Higher clock speed and Android 4.2 will mean its slightly ahead.
No benchmarks from the Octa version yet.
Nekromantik said:
Those are from the S600 version.
Higher clock speed and Android 4.2 will mean its slightly ahead.
No benchmarks from the Octa version yet.
Click to expand...
Click to collapse
I'll just stick with the one and wait for the 4.2 update. By then we should have custom kernels to overclock ourselves
Sent from my ADR6425LVW using xda app-developers app
Here you go
Nekromantik said:
Saying its a 8 core cpu is marketing simply put.
Like it has been said only 4 out of 8 cores will only ever be enabled at once max.
The GPU on the Octa might be better then the Adreno 320 but its have to wait for benchmarks.
Click to expand...
Click to collapse
"Octa" is not gimmicky or for marketing.
Octa is the name of the SoC, and how it was named is nothing wrong
There are 3 implementations can be used, and one with maximum 8 cores running at the same time.
GS4 doesn't use that impletations, but it does not mean the SoC cannot be "Octa". You have a house with 8 rooms but you know to open or you wanna open 4 rooms only, the house is still an 8-room house.
hung2900 said:
"Octa" is not gimmicky or for marketing.
Octa is the name of the SoC, and how it was named is nothing wrong
There are 3 implementations can be used, and one with maximum 8 cores running at the same time.
GS4 doesn't use that impletations, but it does not mean the SoC cannot be "Octa". You have a house with 8 rooms but you know to open or you wanna open 4 rooms only, the house is still an 8-room house.
Click to expand...
Click to collapse
How do you know all 8 can run at the same time? Has Samsung demonstrated that already? Any links?
Also what would be the speed if all 8 are running at the same time?
Also did you see that an Intel dual core @2GHz beat the Exynos Octa in benchmarks!!! So all 8 cores running at slower speed might not be very good actually. It might even slow down things even more...
We recently demonstrated a dual core running at 3GHz at MWC in Barcelona. That chip was able to load games at crazy speeds. A game that took 15s to load on existing Exynos Quad core was loading in just 6s with our chip!
joslicx said:
We recently demonstrated a dual core running at 3GHz at MWC in Barcelona. That chip was able to load games at crazy speeds. A game that took 15s to load on existing Exynos Quad core was loading in just 6s with our chip!
Click to expand...
Click to collapse
. And used 3 times the energy to do it... Was that tested at all?
backfromthestorm said:
. And used 3 times the energy to do it... Was that tested at all?
Click to expand...
Click to collapse
Its all about bragging rights really. Same as Samsung is doing with regards to Octa.
The the chip that could run at 3GHz could also very well run at 1GHz at just 0.6V (so consuming far lesser power than anything else in the market). A dual core at 1GHz is still good enough for all mundane tasks like playing videos or internet browsing etc. So in practice it would have been a very efficient solution. It was a real innovation really. Sadly the company did not have money to pour more funds into the program and has shut it.
It was demonstrated at Mobile World Congress in Barcelona in february this year.
Anyway point is, we did not need extra set of power efficient cores like Samsung is doing. We ran the same cores that could do crazy high speeds and even crazier power efficient mode! Thats a very neat solution.
Heres a press link: http://www.itproportal.com/2013/02/25/mwc-2013-exclusive-dual-core-st-ericsson-novathor-l8580-soc-crushes-competition-benchmarks/
To quote the article:
A continuous running test monitored by an infra-red reader showed that the 3GHz prototype smartphone remained cooler as it uses less energy and in some scenarios, it could add up to five hours battery life in a normal usage scenario
Click to expand...
Click to collapse
hung2900 said:
"Octa" is not gimmicky or for marketing.
Octa is the name of the SoC, and how it was named is nothing wrong
There are 3 implementations can be used, and one with maximum 8 cores running at the same time.
GS4 doesn't use that impletations, but it does not mean the SoC cannot be "Octa". You have a house with 8 rooms but you know to open or you wanna open 4 rooms only, the house is still an 8-room house.
Click to expand...
Click to collapse
Actually, no. At least not in my opinion. Octacore means 8 cpu cores on one cpu-chip.
I would see it like this:
You have 2 houses on your lawn which are beside each other. Every house has 4 rooms. You have to switch houses to open up the rooms. Just like the Exynos "Octa" has to, since it cannot run both CPU's at the same time.
If you are in a house with 8 rooms, you cannot simply be in all 8 rooms at once. You can connect the open doors between all the rooms, and since your in that house, you can freely walk in every room. But not with that implementation.
I wouldn't call the Exynos "Octa" an Octacore, its a dual CPU system with a 2x4 cores, with the difference that regular desktop dual CPU systems can use both CPU units at once, but not like the Exynos "Octa". Still, dual quad system comes closer than a pure octacore system.
This is kind of a hybrid. Nice technology for a mobile device, but at the same time, kind of unneeded / inefficient, compared to regular quadcore systems. Even the Tegra 3 system with 4 active cores and 1 companion core for standby tasks seems more efficient (in terms of "used space" and ressources).
Ah well let's see how the supposed and so called "octacore" will score in the future...
processor differences
okay I know both processor are snapdragon 600's but why is the galaxy S4's processor clocked at 1.9 ghz and the HTC One's processor is clocked at 1.7 ghz is it just an instance of samsung overclocking the s600 or are they different variations of the same processor, I have done some research and am able to find no clear answer to this question even on the snapdragon website????????
dawg00201 said:
okay I know both processor are snapdragon 600's but why is the galaxy S4's processor clocked at 1.9 ghz and the HTC One's processor is clocked at 1.7 ghz is it just an instance of samsung overclocking the s600 or are they different variations of the same processor, I have done some research and am able to find no clear answer to this question even on the snapdragon website????????
Click to expand...
Click to collapse
They should be identical. I think its just a manufacturer choice. But it could also be associated to termals or battery.
Cause Samsung took the higher frequency chips, there is the possibility that they also get the "better" chips: Lower Voltage for the same frequency. But thats just an assumption.

GPU and benchmarks

Hey everyone.
I'm a bit lost and I don't know what to choose to buy: I9500 or I9505.
So far I know that Adreno 320 is fully OpenGL 3.0 compatible, while PowerVR SGX544MP3 not.
Adreno 320 is scoring 4 FPS more than PowerVR in T-Rex GLBenchmark 2.7.0.
PowerVR is scoring 1-2 more FPS in GLBenchmark 2.5 Egypt
Both GPU is scoring the same in Antutu and Quadrant video test, with PowerVR slightly better for few seconds (Adreno is dropping 1-2 seconds of the test to 30 FPS while PowerVR stay constant at 50-60)
In Antutu, the 3rd test (with the DNA code), Adreno 320 stays at 30-40 fps while PowerVR scores constant 60.
Both, 3dmark and glbenchmark show the PowerVR in the S4 even weaker than Nexus 4 and other chinese mobiles.
What's the deal....what the hell it's happening ? Is PowerVR that weak in the new graphic technologies but scores well in the new ones ?
Also, is there any OpenGL 3.0 benchmark so we can compare the Adreno 320 (fully OpenGL 3.0) with the PowerVR 544MP3 (OpenGL 2.0 but with some OpenGL 3.0 features thanks to an API), to see what the score and quality is ? I really want to see what that 3.0 API knows to do, as the Imagination doesn't really says what that API really do. Would there be games or apps using only OpenGL 3.0 and we will have trouble to run them because of this old GPU ?
I'm wondering...if in one year will be released an OpenGL 3.0 game, what will happens with S4 Octa ? It will not be able to play it, right ? I have no idea how that OpenGL thing works, but I remember that a game requesting DirectX 10 will not work with DirectX 9.
PowerVR really sucks. Samsung dumbs should put the PowerVR 6 "Rogue".
My opinion is that the Qualcomm scores very well, even my S3 is enough to play every single game, but the phone lags on RAM and that's why I replace it now. Buying the Octa will costs me $150 more than the Qualcomm version and I will need to send it oversea in case I will have problems and need to send it to warranty. With those $150 I can buy 2 spare battery and the Samsung S band instead getting the Octa. I want the Octa, but this phone really deserve such attention with that old rubish PowerVR GPU chip ? I don't have 4G in my area, so I don't care about the 4G, but will be nice in case I will travel somewhere with 4G, even if for me HSPA+ is enough and very fast, so the only thing counts here is the CPU, GPU and the battery life. Battery life can be solved with an additional battery, so remains the GPU and the CPU....So far A15 cores are yet very fast, but can use a lot of energy. So I can have 2 days battery life with texting and calling, but 2 hours playing games and watching 1080p videos, while with A9 I will have something similar to S3
Any developer or experienced guy here can answer me to this questions ?
Nobody ?
I'm the same situation. I'm still deciding on what version i should buy...
We need an user with Galaxy S 4 Exynos and one with Snapdragon. They should do same tests (like linpack, vellamo, antutu, and much more) and give us results.
For OpenGL 3.0 i think is better to have native support, not via APIs. Also in Snapdragon we can have same Exynos Performance via OCs and much more. I find Snapdragon more optimizable than exynos, but PowerVR is still a good GPU.
Alberto96 said:
I'm the same situation. I'm still deciding on what version i should buy...
We need an user with Galaxy S 4 Exynos and one with Snapdragon. They should do same tests (like linpack, vellamo, antutu, and much more) and give us results.
For OpenGL 3.0 i think is better to have native support, not via APIs. Also in Snapdragon we can have same Exynos Performance via OCs and much more. I find Snapdragon more optimizable than exynos, but PowerVR is still a good GPU.
Click to expand...
Click to collapse
Totally agree with you. I don't get it why people says the powervr is better. I see that in antutu benchmark scores better than adreno, but in GLBenchmark is awful. This is my only worry right now: what happens if we put the two gpu to do a full OpenGL ES 3.0 test? It will throw an error or will pass it, but with lower score. I don't care the score so much, but its capability to pass the test. If it pass it, I'm sold to Octa.
Also I found that Octa supports LPPDDR3 at 800Mhz, which means 12.8GB/s bandwidth, while S600 is LPPDDR3 but only at 600Mhz or so (only 9.4GB/s or something like that)
Sent from my GT-I9300 using Tapatalk 2
I just read (italian forum) that Exynos in the future can use all of the 8 cores together with kernel 3.8 .
So.......i think i will buy the exynos I'm just waiting a friend reply that bought it on Expansys USA. If he receive it and is all good, i will buy it from that site. With Italian Taxes (21%) and shipping costs it will cost about 730-740€
Alberto96 said:
I just read (italian forum) that Exynos in the future can use all of the 8 cores together with kernel 3.8 .
Click to expand...
Click to collapse
Why would you need these eight cores working together? How will you be sure Android will dispatch your applications threads in a proper way among them? Just another headache. I also don't believe they will really help to save battery, it's a pure marketing. But A15 is a bit more powerful than Krait from S600.
I think PowerVR 544MP3 scores below Adreno 320 in T-Rex because of unified architecture implemented in Adreno. This test uses complex shaders on every surface, so, probably, Octa GPU runs out of its fragment processors.
If you don't need a new phone right now, wait for S800 models. I don't think Mali T65x is good enough either. Looking at S3 GPU - yes, it's pretty fast in some wonderful tasks as rendering to texture, but it has some weird bottlenecks making Horn and T-REX much slower in fps than I've expected looking at pure gflops values.
Phobos Exp-Nord said:
Why would you need these eight cores working together? How will you be sure Android will dispatch your applications threads in a proper way among them? Just another headache. I also don't believe they will really help to save battery, it's a pure marketing. But A15 is a bit more powerful than Krait from S600.
I think PowerVR 544MP3 scores below Adreno 320 in T-Rex because of unified architecture implemented in Adreno. This test uses complex shaders on every surface, so, probably, Octa GPU runs out of its fragment processors.
If you don't need a new phone right now, wait for S800 models. I don't think Mali T65x is good enough either. Looking at S3 GPU - yes, it's pretty fast in some wonderful tasks as rendering to texture, but it has some weird bottlenecks making Horn and T-REX much slower in fps than I've expected looking at pure gflops values.
Click to expand...
Click to collapse
Well, when you play some heavy games you need all cores. Also is useful to use all cores when you are charging phone, without killing battery.
I need a new phone, because my Galaxy S I9000 is slow with new apps and android versions. If i buy this is useless a S800 version. CPU is fast, gpu maybe not as Adreno 330, but with overclock we can boost a lot performances.
Dude, using all eight cores will simply melt your phone in your hands LOL. You will drink S4 cocktail LOL. Quad-core is enough, but a gpu it's never. Same things are happening with the PCs. I don't need huge fps in trex, but some safe reviews and opinions from people really knows this things....but so far only you two were able to answer (I will not pretend yet that this forum is full of noobs LOL).
I want new mobile because of the lack of ram in S3, even if it's smooth for me. I was happy to hear about the Octa version, because I wanted to try something new, but I'm kinda lost now.
Alberto96, please let me know when your friend gets that i9500. I want to get it from Expansys too (I think we already talked together about this in other threads). If I will buy i9505 I will get it from Amazon Italy as it cheaper than other places
I'm just comparing:
I9500: - 1 years of warranty (overseas)
I9505 - 2 years of warranty (locally)
I9500 = I9505 + 3 additional S4 batteries with external charger
That because:
740€ = 625€ + 35€ x 3 batteries (and I will still have money for a Burger King and a Cola)
So...it's really deserve the risk ? Still nobody answered me related to OpenGL ES 3.0
S800 and Adreno 330 will not be in a Samsung device soon (maybe never) and 2.1-2.3GHz looks too much for a mobile phone. We already have warming issues with the S4 (I even have issues in S3, with the phone going warmer). Also....My laptop is a Dual-Core AMD 2.1 GHz for God sake.
@Alberto96, I beg you, when your friend gets the phone, please test it and let me know what you think ?
demlasjr said:
2.1-2.3GHz looks too much for a mobile phone
Click to expand...
Click to collapse
Not when playing Hi10P in software.
I do not know the exact internal scheme of Exynos Octa, so it's easy for me to imagine the situation when two threads of single application will be dispatched to two different core domains, making it really hard to exchange the data between them, as probably each domain has its own cache subsystem, so the performance will drop even higher than with two threads on A7-domain together.
Phobos Exp-Nord said:
Not when playing Hi10P in software.
I do not know the exact internal scheme of Exynos Octa, so it's easy for me to imagine the situation when two threads of single application will be dispatched to two different core domains, making it really hard to exchange the data between them, as probably each domain has its own cache subsystem, so the performance will drop even higher than with two threads on A7-domain together.
Click to expand...
Click to collapse
Yeah, you're right here. I don't have much knowledge relating this profile as I'm not watching anime, but seems to depending more on the GPU than CPU in S4 case. I'm really sure that Exynos Octa is able to run it, but not sure about the PowerVR. I've read that an Hi10P plays anywhere from 15-20fps (watchable, but still not that great) with a Tegra 3 quad-core overclocked at 1.6GHz, so there is still hope.
demlasjr said:
I've read that an Hi10P plays anywhere from 15-20fps
Click to expand...
Click to collapse
That's about 720p. Just asked in another thread there about 1080p - S4 cannot play it smooth enough with MX Player. It's not a question of resolution, it's a problem of use a file from 1080p home collection without any additional efforts.
We'll see, maybe later there will be an update released for such issues. I think the GPU and the CPU of both variants are capable of playing such videos.
Hey guys,
http://withimagination.imgtec.com/i...or-todays-leading-platforms#comment-880303396
jumping directly from OpenGL ES 2.0 to 3.0 would create a situation where app compatibility would be severely broken across devices. But most people update their devices every two years; by that time, PowerVR Series6 would be the dominant OpenGL ES 3.0 GPU generation shipping in most devices.
It is also important to remember that the PowerVR Series5XT GPU family has been successfully holding its own against recently released competing graphics solutions despite being released almost four years ago, which in itself is an amazing feat.
Click to expand...
Click to collapse
So....we should trust alexvoica and go forward with PowerVR SGX544MP3 even if lacks of OpenGL ES 2.0 ? He said that there was long way til OpenGL ES 2.0, but it wasn't such a big way as he said. Now every single game use OpenGL ES 2.0, I'm sure soon will be OpenGL ES 3.0 games only and not after 2 years.
get a look at this http://gfxbench.com/compare.jsp?cols=2&D1=Samsung+GT-I9500+Galaxy+S4&D2=Samsung+GT-I9505+Galaxy+S4

Categories

Resources