Mali-400 MP vs Adreno 220 - Galaxy S II General

Which is the better GPU and why ?

I'm not sure of the technical reasons why, maybe people are just going off benchmarks, but the general consensus is that Adreno 220 has better gaming performance.
However, unless you are planning some hardcore gaming; Mali-400 MP or GeForce ULP will be just fine.

MALI-400 MP is imo a faster GPU but it really lacks stuff needed to be a good GPU. Also on the low level some of the major 3d scores are even lower than Adreno 205. So the quality here sucks. It misses many compression texture formats so low compatibility. Most games will come up with a solution for that but with time and that time could really end the life cycle of the gs2. Mali 400 is slower than adreno 205 in Geometric Tests, Common Tests, Exponential Tests. Adreno 220 will be a slightly slower GPU in synthetic tests but with more compatibility, better quality from the lower level, more texture compression formats and will be compatible with all games since start as adreno gpu games are already abundant in the market. So its more like a Samung delivered the fastest GPU with major flaws. Here Adreno 220 is like ATI and Nvidia and Mali-400 is like any other generic GPU from another company. And Galaxy S 2 coming in tegra 2 would really mess up the compatibility of Mali-400 seeing that Mali will be missing the number of devices so Mali - 400 could be a left out here.
Right now the game here is a Faster GPU (by a small margin) vs a more Compatible GPU. Better - if u can wait with no definite future mali and if u want everything now and in future its adreno

With CF working on compat I wouldnt be surprised if we're all playing Tegra Zone next month.

bilboa1 said:
With CF working on compat I wouldnt be surprised if we're all playing Tegra Zone next month.
Click to expand...
Click to collapse
I agree. (10char)

Samsung's Galaxy S II Preliminary Performance: Mali-400MP Benchmarked 1Ghz Mali 400
Dual Core Snapdragon GPU Performance Explored - 1.5 GHz MSM8660 and Adreno 220 Benchmarks 1.5Ghz Adreno 220
Should give you a rough idea of what to expect.

_dsk_ said:
Samsung's Galaxy S II Preliminary Performance: Mali-400MP Benchmarked 1Ghz Mali 400
Dual Core Snapdragon GPU Performance Explored - 1.5 GHz MSM8660 and Adreno 220 Benchmarks 1.5Ghz Adreno 220
Should give you a rough idea of what to expect.
Click to expand...
Click to collapse
No, it's not even close.
My Galaxy S II scores 42.2 fps in the same benchmark, Adreno scores an impressive 38 fps but this is with the CPU at 1.5GHz.

_dsk_ said:
Samsung's Galaxy S II Preliminary Performance: Mali-400MP Benchmarked 1Ghz Mali 400
Dual Core Snapdragon GPU Performance Explored - 1.5 GHz MSM8660 and Adreno 220 Benchmarks 1.5Ghz Adreno 220
Should give you a rough idea of what to expect.
Click to expand...
Click to collapse
No, this one is not accurate.
Just look at the firmware, SGS2 was running Android2.3.1 at the time, it was not a retail device.
Retail SGS2 outperforms anything currently in GLbench.

"Originally Posted by iwantandroid
I cried when I lerned this phone i got from tmobile didnt have Android. Can sum1 help me get Android on my new G1 and then tel me how to jailbroke it please"
LOL OMG

_dsk_ said:
Samsung's Galaxy S II Preliminary Performance: Mali-400MP Benchmarked 1Ghz Mali 400
Dual Core Snapdragon GPU Performance Explored - 1.5 GHz MSM8660 and Adreno 220 Benchmarks 1.5Ghz Adreno 220
Should give you a rough idea of what to expect.
Click to expand...
Click to collapse
these tests are kinda misleading, between non-final device/software and capped framerate
i'm a bit disappointed that it comes from anandtech since they usually try to have stuff all squared out on PCs ;-)

lol, you guys are very defensive about your phones, understandably.
What you should be able to ascertain though is that the 1Ghz Mali benchmarks are decent and you can expect better performance with it clocked at 1.2Ghz.
Conversely you should be able to see that the Adreno at 1.5Ghz, though impressive, will be less so clocked at 1Ghz like in the Sensation, which will also have a higher resolution screen.
I only provided the links so that people could make up their own mind by using the same logic.

Are you sure the Mali-400 is clocked at 1.2Ghz ?
Because when I overclocked my SGS2 to 1.5Ghz I saw a 25% performance increase in computing performance, but almost no increase at all in graphics performance (using GL Benchmark), so I thought the frequencies of the two were totally unrelated.

I dont know what the clock speeds of the GPU are, but CPU speed bumps will also help with 3D performance.

_dsk_ said:
I dont know what the clock speeds of the GOP are, but CPU speed bumps will also help with 3D performance.
Click to expand...
Click to collapse
Well in my case it did not. I guess a dual core 1.2Ghz CPU is not a bottleneck on a smartphone lol.

Ive heard there are FPS caps on the Galaxy line, not sure how true this is, usually benchmarks should see an increase when handsets are overclocked.

_dsk_ said:
Ive heard there are FPS caps on the Galaxy line, not sure how true this is, usually benchmarks should see an increase when handsets are overclocked.
Click to expand...
Click to collapse
its true for the sgs1 and 2 at least, frame rate is capped between 56 and 66 fps depending on kernels/versions etc
many benchmarks hit the cap (like quadrant)

Related

Why Adreno GPU is better than Tegra 2 GPU.

Many people have said that SE should have put a Tegra 2 dual core chip inside the Xperia Play instead of the Snapdragon with Adreno 205.
In the real world the Adreno 205 was a much better choice for complex game effects and battery life.
This is a heavy read but there are plenty of charts & pictures that tell a fairer story from a Game Developers point of view.
http://blogs.unity3d.com/wp-content/uploads/2011/08/FastMobileShaders_siggraph2011.pdf
Here's hoping Tegra 3 is a much better effort.
First of all Tegra II is not a GPU. It's a CPU. So a more valid comparison would be snapdragon V's tegra II or Adreno V's GEforce.
Adreno 200 really was a poor GPU and qualcomm made a mess when they purchased the Adreno project off ATI. Although i think were all agreed that the jump from adreno 200 to adreno 205 was massive.
Adreno 205 is easly on par with the GPU in any single core CPU. I dont quite think it is a match for the 8 core ULV GPU inside the tegra II.
And imo NVIDA has proven with some of the tegra II games that the mobile version of GEforce inside there CPU is in a league of it's own compared to our GPU. Although i think Adreno 220 is on par with the Tegra II GPU. The soon to be released quad core tegra III CPU comes with such an awesome GPU it will be hard to beat
http://www.youtube.com/watch?v=cI-guAGGK3s
AndroHero said:
First of all Tegra II is not a GPU. It's a CPU. So a more valid comparison would be snapdragon V's tegra II or Adreno V's GEforce.
Click to expand...
Click to collapse
I think I tried to say that in the post the header could have been a bit clearer.
Adreno 200 really was a poor GPU and qualcomm made a mess when they purchased the Adreno project off ATI. Although i think were all agreed that the jump from adreno 200 to adreno 205 was massive.
Click to expand...
Click to collapse
Agree I have tried rudimentary GPU benchmarking on all my phones, the Xperia Play would have been severly weakened if it went ahead with using a Adreno 200 based SOC.
Adreno 205 is easly on par with the GPU in any single core CPU. I dont quite think it is a match for the 8 core ULV GPU inside the tegra II.
And imo NVIDA has proven with some of the tegra II games that the mobile version of GEforce inside there CPU is in a league of it's own compared to our GPU. Although i think Adreno 220 is on par with the Tegra II GPU. The soon to be released quad core tegra III CPU comes with such an awesome GPU it will be hard to beat
Click to expand...
Click to collapse
I also thought the NVIDA GPU chip would be much better, but after reading the PDF I don't think it is. It looks like to get the best from the NVIDA GPU you have to use the CPU's much more than the Adreno 205 which will hit battery life. Also the Adreno looks like it has some hidden tricks that help in more complex scenes.
Give the PDF a read.
From the (very little, it is a really technical paper) content I can extract, it seems that the Nvidia Tegra devices follow a "classic approach" and load many more things on the CPU, while the Adreno and PowerVR (aka Apple's chip) follow a "smarter" approach, reducing the CPU load and loading the GPU, plus using tricks.
I'd say that, if that is correct, that it comes from the legacy of Nvidia as a desktop pc GPU maker, and that it makes sense that Nvidia is betting on getting multi-core devices out ASAP, for their approach is much more CPU-taxing and multiple cores allow to reduce CPU stress.
Techdread said:
I think I tried to say that in the post the header could have been a bit clearer.
Agree I have tried rudimentary GPU benchmarking on all my phones, the Xperia Play would have been severly weakened if it went ahead with using a Adreno 200 based SOC.
I also thought the NVIDA GPU chip would be much better, but after reading the PDF I don't think it is. It looks like to get the best from the NVIDA GPU you have to use the CPU's much more than the Adreno 205 which will hit battery life. Also the Adreno looks like it has some hidden tricks that help in more complex scenes.
Give the PDF a read.
Click to expand...
Click to collapse
I did look at the .pdf. But to be honest, it's a little over my head lol
Sent from my R800i using Tapatalk
Interesting results for the Adreno 205.
Shader Performance
•Normalized to iPad2 resolution
•From single color:
• 1.4ms iPad2
• 3.5ms XperiaPlay
• 3.8ms Tegra2
• 14.3ms iPhone3Gs
•To fully per-pixel bump spec:
• 19.3ms iPad2
• 18.4ms XperiaPlay
• 47.7ms Tegra2
• 122.4ms iPhone3Gs
hairdewx said:
Interesting results for the Adreno 205.
Shader Performance
•Normalized to iPad2 resolution
•From single color:
• 1.4ms iPad2
• 3.5ms XperiaPlay
• 3.8ms Tegra2
• 14.3ms iPhone3Gs
•To fully per-pixel bump spec:
• 19.3ms iPad2
• 18.4ms XperiaPlay
• 47.7ms Tegra2
• 122.4ms
Hmmmmmmm
Sent from my R800i using Tapatalk
Click to expand...
Click to collapse
Double post......
AndroHero said:
The soon to be released quad core tegra III CPU comes with such an awesome GPU it will be hard to beat
http://www.youtube.com/watch?v=cI-guAGGK3s
Click to expand...
Click to collapse
Holy cr*p that looks amazing!
When is the Tegra 3 and Adreno 220 coming out? which will be the best? tablet only or on phones too?
FK1983 said:
Holy cr*p that looks amazing!
When is the Tegra 3 and Adreno 220 coming out? which will be the best? tablet only or on phones too?
Click to expand...
Click to collapse
Adreno 220 is already out with the dual core qualcomm chips
http://www.youtube.com/watch?v=Ehfyxvh2W4k&feature=related
Although the game in the demo is desert winds, an xperia play (adreno 205) exclusive
Comparing dual core Qualcomm chips to the Tegra is like comparing our current chip to the Samsung hummingbird.
The former is more widely supported, and better optimized. Whereas the latter is not well supported, and although it's supposed to be better on paper, it's real life performance isn't as good.
Sent from my R800
Logseman said:
From the (very little, it is a really technical paper) content I can extract, it seems that the Nvidia Tegra devices follow a "classic approach" and load many more things on the CPU, while the Adreno and PowerVR (aka Apple's chip) follow a "smarter" approach, reducing the CPU load and loading the GPU, plus using tricks.
Click to expand...
Click to collapse
Thats the impression I got.
I'd say that, if that is correct, that it comes from the legacy of Nvidia as a desktop pc GPU maker, and that it makes sense that Nvidia is betting on getting multi-core devices out ASAP, for their approach is much more CPU-taxing and multiple cores allow to reduce CPU stress.
Click to expand...
Click to collapse
Desktop and handheld are vastly different in power & heat requirements, NVidia were probably rushing their dual core SOC's to market the lack of NEON in initial shipments and poor GPU's seems to confirms this.

SGSIII Mali 400 Drivers on the note!

The folks at the HTC Sensation/EVO 3D section extracted the Adreno 225 drivers from the HTC One S, as some of you may know that the Adreno 225 is the same as the Adreno 220 GPU but just have double the frequency! the frequency has nothing to do here if you ask, using these drivers gave them a HUGE performance boost with the STOCK frequency
as we know that the Mali 400 GPU at the SGSIII is clocked at 400mhz but even if you clocked your Mali 400 GPU in your Note (which has the same Resolution) you wont be able to reach that performance which tells me that its all about the drivers just like the Adreno 225
So can the Developers extract the Mali 400 Drivers from the SGSIII so we can use it on our phones?
This is not a question so i think it belongs to here not the Q/A section as its just a discussion if this is going to work or not!
Same driver, bigger screen = performance loss.
That is why Sammy set CPU 200 Mhz faster on Note over S2.
Screen has NOTHING to do with anything the Resolution does, which is the same in the SGSIII and the Note
Also that's why i said if you overclock the GPU to 400mhz you still wont reach that performance so it has to do with the Drivers
The note and SGSIII do indeed have different different screen resolutions, the Note being at 1280x800, while the SGSIII is at 1280x720. not much of a difference though, basically 16:10 vs 16:9, respectively. I believe the new Mali400 Drivers will be in the next ROM update anyway.
Hell Guardian said:
The folks at the HTC Sensation/EVO 3D section extracted the Adreno 225 drivers from the HTC One S, as some of you may know that the Adreno 225 is the same as the Adreno 220 GPU but just have double the frequency! the frequency has nothing to do here if you ask, using these drivers gave them a HUGE performance boost with the STOCK frequency
as we know that the Mali 400 GPU at the SGSIII is clocked at 400mhz but even if you clocked your Mali 400 GPU in your Note (which has the same Resolution) you wont be able to reach that performance which tells me that its all about the drivers just like the Adreno 225
So can the Developers extract the Mali 400 Drivers from the SGSIII so we can use it on our phones?
This is not a question so i think it belongs to here not the Q/A section as its just a discussion if this is going to work or not!
Click to expand...
Click to collapse
Well , if they are exactly the same just different clock speeds then I would think they should work indeed.
This is interesting and I certainly hope it does , not that at 400mhz or even less, the GPU is lacking but who does not like more performance for free?
Muskie said:
The note and SGSIII do indeed have different different screen resolutions, the Note being at 1280x800, while the SGSIII is at 1280x720. not much of a difference though, basically 16:10 vs 16:9, respectively. I believe the new Mali400 Drivers will be in the next ROM update anyway.
Click to expand...
Click to collapse
I know that but that deference is not major by any mean to effect the performance that much is they are both have the same frequency
shaolin95 said:
Well , if they are exactly the same just different clock speeds then I would think they should work indeed.
This is interesting and I certainly hope it does , not that at 400mhz or even less, the GPU is lacking but who does not like more performance for free?
Click to expand...
Click to collapse
My thoughts exactly, If they folks at the Sensation did it, why can't we?
Link of the Drivers that got extracted from the One S
http://forum.xda-developers.com/showthread.php?t=1643472
Just check the replies to see the performance boost, This is the EXACT same situation as the Note and the SGSIII GPU
Wow, that's a good boost.
nex7er said:
Wow, that's a good boost.
Click to expand...
Click to collapse
I think if the Note users can have that kind of boost on their phones that will eliminate ANY kind of lag in the UI and it i will be Amazingly smooth it will also give huge boost to the SGSII users
if this really happened and it does work, what about the battery-life... can be poorer i think
In theory, I see where you're going with this, and in theory it sounds plausible. However, something that I think has been overlooked is the process design of the new S3's chipset vs the ones found in the current generation S2/Note (45nm vs 32nm). It's entirely possible that the only reason why Samsung is able to run the Mali-400 at 400mhz is due to the fact that the 32nm process is just that much more efficient, such that you can safely run at 400mhz using the same power as you would running at 266mhz on the 45nm process.
I just get the feeling that trying to push the 45nm process up to 400mhz might simply melt the silicon (or at least gobble your battery life in one gulp!). Call me defeatist if you have to, but I remain skeptical until I see evidence to the contrary.
I run my galaxy nexus with the GPU clocked to 512mhz (standard is 308mhz), and that cpu too uses the 45nm process.
Been running it like that for the last 3 months with no issue, and game fps is greatly improved.
Is there any kernels at all that even support over clocking the GNote gpu?
Very interesting, Would like to see this being investigated further for sure!
screen has nothing to do with it...on note we got 100k more pixels 1280x800-1280x720=100k
,,, and s3 has more cores in the mali-gpu...but yea i think the drivers would get us more performance
lyp9176 said:
if this really happened and it does work, what about the battery-life... can be poorer i think
Click to expand...
Click to collapse
The sg s3 seems to have a decent battery life
resistant said:
screen has nothing to do with it...on note we got 100k more pixels 1280x800-1280x720=100k
,,, and s3 has more cores in the mali-gpu...but yea i think the drivers would get us more performance
Click to expand...
Click to collapse
After some digging I found that the GPU In Exinos 4210(SGS2/Note) and 4412 (SGS3) is absolutely the same Mali 400MP4 (same number of GPU cores)! The only difference is that the 4412 GPU Can Go up to 400MHz (which is doable to our GPU too and have been done to the SGS2 already). The main difference here are the four CPU cores that help the GPU. I'm skeptical that the new drivers will do much (if at all) in terms of performance! Oh and lets not forget that the Adreno GPU Drivers are written by Qualcomm and they can't do anything right so the updated drivers may just be better written (or at least less buggier) than the old ones!
Manya3084 said:
I run my galaxy nexus with the GPU clocked to 512mhz (standard is 308mhz), and that cpu too uses the 45nm process.
Been running it like that for the last 3 months with no issue, and game fps is greatly improved.
Click to expand...
Click to collapse
It has been proved to make very little improvement over a well developed kernal. Hence why developers like Franco and imyosen took it out.
Game frame rate is simply due to force gpu being active
Sent from my GT-I9300 using xda premium
Mahoro.san said:
The sg s3 seems to have a decent battery life
Click to expand...
Click to collapse
That is due to the new processor voltage and the low idle drain of the CPU
Sent from my GT-I9300 using xda premium
GR36 said:
It has been proved to make very little improvement over a well developed kernal. Hence why developers like Franco and imyosen took it out.
Game frame rate is simply due to force gpu being active
Sent from my GT-I9300 using xda premium
Click to expand...
Click to collapse
This during kernel development in the gingerbread days or the new current ics kernels?
Sent from my Galaxy Nexus using Tapatalk 2
May be...
Clocking the GPU at 400Mhz would give a boost in performance but at the cost of battery life....and also making the phone really hot....which is not idle...just wait a little while and see how will s3 perform under those conditions...

What's the GPU clock of Exynos used in S3?

I'm wondering, was the GPU clock dropped from initially stated 533MHz?
Sorry, i'm asking about S4, not S3.
GPU [email protected]
Source: http://en.wikipedia.org/wiki/Exynos
demlasjr said:
GPU [email protected]
Source: http://en.wikipedia.org/wiki/Exynos
Click to expand...
Click to collapse
Holy ****, meant to ask about S4.
Hahahaha....you asked...I answered...LOL
- PowerVR SGX 544MP3 GPU; 3-core, 533Mhz
- Adreno 320; 4-core, 450Mhz
Now, hope that you won't tell that you were talking about Nexus 4 LMAO
So, while A15 cores were slowed down a bit from initial specs, GPU runs the same frequency as announced. Thank you
Phobos Exp-Nord said:
So, while A15 cores were slowed down a bit from initial specs, GPU runs the same frequency as announced. Thank you
Click to expand...
Click to collapse
No, A15 were not slowed down. It was rumored and even released with the max frequency at 1,6 GHz, while in Korea will be clocked at 1,8GHz.

Which one do you suggest when buying the S4 .... Snapdragon 600 or Exynos 5 Octa ?

I am planning to buy S4
But I'm not sure which is better
Snapdragon 600
1.9 Ghz
Quad Core Krait 300.
or
Exynos5
1.6 Ghz Octal Core
Quad Core Cortex A15 / Quad Core A7
4g and installing roms is not important to me
Which is better in terms of performance?
Which is better for gaming?
Which is better for battery life?
Octa for all.
tuxonhtc said:
Octa for all.
Click to expand...
Click to collapse
why?
an9093 said:
why?
Click to expand...
Click to collapse
processor is more stronger rather than spandragon
also exynos version will be global version so you can get update in time
but power vr sgx and adreno 320 is offering same result as a GPU
as a result i recommend you to buy exynos octa 5
any other suggestions?
an9093 said:
I am planning to buy S4
But I'm not sure which is better
Snapdragon 600
1.9 Ghz
Quad Core Krait 300.
or
Exynos5
1.6 Ghz Octal Core
Quad Core Cortex A15 / Quad Core A7
4g and installing roms is not important to me
Which is better in terms of performance?
Which is better for gaming?
Which is better for battery life?
Click to expand...
Click to collapse
Performance wise the octa version will win on the CPU side and be on par , gaming the snapdragon will probably win cause it seems to becoming a standard on high end android Devices, battery will also be a tie, where the octa wins in low demanding tasks thanks to its a7, and the s600 win on higher demanding tasks cause the krait consume less then the a15.
Sent from my iPhone 5 using Tapatalk
any other suggestions?
an9093 said:
any other suggestions?
Click to expand...
Click to collapse
The above suggestions are not enough?
Sent from my HTC Desire X using xda app-developers app
Exynos 5 sweeps the floor with the snapdragon
nitinvaid said:
The above suggestions are not enough?
Sent from my HTC Desire X using xda app-developers app
Click to expand...
Click to collapse
not enough i want more
Blackwolf10 said:
Exynos 5 sweeps the floor with the snapdragon
Click to expand...
Click to collapse
what do you mean ?
an9093 said:
what do you mean ?
Click to expand...
Click to collapse
better cpu same gpu and better battery life faster updates the only thing snappy beats it is with Aosp rom support and 4G
an9093 said:
what do you mean ?
Click to expand...
Click to collapse
Giving Samsung ready to make the sources available, you'd better off with the Snapdragon version because Qualcomm will release their sources well before Samsung does. Consequence:
Development for the i9505 (Snapdragon version) will take off while the Exynos version's development will be slower...
f.
Blackwolf10 said:
better cpu same gpu and better battery life faster updates the only thing snappy beats it is with Aosp rom support and 4G
Click to expand...
Click to collapse
Stop saying GPU is same ! Adreno 320 sweeps the floor and even the windows with the ****ing PowerVR !!!
but Octa sweeps the floor with Qualcomm.
Both are important....cpu and gpu. But we have enough CPU in both. What sense makes to have huge CPU if you wouldn't be able to play or use some apps because of the old gpu without OpenGL ES 3.0 ?
demlasjr said:
Stop saying GPU is same ! Adreno 320 sweeps the floor and even the windows with the ****ing PowerVR !!!
but Octa sweeps the floor with Qualcomm.
Both are important....cpu and gpu. But we have enough CPU in both. What sense makes to have huge CPU if you wouldn't be able to play or use some apps because of the old gpu without OpenGL ES 3.0 ?
Click to expand...
Click to collapse
you changed ur comment to neutral so i guess i can say that your right but still it's awesome to look like a boss on Antutu benchmarks
Blackwolf10 said:
you changed ur comment to neutral so i guess i can say that your right but still it's awesome to look like a boss on Antutu benchmarks
Click to expand...
Click to collapse
Like a boss in Antutu and playing Angry Birds while Snapdragon 600 owners cries for being kings of the hills but playing new and high graphics games.
I found one more things which makes Exynos Octa better. The ram in Exynos is LPPDDR3 at 800MHz, which supports 12.8GB/s bandwidth, while S600 have LPPDDR3 at 600MHz (? not sure) but it's at 9.X GB/s (can't find this information
demlasjr said:
Like a boss in Antutu and playing Angry Birds while Snapdragon 600 owners cries for being kings of the hills but playing new and high graphics games.
I found one more things which makes Exynos Octa better. The ram in Exynos is LPPDDR3 at 800MHz, which supports 12.8GB/s bandwidth, while S600 have LPPDDR3 at 600MHz (? not sure) but it's at 9.X GB/s (can't find this information
Click to expand...
Click to collapse
owh and angry birds friends is releasing soon
most of Ram is wasted on S-apps and features anyway and don't forget the wolfson audio chipset for exynos
Blackwolf10 said:
owh and angry birds friends is releasing soon
most of Ram is wasted on S-apps and features anyway and don't forget the wolfson audio chipset for exynos
Click to expand...
Click to collapse
I know, even so I'm big fan of Yamaha Grand Piano (I own one), so...if the chip inside the Qualcomm is Yamaha isn't a problem for me.
The only thing that makes me angry about the PowerVR 544MP3 junk is the OpenGL ES 3.0. I'm not 100% what's the importance if it, but i'm sure that soon will be released many games with this support and PowerVR have an OpenGL ES 2.0 api which gives "extended support of OpenGL ES 3.0. However, if you want full 3.0 compliance choose PowerVR 6X Rogue".
WTF is that extended api does ? I want to see a OpenGL ES 3.0 benchmark (GLBenchmark promised to release one) of both, Adreno 320 and PowerVR 544
demlasjr said:
Like a boss in Antutu and playing Angry Birds while Snapdragon 600 owners cries for being kings of the hills but playing new and high graphics games.
I found one more things which makes Exynos Octa better. The ram in Exynos is LPPDDR3 at 800MHz, which supports 12.8GB/s bandwidth, while S600 have LPPDDR3 at 600MHz (? not sure) but it's at 9.X GB/s (can't find this information
Click to expand...
Click to collapse
what is the bandwidth and how it makes the Exynos stronger ?
Eyxynos seems to be overheating (see relevant thread in Q&A)
A question once and for all: DOES THE 9500 support LTE/4G or NOT!? Because that's a deciding factor in my purchse. I live in Denmark. I mean definite yes or no should be possible right?
overheating + no LTE will definitely sway me towards the 9505, but it still seems to be too early to tell.

How does the speed/performance compare to the 2013 nexus 7?

Just wondering if this is a step up in terms of what I am currently using as my daily android device. Alright, after a bit of reading, the best guess I have is about a 25% increase in performance with the Redmi note 2.
Powervr g6200 vs Adreno 320
The PowerVR chip was ranked 8th best for 2015 compared to 14th for the dated Adreno 320.
CPU clock time is 2000mhz compared to 1500.
However, benchmarks on Atutu seem to show it scoring twice as high.
Was hoping for a little more performance, but for $110 and a far superior GPS/camera I'll take it.
I've never heard any complaints on here about performance on the RN2; in fact, people have been underclocking the CPU to 1.2-1.5 GHz and still had no lag. It's not as good as the SD 810 and not nearly as good as the Adreno 430, but there aren't really any devices this cheap with more powerful hardware.

Categories

Resources