Galaxy S SGX540 GPU. Any details up 'till now? - Galaxy S I9000 General

Hi everyone
For quite a long time i've been thinking about the whole "galaxy s can do 90mpolys per second" thing.
It sounds like total bull****.
So, after many, many hours of googling, and some unanswered mails to imgtec, i'd like to know-
Can ANYONE provide any concrete info about the SGX540?
From one side i see declerations that the SGX540 can do 90 million polygons per second, and from the other side i see stuff like "Twice the performance of SGX530".
...but twice the performance of SGX530 is EXACTLY what the SGX535 has.
So is the 540 a rebrand of the 535? that can't be, so WHAT THE HELL is going on?
I'm seriously confused, and would be glad if anyone could pour light on the matter.

I asked a Samsung rep what the difference was and this is what I got:
Q: The Samsung Galaxy S uses the SGX540 vs the iPhone using the SGx535. The only data I can find seems like these two GPU's are very similar. Could you please highlight some of the differences between the SGX535 and the SGX540?
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
I also tried getting in contact with ImgTec to find out an answer, but I haven't received a reply back. It's been two weeks now.
Also, the chip is obviously faster than snapdragon with the adreno 200 gpu. I don't know if Adreno supports TBDR, I just know it's a modified Xenon core. Also, Galaxy S uses LPDDR2 ram. So throughput is quite a bit faster, even though it's not *as* necessary with all the memory efficiencies between the Cortex A8 and TBDR on the SGX540.

thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
i think that is the cue, for cost saving for Samsung
besides who will need a 2D Accelerator, with a CPU as fast as it's already.
The HTC Athena (HTC Advantage) failed miserably at adding the ATI 2D Accelerator which no programmers were able to take advantage of, in the end the CPU did all the work.

I'd imagine its a 535 at 45nm. Just a guess, the cpu is also 45nm

Having tried a few phones the speed in games is far better, much better fps though there is a problem that we might have to wait for any games to really test its power as most are made to run on all phones.
This was the same problem with the xbox and ps2, the xbox had more power but the ps2 was king and so games were made with its hardware in mind which held back the xbox, only now and then did a xbox only game come out that really made use of its power....years later xbox changed places which saw 360 hold the ps3 back (dont start on which is better lol) and the ps3 has to make do with 360 ports but when it has a game made just for it you really get to see what it can do...anywayits nice to know galaxy is future proof game wise and cannot wait to see what it can do in future or what someone can port on to it.
On a side note I did read that the videos run through the graphics chip which is causing blocking in dark movies (not hd...lower rips) something about it not reading the difference between shades of black, one guy found a way to turn the chip off and movies were all good, guess rest of us have to wait for firmware to sort this.

thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
smart move sammy

voodoochild2008-
I wouldn't say we'd have to wait so much.
Even today, snapdragon devices don't do very well in games, since their fillrate is so low (133Mpixels)
Even the motorola droid (SGX530 at 110mhz, about 9~ Mpoly's and 280~ Mpixels with that freq) fares MUCH better in games, and actually, runs pretty much everything.
So i guess the best hardware is not yet at stake, but weaker devices should be hitting the limit soon.
bl4ckdr4g00n- Why the hell should we care? I don't see any problem with 2D content and/or videos, everything flies at lightspeed.

well I can live in hope, and I guess apples mess (aka the iphone4) will help now as firms are heading more towards android, I did read about one big firm in usa dropping marketing for apple and heading to android, and well thats what you get when you try to sell old ideas...always made me laugh when the first iphone did like 1meg photo when others were on 3meg, then it had no video when most others did, then they hype it when it moves to a 3meg cam and it does video.....omg, ok I am going to stop as it makes my blood boil that people buy into apple, yes they started the ball rolling and good on them for that but then they just sat back and started to count the money as others moved on.................oh and when I bought my galaxy the website did say "able to run games as powerfull as the xbox (old one) so is HALO too much to ask for lol

wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??

The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby

he 535 is a downgrade from the 540. 540 is the latest and greatest from the PowerVR line.
Samsung did not cost cut, they've in fact spent MORE to get this chip on their Galaxy S line. No one else has the 540 besides Samsung.

Like i said, its probably just a process shrink which means our gpu uses less power and is possibly higher clocked.
p.s. desktop gfx haven't had 2d acceleration for years removing it saves transistors for more 3d / power!

This worries me as well... Seems like it might not be as great as what we thought. HOWEVER again, this is a new device that might be fixed in firmware updates. Because obviously the hardware is stellar, there's something holding it back
Pika007 said:
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
Click to expand...
Click to collapse
http://www.slashgear.com/droid-x-review-0793011/
"We benchmarked the DROID X using Quadrant, which measures processor, memory, I/O and 2D/3D graphics and combines them into a single numerical score. In Battery Saver mode, the DROID X scored 819, in Performance mode it scored 1,204, and in Smart mode it scored 963. In contrast, the Samsung Galaxy S running Android 2.1 – using Samsung’s own 1GHz Hummingbird CPU – scored 874, while a Google Nexus One running Android 2.2 – using Qualcomm’s 1GHz Snapdragon – scored 1,434. "
The N1's performance can be explained by the fact it's 2.2...
But the Droid X, even with the "inferior" GPU, outscored the Galaxy S? Why?

gdfnr123 said:
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
Click to expand...
Click to collapse
Same here. I want to know which one is has the better performance as well.
Besides that. Does anyone know which CPU is better between Dorid X and Galaxy S?
I knew that OMAP chip on the original Droid can overclock to 1.2Ghz from what, 550Mhz?
How about the CPU on Droid X and Galaxy S? Did anyone do the comparison between those chips? Which can overclock to a higher clock and which one is better overall?
Sorry about the poor English. Hope you guys can understand.

The CPU in the DroidX is a stock Cortex A8 running at 1GHz. The Samsung Hummingbird is a specialized version of the Cortex A8 designed by Intrinsity running at 1Ghz.
Even Qualcomm does a complete redesign of the Cortex A8 in the snapdragon cpu at 1GHz. But while the original A8 could only be clocked at 600Mhz with a reasonable power drain, the striped down versions of the A8 could be clocked higher while maintaining better power.
An untouched Cortex A8 can do more at the same frequencies compared to a specialized stripped down A8.
If anything the Samsung Galaxy S is better balanced, leveraging the SGX 540 as a video decoder as well. However, the Droid X should be quite snappy in most uses.
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.

TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.

Pika007 said:
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Click to expand...
Click to collapse
The SGS might be falling behind in I/O speeds... It is well known that all the app data is stored in a slower internal SD-card partition... Has anyone tried the benchmarks with the lag fix?
Also, if only android made use of the GPU's to help render the UI's... It's such a shame that the GPU only goes to use in games...

Using the GPU to render the UI would take tons of battery power.
I preffer it being a bit less snappy, but a whole lot easier on the battery.

thephawx said:
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
Click to expand...
Click to collapse
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.

TexUs said:
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
Click to expand...
Click to collapse
I think so. The battery will be the biggest issue for the smart phone in the future if it just remain 1500mAh or even less.
The dual-core CPU could be fast but power aggressive as well.

Related

open gl es2.0( better games ) support on N1 ? when ?

I don't really play a lot of games on my phone but I have tried a number of games such as Abduction, Robo defense, wave blazer, speed forge and It's getting visually boring .. I wonder if android will ever get a real open gl es2.0 and be able to come out with games like what iPhone has; examples, need for speed and street fighter 4..
don't get me wrong, I love my phone very much but from time to time everyone craves for more.. I mean its not really a hardware issue since the N1 has better hardware specs than our friendly Apple counterparts .. come on we want Neon Floating Point support on our n1.... don't u agree ?
I cant wait to see the games improve
hopefully after the google io conference on may 19th..i read somewhere that google is going to be making gaming much more friendlier on android..until then try raging thunder 2 and asphalt 5
i really do miss my iphone games though :/
I made a post similer to this a wile back. Im also seeking better games. Theres a couple iphone games I hope that gets ported to android.
I hope they have games better than the iPhone...especially considering the hardware on this thing -man, it has better specs than some of the first computers I had!! haha
erebusting said:
I hope they have games better than the iPhone...especially considering the hardware on this thing -man, it has better specs than some of the first computers I had!! haha
Click to expand...
Click to collapse
Unfortunately this specifically won't happen....we may have the far superior CPU, but we have no GPU....graphical rendering is all done CPU-side; whereas the iphone has a GPU chip in it. Our CPU is pretty ridiculously powerful, such that it performs graphically about 80% of what the iphone is capable of, but we are limited by the lack of a GPU onboard.
MaximReapage said:
Unfortunately this specifically won't happen....we may have the far superior CPU, but we have no GPU....graphical rendering is all done CPU-side; whereas the iphone has a GPU chip in it. Our CPU is pretty ridiculously powerful, such that it performs graphically about 80% of what the iphone is capable of, but we are limited by the lack of a GPU onboard.
Click to expand...
Click to collapse
What are you talking about? Of course the Nexus One does have a GPU, capable of OpenGL ES 1.1 and 2.0.
MaximReapage said:
Unfortunately this specifically won't happen....we may have the far superior CPU, but we have no GPU....graphical rendering is all done CPU-side; whereas the iphone has a GPU chip in it. Our CPU is pretty ridiculously powerful, such that it performs graphically about 80% of what the iphone is capable of, but we are limited by the lack of a GPU onboard.
Click to expand...
Click to collapse
Dude, u're completely wrong. N1 has a separate chip for 3D rendering, a separate HARDWARE chip.
It's NOT CPU-based.
The snapdragon processor has a dedicated GPU called Adreno 200. A dedicated GPU is a requirement of Flash 10.1 and the reason why some older Android devices will not be able have it. The reason why some games on the iPhone look better than the N1 is because the N1 has a lot more pixels to push. The N1 can processes more Polys/second then iPhone but because there are less pixels the iPhone can render some games better than the N1.
Would be nice to see a software update the optimizes the GPU or perhaps on a new Cyanogen Rom
jlevy73 said:
The snapdragon processor has a dedicated GPU called Adreno 200. A dedicated GPU is a requirement of Flash 10.1 and the reason why some older Android devices will not be able have it. The reason why some games on the iPhone look better than the N1 is because the N1 has a lot more pixels to push. The N1 can processes more Polys/second then iPhone but because there are less pixels the iPhone can render some games better than the N1.
Would be nice to see a software update the optimizes the GPU or perhaps on a new Cyanogen Rom
Click to expand...
Click to collapse
Awww I thought we had the Hummingbird GPU.. I must have incorrectly read something somewhere. Oh no wait, that was the Samsung Galaxy S. It's much better than the Snapdragon, don't know about much, but it is better. 3 times better..
Why did Google render a Hummingbird in the video demonstrating the graphics processing of the Nexus -.-
Don't forget that any games designed for the Marketplace need to run on older versions of Android as well as older devices (such as the Dream). Because of that developers need to choose between making apps backwards compatible for maximum customers or writing apps better on only specific high-end devices (N1).
Eclair~ said:
Awww I thought we had the Hummingbird GPU.. I must have incorrectly read something somewhere. Oh no wait, that was the Samsung Galaxy S. It's much better than the Snapdragon, don't know about much, but it is better. 3 times better..
Why did Google render a Hummingbird in the video demonstrating the graphics processing of the Nexus -.-
Click to expand...
Click to collapse
Yep the Samsung S has a PowerVR SGX540 = 90 million triangles/sec versus Nexus One = 22 million triangles/sec
Not sure why Google did what they did but Samsung is using their new S5PC110 application processor. This processor contains an ARM Cortex-A8 core paired with a PowerVR SGX540 GPU.
andythefan said:
Don't forget that any games designed for the Marketplace need to run on older versions of Android as well as older devices (such as the Dream). Because of that developers need to choose between making apps backwards compatible for maximum customers or writing apps better on only specific high-end devices (N1).
Click to expand...
Click to collapse
In some respects that is true but look at Asphalt 5 for example. The game was designed with the iPhone in mind but when run on a stock N1, it lags real bad. I think a lot of these game developers make one version instead of platform specific ones.

Is galaxy s Gpu really that power

i have heared that galaxy s Gpu can give 90M triangles/sec is that true as some sources claming that it only gives 28M tri/sec http://en.wikipedia.org/wiki/PowerVR , and the higher one sgx 545 gives 40 m so how the sgx 540 gives 90M
hoss_n2 said:
i have heared that galaxy s Gpu can give 90M triangles/sec is that true as some sources claming that it only gives 28M tri/sec http://en.wikipedia.org/wiki/PowerVR , and the higher one sgx 545 gives 40 m so how the sgx 540 gives 90M
Click to expand...
Click to collapse
I don't think the number listed on wikipedia is 'triangles' per second... It just says polys... So it could be a different shape thats harder to render?
Just my guess.
Besides if the 90M claimed is actually the 28 million then don't worry because the same thing for the iPhone's GPU (the 535) claims around 22m and wiki is listing it as 14.
Aaannnnddd if you're worried about the GPU feel comforted that no 3D benchmarks I've seen have even slowed it down so far and you can see tons of videos on youtube of Galaxy S series phones face rolling every single other Android device in gaming FPS benchmarks. Even if it isn't as amazing as the numbers they claimed there is no doubt that it's the best in the market at the moment, and by quite a lot too!
I'm not going to pretend that I read the comment thoroughly, but I've read a similar question. The person who seemed to know what they were talking about, said that essentially the 90m is a "theoretical number" and that about half of that number is what the phone should? can? will? potentially? do...(skimming, memory and probably comprehension make that a very difficult word to fill in accurately)....but this is how all manufacturers report their graphics capabilities (at least in smartphones, but I'll assume the same holds true for the desktop/laptop graphics cards).
So, while the number is definitely overstated, it's within the standard reporting convention...and relative to other numbers, still accurate (2x as many triangles is 2x as many whether everything gets cut in half of cut by a factor of 3).
*I'll remove my fuzzy language when someone better informed than me responds*
I also read a good article (don't know where it is now sorry) all about how the GPU relies heavily on the memory and bus between them etc and for example there could be a phone running the same GPU as another and have much less performance because they don't use much memory, or use slow memory. Apparently our SGS have done pretty well in all departments.
To untangle the confusion-
Triangles= "polys" (polygons)
The SGS does nothing bear 90M, but on the other side, none of the other phones are doing what the manufacturers are claiming them to do.
Plus, the wikipedia article is FAR from being reliable, it's been edited more than 5 times over the past 2 months, with ever changing results. No official specs are available from imgtec.
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Pika007 said:
...
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Click to expand...
Click to collapse
Well, one important fact is the pixelcount in the glbenchmark link you sent. iPhone4 and iPad share the same GPU. The difference in pixels is about 20%, and hence the difference between those two.
Let me make one ugly calculation to map SGS's score to iPhone4's. Pixelcount difference between i4 and SGS is a factor 0.625. That we would make the SGS score 1146 on the iPhone resolution. (or 1723 for i4 on 800*480 resolution). Offcourse there are more factors involved but this the best estimate i can make at the moment.
Difference turns out not te be that great after all.
I knew this argument was going to pop up soon enough, so i'll add one VERY important factor-
Score doesn't decrease proportionally to an increase in resolution.
For example, doubling the resolution won't give half the score. More like 70%~
Try running 3Dmark on your PC in different resolutions, you'll see some interesting results.
Personally, GLmark 1.1 for me is just a very crude example, for general demontstrations. It's not really close to be very accurate.
I'm waiting for GLmark 2.0 that should be a great tool to effectively compare the devices.
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Pika007 said:
To untangle the confusion-
Triangles= "polys" (polygons)
The SGS does nothing bear 90M, but on the other side, none of the other phones are doing what the manufacturers are claiming them to do.
Plus, the wikipedia article is FAR from being reliable, it's been edited more than 5 times over the past 2 months, with ever changing results. No official specs are available from imgtec.
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Click to expand...
Click to collapse
yes it is edited more than 5 times but there is an offcial sources says that sgx 454 gives only 40M polygons so hw sgx450 gives 90M i know numbers are not important if there is nothing to use it but i only wanted to know
I think its due to fact that older chip has 2d acceleration too, while 450 is pure 3d and we use cpu for 2d. Thats why its faster.
It is important to note that PowerVR does not do 3D rendering using the traditional 3D polygon based pipeline, like those used in nVidia and ATi cards. It uses the unique tile based rendering engine. This approach is more efficient and uses less memory bandwidth as well as RAW horse power. IIRC, the original PowerVR 3D PC card is a PCI card that can compete head to head with AGP based cards from 3dfx and ATi at that time. Unfortunately, its unique rendering engine does not fit well with Direct3D and OpenGL which favor traditional polygon-based rendering pipelines.
So, the 90M figure could well be the equivelent performance number when using traditional 3D rendering pipeline as compared to Tile-based PowerVR setup.
Power VR does indeed use the traditional 3D polygon based pipeline.
Tile based rendering is in addition, not instead.
Do note that not all games (and actually, far from it) are using TBR properly (if at all).
Read the release notes and press release, it has enough details.
hoss_n2 said:
yes it is edited more than 5 times but there is an offcial sources says that sgx 454 gives only 40M polygons so hw sgx450 gives 90M i know numbers are not important if there is nothing to use it but i only wanted to know
Click to expand...
Click to collapse
All the given numbers for "official" specs about PowerVR GPU's are for a frequenct of 200mhz.
Those chips can do well above 400mhz, so for example, if an SGX530 does 14M polygons and 500Mpixels per second @200mhz, if you clock it up to 400, it'll do 28Mpolys/1Gpixels.
Though i extremely doubt samsung has the SGX540 clocked at 600mhz in ths SGS...
A pratical and good exaple that shows of the power of the Galaxy S is Gameloft's Real Football 2010 game. The game hasn't got a framelock so it's playable on the Desire and Nexus One. Since pictures tell a thousand words and videos even moreso, I'll provide you this YouTube link: http://www.youtube.com/watch?v=S0DxP0sk5s0
Pika007 said:
All the given numbers for "official" specs about PowerVR GPU's are for a frequenct of 200mhz.
Those chips can do well above 400mhz, so for example, if an SGX530 does 14M polygons and 500Mpixels per second @200mhz, if you clock it up to 400, it'll do 28Mpolys/1Gpixels.
Though i extremely doubt samsung has the SGX540 clocked at 600mhz in ths SGS...
Click to expand...
Click to collapse
This is true however overclocking the GPU to those numbers is silly because the memory & memory bus can't support that much data throughput anyway. I don't even think there is enough to support the amount of the standard clock rate. There is a lot more to consider than just the GPU when it comes to graphics here
You're taking that article you read way too seriously.
Plus, we have no idea what is the bandwidth limit of the galaxy S, we don't know what kind of memory is used, how much of it, at what frequency, etc.
WiseDuck said:
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Click to expand...
Click to collapse
+1
Re: lag, I want doing bad until I installed one of the fixes. Now I've officially entered crazy-town.
If I would have to guess it has to do with S5PC110 optimizations. When rendering polygons there are many things that contribute aside from the GPU. Think of it maybe similar to hybrid-sli...(but this is just a guess)
but if you want to look at it in more detail, someone posted the official documentation and spec sheet for the S5PC110 a while back..I ddint get a chance to look at it but my guess the clock speeds and other stuff would be there :/
WiseDuck said:
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Click to expand...
Click to collapse
Well i dont have any lags, what so ever after lag fix. Something else must be troubleing your phone. Auto memory manager is a need tho if you want to keep it real snappy.
Sent from my GT-I9000 using XDA App

Just what you always wanted - 2400 page processor manual!

I'm probably the only person on this planet that would ever download a 20.5-meg, 2426-page document titled "S5PC110 RISC Microprocessor User's Manual", but if there are other hardware freaks out there interested, here you go:
http://pdadb.net/index.php?m=repository&id=644&c=samsung_s5pc110_microprocessor_user_manual_1.00
As you may or may not know, the S5PC110, better known as Hummingbird, is the SoC (System on a Chip) that is the brain of your Epic. Now, when you have those moments when you really just gotta know the memory buffer size for your H.264 encoder or are dying to pore over a block diagram of your SGX540 GPU architecture, you can!
( Note: It does get a little bit dry at parts. Unless you're an ARM engineer, I suppose. )
Why arent you working on porting CM6 or gingerbread via CM7?? lol
now we can overclock the gpu
/sarcasm
cbusillo said:
Why arent you working on porting CM6 or gingerbread via CM7?? lol
Click to expand...
Click to collapse
Hah, because I know exactly squat about Android development. Hardware is more my thing, though if I find some spare time to play around with the Android SDK maybe that can change.
Sent from my SPH-D700 using XDA App
This actually is really exciting news. RISC architectures in general, especially the ARM instruction set is great and honestly it would so the works a lot of good kicking the chains of x86
Sent from my Nexus S with a keyboard
Interesting - the complete technical design of the Hummingbird chips.
After reading your blog as to how Hummingbird got its extra performance, I still wonder at times - did we make the right choice in getting this phone the Epic 4G (I bought one for $300 off contract and imported it to Canada) knowing that there are going to be ARM Cortex A9 CPUs coming around in just a couple of months? We know that in the real world, Hummingbird is more powerful than Snapdragon and the OMAP 3600 series, while benchmark scores tend to not reflect real world performance.
Performance-wise: It's know that the out of order A9 parts are at least 30% faster clock for clock in real world performance. There will be dual and maybe quad core implementations. What's really up in the air is the graphics performance of the A9 parts. There's now the Power VR SGX 545, the Mali 400, and the Tegra 2.
Edit: There is also the successor, the Mali T-604. I don't expect to see this in a phone in the near future. Nor do I expect the Tegra 3. Maybe close to this time next year though.
sauron0101 said:
Interesting - the complete technical design of the Hummingbird chips.
After reading your blog as to how Hummingbird got its extra performance, I still wonder at times - did we make the right choice in getting this phone the Epic 4G (I bought one for $300 off contract and imported it to Canada) knowing that there are going to be ARM Cortex A9 CPUs coming around in just a couple of months? We know that in the real world, Hummingbird is more powerful than Snapdragon and the OMAP 3600 series, while benchmark scores tend to not reflect real world performance.
Performance-wise: It's know that the out of order A9 parts are at least 30% faster clock for clock in real world performance. There will be dual and maybe quad core implementations. What's really up in the air is the graphics performance of the A9 parts. There's now the Power VR SGX 545, the Mali 400, and the Tegra 2.
Edit: There is also the successor, the Mali T-604. I don't expect to see this in a phone in the near future. Nor do I expect the Tegra 3. Maybe close to this time next year though.
Click to expand...
Click to collapse
Your always going to be playing catchup..I personally think the Epic has great hardware for the time...I mean on Samsung's roadmap for 2012/13 is their Aquila processor which is a quad-core 1.2ghz..its going to be endless catchup..every year there will be something that completely over shallows the rest..
gTen said:
Your always going to be playing catchup..I personally think the Epic has great hardware for the time...I mean on Samsung's roadmap for 2012/13 is their Aquila processor which is a quad-core 1.2ghz..its going to be endless catchup..every year there will be something that completely over shallows the rest..
Click to expand...
Click to collapse
No, but I mean, if you buy the latest technology when its released, you'll be set for quite some time.
For example, if you were to buy the one of the first Tegra 2 phones, its unlikely that anything is going to be beating that significantly until at least 2012 when the quad core parts begin to emerge.
It takes a year or so from the time that a CPU is announced to the time that it gets deployed in a handset. For example, the Snapdragon was announced in late 2008 and the first phones (HD2) were about a year later. IF you buy an A9 dual core part early on, you should be set for some time.
Well, I got the Epic knowing Tegra 2 was coming in a few months with next-gen performance. I was badly in need of a new phone and the Epic, while not a Cortex A9, is no slouch.
Sent from my SPH-D700 using XDA App
sauron0101 said:
No, but I mean, if you buy the latest technology when its released, you'll be set for quite some time.
For example, if you were to buy the one of the first Tegra 2 phones, its unlikely that anything is going to be beating that significantly until at least 2012 when the quad core parts begin to emerge.
Click to expand...
Click to collapse
Thats relative, in terms of GPU performance our Hummingbird doesn't do so badly..the GPU the TI chose to pair with the dual core OMAP is effectively a PowerVR SGX540..the Snapdragon that is rumored to be in the dual cores next summer is also on par with our GPU performance...so yes we will loose out to newer hardware..which is to be expected but I wouldn't consider it a slouch either...
It takes a year or so from the time that a CPU is announced to the time that it gets deployed in a handset. For example, the Snapdragon was announced in late 2008 and the first phones (HD2) were about a year later. IF you buy an A9 dual core part early on, you should be set for some time.
Click to expand...
Click to collapse
The first phone was a TG01, that said I guarantee you that a year if not less from the first Tegra release there will be a better processor out...its bound to happen..
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
gTen said:
Thats relative, in terms of GPU performance our Hummingbird doesn't do so badly..the GPU the TI chose to pair with the dual core OMAP is effectively a PowerVR SGX540..the Snapdragon that is rumored to be in the dual cores next summer is also on par with our GPU performance...so yes we will loose out to newer hardware..which is to be expected but I wouldn't consider it a slouch either...
The first phone was a TG01, that said I guarantee you that a year if not less from the first Tegra release there will be a better processor out...its bound to happen..
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
Click to expand...
Click to collapse
AFAIK, dual-core support is only fully supported by Honeycomb. But if you feel like buying into NVIDIA's explanation of Tegra 2 performance, check this out: http://www.nvidia.com/content/PDF/t...-Multi-core-CPUs-in-Mobile-Devices_Ver1.2.pdf
Electrofreak said:
AFAIK, dual-core support is only fully supported by Honeycomb. But if you feel like buying into NVIDIA's explanation of Tegra 2 performance, check this out: http://www.nvidia.com/content/PDF/t...-Multi-core-CPUs-in-Mobile-Devices_Ver1.2.pdf
Click to expand...
Click to collapse
I see I actually read before that Gingerbread would allow for dual core support but I guess that was delayed to honeycomb...
either way this would mean even if a Tegra based phone comes out it wont be able to utilize both cored until at least mid next year.
I can't open pdfs right now but I read a whitepaper with performance of hummingbird and Tegra 2 compared both on single core and dual core..is that the same one?
One thing though is Nvidia and ATI are quite known for tweaking their gfx cards to perform well on benchmarks...I hope its not the same with their CPUs :/
gTen said:
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
Click to expand...
Click to collapse
Here are some additional benchmarks comparing the Galaxy Tab to the Viewsonic G Tablet:
http://www.anandtech.com/show/4062/samsung-galaxy-tab-the-anandtech-review/5
It's possible that the Tegra 2 isn't optimized yet. Not to mention, Honeycomb will be the release that makes the most of dual cores. However, there are lackluster performance gains in terms of graphics - most of it seems to be purely CPU gains in performance.
I'm not entirely sure that Neocore is representative of real world performance either. It's possible that it may have been optimized for some platforms. Furthermore, I would not be surprised if Neocore gave inflated scores for the Snapdragon and it's Adreno graphics platform. Of course, neither is Quadrant.
I think that real world games like Quake III based games are the way to go, although until we see more graphics demanding games, I suppose that there's little to test (we're expecting more games for Android next year).
Finally, we've gotten to a point for web browsing where its the data connection HSPA+, LTE, or WiMAX that will dictate how fast pages load. It's like upgrading the CPU for a PC. I currently run an overclocked q6600 - if I were to upgrade to say a Sandy Bridge when it comes out next year, I don't expect significant improvements in real world browsing performance.
Eventually, the smartphone market will face the same problem that the PC market does. Apart from us enthusiasts who enjoy benchmarking and overclocking, apart from high end gaming, and perhaps some specialized operations (like video encoding which I do a bit of), you really don't need the latest and greatest CPU or 6+ GB of RAM (which many new desktops come with). Same with high end GPUs. Storage follows the same dilemna. I imagine that as storage grows, I'll be storing FLAC music files instead of AAC, MP3, or OGG, and more video. I will also use my cell phone to replace my USB key drive. Otherwise, there's no need for bigger storage.
gTen said:
I see I actually read before that Gingerbread would allow for dual core support but I guess that was delayed to honeycomb...
either way this would mean even if a Tegra based phone comes out it wont be able to utilize both cored until at least mid next year.
I can't open pdfs right now but I read a whitepaper with performance of hummingbird and Tegra 2 compared both on single core and dual core..is that the same one?
One thing though is Nvidia and ATI are quite known for tweaking their gfx cards to perform well on benchmarks...I hope its not the same with their CPUs :/
Click to expand...
Click to collapse
Gingerbread doesn't have any dual-core optimizations. It has some JIT improvements in addition to some other minor enhancements, but according to rumor, Honeycomb is where it's at, and it's why the major tablet manufacturers are holding off releasing their Tegra 2 tablets until it's released.
And yeah, that paper shows the performance of several different Cortex A8s (including Hummingbird) compared to Tegra 2, and then goes on to compare Tegra 2 single-core performance vs dual.
Electrofreak said:
Gingerbread doesn't have any dual-core optimizations. It has some JIT improvements in addition to some other minor enhancements, but according to rumor, Honeycomb is where it's at, and it's why the major tablet manufacturers are holding off releasing their Tegra 2 tablets until it's released.
And yeah, that paper shows the performance of several different Cortex A8s (including Hummingbird) compared to Tegra 2, and then goes on to compare Tegra 2 single-core performance vs dual.
Click to expand...
Click to collapse
I looked at:
http://androidandme.com/2010/11/new...u-will-want-to-buy-a-dual-core-mobile-device/
since I can't access the pdf..does the whitepaper state what version they used to do their tests? for example if they used 2.1 on the sgs and honeycomb on their tests it wouldn't exactly be a fair comparison...do they also put in the actual FPS..not % wise? for example we are capped on the FPS for example...
Lastly, in the test does it say whether the Tegra 2 was dithering at 16bit or 24bit?
gTen said:
I looked at:
http://androidandme.com/2010/11/new...u-will-want-to-buy-a-dual-core-mobile-device/
Click to expand...
Click to collapse
I'm one of Taylor's (unofficial) tech consultants, and I spoke with him regarding that article. Though, credit where it's due to Taylor, he's been digging stuff up recently that I don't have a clue about. We've talked about Honeycomb and dual-core tablets, and since Honeycomb will be the first release of Android to support tablets officially, and since Motorola seems to be holding back the release of its Tegra 2 tablet until Honeycomb (quickly checks AndroidAndMe to make sure I haven't said anything Taylor hasn't already said), and rumors say that Honeycomb will have dual-core support, it all makes sense.
But yes, the whitepaper is the one he used to base that article on.
gTen said:
since I can't access the pdf..does the whitepaper state what version they used to do their tests? for example if they used 2.1 on the sgs and honeycomb on their tests it wouldn't exactly be a fair comparison...do they also put in the actual FPS..not % wise? for example we are capped on the FPS for example...
Lastly, in the test does it say whether the Tegra 2 was dithering at 16bit or 24bit?
Click to expand...
Click to collapse
Android 2.2 was used in all of their tests according to the footnotes in the document. While I believe that Android 2.2 is capable of using both cores simultaneously, I don't believe it is capable of threading them separately. But that's just my theory. I'm just going off of what the Gingerbread documentation from Google says; and unfortunately there is no mention of improved multi-core processor support in Gingerbread.
http://developer.android.com/sdk/android-2.3-highlights.html
As for FPS and the dithering... they don't really go there; the whitepaper is clearly focused on CPU performance, and so it features benchmark scores and timed results. I take it all with a pinch of salt anyhow; despite the graphs and such, it's still basically an NVIDIA advertisement.
That said, Taylor has been to one of their expos or whatever you call it, and he's convinced that the Tegra 2 GPU will perform several times better than the SGX 540 in the Galaxy S phones. I'm not so sure I'm convinced... I've seen comparable performance benchmarks come from the LG Tegra 2 phone, but Taylor claims it was an early build with and he's seen even better performance. Time will tell I suppose...
EDIT - As for not being able to access the .pdfs, what are you talking about?! XDA app / browser and Adobe Reader!

Tegra 2 faster than a5, Exynos, Dual-core Snapdragon

I am sick of everyone thinking the upcoming dual-core devices will blow away tegra 2.
Tegra 2 vs Dual Core A5 (Ipad 2)
A lot of talk about Andntech OpenGL benchmark trumping Tegra 2, but what about Stockfish and Benchit Pi where A5 got slaughtered (PC Magazine)? With half the RAM and lower clock I don't see this thing smoking Tegra 2 in all benchmarks, or real life CPU situations.
Tegra 2 vs Exynos (Some Galaxy S2)
Lower benchmarks in Smartbench Gaming. Plus there is early benchmarks of Quadrant scores of 2100 tablets running the Exynos 4210. There is a reason why Samsung Galaxy S2 is including Tegra 2 in some regions.
Androidevolution.."One negative surprise on the S2 so far has been the level of GPU performance. So far, most of the early benchmark shows that Exynos 4210 isn’t up to par when it comes to the GPU performance. This is strange given that Samsung was leading the market when they introduced the previous generation SoC ...... Smartbench 2011 GPU numbers are once again, very disappointing"
Tegra 2 vs Dual Core-Snapdragon (HTC Pyramid)
This thing got smoked in Smartbench with gaming and productivity.
" Their tests confirm that the Pyramid indeed houses a dual-core chip, but the popular Smarbench 2011 shows a CPU and GPU that simply don’t hold up to the Tegra 2 chip found in the LG Optimus 2X and Motorola Atrix 4G"
Yea you're comparing pre-release builds of phones (S2 and Pyramid) with a Tegra 2 which has been out for months? Also, it's sad how poor the Tegra 2 platforms perform compared to the SGX540 which has been out for half a year already and still gets outscored in most benchmarks.
Oh and if you look at the most recent GLBenchmark 2.0 Egypt... Samsung's Exynos scores around 4000 compared to the Xoom's 1300 and Atrix's 2000. Even the original Galaxy S scores higher... around 2400.
Odroid-A Tablet which runs Exynos: http://www.glbenchmark.com/result.j...version=all&certified_only=2&brand=Hardkernel
Xoom and Atrix: http://www.glbenchmark.com/result.j...4&version=all&certified_only=2&brand=Motorola
Original Galaxy S: http://www.glbenchmark.com/result.j...=0&version=all&certified_only=2&brand=Samsung
And don't even bring up the Ipad 2. That thing has a dual core SGX543 which even in the single core version outperforms the SGX540, which the Tegra 2 can't even beat.
rex-tc said:
I am sick of everyone thinking the upcoming dual-core devices will blow away tegra 2.
Tegra 2 vs Dual Core A5 (Ipad 2)
A lot of talk about Andntech OpenGL benchmark trumping Tegra 2, but what about Stockfish and Benchit Pi where A5 got slaughtered (PC Magazine)? With half the RAM and lower clock I don't see this thing smoking Tegra 2 in all benchmarks, or real life CPU situations.
Tegra 2 vs Exynos (Some Galaxy S2)
Lower benchmarks in Smartbench Gaming. Plus there is early benchmarks of Quadrant scores of 2100 tablets running the Exynos 4210. There is a reason why Samsung Galaxy S2 is including Tegra 2 in some regions.
Androidevolution.."One negative surprise on the S2 so far has been the level of GPU performance. So far, most of the early benchmark shows that Exynos 4210 isn’t up to par when it comes to the GPU performance. This is strange given that Samsung was leading the market when they introduced the previous generation SoC ...... Smartbench 2011 GPU numbers are once again, very disappointing"
Tegra 2 vs Dual Core-Snapdragon (HTC Pyramid)
This thing got smoked in Smartbench with gaming and productivity.
" Their tests confirm that the Pyramid indeed houses a dual-core chip, but the popular Smarbench 2011 shows a CPU and GPU that simply don’t hold up to the Tegra 2 chip found in the LG Optimus 2X and Motorola Atrix 4G"
Click to expand...
Click to collapse
****ty deal... I wonder how much software it will take to make it speedy gonzales
Sent from my Xoom using XDA App
dinan said:
Yea you're comparing pre-release builds of phones (S2 and Pyramid) with a Tegra 2 which has been out for months? Also, it's sad how poor the Tegra 2 platforms perform compared to the SGX540 which has been out for half a year already and still gets outscored in most benchmarks.
Oh and if you look at the most recent GLBenchmark 2.0 Egypt... Samsung's Exynos scores around 4000 compared to the Xoom's 1300 and Atrix's 2000. Even the original Galaxy S scores higher... around 2400.
Odroid-A Tablet which runs Exynos: http://www.glbenchmark.com/result.j...version=all&certified_only=2&brand=Hardkernel
Xoom and Atrix: http://www.glbenchmark.com/result.j...4&version=all&certified_only=2&brand=Motorola
Original Galaxy S: http://www.glbenchmark.com/result.j...=0&version=all&certified_only=2&brand=Samsung
And don't even bring up the Ipad 2. That thing has a dual core SGX543 which even in the single core version outperforms the SGX540, which the Tegra 2 can't even beat.
Click to expand...
Click to collapse
ouch well put??? lol
dinan said:
Yea you're comparing pre-release builds of phones (S2 and Pyramid) with a Tegra 2 which has been out for months? Also, it's sad how poor the Tegra 2 platforms perform compared to the SGX540 which has been out for half a year already and still gets outscored in most benchmarks.
Oh and if you look at the most recent GLBenchmark 2.0 Egypt... Samsung's Exynos scores around 4000 compared to the Xoom's 1300 and Atrix's 2000. Even the original Galaxy S scores higher... around 2400.
Odroid-A Tablet which runs Exynos: http://www.glbenchmark.com/result.j...version=all&certified_only=2&brand=Hardkernel
Xoom and Atrix: http://www.glbenchmark.com/result.j...4&version=all&certified_only=2&brand=Motorola
Original Galaxy S: http://www.glbenchmark.com/result.j...=0&version=all&certified_only=2&brand=Samsung
And don't even bring up the Ipad 2. That thing has a dual core SGX543 which even in the single core version outperforms the SGX540, which the Tegra 2 can't even beat.
Click to expand...
Click to collapse
Are you freaking kidding me?! You're an idiot mate.
All these devices have different resolutions so obviously your devices with **** resolutions (ie ipad) will have awesome scores.
Dude seriously poor effort.
Nado85 said:
Are you freaking kidding me?! You're an idiot mate.
All these devices have different resolutions so obviously your devices with **** resolutions (ie ipad) will have awesome scores.
Dude seriously poor effort.
Click to expand...
Click to collapse
Both the Ipad2 and the Odroid run at higher resolutions than the Atrix, that should make them worse, not better off.
The Xoom is ~30% larger than the ipad2, but that is not enough to explain why the ipad2 is 4 times better
The Odroid is again larger than the Xoom, and that performes 3 times better than the Xoom.
dinan said:
Yea you're comparing pre-release builds of phones (S2 and Pyramid) with a Tegra 2 which has been out for months? Also, it's sad how poor the Tegra 2 platforms perform compared to the SGX540 which has been out for half a year already and still gets outscored in most benchmarks.
Oh and if you look at the most recent GLBenchmark 2.0 Egypt... Samsung's Exynos scores around 4000 compared to the Xoom's 1300 and Atrix's 2000. Even the original Galaxy S scores higher... around 2400.
Odroid-A Tablet which runs Exynos: http://www.glbenchmark.com/result.j...version=all&certified_only=2&brand=Hardkernel
Xoom and Atrix: http://www.glbenchmark.com/result.j...4&version=all&certified_only=2&brand=Motorola
Original Galaxy S: http://www.glbenchmark.com/result.j...=0&version=all&certified_only=2&brand=Samsung
And don't even bring up the Ipad 2. That thing has a dual core SGX543 which even in the single core version outperforms the SGX540, which the Tegra 2 can't even beat.
Click to expand...
Click to collapse
you know what this points out? That people are VERY stupid and care too much about upcoming technology! Everything people buy ends up being obsolete in about a week or two. Its REALLY sad to see this because these software developers put a lot of time for something that will only be hot for a few weeks and then its yesterdays news. That's why software is getting choppier, and there is no quality backing anymore.
Aside from my *****ing...i do like how that Samsung platform works..quite impressive, i'd like to see what Nvidia will do next. These new technologies have been pushed mad crazy this last year. I think quality and reliability will take a hit quite hard due to the silicon being pushed to the limit of its threshold...we're not too far from that.
Mafisometal said:
... I think quality and reliability will take a hit quite hard due to the silicon being pushed to the limit of its threshold...we're not too far from that.
Click to expand...
Click to collapse
Silicone has a long way to go before it max'es out. The good news is Nvidia has years of quality GPU fabbing and they've got loads of tricks up their sleeves yet.
What Samsung and Qualcomm dont have right now is games & software optimised for their chipsets. This is where the Tegra II is a step a head of the rest..
So don't stress peoples!
tadjiik said:
Both the Ipad2 and the Odroid run at higher resolutions than the Atrix, that should make them worse, not better off..
Click to expand...
Click to collapse
Also, the Atrix is also running Android 2.2, where as the Odroid is on 2.3 which is optimised for dual core CPU's.....
We will all see a big difference when Moto release 2.3 / 2.4 for the Atrix.
I can confirm the SGX540 (iphone 4 graphics processor) beast tegra 2. i know the resolution is lower, however the smoothness and especially quick scrolling on jam packed websites like non moble youtube for example show its smoother.
i have not done a bench yet. i am more than happy with my atrix. actaully i just got an amd zacate fusion e350 and its on part with my atrix dual core yet eats 18watts. actaully, the atrix plays less choppy than the zacate.
however, its not as fast 'yet' as the sxg540 and OMG i bet the SGX543 is awesome.
The iPhone 5 had a higher resolution then the atrix, yet scored 15-16 fps in tests where the atrix gets 48-50. The tegra 2 I'd very future proof for s few.months especially considering that man manufacturers are still making single core phones.
To see what optimization can do. Download fruit slice, and compare it to fruit ninja tegra HD.
Sent from my MB860 using XDA App
Techcruncher said:
The iPhone 5 had a higher resolution then the atrix, yet scored 15-16 fps in tests where the atrix gets 48-50. The tegra 2 I'd very future proof for s few.months especially considering that man manufacturers are still making single core phones.
Click to expand...
Click to collapse
Are you refering to the iphone4 here? That has a SGX535, which is worse than Tegra2, that is correct. But it is still not beating SGX540 (Samsung galaxy s), they seem to be about on par, according to glbenchmark.com. If you were refering to the ipad2, that beats tegra2 in just about anything.
To see what optimization can do. Download fruit slice, and compare it to fruit ninja tegra HD.
Click to expand...
Click to collapse
And what proof do you have that those games are not able to play on non-tegra phones or (more likely) could be optimized just as well or better for non-tegra phones? Throwing up a game that has not really been tested on non-tegra phones does not prove anything.
To throw the ball back, if you want to see what optimizations can do for Exynos do a search for "engadget exynos gdc", which has a 1080p 3D demo @60 fps (I am unable to post links...)
lol nice try.
oh and "your devices"? I like how you assume I'm an apple fanboy when I'm actually a die-hard android user... i HAVE an atrix, a nexus S, a nexus one. What phones do you have? and the benchmark scores I posted were all between android devices so I'm not sure where you're seeing these "awesome scores" for the ipad?
come back when you actually have something to contribute.
Nado85 said:
Are you freaking kidding me?! You're an idiot mate.
All these devices have different resolutions so obviously your devices with **** resolutions (ie ipad) will have awesome scores.
Dude seriously poor effort.
Click to expand...
Click to collapse
rex-tc said:
I am sick of everyone thinking the upcoming dual-core devices will blow away tegra 2.
Tegra 2 vs Dual Core A5 (Ipad 2)
A lot of talk about Andntech OpenGL benchmark trumping Tegra 2, but what about Stockfish and Benchit Pi where A5 got slaughtered (PC Magazine)? With half the RAM and lower clock I don't see this thing smoking Tegra 2 in all benchmarks, or real life CPU situations.
Tegra 2 vs Exynos (Some Galaxy S2)
Lower benchmarks in Smartbench Gaming. Plus there is early benchmarks of Quadrant scores of 2100 tablets running the Exynos 4210. There is a reason why Samsung Galaxy S2 is including Tegra 2 in some regions.
Androidevolution.."One negative surprise on the S2 so far has been the level of GPU performance. So far, most of the early benchmark shows that Exynos 4210 isn’t up to par when it comes to the GPU performance. This is strange given that Samsung was leading the market when they introduced the previous generation SoC ...... Smartbench 2011 GPU numbers are once again, very disappointing"
Tegra 2 vs Dual Core-Snapdragon (HTC Pyramid)
This thing got smoked in Smartbench with gaming and productivity.
" Their tests confirm that the Pyramid indeed houses a dual-core chip, but the popular Smarbench 2011 shows a CPU and GPU that simply don’t hold up to the Tegra 2 chip found in the LG Optimus 2X and Motorola Atrix 4G"
Click to expand...
Click to collapse
You really need to wait for benchmarks on the EVO 3D to come out.
And you really need to see a finalized and optimized (Driver wise) S2.
To make a fair comparison.
dinan said:
Yea you're comparing pre-release builds of phones (S2 and Pyramid) with a Tegra 2 which has been out for months? Also, it's sad how poor the Tegra 2 platforms perform compared to the SGX540 which has been out for half a year already and still gets outscored in most benchmarks.
Oh and if you look at the most recent GLBenchmark 2.0 Egypt... Samsung's Exynos scores around 4000 compared to the Xoom's 1300 and Atrix's 2000. Even the original Galaxy S scores higher... around 2400.
Odroid-A Tablet which runs Exynos: http://www.glbenchmark.com/result.j...version=all&certified_only=2&brand=Hardkernel
Xoom and Atrix: http://www.glbenchmark.com/result.j...4&version=all&certified_only=2&brand=Motorola
Original Galaxy S: http://www.glbenchmark.com/result.j...=0&version=all&certified_only=2&brand=Samsung
And don't even bring up the Ipad 2. That thing has a dual core SGX543 which even in the single core version outperforms the SGX540, which the Tegra 2 can't even beat.
Click to expand...
Click to collapse
Again more OpenGL benchmarks that prove nothing, nothing more than a fillrate test at best. The Tegra 2 has already proven to be better at productivity and a has twice the RAM as the ipad 2. Which means higher res textures and with the better CPU better physics with PhysX. SGX is nothing but a tile renderer that fakes what a true T&L engine produces. When you start having more CPU centric games with high res textures we will see who will prevail. Plus the toolset of NVIDIA is MULTIPLE times better and we are already seeing straight PC ports.
in the paper,,tegra 2 should be the weakest among them..
in fact, on the test, tegra 2 is not fall behind.
i still think exynos, a5, c2 snapdragon's performance will be better than tegra2, just the matter of time.
well, tegra2 is good enough. but tegra3 and tegra4 are the ones that take the lead.
Well until those "CPU centric" games you're talking about actually come out, the only thing we can compare it to is what's out there right now. If you want to see the Tegra 2 get shamed by the iPad 2's SGX543... http://www.anandtech.com/show/4216/...rmance-explored-powervr-sgx543mp2-benchmarked
rex-tc said:
Again more OpenGL benchmarks that prove nothing, nothing more than a fillrate test at best. The Tegra 2 has already proven to be better at productivity and a has twice the RAM as the ipad 2. Which means higher res textures and with the better CPU better physics with PhysX. SGX is nothing but a tile renderer that fakes what a true T&L engine produces. When you start having more CPU centric games with high res textures we will see who will prevail. Plus the toolset of NVIDIA is MULTIPLE times better and we are already seeing straight PC ports.
Click to expand...
Click to collapse
rex-tc said:
Again more OpenGL benchmarks that prove nothing, nothing more than a fillrate test at best.
Click to expand...
Click to collapse
The tests are comprehensive and test different parts of the chip/driver. There is a few "real-life" tests, as well as a bunch of synthetic tests.
The Tegra 2 has already proven to be better at productivity and a has twice the RAM as the ipad 2.
Click to expand...
Click to collapse
Wait, so the "productivity" tests do prove something Well, I belive the ipad is clocked somewhat lower than T2, so no real surprise there. Trying to differentiate between different Dual-A9 cores might be hard, though, since they are all based on the same design. The only thing I could see Tegra2 had donewas the inclusion of a hardware JPEG decoder on the Tegra2, that might skew the productivity tests a little. On the other hand, they are not including NEON, so for tests that include that, they might be at a loss.
SGX is nothing but a tile renderer that fakes what a true T&L engine produces.
Click to expand...
Click to collapse
Do you even know what a tile renderer is? It is not "faking" what a true "T&L engine" produces, it is about not doing rasterization and fragment processing until frames are swapped, thus enabling the use of only a small render buffer (a tile). The only thing it "fakes" is that overdrawn pixels are not fragment processed - but this is also done on non-tilebased to a lesser degree (with early-Z).
By the way, "T&L engine"? There is no hardware "T&L engine" anymore - all is done through shaders nowadays.
When you start having more CPU centric games with high res textures we will see who will prevail.
Click to expand...
Click to collapse
If we are talking CPU centric, then Tegra2 will be at a loss because of its lack of NEON (which I belive Exynos supports). I am not sure if the Apple has it, but that is still comparing apples and oranges (different OS) when it comes to benchmarks.
Plus the toolset of NVIDIA is MULTIPLE times better and we are already seeing straight PC ports.
Click to expand...
Click to collapse
Could you point me to the toolset? When I googled "tegra2 toolset", this post was the first...
Are we seeing PC ports? Could you mention some names/examples? Any reason why they will not run on non-tegra2?
SlimJ87D said:
You really need to wait for benchmarks on the EVO 3D to come out.
And you really need to see a finalized and optimized (Driver wise) S2.
To make a fair comparison.
Click to expand...
Click to collapse
No no no it's important to cherry-pick and declare victory as soon as possible.
I think people need to wait for release of the actual phones before comparing, I mean htc havnt even announced the pyramid an people think its crap because of its in development benchmark, of course scores are going to be crap if its still in development?
Sent from my thumbs

GPU and benchmarks

Hey everyone.
I'm a bit lost and I don't know what to choose to buy: I9500 or I9505.
So far I know that Adreno 320 is fully OpenGL 3.0 compatible, while PowerVR SGX544MP3 not.
Adreno 320 is scoring 4 FPS more than PowerVR in T-Rex GLBenchmark 2.7.0.
PowerVR is scoring 1-2 more FPS in GLBenchmark 2.5 Egypt
Both GPU is scoring the same in Antutu and Quadrant video test, with PowerVR slightly better for few seconds (Adreno is dropping 1-2 seconds of the test to 30 FPS while PowerVR stay constant at 50-60)
In Antutu, the 3rd test (with the DNA code), Adreno 320 stays at 30-40 fps while PowerVR scores constant 60.
Both, 3dmark and glbenchmark show the PowerVR in the S4 even weaker than Nexus 4 and other chinese mobiles.
What's the deal....what the hell it's happening ? Is PowerVR that weak in the new graphic technologies but scores well in the new ones ?
Also, is there any OpenGL 3.0 benchmark so we can compare the Adreno 320 (fully OpenGL 3.0) with the PowerVR 544MP3 (OpenGL 2.0 but with some OpenGL 3.0 features thanks to an API), to see what the score and quality is ? I really want to see what that 3.0 API knows to do, as the Imagination doesn't really says what that API really do. Would there be games or apps using only OpenGL 3.0 and we will have trouble to run them because of this old GPU ?
I'm wondering...if in one year will be released an OpenGL 3.0 game, what will happens with S4 Octa ? It will not be able to play it, right ? I have no idea how that OpenGL thing works, but I remember that a game requesting DirectX 10 will not work with DirectX 9.
PowerVR really sucks. Samsung dumbs should put the PowerVR 6 "Rogue".
My opinion is that the Qualcomm scores very well, even my S3 is enough to play every single game, but the phone lags on RAM and that's why I replace it now. Buying the Octa will costs me $150 more than the Qualcomm version and I will need to send it oversea in case I will have problems and need to send it to warranty. With those $150 I can buy 2 spare battery and the Samsung S band instead getting the Octa. I want the Octa, but this phone really deserve such attention with that old rubish PowerVR GPU chip ? I don't have 4G in my area, so I don't care about the 4G, but will be nice in case I will travel somewhere with 4G, even if for me HSPA+ is enough and very fast, so the only thing counts here is the CPU, GPU and the battery life. Battery life can be solved with an additional battery, so remains the GPU and the CPU....So far A15 cores are yet very fast, but can use a lot of energy. So I can have 2 days battery life with texting and calling, but 2 hours playing games and watching 1080p videos, while with A9 I will have something similar to S3
Any developer or experienced guy here can answer me to this questions ?
Nobody ?
I'm the same situation. I'm still deciding on what version i should buy...
We need an user with Galaxy S 4 Exynos and one with Snapdragon. They should do same tests (like linpack, vellamo, antutu, and much more) and give us results.
For OpenGL 3.0 i think is better to have native support, not via APIs. Also in Snapdragon we can have same Exynos Performance via OCs and much more. I find Snapdragon more optimizable than exynos, but PowerVR is still a good GPU.
Alberto96 said:
I'm the same situation. I'm still deciding on what version i should buy...
We need an user with Galaxy S 4 Exynos and one with Snapdragon. They should do same tests (like linpack, vellamo, antutu, and much more) and give us results.
For OpenGL 3.0 i think is better to have native support, not via APIs. Also in Snapdragon we can have same Exynos Performance via OCs and much more. I find Snapdragon more optimizable than exynos, but PowerVR is still a good GPU.
Click to expand...
Click to collapse
Totally agree with you. I don't get it why people says the powervr is better. I see that in antutu benchmark scores better than adreno, but in GLBenchmark is awful. This is my only worry right now: what happens if we put the two gpu to do a full OpenGL ES 3.0 test? It will throw an error or will pass it, but with lower score. I don't care the score so much, but its capability to pass the test. If it pass it, I'm sold to Octa.
Also I found that Octa supports LPPDDR3 at 800Mhz, which means 12.8GB/s bandwidth, while S600 is LPPDDR3 but only at 600Mhz or so (only 9.4GB/s or something like that)
Sent from my GT-I9300 using Tapatalk 2
I just read (italian forum) that Exynos in the future can use all of the 8 cores together with kernel 3.8 .
So.......i think i will buy the exynos I'm just waiting a friend reply that bought it on Expansys USA. If he receive it and is all good, i will buy it from that site. With Italian Taxes (21%) and shipping costs it will cost about 730-740€
Alberto96 said:
I just read (italian forum) that Exynos in the future can use all of the 8 cores together with kernel 3.8 .
Click to expand...
Click to collapse
Why would you need these eight cores working together? How will you be sure Android will dispatch your applications threads in a proper way among them? Just another headache. I also don't believe they will really help to save battery, it's a pure marketing. But A15 is a bit more powerful than Krait from S600.
I think PowerVR 544MP3 scores below Adreno 320 in T-Rex because of unified architecture implemented in Adreno. This test uses complex shaders on every surface, so, probably, Octa GPU runs out of its fragment processors.
If you don't need a new phone right now, wait for S800 models. I don't think Mali T65x is good enough either. Looking at S3 GPU - yes, it's pretty fast in some wonderful tasks as rendering to texture, but it has some weird bottlenecks making Horn and T-REX much slower in fps than I've expected looking at pure gflops values.
Phobos Exp-Nord said:
Why would you need these eight cores working together? How will you be sure Android will dispatch your applications threads in a proper way among them? Just another headache. I also don't believe they will really help to save battery, it's a pure marketing. But A15 is a bit more powerful than Krait from S600.
I think PowerVR 544MP3 scores below Adreno 320 in T-Rex because of unified architecture implemented in Adreno. This test uses complex shaders on every surface, so, probably, Octa GPU runs out of its fragment processors.
If you don't need a new phone right now, wait for S800 models. I don't think Mali T65x is good enough either. Looking at S3 GPU - yes, it's pretty fast in some wonderful tasks as rendering to texture, but it has some weird bottlenecks making Horn and T-REX much slower in fps than I've expected looking at pure gflops values.
Click to expand...
Click to collapse
Well, when you play some heavy games you need all cores. Also is useful to use all cores when you are charging phone, without killing battery.
I need a new phone, because my Galaxy S I9000 is slow with new apps and android versions. If i buy this is useless a S800 version. CPU is fast, gpu maybe not as Adreno 330, but with overclock we can boost a lot performances.
Dude, using all eight cores will simply melt your phone in your hands LOL. You will drink S4 cocktail LOL. Quad-core is enough, but a gpu it's never. Same things are happening with the PCs. I don't need huge fps in trex, but some safe reviews and opinions from people really knows this things....but so far only you two were able to answer (I will not pretend yet that this forum is full of noobs LOL).
I want new mobile because of the lack of ram in S3, even if it's smooth for me. I was happy to hear about the Octa version, because I wanted to try something new, but I'm kinda lost now.
Alberto96, please let me know when your friend gets that i9500. I want to get it from Expansys too (I think we already talked together about this in other threads). If I will buy i9505 I will get it from Amazon Italy as it cheaper than other places
I'm just comparing:
I9500: - 1 years of warranty (overseas)
I9505 - 2 years of warranty (locally)
I9500 = I9505 + 3 additional S4 batteries with external charger
That because:
740€ = 625€ + 35€ x 3 batteries (and I will still have money for a Burger King and a Cola)
So...it's really deserve the risk ? Still nobody answered me related to OpenGL ES 3.0
S800 and Adreno 330 will not be in a Samsung device soon (maybe never) and 2.1-2.3GHz looks too much for a mobile phone. We already have warming issues with the S4 (I even have issues in S3, with the phone going warmer). Also....My laptop is a Dual-Core AMD 2.1 GHz for God sake.
@Alberto96, I beg you, when your friend gets the phone, please test it and let me know what you think ?
demlasjr said:
2.1-2.3GHz looks too much for a mobile phone
Click to expand...
Click to collapse
Not when playing Hi10P in software.
I do not know the exact internal scheme of Exynos Octa, so it's easy for me to imagine the situation when two threads of single application will be dispatched to two different core domains, making it really hard to exchange the data between them, as probably each domain has its own cache subsystem, so the performance will drop even higher than with two threads on A7-domain together.
Phobos Exp-Nord said:
Not when playing Hi10P in software.
I do not know the exact internal scheme of Exynos Octa, so it's easy for me to imagine the situation when two threads of single application will be dispatched to two different core domains, making it really hard to exchange the data between them, as probably each domain has its own cache subsystem, so the performance will drop even higher than with two threads on A7-domain together.
Click to expand...
Click to collapse
Yeah, you're right here. I don't have much knowledge relating this profile as I'm not watching anime, but seems to depending more on the GPU than CPU in S4 case. I'm really sure that Exynos Octa is able to run it, but not sure about the PowerVR. I've read that an Hi10P plays anywhere from 15-20fps (watchable, but still not that great) with a Tegra 3 quad-core overclocked at 1.6GHz, so there is still hope.
demlasjr said:
I've read that an Hi10P plays anywhere from 15-20fps
Click to expand...
Click to collapse
That's about 720p. Just asked in another thread there about 1080p - S4 cannot play it smooth enough with MX Player. It's not a question of resolution, it's a problem of use a file from 1080p home collection without any additional efforts.
We'll see, maybe later there will be an update released for such issues. I think the GPU and the CPU of both variants are capable of playing such videos.
Hey guys,
http://withimagination.imgtec.com/i...or-todays-leading-platforms#comment-880303396
jumping directly from OpenGL ES 2.0 to 3.0 would create a situation where app compatibility would be severely broken across devices. But most people update their devices every two years; by that time, PowerVR Series6 would be the dominant OpenGL ES 3.0 GPU generation shipping in most devices.
It is also important to remember that the PowerVR Series5XT GPU family has been successfully holding its own against recently released competing graphics solutions despite being released almost four years ago, which in itself is an amazing feat.
Click to expand...
Click to collapse
So....we should trust alexvoica and go forward with PowerVR SGX544MP3 even if lacks of OpenGL ES 2.0 ? He said that there was long way til OpenGL ES 2.0, but it wasn't such a big way as he said. Now every single game use OpenGL ES 2.0, I'm sure soon will be OpenGL ES 3.0 games only and not after 2 years.
get a look at this http://gfxbench.com/compare.jsp?cols=2&D1=Samsung+GT-I9500+Galaxy+S4&D2=Samsung+GT-I9505+Galaxy+S4

Categories

Resources