Is galaxy s Gpu really that power - Galaxy S I9000 General

i have heared that galaxy s Gpu can give 90M triangles/sec is that true as some sources claming that it only gives 28M tri/sec http://en.wikipedia.org/wiki/PowerVR , and the higher one sgx 545 gives 40 m so how the sgx 540 gives 90M

hoss_n2 said:
i have heared that galaxy s Gpu can give 90M triangles/sec is that true as some sources claming that it only gives 28M tri/sec http://en.wikipedia.org/wiki/PowerVR , and the higher one sgx 545 gives 40 m so how the sgx 540 gives 90M
Click to expand...
Click to collapse
I don't think the number listed on wikipedia is 'triangles' per second... It just says polys... So it could be a different shape thats harder to render?
Just my guess.
Besides if the 90M claimed is actually the 28 million then don't worry because the same thing for the iPhone's GPU (the 535) claims around 22m and wiki is listing it as 14.
Aaannnnddd if you're worried about the GPU feel comforted that no 3D benchmarks I've seen have even slowed it down so far and you can see tons of videos on youtube of Galaxy S series phones face rolling every single other Android device in gaming FPS benchmarks. Even if it isn't as amazing as the numbers they claimed there is no doubt that it's the best in the market at the moment, and by quite a lot too!

I'm not going to pretend that I read the comment thoroughly, but I've read a similar question. The person who seemed to know what they were talking about, said that essentially the 90m is a "theoretical number" and that about half of that number is what the phone should? can? will? potentially? do...(skimming, memory and probably comprehension make that a very difficult word to fill in accurately)....but this is how all manufacturers report their graphics capabilities (at least in smartphones, but I'll assume the same holds true for the desktop/laptop graphics cards).
So, while the number is definitely overstated, it's within the standard reporting convention...and relative to other numbers, still accurate (2x as many triangles is 2x as many whether everything gets cut in half of cut by a factor of 3).
*I'll remove my fuzzy language when someone better informed than me responds*

I also read a good article (don't know where it is now sorry) all about how the GPU relies heavily on the memory and bus between them etc and for example there could be a phone running the same GPU as another and have much less performance because they don't use much memory, or use slow memory. Apparently our SGS have done pretty well in all departments.

To untangle the confusion-
Triangles= "polys" (polygons)
The SGS does nothing bear 90M, but on the other side, none of the other phones are doing what the manufacturers are claiming them to do.
Plus, the wikipedia article is FAR from being reliable, it's been edited more than 5 times over the past 2 months, with ever changing results. No official specs are available from imgtec.
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.

Pika007 said:
...
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Click to expand...
Click to collapse
Well, one important fact is the pixelcount in the glbenchmark link you sent. iPhone4 and iPad share the same GPU. The difference in pixels is about 20%, and hence the difference between those two.
Let me make one ugly calculation to map SGS's score to iPhone4's. Pixelcount difference between i4 and SGS is a factor 0.625. That we would make the SGS score 1146 on the iPhone resolution. (or 1723 for i4 on 800*480 resolution). Offcourse there are more factors involved but this the best estimate i can make at the moment.
Difference turns out not te be that great after all.

I knew this argument was going to pop up soon enough, so i'll add one VERY important factor-
Score doesn't decrease proportionally to an increase in resolution.
For example, doubling the resolution won't give half the score. More like 70%~
Try running 3Dmark on your PC in different resolutions, you'll see some interesting results.
Personally, GLmark 1.1 for me is just a very crude example, for general demontstrations. It's not really close to be very accurate.
I'm waiting for GLmark 2.0 that should be a great tool to effectively compare the devices.

Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.

Pika007 said:
To untangle the confusion-
Triangles= "polys" (polygons)
The SGS does nothing bear 90M, but on the other side, none of the other phones are doing what the manufacturers are claiming them to do.
Plus, the wikipedia article is FAR from being reliable, it's been edited more than 5 times over the past 2 months, with ever changing results. No official specs are available from imgtec.
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Click to expand...
Click to collapse
yes it is edited more than 5 times but there is an offcial sources says that sgx 454 gives only 40M polygons so hw sgx450 gives 90M i know numbers are not important if there is nothing to use it but i only wanted to know

I think its due to fact that older chip has 2d acceleration too, while 450 is pure 3d and we use cpu for 2d. Thats why its faster.

It is important to note that PowerVR does not do 3D rendering using the traditional 3D polygon based pipeline, like those used in nVidia and ATi cards. It uses the unique tile based rendering engine. This approach is more efficient and uses less memory bandwidth as well as RAW horse power. IIRC, the original PowerVR 3D PC card is a PCI card that can compete head to head with AGP based cards from 3dfx and ATi at that time. Unfortunately, its unique rendering engine does not fit well with Direct3D and OpenGL which favor traditional polygon-based rendering pipelines.
So, the 90M figure could well be the equivelent performance number when using traditional 3D rendering pipeline as compared to Tile-based PowerVR setup.

Power VR does indeed use the traditional 3D polygon based pipeline.
Tile based rendering is in addition, not instead.
Do note that not all games (and actually, far from it) are using TBR properly (if at all).
Read the release notes and press release, it has enough details.
hoss_n2 said:
yes it is edited more than 5 times but there is an offcial sources says that sgx 454 gives only 40M polygons so hw sgx450 gives 90M i know numbers are not important if there is nothing to use it but i only wanted to know
Click to expand...
Click to collapse
All the given numbers for "official" specs about PowerVR GPU's are for a frequenct of 200mhz.
Those chips can do well above 400mhz, so for example, if an SGX530 does 14M polygons and 500Mpixels per second @200mhz, if you clock it up to 400, it'll do 28Mpolys/1Gpixels.
Though i extremely doubt samsung has the SGX540 clocked at 600mhz in ths SGS...

A pratical and good exaple that shows of the power of the Galaxy S is Gameloft's Real Football 2010 game. The game hasn't got a framelock so it's playable on the Desire and Nexus One. Since pictures tell a thousand words and videos even moreso, I'll provide you this YouTube link: http://www.youtube.com/watch?v=S0DxP0sk5s0

Pika007 said:
All the given numbers for "official" specs about PowerVR GPU's are for a frequenct of 200mhz.
Those chips can do well above 400mhz, so for example, if an SGX530 does 14M polygons and 500Mpixels per second @200mhz, if you clock it up to 400, it'll do 28Mpolys/1Gpixels.
Though i extremely doubt samsung has the SGX540 clocked at 600mhz in ths SGS...
Click to expand...
Click to collapse
This is true however overclocking the GPU to those numbers is silly because the memory & memory bus can't support that much data throughput anyway. I don't even think there is enough to support the amount of the standard clock rate. There is a lot more to consider than just the GPU when it comes to graphics here

You're taking that article you read way too seriously.
Plus, we have no idea what is the bandwidth limit of the galaxy S, we don't know what kind of memory is used, how much of it, at what frequency, etc.

WiseDuck said:
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Click to expand...
Click to collapse
+1
Re: lag, I want doing bad until I installed one of the fixes. Now I've officially entered crazy-town.

If I would have to guess it has to do with S5PC110 optimizations. When rendering polygons there are many things that contribute aside from the GPU. Think of it maybe similar to hybrid-sli...(but this is just a guess)
but if you want to look at it in more detail, someone posted the official documentation and spec sheet for the S5PC110 a while back..I ddint get a chance to look at it but my guess the clock speeds and other stuff would be there :/

WiseDuck said:
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Click to expand...
Click to collapse
Well i dont have any lags, what so ever after lag fix. Something else must be troubleing your phone. Auto memory manager is a need tho if you want to keep it real snappy.
Sent from my GT-I9000 using XDA App

Related

open gl es2.0( better games ) support on N1 ? when ?

I don't really play a lot of games on my phone but I have tried a number of games such as Abduction, Robo defense, wave blazer, speed forge and It's getting visually boring .. I wonder if android will ever get a real open gl es2.0 and be able to come out with games like what iPhone has; examples, need for speed and street fighter 4..
don't get me wrong, I love my phone very much but from time to time everyone craves for more.. I mean its not really a hardware issue since the N1 has better hardware specs than our friendly Apple counterparts .. come on we want Neon Floating Point support on our n1.... don't u agree ?
I cant wait to see the games improve
hopefully after the google io conference on may 19th..i read somewhere that google is going to be making gaming much more friendlier on android..until then try raging thunder 2 and asphalt 5
i really do miss my iphone games though :/
I made a post similer to this a wile back. Im also seeking better games. Theres a couple iphone games I hope that gets ported to android.
I hope they have games better than the iPhone...especially considering the hardware on this thing -man, it has better specs than some of the first computers I had!! haha
erebusting said:
I hope they have games better than the iPhone...especially considering the hardware on this thing -man, it has better specs than some of the first computers I had!! haha
Click to expand...
Click to collapse
Unfortunately this specifically won't happen....we may have the far superior CPU, but we have no GPU....graphical rendering is all done CPU-side; whereas the iphone has a GPU chip in it. Our CPU is pretty ridiculously powerful, such that it performs graphically about 80% of what the iphone is capable of, but we are limited by the lack of a GPU onboard.
MaximReapage said:
Unfortunately this specifically won't happen....we may have the far superior CPU, but we have no GPU....graphical rendering is all done CPU-side; whereas the iphone has a GPU chip in it. Our CPU is pretty ridiculously powerful, such that it performs graphically about 80% of what the iphone is capable of, but we are limited by the lack of a GPU onboard.
Click to expand...
Click to collapse
What are you talking about? Of course the Nexus One does have a GPU, capable of OpenGL ES 1.1 and 2.0.
MaximReapage said:
Unfortunately this specifically won't happen....we may have the far superior CPU, but we have no GPU....graphical rendering is all done CPU-side; whereas the iphone has a GPU chip in it. Our CPU is pretty ridiculously powerful, such that it performs graphically about 80% of what the iphone is capable of, but we are limited by the lack of a GPU onboard.
Click to expand...
Click to collapse
Dude, u're completely wrong. N1 has a separate chip for 3D rendering, a separate HARDWARE chip.
It's NOT CPU-based.
The snapdragon processor has a dedicated GPU called Adreno 200. A dedicated GPU is a requirement of Flash 10.1 and the reason why some older Android devices will not be able have it. The reason why some games on the iPhone look better than the N1 is because the N1 has a lot more pixels to push. The N1 can processes more Polys/second then iPhone but because there are less pixels the iPhone can render some games better than the N1.
Would be nice to see a software update the optimizes the GPU or perhaps on a new Cyanogen Rom
jlevy73 said:
The snapdragon processor has a dedicated GPU called Adreno 200. A dedicated GPU is a requirement of Flash 10.1 and the reason why some older Android devices will not be able have it. The reason why some games on the iPhone look better than the N1 is because the N1 has a lot more pixels to push. The N1 can processes more Polys/second then iPhone but because there are less pixels the iPhone can render some games better than the N1.
Would be nice to see a software update the optimizes the GPU or perhaps on a new Cyanogen Rom
Click to expand...
Click to collapse
Awww I thought we had the Hummingbird GPU.. I must have incorrectly read something somewhere. Oh no wait, that was the Samsung Galaxy S. It's much better than the Snapdragon, don't know about much, but it is better. 3 times better..
Why did Google render a Hummingbird in the video demonstrating the graphics processing of the Nexus -.-
Don't forget that any games designed for the Marketplace need to run on older versions of Android as well as older devices (such as the Dream). Because of that developers need to choose between making apps backwards compatible for maximum customers or writing apps better on only specific high-end devices (N1).
Eclair~ said:
Awww I thought we had the Hummingbird GPU.. I must have incorrectly read something somewhere. Oh no wait, that was the Samsung Galaxy S. It's much better than the Snapdragon, don't know about much, but it is better. 3 times better..
Why did Google render a Hummingbird in the video demonstrating the graphics processing of the Nexus -.-
Click to expand...
Click to collapse
Yep the Samsung S has a PowerVR SGX540 = 90 million triangles/sec versus Nexus One = 22 million triangles/sec
Not sure why Google did what they did but Samsung is using their new S5PC110 application processor. This processor contains an ARM Cortex-A8 core paired with a PowerVR SGX540 GPU.
andythefan said:
Don't forget that any games designed for the Marketplace need to run on older versions of Android as well as older devices (such as the Dream). Because of that developers need to choose between making apps backwards compatible for maximum customers or writing apps better on only specific high-end devices (N1).
Click to expand...
Click to collapse
In some respects that is true but look at Asphalt 5 for example. The game was designed with the iPhone in mind but when run on a stock N1, it lags real bad. I think a lot of these game developers make one version instead of platform specific ones.

Galaxy S SGX540 GPU. Any details up 'till now?

Hi everyone
For quite a long time i've been thinking about the whole "galaxy s can do 90mpolys per second" thing.
It sounds like total bull****.
So, after many, many hours of googling, and some unanswered mails to imgtec, i'd like to know-
Can ANYONE provide any concrete info about the SGX540?
From one side i see declerations that the SGX540 can do 90 million polygons per second, and from the other side i see stuff like "Twice the performance of SGX530".
...but twice the performance of SGX530 is EXACTLY what the SGX535 has.
So is the 540 a rebrand of the 535? that can't be, so WHAT THE HELL is going on?
I'm seriously confused, and would be glad if anyone could pour light on the matter.
I asked a Samsung rep what the difference was and this is what I got:
Q: The Samsung Galaxy S uses the SGX540 vs the iPhone using the SGx535. The only data I can find seems like these two GPU's are very similar. Could you please highlight some of the differences between the SGX535 and the SGX540?
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
I also tried getting in contact with ImgTec to find out an answer, but I haven't received a reply back. It's been two weeks now.
Also, the chip is obviously faster than snapdragon with the adreno 200 gpu. I don't know if Adreno supports TBDR, I just know it's a modified Xenon core. Also, Galaxy S uses LPDDR2 ram. So throughput is quite a bit faster, even though it's not *as* necessary with all the memory efficiencies between the Cortex A8 and TBDR on the SGX540.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
i think that is the cue, for cost saving for Samsung
besides who will need a 2D Accelerator, with a CPU as fast as it's already.
The HTC Athena (HTC Advantage) failed miserably at adding the ATI 2D Accelerator which no programmers were able to take advantage of, in the end the CPU did all the work.
I'd imagine its a 535 at 45nm. Just a guess, the cpu is also 45nm
Having tried a few phones the speed in games is far better, much better fps though there is a problem that we might have to wait for any games to really test its power as most are made to run on all phones.
This was the same problem with the xbox and ps2, the xbox had more power but the ps2 was king and so games were made with its hardware in mind which held back the xbox, only now and then did a xbox only game come out that really made use of its power....years later xbox changed places which saw 360 hold the ps3 back (dont start on which is better lol) and the ps3 has to make do with 360 ports but when it has a game made just for it you really get to see what it can do...anywayits nice to know galaxy is future proof game wise and cannot wait to see what it can do in future or what someone can port on to it.
On a side note I did read that the videos run through the graphics chip which is causing blocking in dark movies (not hd...lower rips) something about it not reading the difference between shades of black, one guy found a way to turn the chip off and movies were all good, guess rest of us have to wait for firmware to sort this.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
smart move sammy
voodoochild2008-
I wouldn't say we'd have to wait so much.
Even today, snapdragon devices don't do very well in games, since their fillrate is so low (133Mpixels)
Even the motorola droid (SGX530 at 110mhz, about 9~ Mpoly's and 280~ Mpixels with that freq) fares MUCH better in games, and actually, runs pretty much everything.
So i guess the best hardware is not yet at stake, but weaker devices should be hitting the limit soon.
bl4ckdr4g00n- Why the hell should we care? I don't see any problem with 2D content and/or videos, everything flies at lightspeed.
well I can live in hope, and I guess apples mess (aka the iphone4) will help now as firms are heading more towards android, I did read about one big firm in usa dropping marketing for apple and heading to android, and well thats what you get when you try to sell old ideas...always made me laugh when the first iphone did like 1meg photo when others were on 3meg, then it had no video when most others did, then they hype it when it moves to a 3meg cam and it does video.....omg, ok I am going to stop as it makes my blood boil that people buy into apple, yes they started the ball rolling and good on them for that but then they just sat back and started to count the money as others moved on.................oh and when I bought my galaxy the website did say "able to run games as powerfull as the xbox (old one) so is HALO too much to ask for lol
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
he 535 is a downgrade from the 540. 540 is the latest and greatest from the PowerVR line.
Samsung did not cost cut, they've in fact spent MORE to get this chip on their Galaxy S line. No one else has the 540 besides Samsung.
Like i said, its probably just a process shrink which means our gpu uses less power and is possibly higher clocked.
p.s. desktop gfx haven't had 2d acceleration for years removing it saves transistors for more 3d / power!
This worries me as well... Seems like it might not be as great as what we thought. HOWEVER again, this is a new device that might be fixed in firmware updates. Because obviously the hardware is stellar, there's something holding it back
Pika007 said:
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
Click to expand...
Click to collapse
http://www.slashgear.com/droid-x-review-0793011/
"We benchmarked the DROID X using Quadrant, which measures processor, memory, I/O and 2D/3D graphics and combines them into a single numerical score. In Battery Saver mode, the DROID X scored 819, in Performance mode it scored 1,204, and in Smart mode it scored 963. In contrast, the Samsung Galaxy S running Android 2.1 – using Samsung’s own 1GHz Hummingbird CPU – scored 874, while a Google Nexus One running Android 2.2 – using Qualcomm’s 1GHz Snapdragon – scored 1,434. "
The N1's performance can be explained by the fact it's 2.2...
But the Droid X, even with the "inferior" GPU, outscored the Galaxy S? Why?
gdfnr123 said:
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
Click to expand...
Click to collapse
Same here. I want to know which one is has the better performance as well.
Besides that. Does anyone know which CPU is better between Dorid X and Galaxy S?
I knew that OMAP chip on the original Droid can overclock to 1.2Ghz from what, 550Mhz?
How about the CPU on Droid X and Galaxy S? Did anyone do the comparison between those chips? Which can overclock to a higher clock and which one is better overall?
Sorry about the poor English. Hope you guys can understand.
The CPU in the DroidX is a stock Cortex A8 running at 1GHz. The Samsung Hummingbird is a specialized version of the Cortex A8 designed by Intrinsity running at 1Ghz.
Even Qualcomm does a complete redesign of the Cortex A8 in the snapdragon cpu at 1GHz. But while the original A8 could only be clocked at 600Mhz with a reasonable power drain, the striped down versions of the A8 could be clocked higher while maintaining better power.
An untouched Cortex A8 can do more at the same frequencies compared to a specialized stripped down A8.
If anything the Samsung Galaxy S is better balanced, leveraging the SGX 540 as a video decoder as well. However, the Droid X should be quite snappy in most uses.
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Pika007 said:
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Click to expand...
Click to collapse
The SGS might be falling behind in I/O speeds... It is well known that all the app data is stored in a slower internal SD-card partition... Has anyone tried the benchmarks with the lag fix?
Also, if only android made use of the GPU's to help render the UI's... It's such a shame that the GPU only goes to use in games...
Using the GPU to render the UI would take tons of battery power.
I preffer it being a bit less snappy, but a whole lot easier on the battery.
thephawx said:
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
Click to expand...
Click to collapse
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
TexUs said:
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
Click to expand...
Click to collapse
I think so. The battery will be the biggest issue for the smart phone in the future if it just remain 1500mAh or even less.
The dual-core CPU could be fast but power aggressive as well.

Hummingbird VS Snapdragon

I cannot understand why everyone is saying that hummingbird processor is better than snapdragon and that's why I started this thread.
I own an HD2 (snapdragon) and SGS (hummingbird).
I've run linpack and quadrant in both phones and here are the results showing that snapdragon is 4 to 5 times faster.
Hummingbird: linpack 13,864 quadrant CPU 1456
Snapdragon: linpack 63,122 quadrant CPU 4122
I'm only talking for the CPU cause if you go to 3D I'll agree that hummingbird is better (but I don't care about 3D cause I don't use my device for games)
Both phones have android 2,2 installed and I have voodoo lagfix installed in SGS
johcos said:
I cannot understand why everyone is saying that hummingbird processor is better than snapdragon and that's why I started this thread.
I own an HD2 (snapdragon) and SGS (hummingbird).
I've run linpack and quadrant in both phones and here are the results showing that snapdragon is 4 to 5 times faster.
Hummingbird: linpack 13,864 quadrant CPU 1456
Snapdragon: linpack 63,122 quadrant CPU 4122
I'm only talking for the CPU cause if you go to 3D I'll agree that hummingbird is better (but I don't care about 3D cause I don't use my device for games)
Both phones have android 2,2 installed and I have voodoo lagfix installed in SGS
Click to expand...
Click to collapse
After looking into it for a while, I was focusing on what makes the Nexus One so much better than the other phones. On the chip level, I didn’t see it. Then it dawned on me to look at what Google had to say on the matter. Well, it was there in black and white. In their 20 May 2010 Developer’s Blog entry (http://android-developers.blogspot.com/2010/05/android-22-and-developers-goodies.html) they say that people could see a 2-5x speed increase. I think it is pointed out in an entry later in the blog dealing with NDK, which I initially missed: “ARM Advanced SIMD (a.k.a. NEON) instruction support The NEON instruction set extension can be used to perform scalar computations on integers and floating points. However, it is an optional CPU feature and will not be supported by all Android ARMv7-A based devices. The NDK includes a tiny library named “cpufeatures” that can be used by native code to test at runtime the features supported by the device’s target CPU.”
So, I guess this means that NEON is the difference. If your phone’s CPU has it and it’s enabled for JIT, you can expect higher Linpack numbers.
Click to expand...
Click to collapse
http://www.greenecomputing.com/2010...ack-scores-so-mucher-higher-than-on-my-phone/
Now stop making topics like this.
the difference you notice is software related
If you want a real test, run a hd video on both phones, or a psx emulator and see if the nexus one is 5x faster... it is the same if not slower then the sgs
Well, SGS got hardware h264 decoding acceleration. Also, maybe you forget, but:
he Hummingbird comes with 32KB each of data and instruction caches, an L2 cache, the size of which can be customized, and an ARM® NEON™ multi-media extension.
Click to expand...
Click to collapse
SAMSUNG and Intrinsity Jointly Develop the World's Fastest ARM® Cortex™-A8 Processor Based Mobile Core in 45 Nanometer Low Power Process
Advanced SIMD (NEON)
The Advanced SIMD extension, marketed as NEON technology, is a combined 64- and 128-bit single instruction multiple data (SIMD) instruction set that provides standardized acceleration for media and signal processing applications. NEON can execute MP3 audio decoding on CPUs running at 10 MHz and can run the GSM AMR (Adaptive Multi-Rate) speech codec at no more than 13 MHz. It features a comprehensive instruction set, separate register files and independent execution hardware. NEON supports 8-, 16-, 32- and 64-bit integer and single-precision (32-bit) floating-point data and operates in SIMD operations for handling audio and video processing as well as graphics and gaming processing. In NEON, the SIMD supports up to 16 operations at the same time. The NEON hardware shares the same floating-point registers as used in VFP.
Click to expand...
Click to collapse
source: wiki
This means Hummingbirds are equipped with NEON. Why its not so effective/used in Quadrant/Linpack? My guess they (these benchmarks) are not compiled/optimised for Hummingbirds, just for Snapdragons.
I came from owning an iPhone and playing lots of games on it. I bought the SGS purely for the gaming performance of the Hummingbird processor.
Having seen the difference in game quality between the HTC Desire and the SGS, I know I made the right decision. Benchmarks don't mean anything.
As long as the device can run apps, games, multimedia smoothly, I dont care much about those benchmarkers, maybe they were designed and/or optimized for snapdragon prior to hummingbird.
Sent from my GT-I9000 using XDA App
i bet you anything he actually doesn't have a sgs...lol
jealousy maybe just a troll, ignore
In terms of overall smoothness (everything, not just games) the SGS is vastly superior to any other android phone I've seen (Desire included).
Darkimmortal said:
everything
Click to expand...
Click to collapse
Really? You have to go all out and use the word "everything" when the phone can get major lockups?
"most things" sounds like a more reasonable and believable choice of words...
Sent from my GT-I9000 using XDA App
My friends I do own an SGS (not happy with it thought) and the tests that I posted were run from me.
I wasn't talking about the gaming performance (I know that SGS is the best out there)
This thread was started so that we can find an answer why is this happening?
I see some answers that cover it but I believe not completely because in everyday use of the phones I see that HD2 is snappier (not much but it is) than SGS (with lagfix).
The best test I believe would be to put the phones to encode something (like a video) but I don't know any software that could do that. (If anyone knows some please point them to me and I'll be happy to post the results here)
The tests you mention with psx and multimedia won't show as what we're looking because the SGS will clearly win because of the GPU.
johcos said:
My friends I do own an SGS (not happy with it thought) and the tests that I posted were run from me.
I wasn't talking about the gaming performance (I know that SGS is the best out there)
This thread was started so that we can find an answer why is this happening?
I see some answers that cover it but I believe not completely because in everyday use of the phones I see that HD2 is snappier (not much but it is) than SGS (with lagfix).
The best test I believe would be to put the phones to encode something (like a video) but I don't know any software that could do that. (If anyone knows some please point them to me and I'll be happy to post the results here)
The tests you mention with psx and multimedia won't show as what we're looking because the SGS will clearly win because of the GPU.
Click to expand...
Click to collapse
man. if you are not happy, then i think you should sell it. no one here will give you a satisfying answer that warm your heart. look for desire hd or something.
to answer ur questions. i get a 2100+ on quadrant. using voodoo fix and oclf on my eclaire. lag free and smooth as butter.
but either way, these test scores mean nothing. they were not designed for samusng hardware. it was designed based on htc and the snapdragon processor.
even people who use neocore for gpu are wrong. if you wana test the gpu performance, use nenamark1. the sgs gives u 49+ fps while the desire HD struggle to give u 35. while if you use neocore. the sgs gives u 56 while desire hd 58
my point is most of those software were designed with htc hardware in mind. so you cant really compare them.
just test your device for your self. apply whatever best roms you find here. if it doesnt lag and smooth for you. then ^^^^ everyone else.
the display alone is worth keepin the sgs for me. sure people might like i phone 4 display more. but nothing in my eyes come close to the contrast and colors of the super amoled. watching a movie or playing a game is a joy in this device.
hell yesterday evening a local htc store had a demo of desire hd. and the guy was nice enough to me play with it for like 1 hour.
device as a hardware look. its friggin sexy as hell. screen ? beauitful large 4.3 screen. quality colors compared to sgs ? fail. a lil slow and laggy " i am sure its because of the firmware. once roms are out, it will be faster "
i was thinking to change to desire hd honestly. but i wake away from the store kissing my sgs.
i love the desire hf look and feel. but as of now its not as smooth as my sgs. and the screen isnt as vibrant.
Psx emulator does not use the gpu...yet
Sent from my GT-I9000 using XDA App
android53 said:
Psx emulator does not use the gpu...yet
Sent from my GT-I9000 using XDA App
Click to expand...
Click to collapse
this. i played king of fighters on my hd2 and it was laggy as hell
smooth as butter on my galaxy s
to be honest. the day psx4droid use gpu. galaxy owners are in heaven.
Its unlikely it ever will though, even modern pc emulators barely use the gpu, only for anti aliasing
Sent from my GT-I9000 using XDA App
johcos said:
My friends I do own an SGS (not happy with it thought) and the tests that I posted were run from me.
I wasn't talking about the gaming performance (I know that SGS is the best out there)
This thread was started so that we can find an answer why is this happening?
I see some answers that cover it but I believe not completely because in everyday use of the phones I see that HD2 is snappier (not much but it is) than SGS (with lagfix).
The best test I believe would be to put the phones to encode something (like a video) but I don't know any software that could do that. (If anyone knows some please point them to me and I'll be happy to post the results here)
The tests you mention with psx and multimedia won't show as what we're looking because the SGS will clearly win because of the GPU.
Click to expand...
Click to collapse
Why in hell woul you want to incodea video using a smartPHONE...?
It's like trying to fit your family and grocery in a sport car... not made for this bro!
stop trying to find reason to "not like" the SGS, if you don't like it, sell it and be done...
Snapdragon/Hummingbird scores in glbenchmark (nexus one/galaxy s):
integer: 20661/27624
float: 11173/7968
I guess glbenchmark uses native C code (hopefully with armv7 optimization), so the JIT compiler has no effect. From the scores it seems that the floating point unit in Snapdragon is faster - but most of the time it is not used (except video & games).
Anyway, a benchmark to measure the same algorithm in both native & java code with scalar & vector instructions would be great...
t1mman said:
Why in hell woul you want to incodea video using a smartPHONE...?
It's like trying to fit your family and grocery in a sport car... not made for this bro!
stop trying to find reason to "not like" the SGS, if you don't like it, sell it and be done...
Click to expand...
Click to collapse
he's not whining, well, not in the first place and i don't see any harm on that i think he's trying to UNDERSTAND reasons behind numbers and daily use with help of other people, so am i. if i had to sell phones for every problem i encounter i will problaby be without (smart)phone at this time
i don't care about benchmarks, but if you think that sgs is smoother than hd2 xda optimized (with wm 6.5 or android 2.2) you obviously never owned an hd2 i'm not talking about games, like johcos says galaxy s performance is not questionable. but android is not all about game. anyway, i don't think hardware is the problem here, sure sgs is superior in many aspects, we know that, regardless benchmarks (even if it seems here that only benchmarks where sgs win are trustworthy, others are not good, not optimized, not realistic, meaningless for real life performance etc.). with a little help from samsung and this community sgs will soon outperform (in real usage) all snapdragon phones. i hope
...when average men talk about the high tech w/o knowledge, boo
ll_l_x_l_ll said:
man. if you are not happy, then i think you should sell it. no one here will give you a satisfying answer that warm your heart. look for desire hd or something.
to answer ur questions. i get a 2100+ on quadrant. using voodoo fix and oclf on my eclaire. lag free and smooth as butter.
but either way, these test scores mean nothing. they were not designed for samusng hardware. it was designed based on htc and the snapdragon processor.
even people who use neocore for gpu are wrong. if you wana test the gpu performance, use nenamark1. the sgs gives u 49+ fps while the desire HD struggle to give u 35. while if you use neocore. the sgs gives u 56 while desire hd 58
my point is most of those software were designed with htc hardware in mind. so you cant really compare them.
just test your device for your self. apply whatever best roms you find here. if it doesnt lag and smooth for you. then ^^^^ everyone else.
the display alone is worth keepin the sgs for me. sure people might like i phone 4 display more. but nothing in my eyes come close to the contrast and colors of the super amoled. watching a movie or playing a game is a joy in this device.
hell yesterday evening a local htc store had a demo of desire hd. and the guy was nice enough to me play with it for like 1 hour.
device as a hardware look. its friggin sexy as hell. screen ? beauitful large 4.3 screen. quality colors compared to sgs ? fail. a lil slow and laggy " i am sure its because of the firmware. once roms are out, it will be faster "
i was thinking to change to desire hd honestly. but i wake away from the store kissing my sgs.
i love the desire hf look and feel. but as of now its not as smooth as my sgs. and the screen isnt as vibrant.
Click to expand...
Click to collapse
Honestly couldn't agree anymore, even with all the problems the SGS has. The screen+hardware combination is just too overwhelming for me to swap the phone for something else.

Galaxy S 2 and FPSE (9th June)

hey all
i thought id share with you for those who would be interested in my first hands on experience with the samsung galaxy s2
well i went round my friends house after work, as he was waiting for one to be delivered as an upgrade, anyway he rang me to say that it was here,
so i thought id go up and have alook.
the first thing i did when he gave it me was.... drop it lol
he passed it me and it slipped right through my fingers it was so thin
it bounced on his hard laminated floor a few times but he just laughed.
it is a nice device, very slim altho abit wider than the play
the brightness on the screen was great
and it was very very snappy scrolling across homescreens and loading up apps,
exploring the memory with astro. i was impressed
and alot lighter than the play aswell
the quadrant scores were off the chart getting something like 3k+
linpack he was getting 48
i did notice that when running quadrant some of the textures were just plain solid shapes no actual textures. i thought maybe thats got something to do with that mali chipset that i heard didnt support certain texture compression but anyway.
i told him to fire up fpse to see what that ran like with the new fpse.
mines runs great hovering around at around 45-50fps
on most games. until i put on screen filtering
and that has a massive impact on fps it drops to around
27-30fps and games become unplayable even with frame skip on max
enhanced 3d rendering has little effect on fps so i keep that on
anyway i thought that the galaxy s 2 with the 1.2 dualcore and the mali 400mp
would power along when putting screen filtering on.
so i tried tekken 3 loaded it up. ran it but without screen filtering on
and it was buttery smooth stuck at 60fps.
then i turned on screen filtering and it dropped to 30/35fps
and was the same unplayable state mine was.
i was gob smacked surely a dualcore would best this???
my mate said, so looks like your not missing much gaming wise then compared to mine. i just smiled ,he hasnt got his for gaming mind.
i dont know if there was summat goin on, or a imcompatability with fpse
but it was an interesting discovery. i tried afew other games and they all were the same.
so as far as emulation and fpse goes were not getting left behind because we have no dualcore cpu.
Someone please correct me here if im wrong but what your saying is that a dual core cpu is the same speed as a single core cpu on single core programming, well yes that is going to be the obvous result, yes different single core speeds will have various results.
But unless a program has been coded for use with more than one processor then it will not make use of a dual core processor.
for example
a single core processor can work out
x = 5
y = ?
x * y = 15
this equation would take as much time on a single core processor as it would a dual core processor as you are waiting for the result of 15/x to work out what y is.
so until FPSE is programmed to allocate for dual core processors you will end up with the same speed of use, or very similar.
I have both, well sort off, play dead gone for repairing with sony, so bought S2 in the mean time, to tell you, S2 with 1gb RAM and Dual core Processor which can be overclocked to stable 1.66 ghz is way fast as compared to play, trust me, graphic wise both have 16m colors, but s2 with super amoled does it somehow better...i miss playing games the play way...thats all...I guess S2 is the world's fastest stable phone for now.....atleast I have both so know it...
shotgunfool said:
Someone please correct me here if im wrong but what your saying is that a dual core cpu is the same speed as a single core cpu on single core programming, well yes that is going to be the obvous result, yes different single core speeds will have various results.
But unless a program has been coded for use with more than one processor then it will not make use of a dual core processor.
for example
a single core processor can work out
x = 5
y = ?
x * y = 15
this equation would take as much time on a single core processor as it would a dual core processor as you are waiting for the result of 15/x to work out what y is.
so until FPSE is programmed to allocate for dual core processors you will end up with the same speed of use, or very similar.
Click to expand...
Click to collapse
your forgetting about the more onboard ram the s2 has plus it has the mali 400mp Gpu
crispyduckling said:
your forgetting about the more onboard ram the s2 has plus it has the mali 400mp Gpu
Click to expand...
Click to collapse
The tegra2 bests that GPU in every possible way.
Also doesn't the SGS II have an FPS limit.
Also Exynos as a whole is not all that more powerful than tegra2 I mean the reason the SGS II does so well on benchmarks is because of android 2.3 look at the Atrix 2.3 leak its benchmarks are off the charts as well.
And my last point. There is no need for all this power if it is not going to be used. I mean games need that type of power and games are best played with a gamepad. Why sony ericsson didn't put a tegra in the play is beyond me.
I tried tekken on it, could see puches coming at me my thumbb was always in the way. I told the owner to uninstall as gaming on something like that was a joke. He didnt agree until he tried my play.
Sent from my R800a using XDA Premium App
RacecarBMW said:
The tegra2 bests that GPU in every possible way.
Also doesn't the SGS II have an FPS limit.
Also Exynos as a whole is not all that more powerful than tegra2 I mean the reason the SGS II does so well on benchmarks is because of android 2.3 look at the Atrix 2.3 leak its benchmarks are off the charts as well.
And my last point. There is no need for all this power if it is not going to be used. I mean games need that type of power and games are best played with a gamepad. Why sony ericsson didn't put a tegra in the play is beyond me.
Click to expand...
Click to collapse
yea I tried to play tekken and I just couldn't on the touch screen
IM so glad I have a play
its a great phone tho the gs2
crispyduckling said:
yea I tried to play tekken and I just couldn't on the touch screen
IM so glad I have a play
its a great phone tho the gs2
Click to expand...
Click to collapse
Reason I switched too.
Sent from my R800i using XDA App
Touchscreens suck for gaming.
No amount of processing power in the World will overcome this fact.

Android and Multi-Core Processor

Bell points the finger at chipset makers - "The way it's implemented right now, Android does not make as effective use of multiple cores as it could, and I think - frankly - some of this work could be done by the vendors who create the SoCs, but they just haven't bothered to do it. Right now the lack of software effort by some of the folks who have done their hardware implementation is a bigger disadvantage than anything else."
Click to expand...
Click to collapse
What do you think about this guys?
He knows his stuff.
Sent from my GT-I9300
i would take it with a pinch of salt, though there are not many apps that takes advantage of multi core processor lets see what intel will tell when they have thier own dual core processor out in the market
Pretty good valid arguments for the most part.
I mostly agree though, but I think android makes good use of up to 2 cores. Anything more than that it doesn't at all.
There is a huge chunk of the article missing too.
Sent from my GT-I9300
full article
jaytana said:
What do you think about this guys?
Click to expand...
Click to collapse
I think they should all be covered in honey and then thrown into a pit full of bears and Honey bees. And the bears should have like knives ductaped to their feet and the bees stingers should be dipped in chilli sauce.
Reckless187 said:
I think they should all be covered in honey and then thrown into a pit full of bears and Honey bees. And the bears should have like knives ductaped to their feet and the bees stingers should be dipped in chilli sauce.
Click to expand...
Click to collapse
wow, saying Android isn't ready for multip-core deserves such treatment? or this guy had committed more serious crime previously?
Actually is a totally fail but in android 5 I think it's can be solved
Sent from my GT-I9300 using XDA
This was a serious problem on desktop Windows OS as well back when multi cores first starting coming out. I remember having to download patches for certain games and in other cases, having to set the CPU affinity to run certain games/apps with only one core so that it wouldn't freeze up. I am sure Android will move forward with multi-core support in the future.
simollie said:
wow, saying Android isn't ready for multip-core deserves such treatment? or this guy had committed more serious crime previously?
Click to expand...
Click to collapse
Its a harsh but fair punishment imo. They need to sort that sh*t out as its totally unacceptable or they're gonna get a taste of the Cat o Nine Tails.
Android kernel is based on Linux. So this is suggesting the Linux kernel is not built to support multi-core either. Not true. There is a reason the SGS3 gets 5000+ in Quadrant, the the San Diego only gets 3000+. And the San Diego is running 200MHz faster.
Just look at the blue bar here. http://www.engadget.com/2012/05/31/orange-san-diego-benchmarks/ . My SGS3 got over 2.5K on just CPU alone.
What Intel said was true. Android is multicore aware but the os and apps aren't taking advantage of it. When this user disabled 2 cores on the HTC one x it made no difference at all in anything other than benchmarks.
http://forum.xda-developers.com/showpost.php?p=26094852&postcount=3
Disabling the CPU cores will do nothing to the GPU, hence still getting 60 FPS. And you say that like you expected to see a difference. Those games may not be particularly CPU intensive, thats why they continue to run fine. They will more than likely be GPU limited.
Android is not a difficult OS to run, thats why it can run on the G1, or AOKP can run smooth as silk on my i9000. If it can run smooth as silk on one 2yr old 1GHz chip, how COULD it go faster on a next-gen chip like in the SGS3 or HOX? In terms of just using the phone, ive not experienced any lag at all.
If youre buying a phone with dual/quad CPU cores, and only expecting to use it as a phone (i.e, not play demanding games/benchmark/mod/what ever else), of course you wont see any advantage, and you may feel cheated. And if you disable those extra cores, and still only use it as a phone, of course you wont notice any difference.
If a pocket calculator appears to calculate 1+1 instantly, and a HOX also calculates 1+1 instantly, Is the pocket calculator awesome, is the HOX not using all its cores, or is what it is being asked to do simply not taxing enough to use all the CPU power the HOX has got?
I've been hearing this for some time now and is one of the reasons I didn't care that we weren't getting the quad core version of the GS3
916x10 said:
I've been hearing this for some time now and is one of the reasons I didn't care that we weren't getting the quad core version of the GS3
Click to expand...
Click to collapse
Okay folks... firstly linux kernel, which android is based on, is aware of multicore (its obvious) but most the applications are not aware, thats true!.. but is not the android which to blame neither the SoC makers. This is like the flame intel made that they wanted to say their single core can do faster to a dual core arm LOL, (maybe intel will make 1 core has 4 threads or 8 threads) <- imposibruuu for now dunno later
you will notice the core usage while playing HD video that require cpu to decode (better core decode fastly)... and im not sure single core intel does better to arm dual core.. ~haha~
but for average user the differences are not noticable.. if intel aiming for this market yes that make sense... but android user are above average user.. they will optimize its phone eventually IMO
What they have failed to disclose is which SoC they did their test on and their methodology. Not much reason to doubt what he's saying but you gotta remember that Intel only have a single core mobile SoC currently and are aiming to get a foothold in the mobile device ecosystem so part of this could be throwing salt on competing products as it's something that should be taken care of by Google optimising the CPU scheduling algorithms of their OS.
The problem is in the chip set. I currently attend SUNY Oswego and a professor of mine Doug Lea works on many concurrent structures. He is currently working on the ARM spec sheet that is used to make chips. The bench marks that he has done shows that no matter how lucky or unlucky you get, the time that it takes to do a concurrent process is about the same where on desktop chips there is a huge difference between best case and worse case. The blame falls on the people that make the chips for now. They need to change how it handles concurrent operations and then if android still cant use multi-core processors then it falls on the shoulders of google.
that is my two cents on the whole situation. Just finished concurrency with Doug and after many talks this is my current opinion.
Sent from my Transformer Prime TF201 using XDA
Flynny75 said:
Disabling the CPU cores will do nothing to the GPU, hence still getting 60 FPS. And you say that like you expected to see a difference. Those games may not be particularly CPU intensive, thats why they continue to run fine. They will more than likely be GPU limited.
Android is not a difficult OS to run, thats why it can run on the G1, or AOKP can run smooth as silk on my i9000. If it can run smooth as silk on one 2yr old 1GHz chip, how COULD it go faster on a next-gen chip like in the SGS3 or HOX? In terms of just using the phone, ive not experienced any lag at all.
If youre buying a phone with dual/quad CPU cores, and only expecting to use it as a phone (i.e, not play demanding games/benchmark/mod/what ever else), of course you wont see any advantage, and you may feel cheated. And if you disable those extra cores, and still only use it as a phone, of course you wont notice any difference.
If a pocket calculator appears to calculate 1+1 instantly, and a HOX also calculates 1+1 instantly, Is the pocket calculator awesome, is the HOX not using all its cores, or is what it is being asked to do simply not taxing enough to use all the CPU power the HOX has got?
Click to expand...
Click to collapse
That doesn't mean daily task doesn't need the cpu power. When I put my sgs 3 in power save mode which cut back the cpu to 800mHz, I feel the lag instantly when scrolling around and navigating the internet. So I can conclude that performance per core is still much more important than number of cores. There isn't any performance difference either with the dual core sensation xe running beside the single core sensational xl.
The hardware needs to be out for developers to have incentive to make use of it. It's not like Android was built from the ground up to utilize 4 cores. That said, once it hits enough hand it and software running in it will be made to utilize the new hardware.

Categories

Resources