GPU and benchmarks - Galaxy S 4 Q&A, Help & Troubleshooting

Hey everyone.
I'm a bit lost and I don't know what to choose to buy: I9500 or I9505.
So far I know that Adreno 320 is fully OpenGL 3.0 compatible, while PowerVR SGX544MP3 not.
Adreno 320 is scoring 4 FPS more than PowerVR in T-Rex GLBenchmark 2.7.0.
PowerVR is scoring 1-2 more FPS in GLBenchmark 2.5 Egypt
Both GPU is scoring the same in Antutu and Quadrant video test, with PowerVR slightly better for few seconds (Adreno is dropping 1-2 seconds of the test to 30 FPS while PowerVR stay constant at 50-60)
In Antutu, the 3rd test (with the DNA code), Adreno 320 stays at 30-40 fps while PowerVR scores constant 60.
Both, 3dmark and glbenchmark show the PowerVR in the S4 even weaker than Nexus 4 and other chinese mobiles.
What's the deal....what the hell it's happening ? Is PowerVR that weak in the new graphic technologies but scores well in the new ones ?
Also, is there any OpenGL 3.0 benchmark so we can compare the Adreno 320 (fully OpenGL 3.0) with the PowerVR 544MP3 (OpenGL 2.0 but with some OpenGL 3.0 features thanks to an API), to see what the score and quality is ? I really want to see what that 3.0 API knows to do, as the Imagination doesn't really says what that API really do. Would there be games or apps using only OpenGL 3.0 and we will have trouble to run them because of this old GPU ?
I'm wondering...if in one year will be released an OpenGL 3.0 game, what will happens with S4 Octa ? It will not be able to play it, right ? I have no idea how that OpenGL thing works, but I remember that a game requesting DirectX 10 will not work with DirectX 9.
PowerVR really sucks. Samsung dumbs should put the PowerVR 6 "Rogue".
My opinion is that the Qualcomm scores very well, even my S3 is enough to play every single game, but the phone lags on RAM and that's why I replace it now. Buying the Octa will costs me $150 more than the Qualcomm version and I will need to send it oversea in case I will have problems and need to send it to warranty. With those $150 I can buy 2 spare battery and the Samsung S band instead getting the Octa. I want the Octa, but this phone really deserve such attention with that old rubish PowerVR GPU chip ? I don't have 4G in my area, so I don't care about the 4G, but will be nice in case I will travel somewhere with 4G, even if for me HSPA+ is enough and very fast, so the only thing counts here is the CPU, GPU and the battery life. Battery life can be solved with an additional battery, so remains the GPU and the CPU....So far A15 cores are yet very fast, but can use a lot of energy. So I can have 2 days battery life with texting and calling, but 2 hours playing games and watching 1080p videos, while with A9 I will have something similar to S3
Any developer or experienced guy here can answer me to this questions ?

Nobody ?

I'm the same situation. I'm still deciding on what version i should buy...
We need an user with Galaxy S 4 Exynos and one with Snapdragon. They should do same tests (like linpack, vellamo, antutu, and much more) and give us results.
For OpenGL 3.0 i think is better to have native support, not via APIs. Also in Snapdragon we can have same Exynos Performance via OCs and much more. I find Snapdragon more optimizable than exynos, but PowerVR is still a good GPU.

Alberto96 said:
I'm the same situation. I'm still deciding on what version i should buy...
We need an user with Galaxy S 4 Exynos and one with Snapdragon. They should do same tests (like linpack, vellamo, antutu, and much more) and give us results.
For OpenGL 3.0 i think is better to have native support, not via APIs. Also in Snapdragon we can have same Exynos Performance via OCs and much more. I find Snapdragon more optimizable than exynos, but PowerVR is still a good GPU.
Click to expand...
Click to collapse
Totally agree with you. I don't get it why people says the powervr is better. I see that in antutu benchmark scores better than adreno, but in GLBenchmark is awful. This is my only worry right now: what happens if we put the two gpu to do a full OpenGL ES 3.0 test? It will throw an error or will pass it, but with lower score. I don't care the score so much, but its capability to pass the test. If it pass it, I'm sold to Octa.
Also I found that Octa supports LPPDDR3 at 800Mhz, which means 12.8GB/s bandwidth, while S600 is LPPDDR3 but only at 600Mhz or so (only 9.4GB/s or something like that)
Sent from my GT-I9300 using Tapatalk 2

I just read (italian forum) that Exynos in the future can use all of the 8 cores together with kernel 3.8 .
So.......i think i will buy the exynos I'm just waiting a friend reply that bought it on Expansys USA. If he receive it and is all good, i will buy it from that site. With Italian Taxes (21%) and shipping costs it will cost about 730-740€

Alberto96 said:
I just read (italian forum) that Exynos in the future can use all of the 8 cores together with kernel 3.8 .
Click to expand...
Click to collapse
Why would you need these eight cores working together? How will you be sure Android will dispatch your applications threads in a proper way among them? Just another headache. I also don't believe they will really help to save battery, it's a pure marketing. But A15 is a bit more powerful than Krait from S600.
I think PowerVR 544MP3 scores below Adreno 320 in T-Rex because of unified architecture implemented in Adreno. This test uses complex shaders on every surface, so, probably, Octa GPU runs out of its fragment processors.
If you don't need a new phone right now, wait for S800 models. I don't think Mali T65x is good enough either. Looking at S3 GPU - yes, it's pretty fast in some wonderful tasks as rendering to texture, but it has some weird bottlenecks making Horn and T-REX much slower in fps than I've expected looking at pure gflops values.

Phobos Exp-Nord said:
Why would you need these eight cores working together? How will you be sure Android will dispatch your applications threads in a proper way among them? Just another headache. I also don't believe they will really help to save battery, it's a pure marketing. But A15 is a bit more powerful than Krait from S600.
I think PowerVR 544MP3 scores below Adreno 320 in T-Rex because of unified architecture implemented in Adreno. This test uses complex shaders on every surface, so, probably, Octa GPU runs out of its fragment processors.
If you don't need a new phone right now, wait for S800 models. I don't think Mali T65x is good enough either. Looking at S3 GPU - yes, it's pretty fast in some wonderful tasks as rendering to texture, but it has some weird bottlenecks making Horn and T-REX much slower in fps than I've expected looking at pure gflops values.
Click to expand...
Click to collapse
Well, when you play some heavy games you need all cores. Also is useful to use all cores when you are charging phone, without killing battery.
I need a new phone, because my Galaxy S I9000 is slow with new apps and android versions. If i buy this is useless a S800 version. CPU is fast, gpu maybe not as Adreno 330, but with overclock we can boost a lot performances.

Dude, using all eight cores will simply melt your phone in your hands LOL. You will drink S4 cocktail LOL. Quad-core is enough, but a gpu it's never. Same things are happening with the PCs. I don't need huge fps in trex, but some safe reviews and opinions from people really knows this things....but so far only you two were able to answer (I will not pretend yet that this forum is full of noobs LOL).
I want new mobile because of the lack of ram in S3, even if it's smooth for me. I was happy to hear about the Octa version, because I wanted to try something new, but I'm kinda lost now.
Alberto96, please let me know when your friend gets that i9500. I want to get it from Expansys too (I think we already talked together about this in other threads). If I will buy i9505 I will get it from Amazon Italy as it cheaper than other places
I'm just comparing:
I9500: - 1 years of warranty (overseas)
I9505 - 2 years of warranty (locally)
I9500 = I9505 + 3 additional S4 batteries with external charger
That because:
740€ = 625€ + 35€ x 3 batteries (and I will still have money for a Burger King and a Cola)
So...it's really deserve the risk ? Still nobody answered me related to OpenGL ES 3.0
S800 and Adreno 330 will not be in a Samsung device soon (maybe never) and 2.1-2.3GHz looks too much for a mobile phone. We already have warming issues with the S4 (I even have issues in S3, with the phone going warmer). Also....My laptop is a Dual-Core AMD 2.1 GHz for God sake.
@Alberto96, I beg you, when your friend gets the phone, please test it and let me know what you think ?

demlasjr said:
2.1-2.3GHz looks too much for a mobile phone
Click to expand...
Click to collapse
Not when playing Hi10P in software.
I do not know the exact internal scheme of Exynos Octa, so it's easy for me to imagine the situation when two threads of single application will be dispatched to two different core domains, making it really hard to exchange the data between them, as probably each domain has its own cache subsystem, so the performance will drop even higher than with two threads on A7-domain together.

Phobos Exp-Nord said:
Not when playing Hi10P in software.
I do not know the exact internal scheme of Exynos Octa, so it's easy for me to imagine the situation when two threads of single application will be dispatched to two different core domains, making it really hard to exchange the data between them, as probably each domain has its own cache subsystem, so the performance will drop even higher than with two threads on A7-domain together.
Click to expand...
Click to collapse
Yeah, you're right here. I don't have much knowledge relating this profile as I'm not watching anime, but seems to depending more on the GPU than CPU in S4 case. I'm really sure that Exynos Octa is able to run it, but not sure about the PowerVR. I've read that an Hi10P plays anywhere from 15-20fps (watchable, but still not that great) with a Tegra 3 quad-core overclocked at 1.6GHz, so there is still hope.

demlasjr said:
I've read that an Hi10P plays anywhere from 15-20fps
Click to expand...
Click to collapse
That's about 720p. Just asked in another thread there about 1080p - S4 cannot play it smooth enough with MX Player. It's not a question of resolution, it's a problem of use a file from 1080p home collection without any additional efforts.

We'll see, maybe later there will be an update released for such issues. I think the GPU and the CPU of both variants are capable of playing such videos.

Hey guys,
http://withimagination.imgtec.com/i...or-todays-leading-platforms#comment-880303396
jumping directly from OpenGL ES 2.0 to 3.0 would create a situation where app compatibility would be severely broken across devices. But most people update their devices every two years; by that time, PowerVR Series6 would be the dominant OpenGL ES 3.0 GPU generation shipping in most devices.
It is also important to remember that the PowerVR Series5XT GPU family has been successfully holding its own against recently released competing graphics solutions despite being released almost four years ago, which in itself is an amazing feat.
Click to expand...
Click to collapse
So....we should trust alexvoica and go forward with PowerVR SGX544MP3 even if lacks of OpenGL ES 2.0 ? He said that there was long way til OpenGL ES 2.0, but it wasn't such a big way as he said. Now every single game use OpenGL ES 2.0, I'm sure soon will be OpenGL ES 3.0 games only and not after 2 years.

get a look at this http://gfxbench.com/compare.jsp?cols=2&D1=Samsung+GT-I9500+Galaxy+S4&D2=Samsung+GT-I9505+Galaxy+S4

Related

Galaxy S SGX540 GPU. Any details up 'till now?

Hi everyone
For quite a long time i've been thinking about the whole "galaxy s can do 90mpolys per second" thing.
It sounds like total bull****.
So, after many, many hours of googling, and some unanswered mails to imgtec, i'd like to know-
Can ANYONE provide any concrete info about the SGX540?
From one side i see declerations that the SGX540 can do 90 million polygons per second, and from the other side i see stuff like "Twice the performance of SGX530".
...but twice the performance of SGX530 is EXACTLY what the SGX535 has.
So is the 540 a rebrand of the 535? that can't be, so WHAT THE HELL is going on?
I'm seriously confused, and would be glad if anyone could pour light on the matter.
I asked a Samsung rep what the difference was and this is what I got:
Q: The Samsung Galaxy S uses the SGX540 vs the iPhone using the SGx535. The only data I can find seems like these two GPU's are very similar. Could you please highlight some of the differences between the SGX535 and the SGX540?
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
I also tried getting in contact with ImgTec to find out an answer, but I haven't received a reply back. It's been two weeks now.
Also, the chip is obviously faster than snapdragon with the adreno 200 gpu. I don't know if Adreno supports TBDR, I just know it's a modified Xenon core. Also, Galaxy S uses LPDDR2 ram. So throughput is quite a bit faster, even though it's not *as* necessary with all the memory efficiencies between the Cortex A8 and TBDR on the SGX540.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
i think that is the cue, for cost saving for Samsung
besides who will need a 2D Accelerator, with a CPU as fast as it's already.
The HTC Athena (HTC Advantage) failed miserably at adding the ATI 2D Accelerator which no programmers were able to take advantage of, in the end the CPU did all the work.
I'd imagine its a 535 at 45nm. Just a guess, the cpu is also 45nm
Having tried a few phones the speed in games is far better, much better fps though there is a problem that we might have to wait for any games to really test its power as most are made to run on all phones.
This was the same problem with the xbox and ps2, the xbox had more power but the ps2 was king and so games were made with its hardware in mind which held back the xbox, only now and then did a xbox only game come out that really made use of its power....years later xbox changed places which saw 360 hold the ps3 back (dont start on which is better lol) and the ps3 has to make do with 360 ports but when it has a game made just for it you really get to see what it can do...anywayits nice to know galaxy is future proof game wise and cannot wait to see what it can do in future or what someone can port on to it.
On a side note I did read that the videos run through the graphics chip which is causing blocking in dark movies (not hd...lower rips) something about it not reading the difference between shades of black, one guy found a way to turn the chip off and movies were all good, guess rest of us have to wait for firmware to sort this.
thephawx said:
A: SGX540 is the latest GPU that provides better performance and more energy efficiency.
SGX535 is equipped with 2D Graphic Accelerator which SGX540 does not support.
Click to expand...
Click to collapse
smart move sammy
voodoochild2008-
I wouldn't say we'd have to wait so much.
Even today, snapdragon devices don't do very well in games, since their fillrate is so low (133Mpixels)
Even the motorola droid (SGX530 at 110mhz, about 9~ Mpoly's and 280~ Mpixels with that freq) fares MUCH better in games, and actually, runs pretty much everything.
So i guess the best hardware is not yet at stake, but weaker devices should be hitting the limit soon.
bl4ckdr4g00n- Why the hell should we care? I don't see any problem with 2D content and/or videos, everything flies at lightspeed.
well I can live in hope, and I guess apples mess (aka the iphone4) will help now as firms are heading more towards android, I did read about one big firm in usa dropping marketing for apple and heading to android, and well thats what you get when you try to sell old ideas...always made me laugh when the first iphone did like 1meg photo when others were on 3meg, then it had no video when most others did, then they hype it when it moves to a 3meg cam and it does video.....omg, ok I am going to stop as it makes my blood boil that people buy into apple, yes they started the ball rolling and good on them for that but then they just sat back and started to count the money as others moved on.................oh and when I bought my galaxy the website did say "able to run games as powerfull as the xbox (old one) so is HALO too much to ask for lol
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
he 535 is a downgrade from the 540. 540 is the latest and greatest from the PowerVR line.
Samsung did not cost cut, they've in fact spent MORE to get this chip on their Galaxy S line. No one else has the 540 besides Samsung.
Like i said, its probably just a process shrink which means our gpu uses less power and is possibly higher clocked.
p.s. desktop gfx haven't had 2d acceleration for years removing it saves transistors for more 3d / power!
This worries me as well... Seems like it might not be as great as what we thought. HOWEVER again, this is a new device that might be fixed in firmware updates. Because obviously the hardware is stellar, there's something holding it back
Pika007 said:
The droid X still uses the SGX530, but in the droid x, as opposed to the original droid, it comes in the stock 200mhz (or at least 180)
At that state it does 12-14Mpolygons/sec and can push out 400-500Mpixels/sec
Not too shabby
Click to expand...
Click to collapse
http://www.slashgear.com/droid-x-review-0793011/
"We benchmarked the DROID X using Quadrant, which measures processor, memory, I/O and 2D/3D graphics and combines them into a single numerical score. In Battery Saver mode, the DROID X scored 819, in Performance mode it scored 1,204, and in Smart mode it scored 963. In contrast, the Samsung Galaxy S running Android 2.1 – using Samsung’s own 1GHz Hummingbird CPU – scored 874, while a Google Nexus One running Android 2.2 – using Qualcomm’s 1GHz Snapdragon – scored 1,434. "
The N1's performance can be explained by the fact it's 2.2...
But the Droid X, even with the "inferior" GPU, outscored the Galaxy S? Why?
gdfnr123 said:
wait so what about the droid x vs the galaxy s gpu?? i know the galaxy s is way advanced in specs wise... the droid x does have a dedicated gpu can anyone explain??
Click to expand...
Click to collapse
Same here. I want to know which one is has the better performance as well.
Besides that. Does anyone know which CPU is better between Dorid X and Galaxy S?
I knew that OMAP chip on the original Droid can overclock to 1.2Ghz from what, 550Mhz?
How about the CPU on Droid X and Galaxy S? Did anyone do the comparison between those chips? Which can overclock to a higher clock and which one is better overall?
Sorry about the poor English. Hope you guys can understand.
The CPU in the DroidX is a stock Cortex A8 running at 1GHz. The Samsung Hummingbird is a specialized version of the Cortex A8 designed by Intrinsity running at 1Ghz.
Even Qualcomm does a complete redesign of the Cortex A8 in the snapdragon cpu at 1GHz. But while the original A8 could only be clocked at 600Mhz with a reasonable power drain, the striped down versions of the A8 could be clocked higher while maintaining better power.
An untouched Cortex A8 can do more at the same frequencies compared to a specialized stripped down A8.
If anything the Samsung Galaxy S is better balanced, leveraging the SGX 540 as a video decoder as well. However, the Droid X should be quite snappy in most uses.
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Pika007 said:
TexUs-
I wouldn't take it too seriously.
Quadrant isn't too serious of a benchmark, plus, i think you can blame it on the fact that 2D acceleration in the SGS is done by the processor, while the DROID X has 2D acceleration by the GPU.
I can assure you- There is no way in hell that the SGX540 is inferior to the 530. It's at least twice as strong in everything related to 3D acceleration.
I say- let's wait for froyo for all devices, let all devices clear from "birth ropes" of any kind, and test again. with more than one benchmark.
Click to expand...
Click to collapse
The SGS might be falling behind in I/O speeds... It is well known that all the app data is stored in a slower internal SD-card partition... Has anyone tried the benchmarks with the lag fix?
Also, if only android made use of the GPU's to help render the UI's... It's such a shame that the GPU only goes to use in games...
Using the GPU to render the UI would take tons of battery power.
I preffer it being a bit less snappy, but a whole lot easier on the battery.
thephawx said:
At the end of the day. You really shouldn't care too much about obsolescence. I mean the Qualcomm Dual-core scorpion chip is probably going to be coming out around December.
Smart phones are moving at a blisteringly fast pace.
Click to expand...
Click to collapse
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
TexUs said:
Smart phones aren't but batteries are.
IMO the only way we haven't had huge battery issues because all the other tech (screen, RAM power, CPU usage, etc) has improved...
Dual core or 2Ghz devices sound nice on paper but I worry if the battery technology can keep up.
Click to expand...
Click to collapse
I think so. The battery will be the biggest issue for the smart phone in the future if it just remain 1500mAh or even less.
The dual-core CPU could be fast but power aggressive as well.

Just what you always wanted - 2400 page processor manual!

I'm probably the only person on this planet that would ever download a 20.5-meg, 2426-page document titled "S5PC110 RISC Microprocessor User's Manual", but if there are other hardware freaks out there interested, here you go:
http://pdadb.net/index.php?m=repository&id=644&c=samsung_s5pc110_microprocessor_user_manual_1.00
As you may or may not know, the S5PC110, better known as Hummingbird, is the SoC (System on a Chip) that is the brain of your Epic. Now, when you have those moments when you really just gotta know the memory buffer size for your H.264 encoder or are dying to pore over a block diagram of your SGX540 GPU architecture, you can!
( Note: It does get a little bit dry at parts. Unless you're an ARM engineer, I suppose. )
Why arent you working on porting CM6 or gingerbread via CM7?? lol
now we can overclock the gpu
/sarcasm
cbusillo said:
Why arent you working on porting CM6 or gingerbread via CM7?? lol
Click to expand...
Click to collapse
Hah, because I know exactly squat about Android development. Hardware is more my thing, though if I find some spare time to play around with the Android SDK maybe that can change.
Sent from my SPH-D700 using XDA App
This actually is really exciting news. RISC architectures in general, especially the ARM instruction set is great and honestly it would so the works a lot of good kicking the chains of x86
Sent from my Nexus S with a keyboard
Interesting - the complete technical design of the Hummingbird chips.
After reading your blog as to how Hummingbird got its extra performance, I still wonder at times - did we make the right choice in getting this phone the Epic 4G (I bought one for $300 off contract and imported it to Canada) knowing that there are going to be ARM Cortex A9 CPUs coming around in just a couple of months? We know that in the real world, Hummingbird is more powerful than Snapdragon and the OMAP 3600 series, while benchmark scores tend to not reflect real world performance.
Performance-wise: It's know that the out of order A9 parts are at least 30% faster clock for clock in real world performance. There will be dual and maybe quad core implementations. What's really up in the air is the graphics performance of the A9 parts. There's now the Power VR SGX 545, the Mali 400, and the Tegra 2.
Edit: There is also the successor, the Mali T-604. I don't expect to see this in a phone in the near future. Nor do I expect the Tegra 3. Maybe close to this time next year though.
sauron0101 said:
Interesting - the complete technical design of the Hummingbird chips.
After reading your blog as to how Hummingbird got its extra performance, I still wonder at times - did we make the right choice in getting this phone the Epic 4G (I bought one for $300 off contract and imported it to Canada) knowing that there are going to be ARM Cortex A9 CPUs coming around in just a couple of months? We know that in the real world, Hummingbird is more powerful than Snapdragon and the OMAP 3600 series, while benchmark scores tend to not reflect real world performance.
Performance-wise: It's know that the out of order A9 parts are at least 30% faster clock for clock in real world performance. There will be dual and maybe quad core implementations. What's really up in the air is the graphics performance of the A9 parts. There's now the Power VR SGX 545, the Mali 400, and the Tegra 2.
Edit: There is also the successor, the Mali T-604. I don't expect to see this in a phone in the near future. Nor do I expect the Tegra 3. Maybe close to this time next year though.
Click to expand...
Click to collapse
Your always going to be playing catchup..I personally think the Epic has great hardware for the time...I mean on Samsung's roadmap for 2012/13 is their Aquila processor which is a quad-core 1.2ghz..its going to be endless catchup..every year there will be something that completely over shallows the rest..
gTen said:
Your always going to be playing catchup..I personally think the Epic has great hardware for the time...I mean on Samsung's roadmap for 2012/13 is their Aquila processor which is a quad-core 1.2ghz..its going to be endless catchup..every year there will be something that completely over shallows the rest..
Click to expand...
Click to collapse
No, but I mean, if you buy the latest technology when its released, you'll be set for quite some time.
For example, if you were to buy the one of the first Tegra 2 phones, its unlikely that anything is going to be beating that significantly until at least 2012 when the quad core parts begin to emerge.
It takes a year or so from the time that a CPU is announced to the time that it gets deployed in a handset. For example, the Snapdragon was announced in late 2008 and the first phones (HD2) were about a year later. IF you buy an A9 dual core part early on, you should be set for some time.
Well, I got the Epic knowing Tegra 2 was coming in a few months with next-gen performance. I was badly in need of a new phone and the Epic, while not a Cortex A9, is no slouch.
Sent from my SPH-D700 using XDA App
sauron0101 said:
No, but I mean, if you buy the latest technology when its released, you'll be set for quite some time.
For example, if you were to buy the one of the first Tegra 2 phones, its unlikely that anything is going to be beating that significantly until at least 2012 when the quad core parts begin to emerge.
Click to expand...
Click to collapse
Thats relative, in terms of GPU performance our Hummingbird doesn't do so badly..the GPU the TI chose to pair with the dual core OMAP is effectively a PowerVR SGX540..the Snapdragon that is rumored to be in the dual cores next summer is also on par with our GPU performance...so yes we will loose out to newer hardware..which is to be expected but I wouldn't consider it a slouch either...
It takes a year or so from the time that a CPU is announced to the time that it gets deployed in a handset. For example, the Snapdragon was announced in late 2008 and the first phones (HD2) were about a year later. IF you buy an A9 dual core part early on, you should be set for some time.
Click to expand...
Click to collapse
The first phone was a TG01, that said I guarantee you that a year if not less from the first Tegra release there will be a better processor out...its bound to happen..
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
gTen said:
Thats relative, in terms of GPU performance our Hummingbird doesn't do so badly..the GPU the TI chose to pair with the dual core OMAP is effectively a PowerVR SGX540..the Snapdragon that is rumored to be in the dual cores next summer is also on par with our GPU performance...so yes we will loose out to newer hardware..which is to be expected but I wouldn't consider it a slouch either...
The first phone was a TG01, that said I guarantee you that a year if not less from the first Tegra release there will be a better processor out...its bound to happen..
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
Click to expand...
Click to collapse
AFAIK, dual-core support is only fully supported by Honeycomb. But if you feel like buying into NVIDIA's explanation of Tegra 2 performance, check this out: http://www.nvidia.com/content/PDF/t...-Multi-core-CPUs-in-Mobile-Devices_Ver1.2.pdf
Electrofreak said:
AFAIK, dual-core support is only fully supported by Honeycomb. But if you feel like buying into NVIDIA's explanation of Tegra 2 performance, check this out: http://www.nvidia.com/content/PDF/t...-Multi-core-CPUs-in-Mobile-Devices_Ver1.2.pdf
Click to expand...
Click to collapse
I see I actually read before that Gingerbread would allow for dual core support but I guess that was delayed to honeycomb...
either way this would mean even if a Tegra based phone comes out it wont be able to utilize both cored until at least mid next year.
I can't open pdfs right now but I read a whitepaper with performance of hummingbird and Tegra 2 compared both on single core and dual core..is that the same one?
One thing though is Nvidia and ATI are quite known for tweaking their gfx cards to perform well on benchmarks...I hope its not the same with their CPUs :/
gTen said:
Edit: Some benchmarks for Tablets:
http://www.anandtech.com/show/4067/nvidia-tegra-2-graphics-performance-update
Though I am not sure if its using both cores or not...also Tegra 2 I think buffers at 16bit..while Hummingbird buffers at 24bit..
Click to expand...
Click to collapse
Here are some additional benchmarks comparing the Galaxy Tab to the Viewsonic G Tablet:
http://www.anandtech.com/show/4062/samsung-galaxy-tab-the-anandtech-review/5
It's possible that the Tegra 2 isn't optimized yet. Not to mention, Honeycomb will be the release that makes the most of dual cores. However, there are lackluster performance gains in terms of graphics - most of it seems to be purely CPU gains in performance.
I'm not entirely sure that Neocore is representative of real world performance either. It's possible that it may have been optimized for some platforms. Furthermore, I would not be surprised if Neocore gave inflated scores for the Snapdragon and it's Adreno graphics platform. Of course, neither is Quadrant.
I think that real world games like Quake III based games are the way to go, although until we see more graphics demanding games, I suppose that there's little to test (we're expecting more games for Android next year).
Finally, we've gotten to a point for web browsing where its the data connection HSPA+, LTE, or WiMAX that will dictate how fast pages load. It's like upgrading the CPU for a PC. I currently run an overclocked q6600 - if I were to upgrade to say a Sandy Bridge when it comes out next year, I don't expect significant improvements in real world browsing performance.
Eventually, the smartphone market will face the same problem that the PC market does. Apart from us enthusiasts who enjoy benchmarking and overclocking, apart from high end gaming, and perhaps some specialized operations (like video encoding which I do a bit of), you really don't need the latest and greatest CPU or 6+ GB of RAM (which many new desktops come with). Same with high end GPUs. Storage follows the same dilemna. I imagine that as storage grows, I'll be storing FLAC music files instead of AAC, MP3, or OGG, and more video. I will also use my cell phone to replace my USB key drive. Otherwise, there's no need for bigger storage.
gTen said:
I see I actually read before that Gingerbread would allow for dual core support but I guess that was delayed to honeycomb...
either way this would mean even if a Tegra based phone comes out it wont be able to utilize both cored until at least mid next year.
I can't open pdfs right now but I read a whitepaper with performance of hummingbird and Tegra 2 compared both on single core and dual core..is that the same one?
One thing though is Nvidia and ATI are quite known for tweaking their gfx cards to perform well on benchmarks...I hope its not the same with their CPUs :/
Click to expand...
Click to collapse
Gingerbread doesn't have any dual-core optimizations. It has some JIT improvements in addition to some other minor enhancements, but according to rumor, Honeycomb is where it's at, and it's why the major tablet manufacturers are holding off releasing their Tegra 2 tablets until it's released.
And yeah, that paper shows the performance of several different Cortex A8s (including Hummingbird) compared to Tegra 2, and then goes on to compare Tegra 2 single-core performance vs dual.
Electrofreak said:
Gingerbread doesn't have any dual-core optimizations. It has some JIT improvements in addition to some other minor enhancements, but according to rumor, Honeycomb is where it's at, and it's why the major tablet manufacturers are holding off releasing their Tegra 2 tablets until it's released.
And yeah, that paper shows the performance of several different Cortex A8s (including Hummingbird) compared to Tegra 2, and then goes on to compare Tegra 2 single-core performance vs dual.
Click to expand...
Click to collapse
I looked at:
http://androidandme.com/2010/11/new...u-will-want-to-buy-a-dual-core-mobile-device/
since I can't access the pdf..does the whitepaper state what version they used to do their tests? for example if they used 2.1 on the sgs and honeycomb on their tests it wouldn't exactly be a fair comparison...do they also put in the actual FPS..not % wise? for example we are capped on the FPS for example...
Lastly, in the test does it say whether the Tegra 2 was dithering at 16bit or 24bit?
gTen said:
I looked at:
http://androidandme.com/2010/11/new...u-will-want-to-buy-a-dual-core-mobile-device/
Click to expand...
Click to collapse
I'm one of Taylor's (unofficial) tech consultants, and I spoke with him regarding that article. Though, credit where it's due to Taylor, he's been digging stuff up recently that I don't have a clue about. We've talked about Honeycomb and dual-core tablets, and since Honeycomb will be the first release of Android to support tablets officially, and since Motorola seems to be holding back the release of its Tegra 2 tablet until Honeycomb (quickly checks AndroidAndMe to make sure I haven't said anything Taylor hasn't already said), and rumors say that Honeycomb will have dual-core support, it all makes sense.
But yes, the whitepaper is the one he used to base that article on.
gTen said:
since I can't access the pdf..does the whitepaper state what version they used to do their tests? for example if they used 2.1 on the sgs and honeycomb on their tests it wouldn't exactly be a fair comparison...do they also put in the actual FPS..not % wise? for example we are capped on the FPS for example...
Lastly, in the test does it say whether the Tegra 2 was dithering at 16bit or 24bit?
Click to expand...
Click to collapse
Android 2.2 was used in all of their tests according to the footnotes in the document. While I believe that Android 2.2 is capable of using both cores simultaneously, I don't believe it is capable of threading them separately. But that's just my theory. I'm just going off of what the Gingerbread documentation from Google says; and unfortunately there is no mention of improved multi-core processor support in Gingerbread.
http://developer.android.com/sdk/android-2.3-highlights.html
As for FPS and the dithering... they don't really go there; the whitepaper is clearly focused on CPU performance, and so it features benchmark scores and timed results. I take it all with a pinch of salt anyhow; despite the graphs and such, it's still basically an NVIDIA advertisement.
That said, Taylor has been to one of their expos or whatever you call it, and he's convinced that the Tegra 2 GPU will perform several times better than the SGX 540 in the Galaxy S phones. I'm not so sure I'm convinced... I've seen comparable performance benchmarks come from the LG Tegra 2 phone, but Taylor claims it was an early build with and he's seen even better performance. Time will tell I suppose...
EDIT - As for not being able to access the .pdfs, what are you talking about?! XDA app / browser and Adobe Reader!

Samsung Galaxy S III for Verizon keeps its design intact

Verizon is one of the five carriers to start offering the Samsung Galaxy S III this month and leaked pictures show that the device will virtually be an untouched version of its international GSM sibling.Android Central got their hands on some photos of the Galaxy S III for the Big Red, which, excluding the 4G Verizon logo on the back, is the same as the GSM model of the device. The only difference is it runs on a dual-core 1.5GHz Qualcomm Snapdragon S4 MSM8960 chipset with 2GB of RAM.
Samsung have decided not to alter the Galaxy S III as much as they did with the Galaxy S II lineup last year and launch the device with the same outfit as everywhere else. This seems to be the case with the US-bound T-Mobile version and the one sold by AT&T, as well.
Speaking of launch, it's yet unclear when Verizon is going to put the Galaxy S III on the shelves, but it will surely be sometime this month.
Dual core with 2gigs of ram? Isnt the S3 quad core with 1gig? Hmmm
Sent from my Nexus S 4G using XDA Premium App
QBANBOY407 said:
Dual core with 2gigs of ram? Isnt the S3 quad core with 1gig? Hmmm
Sent from my Nexus S 4G using XDA Premium App
Click to expand...
Click to collapse
2GB of RAM is nice, but I'd rather have a quad-core Exynos since I'm a gamer and that's a big selling point of the Galaxy line.
Product F(RED) said:
2GB of RAM is nice, but I'd rather have a quad-core Exynos since I'm a gamer and that's a big selling point of the Galaxy line.
Click to expand...
Click to collapse
Me too!
Sent from my Nexus S 4G using XDA Premium App
Snapdragon....so does that mean no Wolfson DAC for Verizon's phone?
alpha-niner64 said:
Snapdragon....so does that mean no Wolfson DAC for Verizon's phone?
Click to expand...
Click to collapse
Sent u a pm can u please reply ??
June 6th they are starting to take pre-orders is what I just saw.
Sent from my MB870 using xda premium
As the release of the Samsung Galaxy SIII looms, I am wondering what events will take place. Do you think big red will officially roll out the the new data plans before launch? I doubt it.
When the S3 is released this month presumably before the new data plans roll out, will I be able to keep my grandfathered unlimited plan?
ready to leave Apple for android, but is the GS3 good enough?
hey guys not sure if i should get this phone... im sad it will not have the overclocked Mali400 400mhz GPU.... but i know the S4 CPU with the andreno 225 is a beast, i held off the GN on big red cause of the old powervr540 GPU WTF but i know ICS is much better of using the GPU instead of 3.2 and below mostly using the CPU for graphics processing... im a big gamer , thats why i use the iphone4s i love the powervr543mp2 its badass... so... what should i get? i kinda wanna wait for the LG eclipse i hear it comes with the adreno 320, that alone makes me giddy or does any one know of any phones coming out with the exynos 5250? i hear that Mali-t604 GPU can walk all over the PowerVR544mp4 in the ipad3 so anyone please help... should i wait for phones with the next Gen GPUS ,adreno 320 and Mali-t604? or will my gaming needs be met with the GS3 with the S4 CPU running the adreno 225 GPU? im ready to get rid of my Iphone4s... but i still want the same graphics performance of the powerVR543mp2 in my iphone 4S , i love the idea of android and i cant wait to leave the dark side of apple!!!! FTW Andriod!!!
p.s i know im a noob here so sorry for the long post
jfriend33 said:
As the release of the Samsung Galaxy SIII looms, I am wondering what events will take place. Do you think big red will officially roll out the the new data plans before launch? I doubt it.
When the S3 is released this month presumably before the new data plans roll out, will I be able to keep my grandfathered unlimited plan?
Click to expand...
Click to collapse
If you go ahead and pre-order you will be able to keep your Unlimited. Tiered plans are supposed to begin July 1st, so anytime before then should be fine.
vader540is said:
hey guys not sure if i should get this phone... im sad it will not have the overclocked Mali400 400mhz GPU.... but i know the S4 CPU with the andreno 225 is a beast, i held off the GN on big red cause of the old powervr540 GPU WTF but i know ICS is much better of using the GPU instead of 3.2 and below mostly using the CPU for graphics processing... im a big gamer , thats why i use the iphone4s i love the powervr543mp2 its badass... so... what should i get? i kinda wanna wait for the LG eclipse i hear it comes with the adreno 320, that alone makes me giddy or does any one know of any phones coming out with the exynos 5250? i hear that Mali-t604 GPU can walk all over the PowerVR544mp4 in the ipad3 so anyone please help... should i wait for phones with the next Gen GPUS ,adreno 320 and Mali-t604? or will my gaming needs be met with the GS3 with the S4 CPU running the adreno 225 GPU? im ready to get rid of my Iphone4s... but i still want the same graphics performance of the powerVR543mp2 in my iphone 4S , i love the idea of android and i cant wait to leave the dark side of apple!!!! FTW Andriod!!!
p.s i know im a noob here so sorry for the long post
Click to expand...
Click to collapse
Developers never ever use the highest end hardware when designing their games because of exactly why you're worried. They'll always use the hardware that's the most friendly and easily-sourced in favor of something that is completely different like the Mali GPUs (which is more reserved for tablets anyways if theory comes to fact). Mali is still unproven whereas Adreno is easily sourced. I'll put money that developers will favor Adreno for some time until.
alpha-niner64 said:
Developers never ever use the highest end hardware when designing their games because of exactly why you're worried. They'll always use the hardware that's the most friendly and easily-sourced in favor of something that is completely different like the Mali GPUs (which is more reserved for tablets anyways if theory comes to fact). Mali is still unproven whereas Adreno is easily sourced. I'll put money that developers will favor Adreno for some time until.
Click to expand...
Click to collapse
The exception to your statement is of course the Tegra platform, which has versions of games optimized specifically for it. But in general you're correct. The Mali is significantly more powerful than the S4, although in real-world usage the difference would be negligible.
Does the VZW version with the Snapdragon MSM8960 radio have LTE on the actual SOC. Or is the LTE radio on a separate chip like the Bionic and Galaxy Nexus? Basically is there any battery saving with this radio by having the LTE on the SOC itself instead of a stand alone chip set.
proxus01 said:
Does the VZW version with the Snapdragon MSM8960 radio have LTE on the actual SOC. Or is the LTE radio on a separate chip like the Bionic and Galaxy Nexus? Basically is there any battery saving with this radio by having the LTE on the SOC itself instead of a stand alone chip set.
Click to expand...
Click to collapse
Its actually integrated in to the block of the CPU diagram
Sent from my ADR6400L using XDA
I found a Diagram
The Qualcomm Snapdragon S4 (MSM8960) is composed of two Krait CPUs clocked between 1.2 and 1.5 Ghz, an Adreno 225 GPU and a modem subsystem with LTE, GPS, Wifi, Bluetooth and FM support. It will be manufactured using 28nm technology and provide much lower power consumption compared to previous generations.
Snapdragon S4 Block Diagram
Key features and improvements:
New CPU micro-architecture: The Krait CPU offer a 60% performance improvement compared to the scorpion CPU used in previous generations.
CPU performance Roadmap
SIMD/VFP performance: Multimedia instructions (SIMD) and floating point operations have also been improved, but no metrics have been provided.
Optimized memory subsystem: Krait includes dual-channel memory. Dual-channel memory is critical in
order for the processor to being able to handle the large bandwidth requirements in multicore systems.
25/40% power improvement: Thanks to an asynchronous multi-core processing, the MSM8960 consumes between 25 to 40% less power.
Reduced complexity: Qualcomm explains a companion core is not needed to reduce power savings as they use aSMP (asynchronous SMP) technology. This goes against the choice of Nvidia to have a companion core in NVidia 3.
50% increase in GPU performance: The Adreno 225 GPU delivers 50% greater graphics processing power over the previous generation Adreno GPU, Adreno 220, and six times the processing power of Adreno 200.
Adreno GPU Power Improvements
Fully integrated 3G/4G world/multimode LTE Modem: Supports all of the world:s leading 2G, 3G
and 4G LTE standards. It also includes integrated support for multiple satellite position networks (GPS and GLONASS) as well as short range radios via Bluetooth, WiFi, FM and NFC.
Programmable Hexagon DSPTM Architecture: According to the block diagram above. They all contribute to the improved performance of the mobile processor. Custom DSP applications can also be written by OEM and ISV.
Read more: http://www.cnx-software.com/2011/10/08/qualcomm-snapdragon-s4-msm8960/
alpha-niner64 said:
Developers never ever use the highest end hardware when designing their games because of exactly why you're worried. They'll always use the hardware that's the most friendly and easily-sourced in favor of something that is completely different like the Mali GPUs (which is more reserved for tablets anyways if theory comes to fact). Mali is still unproven whereas Adreno is easily sourced. I'll put money that developers will favor Adreno for some time until.
Click to expand...
Click to collapse
Very true but with the growth of mobile gaming today, developers must use next Gen GPUs for example the Malit628 will have native support for open CL, 3D hi Res, multi threading and 64 bit... Look at TV now there will be ultra definition which will make 1080p look like my original Nintendo game boy in the 90s... So smart phones will follow suit... Look at LGs new super phone the eclipse, 5 inch display 440 ppi ! And has an adreno 320 GPU apple knows how important a smooth graphic interface is... Apple has always used high power GPUs in their Ipad and iPhones, look at ICS Google finally use integrated hardware and graphical acceleration in the ICS operating system, u can tell the difference on how smooth 4.0 is compared to 2.3 and 3.0 the future looks good right about now....its the waiting that is killing me lol
Sent from my ADR6400L using XDA
I am considering the Galaxy S III to keep my Verizon Unlimited Data plan, but I am wondering if it is rootable?
S. Prime said:
I am considering the Galaxy S III to keep my Verizon Unlimited Data plan, but I am wondering if it is rootable?
Click to expand...
Click to collapse
Samsung phones are ALWAYS rootable. They allow it. In fact, the bootloader just gives you a warning but lets you.

CPU/Processor Showdown - HTC One vs Galaxy S4

Which processow will be better, Exynos 5 Octa or A simple Snapdragon 600 quad?
In my POV, Octa will be useless since it will be a battery hog and no apps really use that much cores and power. The S600 will be more efficient for day-to-day use since it consumes less power and will actually be used.
-.-.-.-.-.-.-.-.-
Sent from a dark and unknown place
Galaxy Tab 2 7.0 P3100
I thought the s4 had the same processor as the One, but it was clocked to 1.9? I could be wrong. I wasn't really paying attention.
Sent from my HTC One using Tapatalk 2
I'd imagine this thread will get closed.
In the meantime, read this thread and then make a judgement because the "it uses more power so it sucks" mentality is just simply incorrect.
[Info] Exynos Octa and why you need to stop the drama about the 8 cores
AndreiLux said:
Misconception #1: Samsung didn't design this, ARM did. This is not some stupid marketing gimmick.
Misconception #2: You DON'T need to have all 8 cores online, actually, only maximum 4 cores will ever be online at the same time.
Misconception #3: If the workload is thread-light, just as we did hot-plugging on previous CPUs, big.LITTLE pairs will simply remain offline under such light loads. There is no wasted power with power-gating.
Misconception #4: As mentioned, each pair can switch independently of other pairs. It's not he whole cluster who switches between A15 and A7 cores. You can have only a single A15 online, together with two A7's, while the fourth pair is completely offline.
Misconception #5: The two clusters have their own frequency planes. This means A15 cores all run on one frequency while the A7 cores can be running on another. However, inside of the frequency planes, all cores run at the same frequency, meaning there is only one frequency for all cores of a type at a time.
Click to expand...
Click to collapse
Addition: I am not a Samsung fanboy by any means, however, the amount of incorrect information floating around about both of these flagships is starting to get annoying.
2nd addition: Read this as well, the big.LITTLE technology being used in the Octa is pretty amazing: big.LITTLE Processing
I hope that the overclocking or higher clock rate doesn't produce Moment-esque results.
Alsybub said:
I thought the s4 had the same processor as the One, but it was clocked to 1.9? I could be wrong. I wasn't really paying attention.
Sent from my HTC One using Tapatalk 2
Click to expand...
Click to collapse
In the US that is true, they are both S600's, with the S4 having a .2ghz higher clockspeed. Many of the other S4's will have the Octa Exynos chip.
crawlgsx said:
In the US that is true, they are both S600's, with the S4 having a .2ghz higher clockspeed. Many of the other S4's will have the Octa Exynos chip.
Click to expand...
Click to collapse
Ah. I see. Different hardware for different regions. Like the One X.
Even though it's eight cores it is probably complete overkill. Yet another bigger number to put on marketing. How many apps will actually use that? How many apps use four cores at the moment?
There have been some articles about multiple cores being more for point of sale than for the end user. Even if you're signing up for a contract right now I doubt that much would be making use of it in two years time. So, the future proofing argument is moot.
It'll be interesting to see. Of course the galaxy builds of Android will use the cores. With things like the stay awake feature and pip it is useful. Outside of the OS I can't see it being necessary.
Sent from my Transformer Prime TF201 using Tapatalk HD
The "octa" core processor is complete bullsh*t. Imo, 2/4 cores are perfectly fine as long as they optimize it and perfect the hardware, why stack 8 cores when only 4 work at one time and no app will use all that power.
They should've focused on design to make it look less like a toy phone and use better finish, instead.
Oh the marketing..
Not HTC or whatever fanboy, just stating my opinion.
rotchcrocket04 said:
I'd imagine this thread will get closed.
In the meantime, read this thread and then make a judgement because the "it uses more power so it sucks" mentality is just simply incorrect.
[Info] Exynos Octa and why you need to stop the drama about the 8 cores
Addition: I am not a Samsung fanboy by any means, however, the amount of incorrect information floating around about both of these flagships is starting to get annoying.
2nd addition: Read this as well, the big.LITTLE technology being used in the Octa is pretty amazing: big.LITTLE Processing
Click to expand...
Click to collapse
Very good read, thanks for taking the time to post it. Surprised no-one has mentioned that we need this in our Ones. Would certainly help with the battery.
Saying its a 8 core cpu is marketing simply put.
Like it has been said only 4 out of 8 cores will only ever be enabled at once max.
The GPU on the Octa might be better then the Adreno 320 but its have to wait for benchmarks.
Nekromantik said:
Saying its a 8 core cpu is marketing simply put.
Like it has been said only 4 out of 8 cores will only ever be enabled at once max.
The GPU on the Octa might be better then the Adreno 320 but its have to wait for benchmarks.
Click to expand...
Click to collapse
Benchmarks show adreno320 keeps up nicely. You won't see any real world differences besides a slightly lower benchmark score
http://forum.xda-developers.com/showthread.php?t=2191834
Sent from my ADR6425LVW using xda app-developers app
Squirrel1620 said:
Benchmarks show adreno320 keeps up nicely. You won't see any real world differences besides a slightly lower benchmark score
http://forum.xda-developers.com/showthread.php?t=2191834
Sent from my ADR6425LVW using xda app-developers app
Click to expand...
Click to collapse
Those are from the S600 version.
Higher clock speed and Android 4.2 will mean its slightly ahead.
No benchmarks from the Octa version yet.
Nekromantik said:
Those are from the S600 version.
Higher clock speed and Android 4.2 will mean its slightly ahead.
No benchmarks from the Octa version yet.
Click to expand...
Click to collapse
I'll just stick with the one and wait for the 4.2 update. By then we should have custom kernels to overclock ourselves
Sent from my ADR6425LVW using xda app-developers app
Here you go
Nekromantik said:
Saying its a 8 core cpu is marketing simply put.
Like it has been said only 4 out of 8 cores will only ever be enabled at once max.
The GPU on the Octa might be better then the Adreno 320 but its have to wait for benchmarks.
Click to expand...
Click to collapse
"Octa" is not gimmicky or for marketing.
Octa is the name of the SoC, and how it was named is nothing wrong
There are 3 implementations can be used, and one with maximum 8 cores running at the same time.
GS4 doesn't use that impletations, but it does not mean the SoC cannot be "Octa". You have a house with 8 rooms but you know to open or you wanna open 4 rooms only, the house is still an 8-room house.
hung2900 said:
"Octa" is not gimmicky or for marketing.
Octa is the name of the SoC, and how it was named is nothing wrong
There are 3 implementations can be used, and one with maximum 8 cores running at the same time.
GS4 doesn't use that impletations, but it does not mean the SoC cannot be "Octa". You have a house with 8 rooms but you know to open or you wanna open 4 rooms only, the house is still an 8-room house.
Click to expand...
Click to collapse
How do you know all 8 can run at the same time? Has Samsung demonstrated that already? Any links?
Also what would be the speed if all 8 are running at the same time?
Also did you see that an Intel dual core @2GHz beat the Exynos Octa in benchmarks!!! So all 8 cores running at slower speed might not be very good actually. It might even slow down things even more...
We recently demonstrated a dual core running at 3GHz at MWC in Barcelona. That chip was able to load games at crazy speeds. A game that took 15s to load on existing Exynos Quad core was loading in just 6s with our chip!
joslicx said:
We recently demonstrated a dual core running at 3GHz at MWC in Barcelona. That chip was able to load games at crazy speeds. A game that took 15s to load on existing Exynos Quad core was loading in just 6s with our chip!
Click to expand...
Click to collapse
. And used 3 times the energy to do it... Was that tested at all?
backfromthestorm said:
. And used 3 times the energy to do it... Was that tested at all?
Click to expand...
Click to collapse
Its all about bragging rights really. Same as Samsung is doing with regards to Octa.
The the chip that could run at 3GHz could also very well run at 1GHz at just 0.6V (so consuming far lesser power than anything else in the market). A dual core at 1GHz is still good enough for all mundane tasks like playing videos or internet browsing etc. So in practice it would have been a very efficient solution. It was a real innovation really. Sadly the company did not have money to pour more funds into the program and has shut it.
It was demonstrated at Mobile World Congress in Barcelona in february this year.
Anyway point is, we did not need extra set of power efficient cores like Samsung is doing. We ran the same cores that could do crazy high speeds and even crazier power efficient mode! Thats a very neat solution.
Heres a press link: http://www.itproportal.com/2013/02/25/mwc-2013-exclusive-dual-core-st-ericsson-novathor-l8580-soc-crushes-competition-benchmarks/
To quote the article:
A continuous running test monitored by an infra-red reader showed that the 3GHz prototype smartphone remained cooler as it uses less energy and in some scenarios, it could add up to five hours battery life in a normal usage scenario
Click to expand...
Click to collapse
hung2900 said:
"Octa" is not gimmicky or for marketing.
Octa is the name of the SoC, and how it was named is nothing wrong
There are 3 implementations can be used, and one with maximum 8 cores running at the same time.
GS4 doesn't use that impletations, but it does not mean the SoC cannot be "Octa". You have a house with 8 rooms but you know to open or you wanna open 4 rooms only, the house is still an 8-room house.
Click to expand...
Click to collapse
Actually, no. At least not in my opinion. Octacore means 8 cpu cores on one cpu-chip.
I would see it like this:
You have 2 houses on your lawn which are beside each other. Every house has 4 rooms. You have to switch houses to open up the rooms. Just like the Exynos "Octa" has to, since it cannot run both CPU's at the same time.
If you are in a house with 8 rooms, you cannot simply be in all 8 rooms at once. You can connect the open doors between all the rooms, and since your in that house, you can freely walk in every room. But not with that implementation.
I wouldn't call the Exynos "Octa" an Octacore, its a dual CPU system with a 2x4 cores, with the difference that regular desktop dual CPU systems can use both CPU units at once, but not like the Exynos "Octa". Still, dual quad system comes closer than a pure octacore system.
This is kind of a hybrid. Nice technology for a mobile device, but at the same time, kind of unneeded / inefficient, compared to regular quadcore systems. Even the Tegra 3 system with 4 active cores and 1 companion core for standby tasks seems more efficient (in terms of "used space" and ressources).
Ah well let's see how the supposed and so called "octacore" will score in the future...
processor differences
okay I know both processor are snapdragon 600's but why is the galaxy S4's processor clocked at 1.9 ghz and the HTC One's processor is clocked at 1.7 ghz is it just an instance of samsung overclocking the s600 or are they different variations of the same processor, I have done some research and am able to find no clear answer to this question even on the snapdragon website????????
dawg00201 said:
okay I know both processor are snapdragon 600's but why is the galaxy S4's processor clocked at 1.9 ghz and the HTC One's processor is clocked at 1.7 ghz is it just an instance of samsung overclocking the s600 or are they different variations of the same processor, I have done some research and am able to find no clear answer to this question even on the snapdragon website????????
Click to expand...
Click to collapse
They should be identical. I think its just a manufacturer choice. But it could also be associated to termals or battery.
Cause Samsung took the higher frequency chips, there is the possibility that they also get the "better" chips: Lower Voltage for the same frequency. But thats just an assumption.

Which one do you suggest when buying the S4 .... Snapdragon 600 or Exynos 5 Octa ?

I am planning to buy S4
But I'm not sure which is better
Snapdragon 600
1.9 Ghz
Quad Core Krait 300.
or
Exynos5
1.6 Ghz Octal Core
Quad Core Cortex A15 / Quad Core A7
4g and installing roms is not important to me
Which is better in terms of performance?
Which is better for gaming?
Which is better for battery life?
Octa for all.
tuxonhtc said:
Octa for all.
Click to expand...
Click to collapse
why?
an9093 said:
why?
Click to expand...
Click to collapse
processor is more stronger rather than spandragon
also exynos version will be global version so you can get update in time
but power vr sgx and adreno 320 is offering same result as a GPU
as a result i recommend you to buy exynos octa 5
any other suggestions?
an9093 said:
I am planning to buy S4
But I'm not sure which is better
Snapdragon 600
1.9 Ghz
Quad Core Krait 300.
or
Exynos5
1.6 Ghz Octal Core
Quad Core Cortex A15 / Quad Core A7
4g and installing roms is not important to me
Which is better in terms of performance?
Which is better for gaming?
Which is better for battery life?
Click to expand...
Click to collapse
Performance wise the octa version will win on the CPU side and be on par , gaming the snapdragon will probably win cause it seems to becoming a standard on high end android Devices, battery will also be a tie, where the octa wins in low demanding tasks thanks to its a7, and the s600 win on higher demanding tasks cause the krait consume less then the a15.
Sent from my iPhone 5 using Tapatalk
any other suggestions?
an9093 said:
any other suggestions?
Click to expand...
Click to collapse
The above suggestions are not enough?
Sent from my HTC Desire X using xda app-developers app
Exynos 5 sweeps the floor with the snapdragon
nitinvaid said:
The above suggestions are not enough?
Sent from my HTC Desire X using xda app-developers app
Click to expand...
Click to collapse
not enough i want more
Blackwolf10 said:
Exynos 5 sweeps the floor with the snapdragon
Click to expand...
Click to collapse
what do you mean ?
an9093 said:
what do you mean ?
Click to expand...
Click to collapse
better cpu same gpu and better battery life faster updates the only thing snappy beats it is with Aosp rom support and 4G
an9093 said:
what do you mean ?
Click to expand...
Click to collapse
Giving Samsung ready to make the sources available, you'd better off with the Snapdragon version because Qualcomm will release their sources well before Samsung does. Consequence:
Development for the i9505 (Snapdragon version) will take off while the Exynos version's development will be slower...
f.
Blackwolf10 said:
better cpu same gpu and better battery life faster updates the only thing snappy beats it is with Aosp rom support and 4G
Click to expand...
Click to collapse
Stop saying GPU is same ! Adreno 320 sweeps the floor and even the windows with the ****ing PowerVR !!!
but Octa sweeps the floor with Qualcomm.
Both are important....cpu and gpu. But we have enough CPU in both. What sense makes to have huge CPU if you wouldn't be able to play or use some apps because of the old gpu without OpenGL ES 3.0 ?
demlasjr said:
Stop saying GPU is same ! Adreno 320 sweeps the floor and even the windows with the ****ing PowerVR !!!
but Octa sweeps the floor with Qualcomm.
Both are important....cpu and gpu. But we have enough CPU in both. What sense makes to have huge CPU if you wouldn't be able to play or use some apps because of the old gpu without OpenGL ES 3.0 ?
Click to expand...
Click to collapse
you changed ur comment to neutral so i guess i can say that your right but still it's awesome to look like a boss on Antutu benchmarks
Blackwolf10 said:
you changed ur comment to neutral so i guess i can say that your right but still it's awesome to look like a boss on Antutu benchmarks
Click to expand...
Click to collapse
Like a boss in Antutu and playing Angry Birds while Snapdragon 600 owners cries for being kings of the hills but playing new and high graphics games.
I found one more things which makes Exynos Octa better. The ram in Exynos is LPPDDR3 at 800MHz, which supports 12.8GB/s bandwidth, while S600 have LPPDDR3 at 600MHz (? not sure) but it's at 9.X GB/s (can't find this information
demlasjr said:
Like a boss in Antutu and playing Angry Birds while Snapdragon 600 owners cries for being kings of the hills but playing new and high graphics games.
I found one more things which makes Exynos Octa better. The ram in Exynos is LPPDDR3 at 800MHz, which supports 12.8GB/s bandwidth, while S600 have LPPDDR3 at 600MHz (? not sure) but it's at 9.X GB/s (can't find this information
Click to expand...
Click to collapse
owh and angry birds friends is releasing soon
most of Ram is wasted on S-apps and features anyway and don't forget the wolfson audio chipset for exynos
Blackwolf10 said:
owh and angry birds friends is releasing soon
most of Ram is wasted on S-apps and features anyway and don't forget the wolfson audio chipset for exynos
Click to expand...
Click to collapse
I know, even so I'm big fan of Yamaha Grand Piano (I own one), so...if the chip inside the Qualcomm is Yamaha isn't a problem for me.
The only thing that makes me angry about the PowerVR 544MP3 junk is the OpenGL ES 3.0. I'm not 100% what's the importance if it, but i'm sure that soon will be released many games with this support and PowerVR have an OpenGL ES 2.0 api which gives "extended support of OpenGL ES 3.0. However, if you want full 3.0 compliance choose PowerVR 6X Rogue".
WTF is that extended api does ? I want to see a OpenGL ES 3.0 benchmark (GLBenchmark promised to release one) of both, Adreno 320 and PowerVR 544
demlasjr said:
Like a boss in Antutu and playing Angry Birds while Snapdragon 600 owners cries for being kings of the hills but playing new and high graphics games.
I found one more things which makes Exynos Octa better. The ram in Exynos is LPPDDR3 at 800MHz, which supports 12.8GB/s bandwidth, while S600 have LPPDDR3 at 600MHz (? not sure) but it's at 9.X GB/s (can't find this information
Click to expand...
Click to collapse
what is the bandwidth and how it makes the Exynos stronger ?
Eyxynos seems to be overheating (see relevant thread in Q&A)
A question once and for all: DOES THE 9500 support LTE/4G or NOT!? Because that's a deciding factor in my purchse. I live in Denmark. I mean definite yes or no should be possible right?
overheating + no LTE will definitely sway me towards the 9505, but it still seems to be too early to tell.

Categories

Resources