I hope... - Galaxy S III General

The iPhone 5 really takes a good jab at the Galaxy S III so it teaches Samsung to stop being lazy with their hardware. They thought they could rest on their laurels and produce a so so product because they had a bit of hype under their sail.
I wanted them to go above and beyond the norm of the average when it comes to phones. Same GPU, RAM from last year? A modest clocked CPU? They clearly didn't go above and beyond like the last time. I don't see the improvement in the camera aside from snapping photos a bit faster; albeit its supposed to have zero shutter lag. They even cut the front facing camera back by .1 megapixels.
I hope they focus more on specs next time around since they honed in on software. With under 2 GB of ram, this phone isn't as future proof.

I also thought they would 'improve' the battery. At least 2,300 mAh would have been a more sufficient improvement.

megagodx said:
The iPhone 5 really takes a good jab at the Galaxy S III so it teaches Samsung to stop being lazy with their hardware. They thought they could rest on their laurels and produce a so so product because they had a bit of hype under their sail.
Click to expand...
Click to collapse
iPhone will be an average phone when it comes as all iPhones are. Maybe good display and camera but thats it.
megagodx said:
I wanted them to go above and beyond the norm of the average when it comes to phones. Same GPU, RAM from last year? A modest clocked CPU? They clearly didn't go above and beyond like the last time. I don't see the improvement in the camera aside from snapping photos a bit faster; albeit its supposed to have zero shutter lag. They even cut the front facing camera back by .1 megapixels.
I hope they focus more on specs next time around since they honed in on software. With under 2 GB of ram, this phone isn't as future proof.
Click to expand...
Click to collapse
GPU is not the same. Just because it states Mali 400 doesnt mean its the same. It's like saying corei7 are all the same while there are many versions. This Mali 400 is better than its predecessor.
RAM is not the same either, just the same amount.
The CPU is clocked at 1.4 for battery consumption purposes. It can certainly be pushed.
The Front Camera records 720p and its 1.9mpixels instead of 1.3 that HOX's is. Maybe the megapixels there had to do with the ability to record 720p? I don't really know but 0.1 mpixels doesnt mean worse
A phone doesn't practically need more than 1GB of RAM. Since you don't notice any lags, the games run smoothly and you don't get any memory full messages I see no reason why to whine about more RAM
Well... I understand that people want revolution but why would samsung raise the competition when they are still ahead with less?

Damn what are you running that makes you want 2GB of ram
Sent from my Epic 4G Touch using XDA

General chat can go here
Also I deleted some crap posts, keep on topic and report dupes/junk.

Related

Is galaxy s Gpu really that power

i have heared that galaxy s Gpu can give 90M triangles/sec is that true as some sources claming that it only gives 28M tri/sec http://en.wikipedia.org/wiki/PowerVR , and the higher one sgx 545 gives 40 m so how the sgx 540 gives 90M
hoss_n2 said:
i have heared that galaxy s Gpu can give 90M triangles/sec is that true as some sources claming that it only gives 28M tri/sec http://en.wikipedia.org/wiki/PowerVR , and the higher one sgx 545 gives 40 m so how the sgx 540 gives 90M
Click to expand...
Click to collapse
I don't think the number listed on wikipedia is 'triangles' per second... It just says polys... So it could be a different shape thats harder to render?
Just my guess.
Besides if the 90M claimed is actually the 28 million then don't worry because the same thing for the iPhone's GPU (the 535) claims around 22m and wiki is listing it as 14.
Aaannnnddd if you're worried about the GPU feel comforted that no 3D benchmarks I've seen have even slowed it down so far and you can see tons of videos on youtube of Galaxy S series phones face rolling every single other Android device in gaming FPS benchmarks. Even if it isn't as amazing as the numbers they claimed there is no doubt that it's the best in the market at the moment, and by quite a lot too!
I'm not going to pretend that I read the comment thoroughly, but I've read a similar question. The person who seemed to know what they were talking about, said that essentially the 90m is a "theoretical number" and that about half of that number is what the phone should? can? will? potentially? do...(skimming, memory and probably comprehension make that a very difficult word to fill in accurately)....but this is how all manufacturers report their graphics capabilities (at least in smartphones, but I'll assume the same holds true for the desktop/laptop graphics cards).
So, while the number is definitely overstated, it's within the standard reporting convention...and relative to other numbers, still accurate (2x as many triangles is 2x as many whether everything gets cut in half of cut by a factor of 3).
*I'll remove my fuzzy language when someone better informed than me responds*
I also read a good article (don't know where it is now sorry) all about how the GPU relies heavily on the memory and bus between them etc and for example there could be a phone running the same GPU as another and have much less performance because they don't use much memory, or use slow memory. Apparently our SGS have done pretty well in all departments.
To untangle the confusion-
Triangles= "polys" (polygons)
The SGS does nothing bear 90M, but on the other side, none of the other phones are doing what the manufacturers are claiming them to do.
Plus, the wikipedia article is FAR from being reliable, it's been edited more than 5 times over the past 2 months, with ever changing results. No official specs are available from imgtec.
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Pika007 said:
...
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Click to expand...
Click to collapse
Well, one important fact is the pixelcount in the glbenchmark link you sent. iPhone4 and iPad share the same GPU. The difference in pixels is about 20%, and hence the difference between those two.
Let me make one ugly calculation to map SGS's score to iPhone4's. Pixelcount difference between i4 and SGS is a factor 0.625. That we would make the SGS score 1146 on the iPhone resolution. (or 1723 for i4 on 800*480 resolution). Offcourse there are more factors involved but this the best estimate i can make at the moment.
Difference turns out not te be that great after all.
I knew this argument was going to pop up soon enough, so i'll add one VERY important factor-
Score doesn't decrease proportionally to an increase in resolution.
For example, doubling the resolution won't give half the score. More like 70%~
Try running 3Dmark on your PC in different resolutions, you'll see some interesting results.
Personally, GLmark 1.1 for me is just a very crude example, for general demontstrations. It's not really close to be very accurate.
I'm waiting for GLmark 2.0 that should be a great tool to effectively compare the devices.
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Pika007 said:
To untangle the confusion-
Triangles= "polys" (polygons)
The SGS does nothing bear 90M, but on the other side, none of the other phones are doing what the manufacturers are claiming them to do.
Plus, the wikipedia article is FAR from being reliable, it's been edited more than 5 times over the past 2 months, with ever changing results. No official specs are available from imgtec.
One thing i CAN tell you is that the GPU on the SGS is nothing less than a monster.
http://glbenchmark.com/result.jsp?benchmark=glpro11&certified_only=2
I'd like you to take as a refrence the Compal NAZ10 that uses the ever-glorified Nvidia TEGRA 2, and the iPhone 4 (SGX535)
I don't know what trick Samsung used, but there shouldn't be such a massive difference between the 535 and the 540.
Appearently someone over at Sammy did something right.
Extremely right.
Click to expand...
Click to collapse
yes it is edited more than 5 times but there is an offcial sources says that sgx 454 gives only 40M polygons so hw sgx450 gives 90M i know numbers are not important if there is nothing to use it but i only wanted to know
I think its due to fact that older chip has 2d acceleration too, while 450 is pure 3d and we use cpu for 2d. Thats why its faster.
It is important to note that PowerVR does not do 3D rendering using the traditional 3D polygon based pipeline, like those used in nVidia and ATi cards. It uses the unique tile based rendering engine. This approach is more efficient and uses less memory bandwidth as well as RAW horse power. IIRC, the original PowerVR 3D PC card is a PCI card that can compete head to head with AGP based cards from 3dfx and ATi at that time. Unfortunately, its unique rendering engine does not fit well with Direct3D and OpenGL which favor traditional polygon-based rendering pipelines.
So, the 90M figure could well be the equivelent performance number when using traditional 3D rendering pipeline as compared to Tile-based PowerVR setup.
Power VR does indeed use the traditional 3D polygon based pipeline.
Tile based rendering is in addition, not instead.
Do note that not all games (and actually, far from it) are using TBR properly (if at all).
Read the release notes and press release, it has enough details.
hoss_n2 said:
yes it is edited more than 5 times but there is an offcial sources says that sgx 454 gives only 40M polygons so hw sgx450 gives 90M i know numbers are not important if there is nothing to use it but i only wanted to know
Click to expand...
Click to collapse
All the given numbers for "official" specs about PowerVR GPU's are for a frequenct of 200mhz.
Those chips can do well above 400mhz, so for example, if an SGX530 does 14M polygons and 500Mpixels per second @200mhz, if you clock it up to 400, it'll do 28Mpolys/1Gpixels.
Though i extremely doubt samsung has the SGX540 clocked at 600mhz in ths SGS...
A pratical and good exaple that shows of the power of the Galaxy S is Gameloft's Real Football 2010 game. The game hasn't got a framelock so it's playable on the Desire and Nexus One. Since pictures tell a thousand words and videos even moreso, I'll provide you this YouTube link: http://www.youtube.com/watch?v=S0DxP0sk5s0
Pika007 said:
All the given numbers for "official" specs about PowerVR GPU's are for a frequenct of 200mhz.
Those chips can do well above 400mhz, so for example, if an SGX530 does 14M polygons and 500Mpixels per second @200mhz, if you clock it up to 400, it'll do 28Mpolys/1Gpixels.
Though i extremely doubt samsung has the SGX540 clocked at 600mhz in ths SGS...
Click to expand...
Click to collapse
This is true however overclocking the GPU to those numbers is silly because the memory & memory bus can't support that much data throughput anyway. I don't even think there is enough to support the amount of the standard clock rate. There is a lot more to consider than just the GPU when it comes to graphics here
You're taking that article you read way too seriously.
Plus, we have no idea what is the bandwidth limit of the galaxy S, we don't know what kind of memory is used, how much of it, at what frequency, etc.
WiseDuck said:
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Click to expand...
Click to collapse
+1
Re: lag, I want doing bad until I installed one of the fixes. Now I've officially entered crazy-town.
If I would have to guess it has to do with S5PC110 optimizations. When rendering polygons there are many things that contribute aside from the GPU. Think of it maybe similar to hybrid-sli...(but this is just a guess)
but if you want to look at it in more detail, someone posted the official documentation and spec sheet for the S5PC110 a while back..I ddint get a chance to look at it but my guess the clock speeds and other stuff would be there :/
WiseDuck said:
Who cares if the phone is powerful when there are no great games that take advantage of the power and when you have an OS that lags all the damn time despite the fact that Quadrant gives me 2100+. Even opening the PHONE app can take up to 10 seconds. This thing can drive me crazy at times.
Click to expand...
Click to collapse
Well i dont have any lags, what so ever after lag fix. Something else must be troubleing your phone. Auto memory manager is a need tho if you want to keep it real snappy.
Sent from my GT-I9000 using XDA App

Old Rumor "Samsung Galaxy S2 I9200" - Might be True !!!

i am sure you all remember after the release of Samsung Galaxy S, a Russian site disclosed information about proposed Samsung Galaxy S2 with following specs.
4.3” 1280x720px Super AMOLED 2 display
2GHz CPU,
1GB RAM/4GB ROM
32GB of built-in flash memory, +32GB microSD card slot
8 megapixel camera with FullHD video recording
A-GPS, Bluetooth 3.0, Wi-Fi b/g/n
3.5 mm headset jack
Accelerometer, gyroscope, proximity and ambient light sensors
OS GingerBread
Many thought that its fake or fan-boy gone crazy...well they must think again cuz samsung has announced some amazing news last day.
A "DUAL CORE A9 1ghz Processor"
The system-on-a-chip (SoC) is codenamed Orion, it is based on the dual-core Cortex-A9 architecture from ARM, and will have the cores running at 1GHz. Samsung claims it will be 5 times faster in graphics rendering than its current Hummingbird.
Full HD video recording and decoding at 30fps, support for cameras up to 18MP, and support for up to three displays simultaneously - two on the device itself, and one external HDTV, for example, via HDMI-out. The Cortex-A9 chipsets also should deliver 30% reduction in power consumption.
have a look at following sites:
http://www.intomobile.com/2010/09/0...-a9-1080p-encodedecode-triple-screen-support/
http://www.tcmagazine.com/tcm/news/...g-tegra-2-orion-dual-core-1ghz-cortex-a9-chip
http://newsblaze.com/story/2010090622341800001.bw/topstory.html
http://www.slashgear.com/samsung-1g...lash-and-5mp14-6mp-cmos-chips-outed-07100868/
specs are just the max of what we could imagine. That's why I am skeptical. Talking about 3.0 when 2.2 is not final on the I9000 - that's a rather bad sign.
they will for sure give up on some specs when releasing the phone. and it will also depend on competitors product.
gingerbread is reality. android has already started spreading rumors abt Honeycomb (next version after gingerbread).
Whether the source is legit or not... I wouldn't be surprised if most of those specs are true.
We haven't heard of a "Super AMOLED 2" yet, but it would be a damn fine screen if it say, fixes the color banding on current pentile amoled (maybe use the RGBW I've heard about, that'd be awesome xD) and other issues (improve battery?)
Personally I think the current SGS size is perfect, so a 4.3" screen.. well they better have absolutely no wasted space around the screen lol.
It would also have to have hdmi out, either micro-hdmi or more likely the same connector as the galaxy tab.. and on that note, these specs sound very similar to the tab on steroids (tab already has confirmed gingerbread update + 1280x720 screen iirc).
If they release it (assuming it exists) in <6 months though, I think most of us would be a little mad though.. lol.
EDIT: OMG. maybe "Super AMOLED 2" could one of those foldable AMOLED screens... that would be pretty sweet haha!
dark_sith said:
specs are just the max of what we could imagine. That's why I am skeptical. Talking about 3.0 when 2.2 is not final on the I9000 - that's a rather bad sign.
Click to expand...
Click to collapse
If they can achieve the screen then nothing else listed is really that mindblowing, particularly if OPs sources are accurate (OP, I didn't read them, too busy). @Froyo/gingerbread, Froyo is supposedly ready in a fashion that they can use it (and have put lots of work into getting apps to scale) for the tab. It's fairly apparent that the Galaxy Tab has been the focus of their software department for the better part of a month, but we will reap part of those rewards even if it has partially delayed our Froyo...it's definitely not too early to talk about Gingerbread, particularly if Samsung can work with Google (Google might be very interested if they like the Galaxy tab) to become the next Google-Experience Device.
oswade said:
Personally I think the current SGS size is perfect, so a 4.3" screen.. well they better have absolutely no wasted space around the screen lol.
If they release it (assuming it exists) in <6 months though, I think most of us would be a little mad though.. lol.
Click to expand...
Click to collapse
I agree on 4.0" being the sweet spot, then again, before big devices become in vogue I wanted the 5.0" Streak until SGS was announced.
In the other thread about this topic a user said, 4.3" makes the rumors less likely because why would Samsung switch sizes (I know that's not what you said but I'll paraphrase my reply to him anyways). ...'I completely disagree. First, perhaps it was too hard for them to fabricate 4.3" in a device that they expect to sell like the SGS. 2nd, what percentage of Samsung's customers will upgrade from a 4.0" to a 4.3" 6 months later, they are trying to hit a different niche that perhaps opted for the Evo or would prefer a 4.3" if they were out of contract.'
As far as the progression of smartphones go, the faster the better, make my device obsolete tomorrow and that means the newest ones are nearly literally mini-laptops...I wouldn't need one for the duration of my contract. I'd only be jealous of the screen, and that is a jealousy I can live with (and probably pay a premium to switch devices for). Everything else, I'd be able to live without, and I bought this device thinking "This is the one I won't worry about for a year." Shorten that to 6 months and touche Samsung, you've done your job and done it well. The way the smartphone market is these days if they don't do it someone else will (well, they prob cant do screens like these for at least 12 months).
wonder if sammy will have fixed the GPS on the current model by the time that S2 releases
Holly Crap!
What will the I10000 have?
5" 1080p Super Super AMOLED
5 Ghz quad-core CPU
4 GB Ram/32 GB ROM
128GB of flash
12 megapixel camera with 1080 recording
???????????????????????
If they can make a dock for it, there is no reason to have a PC or a laptop anymore!
fayeznoor said:
4.3” 1280x720px Super AMOLED 2 display
2GHz CPU,
1GB RAM/4GB ROM
32GB of built-in flash memory, +32GB microSD card slot
8 megapixel camera with FullHD video recording
A-GPS, Bluetooth 3.0, Wi-Fi b/g/n
3.5 mm headset jack
Accelerometer, gyroscope, proximity and ambient light sensors
OS GingerBread
Click to expand...
Click to collapse
Even If it's true, you won't see it until q4 of 2011.
any update on this?
a samsung insider should be asked, anyone know such a person?
Looks pretty awesome to me. I absolutely agree, 4" screen is perfect. 4"3 is a bit too wide, not as easy to hold in one hand.
But i mean, let's be real, all of this depends on the software they'll put in it. Samsung is getting much better at supporting their smartphones but still...
If this thing gets a good OS / support then it will kick ass (look at the X10 still on 1.6...)
AND, who knows what competition will have out by then. Maybe a new iphone 6 with even less buttons
I9200 or I9020 ?
http://forum.xda-developers.com/showthread.php?t=816893
necro a dead thread about a false rumor
good job
4.3” 1280x720px Super AMOLED 2 display
2GHz CPU,
1GB RAM/4GB ROM
32GB of built-in flash memory, +32GB microSD card slot
8 megapixel camera with FullHD video recording
A-GPS, Bluetooth 3.0, Wi-Fi b/g/n
3.5 mm headset jack
Accelerometer, gyroscope, proximity and ambient light sensors
OS GingerBread
Click to expand...
Click to collapse
And battery on nucelar-polymer technology 10.000 mAh ,standby 2600 hours, talk time 1300 min....Will be ideal gadget
Dream, only dream...

[US Variants] Adreno 225 woes

Two years ago I bought an Incredible. I could have waited a month for the fascinate, and I'm glad I didn't, but something began to bug me; the Adreno 200 was very underpowered. Fast forward to now and I'm forced to upgrade to keep my unlimited data. The obvious choice is the upcoming Galaxy S3, so I pre-ordered one. I can't help but wonder if I'm repeating my last phone purchase by obtaining hardware with a GPU that simply wont hold up in the future.
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed. This upset many QSD based device owners, especially the Nexus One adopters. I know ICS roms are continually improving for QSD based phones but they'll always lag. Meanwhile the fascinate has received great ICS support due to having the same GPU as the Galaxy Nexus. One month could have changed my future proof experience, but inevidbly the fascinate had a bunch of issues I'm glad I didn't have to deal with.
I know hardware becomes obsolete. It happens, I get it. We all want to try and do the best we can though, especially those of us on Verizon with unlimited data; this is our last subsidized upgrade allowing us to retain unlimited data.
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
Will the US Galaxy S3 withstand the test of time and provide future proof hardware for a reasonable amount of time? Or will it fall short of the two year expected lifespan like the QSD Adreno 200 devices did?
I am uncertain and wish Qualcomm would seriously step its GPU game. Am I alone with this line of thinking?
[I originally posted this in the international forum, but felt it belonged here.]
Div033 said:
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed.
Click to expand...
Click to collapse
I thought the biggest problem with the Nexus One was the limited space for system files. Other Adreno 200 devices, such as the Evo 4G, have Android 4.0 running on them, and I hear it works really well.
I know that the early Exynos found on the Nexus S also works quite well.
I think any modern chipset easily surpasses the performance required for the type of GPU tasks being implemented at the system level. Games are still a concern, but compatibility is more of an issue there than performance, and the Adreno 225 is popular enough that it should be supported.
But there's always next year for really kick-ass GPUs.
Div033 said:
Two years ago I bought an Incredible. I could have waited a month for the fascinate, and I'm glad I didn't, but something began to bug me; the Adreno 200 was very underpowered. Fast forward to now and I'm forced to upgrade to keep my unlimited data. The obvious choice is the upcoming Galaxy S3, so I pre-ordered one. I can't help but wonder if I'm repeating my last phone purchase by obtaining hardware with a GPU that simply wont hold up in the future.
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed. This upset many QSD based device owners, especially the Nexus One adopters. I know ICS roms are continually improving for QSD based phones but they'll always lag. Meanwhile the fascinate has received great ICS support due to having the same GPU as the Galaxy Nexus. One month could have changed my future proof experience, but inevidbly the fascinate had a bunch of issues I'm glad I didn't have to deal with.
I know hardware becomes obsolete. It happens, I get it. We all want to try and do the best we can though, especially those of us on Verizon with unlimited data; this is our last subsidized upgrade allowing us to retain unlimited data.
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
Will the US Galaxy S3 withstand the test of time and provide future proof hardware for a reasonable amount of time? Or will it fall short of the two year expected lifespan like the QSD Adreno 200 devices did?
I am uncertain and wish Qualcomm would seriously step its GPU game. Am I alone with this line of thinking?
[I originally posted this in the international forum, but felt it belonged here.]
Click to expand...
Click to collapse
Well, you also have to look at resources available and time constraints. The introduction of LTE in the US probably forced said chip maker to make some concessions. What they lost in state of the art GPU, the gained in the ridiculous profit they made this year because they are the only chip that includes LTE. From their perspective, they've won the war thus far.
I agree with this line of thinking. As an earlier poster said, the Evo 4G had the Adreno 200. I use N64oid all the time. The Evo would struggle with games that the first Galaxy S family had no problem at all with. I have since switched to the Motorola Photon 4G, and the Tegra 2 (Nvidia GeForce GPU). It handles both emulators, as well as high end games so much better than my Evo did. I would have already pre-ordered the S3 if it weren't for this.
real world performance > benchmarks
My Sensation has an Adreno220 and it plays every game and movie just fine. Sure it doesn't get the best benchmark numbers but it more than holds it's own when playing any game. I'm sure the Adreno225 will hold up just fine over the next couple years. In fact I still love my Sensation. Side by side it's still just as fast as most phones out there. You only see a difference when running benchmarks which isn't practical. I personally don't care if i'm getting 200fps or 50. It's not like anyone can tell the difference.
I also want to note that the 220 is crazy powerful compared to the 200 and 205. It was the first GPU that Qualcomm seemed to really take a stab at gaming with. I'm fine with the 220 and can't wait to begin using the 225.
bradleyw801 said:
I agree with this line of thinking. As an earlier poster said, the Evo 4G had the Adreno 200. I use N64oid all the time. The Evo would struggle with games that the first Galaxy S family had no problem at all with. I have since switched to the Motorola Photon 4G, and the Tegra 2 (Nvidia GeForce GPU). It handles both emulators, as well as high end games so much better than my Evo did. I would have already pre-ordered the S3 if it weren't for this.
Click to expand...
Click to collapse
As nativestranger said in another thread,
"The 225 despite its deceptive naming is 4-6x the performance of the 205 and roughly 1.6-2x the 220."
Also, performance on the Evo cannot be based solely on Gpu(Adreno 200). Cpu, Ram, Resolution etc... have a ton to do with it as well.
Will the GPU do better in this phone due to the extra ram compared to the one series with the same S4/225 combo?
Div033 said:
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
[I originally posted this in the international forum, but felt it belonged here.]
Click to expand...
Click to collapse
You should also note that the GS2 only had 480 x 800 (384000 total pixels) resolution and even at that way lower resolution it's score was only slightly higher in that test whereas the GS3 is pushing 720x1280 (921600 total pixels). That means that the GS3 is working 2.4 times harder than the GS2 and it delivers almost the same gaming performance at worst, and better performance in others. That's not bad if you ask me seeing as how we all thought the GS2 was a powerhouse just 12 months ago.
incubus26jc said:
As nativestranger said in another thread,
"The 225 despite its deceptive naming is 4-6x the performance of the 205 and roughly 1.6-2x the 220."
Also, performance on the Evo cannot be based solely on Gpu(Adreno 200). Cpu and Ram have a ton to do with it as well.
Click to expand...
Click to collapse
agreed, i haven't seen anyone suffering from GPU woes other than benchmark nuts who obsess over the highest score.....everyone actually using it for real things say it works great.....and honestly, my take is if you want gaming performance, don't use a phone, plug your 360 into your big screen and kick ass from the couch in HD and surround sound
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Div033 said:
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Click to expand...
Click to collapse
Well it might interest you to know the Adreno 225 supports DirectX 9.3 and texture compression where Mali 400 does not. Its a requirement for Windows 8. Now, you might say so what....but I for one plan on trying to dual boot or even run a version of Windows RT perhaps on a virtual machine. Something else that the S4 Krait/Adreno package supports natively I do believe, that the Exynos/Mali doesn't.
Sent from my DROIDX using xda premium
Div033 said:
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
Click to expand...
Click to collapse
This is almost certainly a RAM issue. With the ridiculous 2GB on this phone, I would hope you wouldn't run into issues until Android at least wraps the alphabet.
Voltage Spike said:
This is almost certainly a RAM issue. With the ridiculous 2GB on this phone, I would hope you wouldn't run into issues until Android at least wraps the alphabet.
Click to expand...
Click to collapse
+1. I do believe more ram (largest among this generation phones) matters for long term usage.
The GPU is fine.
Guys, this is a Galaxy S phone. The newest one at least.
It is GUARANTEED a Jelly Bean update from Samsung (albeit late). It is also most likely getting at least 1 or 2 more major Android updates because of XDA.
Remember, ALL OF US has the SAME Galaxy S3. That is a LOT of devs that will be working on it.
Don't worry about that. It will come with time.
Div033 said:
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Click to expand...
Click to collapse
You simply don't compare the Adreno 200 of first generation snapdragon devices with the 225 of current devices. The Adreno 200 gpu even for it's time is woefully weak. The 205 that followed after was easily 3x or more the performance. Most devices based on this gpu including the xperia play actually had no problem playing the latest games. The 220 saw a similarly huge increase. Qualcomm eased off a little with the 225 by using the same architecture but built on a smaller process and increased the clocks/ memory bandwidth resulting in 1.6x - 2x performance increase. Hence the comments of it being a lame upgrade. However when we look at the desktop gpu of the AMD 6000 series. The upgrade was less than 20% over previous year and people are universally appraising the 6970 gpu. This is how gullible fanboyism and can result in strongly skewed perception over actual results.
nativestranger said:
You simply don't compare the Adreno 200 of first generation snapdragon devices with the 225 of current devices. The Adreno 200 gpu even for it's time is woefully weak. The 205 that followed after was easily 3x or more the performance. Most devices based on this gpu including the xperia play actually had no problem playing the latest games. The 220 saw a similarly huge increase. Qualcomm eased off a little with the 225 by using the same architecture but built on a smaller process and increased the clocks/ memory bandwidth resulting in 1.6x - 2x performance increase. Hence the comments of it being a lame upgrade. However when we look at the desktop gpu of the AMD 6000 series. The upgrade was less than 20% over previous year and people are universally appraising the 6970 gpu. This is how gullible fanboyism and can result in strongly skewed perception over actual results.
Click to expand...
Click to collapse
Fair enough. I suppose you're right, the Adreno 200 was already severely underpowered at launch. The 225 may not be the best, but it's still up among the top tier GPUs. I guess I have nothing to worry about. The 2GB ram is definitely nice too.
Sent from my Droid Incredible using the XDA app.
Just put this here for a reference:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The nexus one is running the Adreno 200. The htc one v with Adreno 205 is over 5X faster. The rezound has an Adreno 220. over 3X faster than the one V but also running more than 2X the resolution. The GS3 with adreno 225 is hard up against the vsync wall in he hoverjet test and about 3x faster than the rezound in the egypt test. It's amazing how much Adreno has upgraded in just 2 years. From the 2fps on the 200 to never dropping below 60fps on the 225.
Thank you this helps me make my decision too ^. Also does having a high resolution screen make graphics look better? Like NOVA 3 on my SGS2 looks awesome. All the effects are there bullet smoke and what not. So will these effects or the graphics in general look better on this sgs3 screen?
Thanks!
Your GPU will be just fine. Other posters have already shown that it is perfectly competitive with the gpu's in other top-tier phones of this generation, and most top-tier phones sold in the US since the beginning of the year have run S4 SoC's. It comes down to LTE, something the rest of the world doesn't have (for the most part, and nowhere near our level). I for one would much rather give up an absolutely world-crushing gpu than to be stuck on 3g forever.
Also, keep in mind that the Galaxy S3 is on-track to be the fastest-selling device ever. The American market is huge, and is the home to many of the major players in this industry (including of course Google themselves), not to mention that Samsung seems to want to treat the two devices (US and International versions) as one in the same. Its not like they'll want all those customers to have a gpu that'll make the phone feel old in 3 months, so I wouldn't worry.
And honestly, I don't really see Android becoming significantly more hardware-intensive anytime real soon. The current/upcoming generation of hardware can beat many people's PC's in terms of UI navigation, launching apps, and even running games. Two HUGE things Google talked about with Jellybean was introducing the PDK, and Project Butter. This shows that they recognize that some of the platform's biggest weaknesses were its slow, inconsistent updates and perceived lower performance to iOS due to the UI. From what I have seen of Jellybean in videos, I don't see much further room to speed up the UI, its already about as fast and smooth as it can get it seems. I would imagine screen resolution won't be making huge jumps in the immediate future; there'll be some 1080p screens, but I doubt people will demand that in a sub 5" device they hold a foot in front of their face, considering the negative impact on battery life.
What I'm trying to say is I don't see Android really demanding significantly more powerful hardware to keep growing, as it already runs as smooth and fast as possible on hardware we're already calling outdated. Sure, more ram and more powerful cpu's may let you multitask with more apps without slowdowns, and better gpu's can run newer games even better, but I just don't see the system itself requiring significantly better hardware than we have now for a few generations. Looks like we'll be more focused on multitasking performance, efficiency/battery life, and manufacturing costs so everyone can enjoy the kind of performance that's currently reserved for us nuts buying new top-tier phones every 8 months.
So again, no, I wouldn't worry. Sure, its not the best thing out there, and will soon be outclassed, but I don't see it handicapping Android anytime soon, especially with 2GB of RAM.
Cruiserdude said:
Your GPU will be just fine. Other posters have already shown that it is perfectly competitive with the gpu's in other top-tier phones of this generation, and most top-tier phones sold in the US since the beginning of the year have run S4 SoC's. It comes down to LTE, something the rest of the world doesn't have (for the most part, and nowhere near our level). I for one would much rather give up an absolutely world-crushing gpu than to be stuck on 3g forever.
Also, keep in mind that the Galaxy S3 is on-track to be the fastest-selling device ever. The American market is huge, and is the home to many of the major players in this industry (including of course Google themselves), not to mention that Samsung seems to want to treat the two devices (US and International versions) as one in the same. Its not like they'll want all those customers to have a gpu that'll make the phone feel old in 3 months, so I wouldn't worry.
And honestly, I don't really see Android becoming significantly more hardware-intensive anytime real soon. The current/upcoming generation of hardware can beat many people's PC's in terms of UI navigation, launching apps, and even running games. Two HUGE things Google talked about with Jellybean was introducing the PDK, and Project Butter. This shows that they recognize that some of the platform's biggest weaknesses were its slow, inconsistent updates and perceived lower performance to iOS due to the UI. From what I have seen of Jellybean in videos, I don't see much further room to speed up the UI, its already about as fast and smooth as it can get it seems. I would imagine screen resolution won't be making huge jumps in the immediate future; there'll be some 1080p screens, but I doubt people will demand that in a sub 5" device they hold a foot in front of their face, considering the negative impact on battery life.
What I'm trying to say is I don't see Android really demanding significantly more powerful hardware to keep growing, as it already runs as smooth and fast as possible on hardware we're already calling outdated. Sure, more ram and more powerful cpu's may let you multitask with more apps without slowdowns, and better gpu's can run newer games even better, but I just don't see the system itself requiring significantly better hardware than we have now for a few generations. Looks like we'll be more focused on multitasking performance, efficiency/battery life, and manufacturing costs so everyone can enjoy the kind of performance that's currently reserved for us nuts buying new top-tier phones every 8 months.
So again, no, I wouldn't worry. Sure, its not the best thing out there, and will soon be outclassed, but I don't see it handicapping Android anytime soon, especially with 2GB of RAM.
Click to expand...
Click to collapse
I see lol well that's good I don't wanna have to buy a new phone every half a year! But will the HD resolution make any of the game loft games look any better than they do on my galaxy s2 with the Mali 400 gpu? Thanks!
Sent from my SPH-D710 using XDA Premium App

All exynos users...your reviews?

Whoever have the exynos version please post a review on all details.
StylishMafia said:
Whoever have the exynos version please post a review on all details.
Click to expand...
Click to collapse
+1
please!!!
Just got one yesterday and its really fast than my Note 2 here is my first antutu score on it
just got mine.loving the device. SM-N900 So far I can say that it is blazing fast. cam has no lag. Although my hand would hurt using this as my primary phone, the one handed operation features are quite brilliant. they've taken it to the next level. Screen is beautiful.Will play with it more
I wrote some impressions in a thread in the general section. Someone asked for benchmarks, and while google was resyncing apps, data, etc, I got a 17K quadrant score. I reran it a couple of times since then and my highest broke 22k.
Synthetic benchmarks aside, it absolutely flies. However, without an S800 to compare it to, I cannot say which is faster. They're both beasts.
On the other hand... it seems like they're generally pretty close, with the S800 edging out the exynos 5420 in most CPU-bound benchmarks. This makes me wonder - perhaps the exynos will be the better chip once the firmware update for heterogeneous big.little processing rolls out? Time will tell.
My money is on the Mali GPU, too - this is extremely unscientific, but the 604 was light years ahead of the Adreno 225 and better than the 320 in many cases, and the 6xx (628? I forget) sounds like such an enormous improvement. I'm sure the 330 is no slouch, but I think it could get edged out... again, time will tell.
Of course, none of this really matters. What really matters is the software, and I expect S800 will be EXTREMELY well supported. Every high-end (and even mid-range; did you see the new kindle fire?) has an S800, whereas the Exynos 5420 has almost no design wins and a tarnished reputation from the chip in the galaxy s4 (5410?).
If you're on the fence, I would say go with an S800. If you want to take a little gamble, get an Exynos - it won't be much worse than the S800, and in "Q4" (is that now? I don't know what samsung's fiscal quarter schedule is like) or later, I think there's a very real chance Exynos will come out on top.

Will Google show benchmarks on stage on 9/29 ?

The rumoured SoCs don't hold up to the competition (and by competition I obviously mean Samsung and Apple)
How are they going to make SD810 look good at their conference? (or will they not talk about hardware performance at all...) Either way, if it's an SD810, it's likely to get destroyed in reviewers benchmarks.
The SD810 is much slower than the latest Exynos, and far, far, far slower than Apple's new A9 chip (it is probably even worse than Apple's year old A8).
Many of us were hoping for either SD820 or Kirin 950... But there are so many people confirming SD810 ....
I'm not super happy with my Nexus 6, but SD810 doesn't seem like much of an upgrade
Personally, I would rather they just keep the price down rather than engage in the ever ridiculous spec war. An 810 would be more than enough for a high end phone. I'm on an HTC M7 with an SD600 and it is still quite fast.
NikAmi said:
Personally, I would rather they just keep the price down rather than engage in the ever ridiculous spec war. An 810 would be more than enough for a high end phone. I'm on an HTC M7 with an SD600 and it is still quite fast.
Click to expand...
Click to collapse
Absolutely. I, too am still rocking this M7. It's no quitter by any stretch of the imagination! I just want stock Android and guaranteed timely updates, which this phone will definitely provide. Additionally, it looks like the M7 as well! As long as there isn't a ridiculous camera bump and it's just an area of the phone with different materials used (for radios and other gadgets that can't pierce through aluminum), I'd be sound as a pound. Besides, with the incredible performance rumors marching the internet (it's apparently FOUR TIMES faster than last year's Nexus 6), I think it's safe to say that this phone will be in my pocket for many years to come.
I remember an AnandTech article talking about the price of SoCs. they said that a high end SoC costs less than $30… and the low end are $10....
I don't think Google chose the SD810 because it was cheap. They chose it because there are very few options.
Apple doesn't sell their SoCs. Samsung doesn't sell much. Certainly not to real competition.
Nvidia can't do a SoC at low power. That leaves Intel, QCOM and some of the Chinese brands.
The Chinese brands may not be chosen because the Nexus needs to get approved quickly by lots of carriers. US carriers are quicker to approve QCOM
I would happily pay an extra $10-20 for a top or the line SoC
NikAmi said:
Personally, I would rather they just keep the price down rather than engage in the ever ridiculous spec war. An 810 would be more than enough for a high end phone. I'm on an HTC M7 with an SD600 and it is still quite fast.
Click to expand...
Click to collapse
The 808 is also very quick, so what's important though is optimization (as well as I/O, RAM type, and LTE/WiFi speed). I imagine even the SD600/S4P will continue to be useful for a few more years depending on how much of a burden future Android releases become.
Specs are good but not very useful if the software isn't on par. There's a reason why even phones like the G4, OP2, or S6 can show lag.
Sent from my LG-H950
Ace42 said:
The 808 is also very quick, so what's important though is optimization (as well as I/O, RAM type, and LTE/WiFi speed). I imagine even the SD600/S4P will continue to be useful for a few more years depending on how much of a burden future Android releases become.
Specs are good but not very useful if the software isn't on par. There's a reason why even phones like the G4, OP2, or S6 can show lag.
Sent from my LG-H950
Click to expand...
Click to collapse
Totally agree. I just want both. We had both with the N4, N5, N6. All three of then technically had the very best SoC available at the time (at least if you don't count Apple).
The SD810 was not the 'best' even when they launched 6 months ago, which is rare for Qualcomm. I'm surprised they estimate the SD820 won't be until next year, because that means the entire 2015 has been a QCOM disaster.
If the rumours are true, these Nexuses will be a little bit of a letdown for me in the SoC department.
Just look at 2015. Samsung dropped them, and their Exynos 7420 was far superior to the SD810. And now Amazon just announced their new Fire TV has dropped QCOM, and is using a top end MediaTek with the new A72 cores!
SyXbiT said:
Totally agree. I just want both. We had both with the N4, N5, N6. All three of then technically had the very best SoC available at the time (at least if you don't count Apple).
The SD810 was not the 'best' even when they launched 6 months ago, which is rare for Qualcomm. I'm surprised they estimate the SD820 won't be until next year, because that means the entire 2015 has been a QCOM disaster.
If the rumours are true, these Nexuses will be a little bit of a letdown for me in the SoC department.
Just look at 2015. Samsung dropped them, and their Exynos 7420 was far superior to the SD810. And now Amazon just announced their new Fire TV has dropped QCOM, and is using a top end MediaTek with the new A72 cores!
Click to expand...
Click to collapse
Unlike previous years Qualcomm didn't have their personal architecture (Kyro or whatever) prepared for 2015, so the SD810 felt like more of a placeholder. The thermal issues are likely a side effect of them using standard A57/A53 cores, they usually rely on custom architectures like Apple.
The SD820 according to QC has a bunch of improvements however, I'm unsure of whether it can beat the next Exynos or A9x.
I haven't checked out the new Kindles, but if they'll use A72's that's pretty good considering their HDX used the SD800.
Sent from my LG-H950
No they won't talk about performance. They know the 810 is a bad chip. They've most likely already throttled it or will do so soon afterwards and it'll still overheat, just like all the others.
TransportedMan said:
No they won't talk about performance. They know the 810 is a bad chip. They've most likely already throttled it or will do so soon afterwards and it'll still overheat, just like all the others.
Click to expand...
Click to collapse
New benchmarks showed up yesterday on backbench and they were typical 810 processor... Something in the lines of 1300 single, 4400 multi cores... Weak.
Sent from my SM-N920V using Tapatalk
2swizzle said:
New benchmarks showed up yesterday on backbench and they were typical 810 processor... Something in the lines of 1300 single, 4400 multi cores... Weak.
Sent from my SM-N920V using Tapatalk
Click to expand...
Click to collapse
Well those scores are still higher than any other Snapdragon at the moment. It's no E7420, but its the next best thing behind it. We also can't forget if this is the revised SD810 its been throttled to deal with its heating issues so the scores could possibly be higher. I don't care what the scores on paper say. All I want to know is are the heating issues fixed because some chips who have new 810 are still overheating.
2swizzle said:
New benchmarks showed up yesterday on backbench and they were typical 810 processor... Something in the lines of 1300 single, 4400 multi cores... Weak.
Sent from my SM-N920V using Tapatalk
Click to expand...
Click to collapse
a single benchmark doesn't matter, you have to look at a sustained performance. I could just take a phone out of a fridge to run a benchmark and I can guarantee you it'll be amazing, but if I run the same benchmark several times continuously, the score will be significantly lowered. the key here is whether can google/huawei do something to keep the continuous performance. let's say if I run the benchmarks 5 times in a row, how much deviation will there be between the first and last one? that's the important thing here. typical SD810's performance isn't bad, it's the throttling that everyone hates

Categories

Resources