Samsung Galaxy SIII - Analysis - Galaxy S III General

Ok, lets get a few things into perspective. I've read a lot of negative posts about this phone so lets go into a more technical analysis on the matter.
CPU : Exynos 4 Quad Core (1.4GHz)
For some reason people seem to think 1.4GHz vs the Tegra 3's 1.5GHz makes the Tegra a faster processor. Not true. Yes, the Tegra has a higher clock speed but performance wise the Exynos 4 at 1.4GHz can out perform it (due to its design). Also, people seem to think the A15 architecture would make a major difference. The A15 architecture was designed to increase power efficiency which is something the Exynos 4 has done by decreasing its dye from 45nm to 32nm (from SII to SIII) and with its nice 2100mAh battery I'm sure you will have all the battery power you need. (since it can be removed I'm sure you will be able to upgrade the battery in the future, something you cannot do with the HTC One X)
Design
What's wrong with plastic? Although you may think plastic is "cheap" it may very well help increase your phones durability since plastic is flexible. Drop a plastic phone on the floor and you will be left with minor scratches, drop a metal and glass phone on the floor and you will be left with a malformed frame, shattered glass and internal damage from the shock not being absorbed. I would also like to know what's wrong with curves? A curved design will fit snug in your hand. As for the colour schemes, I guess they can be left for discussion. Although the pebble blue looks more black than anything.
I quite like its sleek design.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
P.S : Watch this before you judge.
http://www.youtube.com/watch?v=KzDwxpFgoDg
Screen
HD Super AMOLED is only possible on pentile displays due to the major resolution change. If the SIII was to have a Super AMOLED+ display it would not be able to brag about its 720p display since at HD resolutions a Plus display is not possible. Super AMOLED+ offers improved detail, but at 720p that detail gain would be practically none because of the extreme pixel density which basically means the HD Super AMOLED displays are far superior than the Super AMOLED Plus Displays.
What is the point in pentile displays?
PenTile RGBG layout uses green pixels interleaved with alternating red and blue pixels. The human eye is most sensitive to green, especially for high resolution luminance information. Thus the RG-BG scheme creates a color display with one third fewer subpixels than a traditional RGB-RGB scheme but with the same measured luminance display resolution (which means your eyes will notice no difference from a AMOLED+ RGB display to a AMOLED RGBG display at the same resolution).
Camera
Why are people complaining about it being 8MP instead of 12MP, the 8MP pictures it takes can be as good if not better than a 12MP image taken from another device. Why? Mega-pixel is the resolution of the image. A crisp 8MP image would look far better than a grainy 12MP image, the actual quality of the image depends on the CCD and I'm pretty sure that would of been upgraded since the SII.
RAM
Oh please... when will you ever use 1GB of RAM...
So lets compare this to the HTC One X (I honestly have nothing against this phone, but I'm just going to be comparing specifications)
CPU : 1.5GHz Tegra 3 vs 1.4GHz Exynos 4
They are both quad-cores so should both be able to run multi-threaded processes very well compared to dual-cores. The Tegra 3 does have its extra core for idling (when its not doing anything) to save battery life although I do feel this would be a bit pointless if the Exynos 4 can just under-clock itself to save battery when idling, this also gives it the ability to still use all four cores for multi-threading even when idling. As for over-clock ability I feel the 40nm Tegra wont clock as well as the 32nm Exynos, a smaller dye will mean less heat which will result it more stable clock speeds. But I guess we will have to wait and see. From my point of view the Exynos wins this.
Code:
[U][B]Galaxy SIII - AnTuTu Benchmark Score[/B][/U]
[B]Score : 11901[/B]
RAM : 3431
CPU Integer : 3999
CPU Float : 3053
2D Graphics : 297
3D Graphics : 1245
Database IO : 550
SD Write : 13.0 MB/s
SD Read : 42.8 MB/s
CPU Frequency : 1400MHz
[B][U]HTC One X - AnTuTu Benchmark Score[/U][/B]
[B]Score : 10539[/B]
RAM : 2169
CPU Integer : 3439
CPU Float : 2543
2D Graphics : 289
3D Graphics : 1209
Database IO : 545
SD Write : 23.8 MB/s
SD Read : 34.3 MB/s
CPU Frequency : 1500MHz
Screen : HD LCD vs HD Super AMOLED
Although they are both at the same resolution the AMOLED screen should be able to out perform the HD LCD screen with its contrast of blacks and whites. The HD LCD screen also needs a back-light to be viewed whereas the AMOLED screen emits its own light which will give it a more natural look and better viewing angles compared to the HD LCD.
Code:
[U][B]HD LCD[/B][/U]
[U]Pros[/U]
- HD Resolution with the RGB stripe array
- Overall less power consumption
[U]Cons[/U]
- Uses a back-light
- Blacks are not true black
[U][B]HD Super AMOLED[/B][/U]
[U]Pros[/U]
- HD Resolution
- Vibrant colours
- Blacks are true blacks
- No back-light
[U]Cons[/U]
- Pentile array (not as sharp as RGB stripe array)
- Overall more power consumption
Although the pentile array means images are not as sharp as the RGB stripe array; at HD resolutions its seriously not noticeable and as for power consumption the HD Super AMOLED screen actually uses dramatically less power when it comes to displaying blacks.
That's about it. So what you have there is a durable phone with the best screen, camera and processor currently available.

Very interesting analysis
Especially for all those guys who mock the phone whenever they can.

I kinda disagree with the RAM argument, considering smartphone nowadays can be compared to a computer, there's no such thing as too much RAM but for other things, I have no complain.
There's just too much haters, either they secretly wants this but have bought other phone so they need to reassure themselves, really arguing just because they can, or plain trolling.

furt890 said:
I kinda disagree with the RAM argument, considering smartphone nowadays can be compared to a computer, there's no such thing as too much RAM but for other things, I have no complain.
There's just too much haters, either they secretly wants this but have bought other phone so they need to reassure themselves, really arguing just because they can, or plain trolling.
Click to expand...
Click to collapse
1GB RAM is more than enough. Watch this
HTC One X with its 1GB RAM (same as the Samsung Galaxy SIII)
http://www.youtube.com/watch?v=qeMuzSQUdTg
He loads a whole range of memory hungry games without closing their tasks (meaning they will be still loaded in the RAM).

Can the title be changed to SIII please

TheUnkn0wn said:
1GB RAM is more than enough. Watch this
HTC One X with its 1GB RAM (same as the Samsung Galaxy SIII)
http://www.youtube.com/watch?v=qeMuzSQUdTg
He loads a whole range of memory hungry games without closing their tasks (meaning they will be still loaded in the RAM).
Click to expand...
Click to collapse
The video shows another aspect Samsung designed good in my opinion. When you look at the 1x screen you can see these 3 little dots there for a menu key.
It not just decreases the screen slightly, I can also see myself accidently clicking on while I'm playing.
In short: I love the hardware menu key!

The new fad at the moment is to hate. If the screen size was same as the note, i would pick it up day 1.
Sent from my GT-N7000 using xda premium

TheUnkn0wn said:
Ok, lets get a few things into perspective. I've read a lot of negative posts about this phone so lets go into a more technical analysis on the matter.
CPU : Exynos 4 Quad Core (1.4GHz)
For some reason people seem to think 1.4GHz vs the Tegra 3's 1.5GHz makes the Tegra a faster processor. Not true. Yes, the Tegra has a higher clock speed but performance wise the Exynos 4 at 1.4GHz can out perform it (due to its design). Also, people seem to think the A15 architecture would make a major difference. The A15 architecture was designed to increase power efficiency which is something the Exynos 4 has done by decreasing its dye from 45nm to 32nm (from SII to SIII) and with its nice 2100mAh battery I'm sure you will have all the battery power you need. (since it can be removed I'm sure you will be able to upgrade the battery in the future, something you cannot do with the HTC One X)
Design
What's wrong with plastic? Although you may think plastic is "cheap" it may very well help increase your phones durability since plastic is flexible. Drop a plastic phone on the floor and you will be left with minor scratches, drop a metal and glass phone on the floor and you will be left with a malformed frame, shattered glass and internal damage from the shock not being absorbed. I would also like to know what's wrong with curves? A curved design will fit snug in your hand. As for the colour schemes, I guess they can be left for discussion. Although the pebble blue looks more black than anything.
I quite like its sleek design.
Screen
HD Super AMOLED is only possible on pentile displays due to the major resolution change. If the SIII was to have a AMOLED+ display it would not be able to brag about its 720p display since at HD resolutions a Plus display is not possible. AMOLED+ offers improved detail, but at 720p that detail gain would be minimal which basically means the HD Super AMOLED displays are far superior than the AMOLED Plus Displays.
Camera
Why are people complaining about it being 8MP instead of 12MP, the 8MP pictures it takes can be as good if not better than a 12MP image taken from another device. Why? Mega-pixel is the resolution of the image. A crisp 8MP image would look far better than a grainy 12MP image, the actual quality of the image depends on the CCD and I'm pretty sure that would of been upgraded since the SII.
RAM
Oh please... when will you ever use 1GB of RAM...
So lets compare this to the HTC One X (I honestly have nothing against this phone, but I'm just going to be comparing specifications)
CPU : 1.5GHz Tegra 3 vs 1.4GHz Exynos 4
They are both quad-cores so should both be able to run multi-threaded processes very well compared to dual-cores. The Tegra 3 does have its extra core for idling (when its not doing anything) to save battery life although I do feel this would be a bit pointless if the Exynos 4 can just under-clock itself to save battery when idling, this also gives it the ability to still use all four cores for multi-threading even when idling. As for over-clock ability I feel the 40nm Tegra wont clock as well as the 32nm Exynos, a smaller dye will mean less heat which will result it more stable clock speeds. But I guess we will have to wait and see. From my point of view the Exynos wins this.
Screen : HD LCD vs HD Super AMOLED
Although they are both at the same resolution the AMOLED screen should be able to out perform the HD LCD screen with its contrast of blacks and whites. The HD LCD screen also needs a back-light to be viewed whereas the AMOLED screen emits its own light which will give it a more natural look and better viewing angles compared to the HD LCD.
That's about it. So what you have there is a durable phone with the best screen, camera and processor currently available.
Click to expand...
Click to collapse
Yeah ur saying right !

Nice analysis !
Can't agree more !
All the trolls => garbage

Haters gonna hate!
touness69 said:
Nice analysis !
Can't agree more !
All the trolls => garbage
Click to expand...
Click to collapse
True..
I also did a comparison between HTC One X and SGS3, similar to what OP did here.
You can find it here.

funb0b said:
Can the title be changed to SIII please
Click to expand...
Click to collapse
Yes, just a small typo there lol

TheUnkn0wn said:
Ok, lets get a few things into perspective. I've read a lot of negative posts about this phone so lets go into a more technical analysis on the matter.
CPU : Exynos 4 Quad Core (1.4GHz)
For some reason people seem to think 1.4GHz vs the Tegra 3's 1.5GHz makes the Tegra a faster processor. Not true. Yes, the Tegra has a higher clock speed but performance wise the Exynos 4 at 1.4GHz can out perform it (due to its design). Also, people seem to think the A15 architecture would make a major difference. The A15 architecture was designed to increase power efficiency which is something the Exynos 4 has done by decreasing its dye from 45nm to 32nm (from SII to SIII) and with its nice 2100mAh battery I'm sure you will have all the battery power you need. (since it can be removed I'm sure you will be able to upgrade the battery in the future, something you cannot do with the HTC One X)
Click to expand...
Click to collapse
First, I should start by saying that I like SGS3
In your analysis you're mixing physical design (32nm vs 40/45nm) and logical design (Cortex A9 vs A15). All cortex A15 SOC's will be made with 32nm or better (e g 28nm), so there won't be any advantage for 4412 in this regard.
The purpose with Cortex A15 is to increase performance (but of course with decent power efficiency). So Cortex A15 is much better than Cortex A9 if you like performance, there is nothing Samsung have made with Exynos 4412 that makes it come closer to A15 performance wise. But in certain benchmarks a quad-core A9 will beat a dual core A15 though.
But A15 will not reach the market before Q3, maybe later, so it was never even close to get into SGS3 anyway.

Agree.

touness69 said:
Nice analysis !
Can't agree more !
All the trolls => garbage
Click to expand...
Click to collapse
+1
agree with everything the OP said.
Trolls and haters, no one is forcing you to stay here, you've had your rant now go away and leave those of us who are looking forward to this phone too it.

Very nice analysis. People knee jerked their reaction because their OWN expectations were not met. Samsung did not really hype this device. They did what they always do at launch.
The first thing people did not like was the "look" and in my opinion only, kind of silly for a mobile phone device. To fret about it as if it's something one actually "wears" and further indicates a reflection who one is as a person, in as if a self described "ugly phone" means ugly person to another who accidently sees the offending device But some people are fashion conscious all the way to their cell phones I suppose.
We then find out the form was intended purposely to avoid Apple lawsuits which at the end of the day cost Samsung money in time and personnel whose time can be better spent in the R &D labs coming up with non Apple FUNCTIONS. Like hot swappable SD Cards and batteries. Something Apple (and perhaps now even HTC) will NEVER have.
The phone is better then the SGS II...Better than the Note (except for screen size if that matters). The "form fighters" are the most likely to have purchased and the most disappointed and will eventually realize they have to get on board because many cannot stand they may not have the latest and greatest.
The HTC One X buyers will realized they jumped too soon and the SGS III is indeed the better phone....."Form" and a pentile screen is not enough of a rationalization to overcome their disappointment at not waiting just a little longer.
Their only hope for redemption is when the masses start the real beta test of the SGS III glaring issues start popping up ala the Captivate.

TheUnkn0wn said:
CPU : 1.5GHz Tegra 3 vs 1.4GHz Exynos 4
They are both quad-cores so should both be able to run multi-threaded processes very well compared to dual-cores. The Tegra 3 does have its extra core for idling (when its not doing anything) to save battery life although I do feel this would be a bit pointless if the Exynos 4 can just under-clock itself to save battery when idling, this also gives it the ability to still use all four cores for multi-threading even when idling. As for over-clock ability I feel the 40nm Tegra wont clock as well as the 32nm Exynos, a smaller dye will mean less heat which will result it more stable clock speeds. But I guess we will have to wait and see. From my point of view the Exynos wins this.
Click to expand...
Click to collapse
Thank you for a great constructive post without fanboism
I just have a couple of things to say about the Tegra 3 part.
The so called "companion core" is not just for "idling" It's for taking care of the "easy" not so demanding tasks as well such as when are reading you're mails, watching you're pictures, looking on amovie and such.
The great part about that is that the "companion core" is really power efficiant and does'nt need to stress any other core to do the easy tasks.
The other cores work individually dependent on the workload as well.
And why would an CPU do multithreading when idling? Is'nt idling just...idling?
What I want to say is that the Tegras 4 cores behaves just as the exynos 4 cores (if I've done my homework) with the exception that the Tegra has an helper core on top of this.
That being said the Tegra 3 also works @ 1.4ghz when in quad mode. The 1.5ghz part is for single core use as I understand it.
I think the exynos will be an awsome processor though. Im not saying anything else. Just updating som facts

Aja82 said:
And why would an CPU do multithreading when idling? Is'nt idling just...idling
Click to expand...
Click to collapse
Even though the phone is in idle there are still threads running, since there are still threads running it can therefore distribute the threads amongst its four cores.
I do like the idea of the companion core to help reduce battery usage, but I know that Android can lower the clock rate of the processor dynamically to work load which would effectively present the same outcome (reduced battery usage).

I laughed at the screen analysis. LCD2 smokes Super Amoled HD. Bit of a biased analysis here... also you fail to recognize that the GSIII's design is horrid. The reason I am so pissed at Samsung is because I've been a GSII user ever since it came out, paid the premium price for it, and its been the best smartphone I've EVER owned. Samsung hyped up the design for the III so much, and they hyped up the whole event with their "sheep" teaser. I'll be a glad owner of a One X, when it comes to AT&T on Sunday.
Sent from my GT-I9100 using xda premium

S3 is great hardware wise, but the pentile display REALLY IS THE ****. And no bias at all, this is coming from someone who has used a galaxy nexus previously, with the same screen technology and an even slightly higher pixel density if you want to be precise.
Pentile screens have a distinct grid and fuzziness no matter how pixel dense it is. And if you'd think at the nominal viewing distance (about 30 cm), you can't see the fuzziness you'd be really disillusioned. And AMOLEDs pentile or not have this problem, at low brightness levels solid colours, especially grey, causes you to see distinct vertical banding and also extreme fuzziness that makes the screen look like sandpaper. This is the exact reason I forced Samsung to give me a full refund such that I could buy my One X.
Btw a little correction, One X uses IPS LCD tech, one of the best panels in the industry. They call it Super LCD 2 because IPS is a patented technology. IPS loses out in contrast and colours are less saturated as compared for AMOLEDs for sure. But it makes it up with truly high pixel densities with the RGB matrix. And its impeccable, as Steve Jobs says that 300+ dpi with RGB matrix means impossible to see pixels with the naked eye even at 10 cm distance. And also to mention their excellent viewing angles just like AMOLEDs, and their more natural colour reproduction.
Not that I'm trolling, nitpicking or whatsoever. S3 is an awesome phone no doubt. Processor, RAM, camera are very legit arguements. However, just a bit of real experience for those who have never actually experienced a high pixel density pentile AMOLED display in real life and are about to buy this based on what people who compare spec sheets say about it. To be fair I would say its a slight imperfection in an almost flawless phone.
Sent from my HTC One X using xda premium

King Shady said:
I laughed at the screen analysis. LCD2 smokes Super Amoled HD. Bit of a biased analysis here... also you fail to recognize that the GSIII's design is horrid. The reason I am so pissed at Samsung is because I've been a GSII user ever since it came out, paid the premium price for it, and its been the best smartphone I've EVER owned. Samsung hyped up the design for the III so much, and they hyped up the whole event with their "sheep" teaser. I'll be a glad owner of a One X, when it comes to AT&T on Sunday.
Sent from my GT-I9100 using xda premium
Click to expand...
Click to collapse

Related

After Careful Analysis of Trends and Rumors, here is what I think of the S3 / Note 2!

Well, after cross-referencing other sources, it is very well likely for the below specifications to appear on new models this year. Applicable to S3 and Note 2. - Cheers!
Next Month Samsung will be announcing who knows, how many new devices. One of the expected models is the Galaxy S3 and possibly, Note 2.
I can say the S3 will have the following after doing a careful analysis of trends and rumors,
4.5" OR 4.65" SAMOLED HD Pentile or RGB/Plus Display with 3D (also known as Super AMOLED III) with 1280x720 Resolution, Lotus Glass
1.5 GHz Quad Core Processor 32nm (Exynos 4412 with Mali T604)
2gb LPDDR2, DDR2 or DDR3 Ram (Dual Channel)
16 or 12 Megapixel Lens Rear Camera (Samsung Technology)
3.2 Megapixel Front Camera with Face Detection
16gb or 32gb Built in Memory Expandable up to 64GB
Bluetooth 4.0
Built In NFC
Ice Cream Sandwich
2000mAh Battery
HzO Technology--water-proof technology
ETA--Late Spring, 2012
Who wants to bet?!
I can say the Note 2 will have the following after doing a careful analysis of trends and rumors,
5.3" SAMOLED HD Pentile Display with 3D (also known as Super AMOLED III) with 1280x800 Resolution, Lotus Glass
1.8 GHz Quad Core Processor 32nm (Exynos 4412 with Mali T604) or 2.0GHz Dual Core Processor 32nm (Exynos 5250 with Mali T604)
2gb LPDDR2, DDR2 or DDR3 Ram
16 or 12 Megapixel Lens Rear Camera (Samsung Technology)
3.2 Megapixel Front Camera with Face Detection
16gb or 32gb Built in Memory Expandable up to 64GB
Bluetooth 4.0 (or possibly 5.0)
USB 3.0
Built In NFC
Ice Cream Sandwich
2750mAh Battery
HzO Technology--water-proof technology
ETA: mid-Summer, 2012
Who wants to bet?!
A 4.5" inch screen is more probable. I agree.
Well, a possible release date of Jelly Bean has yet to be confirmed. And if released in August this year, this certainly will not be in the Note 2. Jelly Bean will first arrive in Nexus 4 and it will take developers a few weeks to overlay their UI's on newer models
Touchwiz 5.0 is obvious for new devices coming out with ICS, so no need to include that in the specifications. Existing S2 and and Note will have ICS soon, but with the existing Touchwiz 4.0 interface.
16MP looks promising since sensors were already introduced late last year, Oct. 2011
Battery seems to be appropriate for the S2, since the battery compartment size will be able accommodate larger batteries due to the .2" inch increase in screen size.
3D may or may not appear. What remains suspicious is how Samsung decided to refer their new AMOLED displays as SAMOLED 3. HEY, YOU NEVER KNOW!
HzO is a possibility, but seeing that Samsung has to shell extra money out of their pockets puts this feature into question.
2GB will be able to accommodate more processes to run in the background, and since the CPU will be bumped, 2GB doesn't look like too much.
The Exynos 5250 (A15) that you are referring to may be featured in Phablets and or Tablets. This we will have to wait and see.
I think a lot of these updates are obvious, so as the clock ticks, we sit down and think.
I would lower the CPU frequency to 1.8 Ghz, the camera to 12 megapixels and the battery to 2000 mAh (you wrote mV, that's not the correct unit for refering to charge).
SAlmighty said:
Who wants to bet?!
Click to expand...
Click to collapse
Money or peanuts /
jje
I don't see why they would use a weaker cpu in the Galaxy Note 2.
2Ghz A15 dual-core >>1.5 Ghz A9 quad-core
The Mali T604 still won't beat the SGX543/4MP2, so for all your fan boys who bums benchmark don't get too over excited. Unless its a Mali T604MP2.
It's too early for the Galaxy Note 2
I'll take that bet because this is totally wrong.
SIII will not ship with 5250 because it won't be ready in time. It will use the quad-core 4412 since that's the most powerful SoC that will be available for mass production for SIII timetable. Heck, there are no devices even running 4412 yet it's so new, 5250 is even newer and won't be used in devices until later in the year, most likely next top end tablet.
It will not have 2GB of ram, too big of a jump, but 1.5GB instead max.
Camera 12MP at most.
Note II ..come on, that's a year away.
Note 2 won't be announced in WMC if we go by the timing of last year's announcements.
Definitely Note 2 won't be using 1280*800 Super AMOLED. It would either be a RGB (+ moniker) at this resolution or something like PenTile at 1600*1000 resolution.
Don't think 2Ghz for GS3. Samsung will "surely" reduce the clock speed to save battery. This was the same strategy they used for GN where they under-clocked OMAP 4460 to 1.2GHz instead of stock 1.5Ghz. This will require a lower operating voltage. Dropping core voltage yields a more than a linear decrease in power consumption. So, if they using 5250, my guess is like 1.5Ghz. A lot of discussion and recent press from Samsung about battery-life. But I doubt they can do something extraordinary because of the large AMOLED screen. Unfortunately all websites are smitten with white. They had to reduce the brightness drastically in
AMOLED's to save battery. In GN, it barely crosses 200 Nits.
I am expecting HD Super AMOLED "Plus" in GS3.
About camera, you may want to check this: http://www.samsung.com/global/business/semiconductor/newsView.do?news_id=1262
Too early to speculate about Note. But definitely it would be clocked higher than GS3 and will have bigger battery - like 3000 mAh and will have better resolution.
"About camera, you may want to check this: http://www.samsung.com/global/busine...o?news_id=1262 "
pretty certain it get a 16mp camera ,60 fps HD recording ....
I would say that a 720 samoled+ (rgb) screen is likely, not sure if the a15s are even samplerling yet so I wouldn't expect one, higher clocked exynos or maybe the quad (but that would be murder on the battery IMO).
Rumor is rumor, zzz
Sent from my GT-I9100 using Tapatalk
I remember a lot of threads in the HD2 general section, saying "The HD3 will have this" and "The HD3 will have that".
Has the SGS3 actually been officially confirmed?
rd_nest said:
Note 2 won't be announced in WMC if we go by the timing of last year's announcements.
Definitely Note 2 won't be using 1280*800 Super AMOLED. It would either be a RGB (+ moniker) at this resolution or something like PenTile at 1600*1000 resolution.
Don't think 2Ghz for GS3. Samsung will "surely" reduce the clock speed to save battery. This was the same strategy they used for GN where they under-clocked OMAP 4460 to 1.2GHz instead of stock 1.5Ghz. This will require a lower operating voltage. Dropping core voltage yields a more than a linear decrease in power consumption. So, if they using 5250, my guess is like 1.5Ghz. A lot of discussion and recent press from Samsung about battery-life. But I doubt they can do something extraordinary because of the large AMOLED screen. Unfortunately all websites are smitten with white. They had to reduce the brightness drastically in
AMOLED's to save battery. In GN, it barely crosses 200 Nits.
I am expecting HD Super AMOLED "Plus" in GS3.
About camera, you may want to check this: http://www.samsung.com/global/business/semiconductor/newsView.do?news_id=1262
Too early to speculate about Note. But definitely it would be clocked higher than GS3 and will have bigger battery - like 3000 mAh and will have better resolution.
Click to expand...
Click to collapse
No offence but your wrong for the clockspeed anyways. The reason why they downclocked it, yes was to save battery BUT was manufactured on 45nm whereas these are 32nm WHICH IS a huge difference between power comsumption and heat different. So 1.5Ghz seems fine to me since it'll either provide equal power or less. Nanometer plays quite a huge role in power consumption
My contract ends tomorrow, currently I have sgs2 do you think it is worth upgrading now to maybe Galaxy Note or wait for the new devices?
brodzik said:
My contract ends tomorrow, currently I have sgs2 do you think it is worth upgrading now to maybe Galaxy Note or wait for the new devices?
Click to expand...
Click to collapse
Wait. Your phone is good enough to last you for another year at least. Compare to other phones on the market and what is being planned to be released of what we know off you have still one of the best if not the best android phone in the market..So enjoy your phone slap ICS when that comes out and wait for the S3 when it gets released
Archer said:
I remember a lot of threads in the HD2 general section, saying "The HD3 will have this" and "The HD3 will have that".
Has the SGS3 actually been officially confirmed?
Click to expand...
Click to collapse
Mhm. And the successor was the HD7, in a way Lawl.
$1 gets you a reply
cokeman2 said:
"About camera, you may want to check this: http://www.samsung.com/global/busine...o?news_id=1262 "
pretty certain it get a 16mp camera ,60 fps HD recording ....
Click to expand...
Click to collapse
I hope they wont go any higher with the mp count as it actually decreases picture quality in terms of noise and lowlight. Its a phone and 8mp is more than enough. Hell my dsrl has 12mp wich is plenty and that on a sensor that is like 20 times bigger than the one in the gs2 and 4s.. More is not always better....
Sent from my GT-I9100 using Tapatalk
JJEgan said:
Money or peanuts /
jje
Click to expand...
Click to collapse
Peanut Brittle..
Sent from TAPAKING using Tapatalk
Lol 3D? Not a chance in hell... If Sammy does that ( which I'm nearly 100% sure they will not ) after HTC and LG failed miserably with Evo3D and Optimus3D, respectively, then I will consider them idiots....
@OP: After less than careful analysis of your post I can confidently say that you are wrong.
Whoops, Physics 101, didn't proofread, thanks.
Filiprino said:
I would lower the CPU frequency to 1.8 Ghz, the camera to 12 megapixels and the battery to 2000 mAh (you wrote mV, that's not the correct unit for refering to charge).
Click to expand...
Click to collapse

[US Variants] Adreno 225 woes

Two years ago I bought an Incredible. I could have waited a month for the fascinate, and I'm glad I didn't, but something began to bug me; the Adreno 200 was very underpowered. Fast forward to now and I'm forced to upgrade to keep my unlimited data. The obvious choice is the upcoming Galaxy S3, so I pre-ordered one. I can't help but wonder if I'm repeating my last phone purchase by obtaining hardware with a GPU that simply wont hold up in the future.
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed. This upset many QSD based device owners, especially the Nexus One adopters. I know ICS roms are continually improving for QSD based phones but they'll always lag. Meanwhile the fascinate has received great ICS support due to having the same GPU as the Galaxy Nexus. One month could have changed my future proof experience, but inevidbly the fascinate had a bunch of issues I'm glad I didn't have to deal with.
I know hardware becomes obsolete. It happens, I get it. We all want to try and do the best we can though, especially those of us on Verizon with unlimited data; this is our last subsidized upgrade allowing us to retain unlimited data.
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
Will the US Galaxy S3 withstand the test of time and provide future proof hardware for a reasonable amount of time? Or will it fall short of the two year expected lifespan like the QSD Adreno 200 devices did?
I am uncertain and wish Qualcomm would seriously step its GPU game. Am I alone with this line of thinking?
[I originally posted this in the international forum, but felt it belonged here.]
Div033 said:
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed.
Click to expand...
Click to collapse
I thought the biggest problem with the Nexus One was the limited space for system files. Other Adreno 200 devices, such as the Evo 4G, have Android 4.0 running on them, and I hear it works really well.
I know that the early Exynos found on the Nexus S also works quite well.
I think any modern chipset easily surpasses the performance required for the type of GPU tasks being implemented at the system level. Games are still a concern, but compatibility is more of an issue there than performance, and the Adreno 225 is popular enough that it should be supported.
But there's always next year for really kick-ass GPUs.
Div033 said:
Two years ago I bought an Incredible. I could have waited a month for the fascinate, and I'm glad I didn't, but something began to bug me; the Adreno 200 was very underpowered. Fast forward to now and I'm forced to upgrade to keep my unlimited data. The obvious choice is the upcoming Galaxy S3, so I pre-ordered one. I can't help but wonder if I'm repeating my last phone purchase by obtaining hardware with a GPU that simply wont hold up in the future.
The biggest blow of having a weak GPU in my Incredible was the incompatibility with ICS. Google would not and could not support the Nexus One due to the meager Adreno 200 GPU it housed. This upset many QSD based device owners, especially the Nexus One adopters. I know ICS roms are continually improving for QSD based phones but they'll always lag. Meanwhile the fascinate has received great ICS support due to having the same GPU as the Galaxy Nexus. One month could have changed my future proof experience, but inevidbly the fascinate had a bunch of issues I'm glad I didn't have to deal with.
I know hardware becomes obsolete. It happens, I get it. We all want to try and do the best we can though, especially those of us on Verizon with unlimited data; this is our last subsidized upgrade allowing us to retain unlimited data.
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
Will the US Galaxy S3 withstand the test of time and provide future proof hardware for a reasonable amount of time? Or will it fall short of the two year expected lifespan like the QSD Adreno 200 devices did?
I am uncertain and wish Qualcomm would seriously step its GPU game. Am I alone with this line of thinking?
[I originally posted this in the international forum, but felt it belonged here.]
Click to expand...
Click to collapse
Well, you also have to look at resources available and time constraints. The introduction of LTE in the US probably forced said chip maker to make some concessions. What they lost in state of the art GPU, the gained in the ridiculous profit they made this year because they are the only chip that includes LTE. From their perspective, they've won the war thus far.
I agree with this line of thinking. As an earlier poster said, the Evo 4G had the Adreno 200. I use N64oid all the time. The Evo would struggle with games that the first Galaxy S family had no problem at all with. I have since switched to the Motorola Photon 4G, and the Tegra 2 (Nvidia GeForce GPU). It handles both emulators, as well as high end games so much better than my Evo did. I would have already pre-ordered the S3 if it weren't for this.
real world performance > benchmarks
My Sensation has an Adreno220 and it plays every game and movie just fine. Sure it doesn't get the best benchmark numbers but it more than holds it's own when playing any game. I'm sure the Adreno225 will hold up just fine over the next couple years. In fact I still love my Sensation. Side by side it's still just as fast as most phones out there. You only see a difference when running benchmarks which isn't practical. I personally don't care if i'm getting 200fps or 50. It's not like anyone can tell the difference.
I also want to note that the 220 is crazy powerful compared to the 200 and 205. It was the first GPU that Qualcomm seemed to really take a stab at gaming with. I'm fine with the 220 and can't wait to begin using the 225.
bradleyw801 said:
I agree with this line of thinking. As an earlier poster said, the Evo 4G had the Adreno 200. I use N64oid all the time. The Evo would struggle with games that the first Galaxy S family had no problem at all with. I have since switched to the Motorola Photon 4G, and the Tegra 2 (Nvidia GeForce GPU). It handles both emulators, as well as high end games so much better than my Evo did. I would have already pre-ordered the S3 if it weren't for this.
Click to expand...
Click to collapse
As nativestranger said in another thread,
"The 225 despite its deceptive naming is 4-6x the performance of the 205 and roughly 1.6-2x the 220."
Also, performance on the Evo cannot be based solely on Gpu(Adreno 200). Cpu, Ram, Resolution etc... have a ton to do with it as well.
Will the GPU do better in this phone due to the extra ram compared to the one series with the same S4/225 combo?
Div033 said:
Looking at GSM Arena's benchmarks, specifically the Egypt off-screen test, the Adreno 225 lags behind yesteryear's Galaxy S2 with the Mali 400. The international GS3's performance in this test zooms ahead of the Qualcomm variants by double.
[I originally posted this in the international forum, but felt it belonged here.]
Click to expand...
Click to collapse
You should also note that the GS2 only had 480 x 800 (384000 total pixels) resolution and even at that way lower resolution it's score was only slightly higher in that test whereas the GS3 is pushing 720x1280 (921600 total pixels). That means that the GS3 is working 2.4 times harder than the GS2 and it delivers almost the same gaming performance at worst, and better performance in others. That's not bad if you ask me seeing as how we all thought the GS2 was a powerhouse just 12 months ago.
incubus26jc said:
As nativestranger said in another thread,
"The 225 despite its deceptive naming is 4-6x the performance of the 205 and roughly 1.6-2x the 220."
Also, performance on the Evo cannot be based solely on Gpu(Adreno 200). Cpu and Ram have a ton to do with it as well.
Click to expand...
Click to collapse
agreed, i haven't seen anyone suffering from GPU woes other than benchmark nuts who obsess over the highest score.....everyone actually using it for real things say it works great.....and honestly, my take is if you want gaming performance, don't use a phone, plug your 360 into your big screen and kick ass from the couch in HD and surround sound
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Div033 said:
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Click to expand...
Click to collapse
Well it might interest you to know the Adreno 225 supports DirectX 9.3 and texture compression where Mali 400 does not. Its a requirement for Windows 8. Now, you might say so what....but I for one plan on trying to dual boot or even run a version of Windows RT perhaps on a virtual machine. Something else that the S4 Krait/Adreno package supports natively I do believe, that the Exynos/Mali doesn't.
Sent from my DROIDX using xda premium
Div033 said:
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
Click to expand...
Click to collapse
This is almost certainly a RAM issue. With the ridiculous 2GB on this phone, I would hope you wouldn't run into issues until Android at least wraps the alphabet.
Voltage Spike said:
This is almost certainly a RAM issue. With the ridiculous 2GB on this phone, I would hope you wouldn't run into issues until Android at least wraps the alphabet.
Click to expand...
Click to collapse
+1. I do believe more ram (largest among this generation phones) matters for long term usage.
The GPU is fine.
Guys, this is a Galaxy S phone. The newest one at least.
It is GUARANTEED a Jelly Bean update from Samsung (albeit late). It is also most likely getting at least 1 or 2 more major Android updates because of XDA.
Remember, ALL OF US has the SAME Galaxy S3. That is a LOT of devs that will be working on it.
Don't worry about that. It will come with time.
Div033 said:
I did somehow forget to acknowledge the resolution of the GS3 vs. GS2 because that most certainly makes a difference. It is an unfair comparison.
My primary concern isn't with gaming on the device. I just want the device to be able to run the next two-three versions of android without being hardware bottlenecked.
I've used ICS on my Incredible which is virtually the same as the Evo 4G but performance is still lacking. In some cases it can take a good 4-5 seconds to move from an app to the homescreen when pressing the home button. This may not be entirely the GPU's fault, but regardless homescreen scrolling remains sluggish and somewhat laggy.
As far as popularity of a chipset goes, its become evident that this factor does not affect how long manufacturers will support it. The Nexus One had a QSD chip and was one of the most popular chipsets around at the time but it still did not receive ICS. I know they claimed space restrictions were the reason but I find this highly unlikely considering the other more limiting factors.
Maybe the 225 will be good enough for future android versions like key lime pie and licorice or whatever they call it.
Sent from my Droid Incredible using the XDA app.
Click to expand...
Click to collapse
You simply don't compare the Adreno 200 of first generation snapdragon devices with the 225 of current devices. The Adreno 200 gpu even for it's time is woefully weak. The 205 that followed after was easily 3x or more the performance. Most devices based on this gpu including the xperia play actually had no problem playing the latest games. The 220 saw a similarly huge increase. Qualcomm eased off a little with the 225 by using the same architecture but built on a smaller process and increased the clocks/ memory bandwidth resulting in 1.6x - 2x performance increase. Hence the comments of it being a lame upgrade. However when we look at the desktop gpu of the AMD 6000 series. The upgrade was less than 20% over previous year and people are universally appraising the 6970 gpu. This is how gullible fanboyism and can result in strongly skewed perception over actual results.
nativestranger said:
You simply don't compare the Adreno 200 of first generation snapdragon devices with the 225 of current devices. The Adreno 200 gpu even for it's time is woefully weak. The 205 that followed after was easily 3x or more the performance. Most devices based on this gpu including the xperia play actually had no problem playing the latest games. The 220 saw a similarly huge increase. Qualcomm eased off a little with the 225 by using the same architecture but built on a smaller process and increased the clocks/ memory bandwidth resulting in 1.6x - 2x performance increase. Hence the comments of it being a lame upgrade. However when we look at the desktop gpu of the AMD 6000 series. The upgrade was less than 20% over previous year and people are universally appraising the 6970 gpu. This is how gullible fanboyism and can result in strongly skewed perception over actual results.
Click to expand...
Click to collapse
Fair enough. I suppose you're right, the Adreno 200 was already severely underpowered at launch. The 225 may not be the best, but it's still up among the top tier GPUs. I guess I have nothing to worry about. The 2GB ram is definitely nice too.
Sent from my Droid Incredible using the XDA app.
Just put this here for a reference:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The nexus one is running the Adreno 200. The htc one v with Adreno 205 is over 5X faster. The rezound has an Adreno 220. over 3X faster than the one V but also running more than 2X the resolution. The GS3 with adreno 225 is hard up against the vsync wall in he hoverjet test and about 3x faster than the rezound in the egypt test. It's amazing how much Adreno has upgraded in just 2 years. From the 2fps on the 200 to never dropping below 60fps on the 225.
Thank you this helps me make my decision too ^. Also does having a high resolution screen make graphics look better? Like NOVA 3 on my SGS2 looks awesome. All the effects are there bullet smoke and what not. So will these effects or the graphics in general look better on this sgs3 screen?
Thanks!
Your GPU will be just fine. Other posters have already shown that it is perfectly competitive with the gpu's in other top-tier phones of this generation, and most top-tier phones sold in the US since the beginning of the year have run S4 SoC's. It comes down to LTE, something the rest of the world doesn't have (for the most part, and nowhere near our level). I for one would much rather give up an absolutely world-crushing gpu than to be stuck on 3g forever.
Also, keep in mind that the Galaxy S3 is on-track to be the fastest-selling device ever. The American market is huge, and is the home to many of the major players in this industry (including of course Google themselves), not to mention that Samsung seems to want to treat the two devices (US and International versions) as one in the same. Its not like they'll want all those customers to have a gpu that'll make the phone feel old in 3 months, so I wouldn't worry.
And honestly, I don't really see Android becoming significantly more hardware-intensive anytime real soon. The current/upcoming generation of hardware can beat many people's PC's in terms of UI navigation, launching apps, and even running games. Two HUGE things Google talked about with Jellybean was introducing the PDK, and Project Butter. This shows that they recognize that some of the platform's biggest weaknesses were its slow, inconsistent updates and perceived lower performance to iOS due to the UI. From what I have seen of Jellybean in videos, I don't see much further room to speed up the UI, its already about as fast and smooth as it can get it seems. I would imagine screen resolution won't be making huge jumps in the immediate future; there'll be some 1080p screens, but I doubt people will demand that in a sub 5" device they hold a foot in front of their face, considering the negative impact on battery life.
What I'm trying to say is I don't see Android really demanding significantly more powerful hardware to keep growing, as it already runs as smooth and fast as possible on hardware we're already calling outdated. Sure, more ram and more powerful cpu's may let you multitask with more apps without slowdowns, and better gpu's can run newer games even better, but I just don't see the system itself requiring significantly better hardware than we have now for a few generations. Looks like we'll be more focused on multitasking performance, efficiency/battery life, and manufacturing costs so everyone can enjoy the kind of performance that's currently reserved for us nuts buying new top-tier phones every 8 months.
So again, no, I wouldn't worry. Sure, its not the best thing out there, and will soon be outclassed, but I don't see it handicapping Android anytime soon, especially with 2GB of RAM.
Cruiserdude said:
Your GPU will be just fine. Other posters have already shown that it is perfectly competitive with the gpu's in other top-tier phones of this generation, and most top-tier phones sold in the US since the beginning of the year have run S4 SoC's. It comes down to LTE, something the rest of the world doesn't have (for the most part, and nowhere near our level). I for one would much rather give up an absolutely world-crushing gpu than to be stuck on 3g forever.
Also, keep in mind that the Galaxy S3 is on-track to be the fastest-selling device ever. The American market is huge, and is the home to many of the major players in this industry (including of course Google themselves), not to mention that Samsung seems to want to treat the two devices (US and International versions) as one in the same. Its not like they'll want all those customers to have a gpu that'll make the phone feel old in 3 months, so I wouldn't worry.
And honestly, I don't really see Android becoming significantly more hardware-intensive anytime real soon. The current/upcoming generation of hardware can beat many people's PC's in terms of UI navigation, launching apps, and even running games. Two HUGE things Google talked about with Jellybean was introducing the PDK, and Project Butter. This shows that they recognize that some of the platform's biggest weaknesses were its slow, inconsistent updates and perceived lower performance to iOS due to the UI. From what I have seen of Jellybean in videos, I don't see much further room to speed up the UI, its already about as fast and smooth as it can get it seems. I would imagine screen resolution won't be making huge jumps in the immediate future; there'll be some 1080p screens, but I doubt people will demand that in a sub 5" device they hold a foot in front of their face, considering the negative impact on battery life.
What I'm trying to say is I don't see Android really demanding significantly more powerful hardware to keep growing, as it already runs as smooth and fast as possible on hardware we're already calling outdated. Sure, more ram and more powerful cpu's may let you multitask with more apps without slowdowns, and better gpu's can run newer games even better, but I just don't see the system itself requiring significantly better hardware than we have now for a few generations. Looks like we'll be more focused on multitasking performance, efficiency/battery life, and manufacturing costs so everyone can enjoy the kind of performance that's currently reserved for us nuts buying new top-tier phones every 8 months.
So again, no, I wouldn't worry. Sure, its not the best thing out there, and will soon be outclassed, but I don't see it handicapping Android anytime soon, especially with 2GB of RAM.
Click to expand...
Click to collapse
I see lol well that's good I don't wanna have to buy a new phone every half a year! But will the HD resolution make any of the game loft games look any better than they do on my galaxy s2 with the Mali 400 gpu? Thanks!
Sent from my SPH-D710 using XDA Premium App

NVIDIA Tegra 4 vs Nexus 10 processor

They've unveiled it today
http://www.engadget.com/2013/01/06/nvidia-tegra-4-official/
and apparently it's much powerful and faster than the eqynox on the nexus 10, but I don't know that much about this kind of tech, I'm probably finally going to buy the Nexus 10 this week if Samsung doesn't unveil a more powerful tablet, so I was wondering if this Tegra 4 processor is worth waiting for until it's implemented on a tablet.
May TEGRA 3 Rest in Peace ...
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Sent from my GT-I9100 using Tapatalk 2
Yes that thing is packing heat. Best case, the next device with a tegra 4 will come out next Christmas. Unless they've been hiding something.
cuguy said:
Yes that thing is packing heat. Best case, the next device with a tegra 4 will come out next Christmas. Unless they've been hiding something.
Click to expand...
Click to collapse
It will be out somewhere b/w June and August maybe..
It will not take that long ...
Sent from my GT-I9100 using Tapatalk 2
i think march....mark my words
Their browser test is having the Nexus 10 run Chrome while the Tegra runs AOSP. In my eyes that makes it a 100% unfair comparison.
Between bad experiences with Tegra 2 and 3 (Atrix/TF700) and their requirement that Tegra optimized games not run on other SoC vendors without any real reason other than because they can, I cant even consider a mobile Nvidia device. All they're good for is keeping the more reputable chip makers on their toes.
yes it's nice
Would be interesting to see this with both devices running the AOSP browser! From my experience it is much faster than the current chrome version (which still is version 18 on android, compared to 23 on desktop). Maybe the Tegra4 would be faster aswell, but not that much.
Everything on my N10 is extremely fast and fluid, so I wouldn't wait for whenever the first Tegra4 devices will be available. Plus its Nexus so you know what you are buying!
Jotokun said:
Their browser test is having the Nexus 10 run Chrome while the Tegra runs AOSP. In my eyes that makes it a 100% unfair comparison.
Between bad experiences with Tegra 2 and 3 (Atrix/TF700) and their requirement that Tegra optimized games not run on other SoC vendors without any real reason other than because they can, I cant even consider a mobile Nvidia device. All they're good for is keeping the more reputable chip makers on their toes.
Click to expand...
Click to collapse
Agreed, they're making an Apples and Pears comparison that was undoubtedly set to show the new processor in a good light. It's only to be expected, it is a sales pitch after all. It will no doubt be a faster chip though.
Sent from my Nexus 10 using XDA Premium HD app
I would much rather see a couple benchmark runs myself. A time comparison of a web browser is no way to test the power of a new chipset.
Still, I would expect Tegra 4 to be WAY WAY WAY more powerful than the Exynos 5250. Both devices use the A15 architecture, and Tegra 4 has twice as many CPU cores as we have. This alone is already a big boost in multithreaded apps. Then look at the GPU where you cant even compare the two at all except by end result numbers. They are just far too different. We have 4 GPU cores, Tegra 4 has 72 GPU cores. But those cores are designed far differently and not nearly as powerful per core. It is all about the companies definition of what a GPU "core" is. And then you have a smaller process node as well, which by itself already promises to use less power than the larger process node the Exynos 5250 uses.
I would honestly expect the Tegra 4 chipset to completely destroy our tablet in terms of performance, but a much more fair comparison would be to compare the Tegra 4 to the Exynos 5 quad. Those two are actually designed to compete with each other.
If you want to compare exynos and tegra 4 then wait for exynos 5450 (quad a15) which should come with galaxy s4 no of cores makes a difference here t4 is quad but early gl benchmarks show that A6X and exynos 5250 have a better GPU
First Tegra 4 Tablet running stock android 4.2:
http://www.theverge.com/2013/1/7/3845608/vizio-10-inch-tablet-combines-tegra-4-android-thin-body
Lets hope and pray Vizio managed to add microSD capability to it. Free of Nexus restraints, it should be doable, but since it is running stock JB, the storage would have to be mounted as USB (?) . So far this Vizio 10" was the most exciting Android development out of CES. We have few more our of Press events scheduled in Vegas and then it will be all over
rashid11 said:
Lets hope and pray Vizio managed to add microSD capability to it. Free of Nexus restraints, it should be doable, but since it is running stock JB, the storage would have to be mounted as USB (?) . So far this Vizio 10" was the most exciting Android development out of CES. We have few more our of Press events scheduled in Vegas and then it will be all over
Click to expand...
Click to collapse
Dont expect the Nexus advantage of up-to-date software or timely updates.
EniGmA1987 said:
I would much rather see a couple benchmark runs myself. A time comparison of a web browser is no way to test the power of a new chipset.
Still, I would expect Tegra 4 to be WAY WAY WAY more powerful than the Exynos 5250. Both devices use the A15 architecture, and Tegra 4 has twice as many CPU cores as we have. This alone is already a big boost in multithreaded apps. Then look at the GPU where you cant even compare the two at all except by end result numbers. They are just far too different. We have 4 GPU cores, Tegra 4 has 72 GPU cores. But those cores are designed far differently and not nearly as powerful per core. It is all about the companies definition of what a GPU "core" is. And then you have a smaller process node as well, which by itself already promises to use less power than the larger process node the Exynos 5250 uses.
I would honestly expect the Tegra 4 chipset to completely destroy our tablet in terms of performance, but a much more fair comparison would be to compare the Tegra 4 to the Exynos 5 quad. Those two are actually designed to compete with each other.
Click to expand...
Click to collapse
look at the new iphone only 2 cores (different architecure) beating the higher clock double cored galaxy s3 in some disciplines..
this presentation is scientifically SO SO irrelevant especially becasue they use different software.. I LOL SO HARD at ppl thinking this is anywhere near to be comparable
schnip said:
look at the new iphone only 2 cores (different architecure) beating the higher clock double cored galaxy s3 in some disciplines..
this presentation is scientifically SO SO irrelevant especially becasue they use different software.. I LOL SO HARD at ppl thinking this is anywhere near to be comparable
Click to expand...
Click to collapse
The new iPhone 5 doesnt use the same ARM architecture are the S3 though, it is a custom design. So those can be compared against each other fine to see which architecture is better. And if a slower clock speed CPU gets better scores then we know it is a superior design. This is the basis of all benchmarking. If we were only allowed to compare the exact same architectures together then we wouldnt learn anything.
Tegra4 uses a (probably slightly modified) A15 core, and the Exynos 5xxx uses a fairly stock A15 core. So a higher clocked A15 should beat out a lower clocked A15 in a direct comparison no matter what. Then when you throw 2 additional cores on top it should always win in multithreaded benchmarks too. Seems pretty common sense to me.
The main difference will be in the graphics side of things, where Nvidia has their own designed GPU compared to Samsung's use of the Mali GPU's.
You can still compare them together just fine, it just need to be both of them on the same browser if there is a browser comparison being done. In this PR release, Nvidia skewed the results like all companies do. So we cant really see the difference between the two from those pictures and we need to wait for 3rd party review sites to do proper testing to see actual results. Yet we can still estimate performance plenty fine since we have a baseline of the architecture already with this tablet.
"Tegra 4 more powerful than Nexus 10"... well duh! It's a new chip just unveiled by nvidia that won't show up in any on sale devices for at least a couple of months. Tablet and smartphone tech is moving very quickly at the moment, nvidia will hold the android performance crown for a couple of months and then someone (probably samsung or qualcomm) will come along with something even more powerful. Such is the nature of the tablet/smartphone market. People that hold off on buying because there is something better on the horizon will be waiting forever because there will always be a better device just a few months down the line!
EniGmA1987 said:
The new iPhone 5 doesnt use the same ARM architecture are the S3 though, it is a custom design. So those can be compared against each other fine to see which architecture is better. And if a slower clock speed CPU gets better scores then we know it is a superior design. This is the basis of all benchmarking. If we were only allowed to compare the exact same architectures together then we wouldnt learn anything.
Tegra4 uses a (probably slightly modified) A15 core, and the Exynos 5xxx uses a fairly stock A15 core. So a higher clocked A15 should beat out a lower clocked A15 in a direct comparison no matter what. Then when you throw 2 additional cores on top it should always win in multithreaded benchmarks too. Seems pretty common sense to me.
The main difference will be in the graphics side of things, where Nvidia has their own designed GPU compared to Samsung's use of the Mali GPU's.
You can still compare them together just fine, it just need to be both of them on the same browser if there is a browser comparison being done. In this PR release, Nvidia skewed the results like all companies do. So we cant really see the difference between the two from those pictures and we need to wait for 3rd party review sites to do proper testing to see actual results. Yet we can still estimate performance plenty fine since we have a baseline of the architecture already with this tablet.
Click to expand...
Click to collapse
That was kind of his point.
I don't think anyone is denying that the tegra will be faster. What's being disputed here is just how much faster it is. Personally, I don't think it'll be enough to notice in everyday use. Twice the cores does not automatically a faster CPU make, you need software that can properly take advantage and even then is not a huge plus in everyday tasks. Also, in the past Nvidia has made pretty crappy chips due to compromise. Good example being how the tegra 2 lacked neon support. The only concrete advantages I see are more cores and a higher clock rate.
Based on the hype : performance ratio of both Tegra 2 & 3 I wouldn't have high hopes until I see legit benchmark results.
What does seem promising though, is the fact that they are making more significant changes than from T2 to T3, such as dual channel memory (finally after 1-2 years of all other SoCs having it -.-) and the GPU cores are different too.
Still, the GPU has always been the weakest point of Tegra, so I still don't think it can beat an overclocked T-604 by much, even though this time around they will not be the first ones to debut a next-gen SoC. Given the A15 architecture they can't really screw up the CPU even if they wanted to, so that should be significantly faster than the Exynos 5 Dual.
I've also just read this article on Anandtech about power consumption and the SoC in the Nexus 10 consumes multiple times as much power as other tablet chipsets, making me wonder how nVidia plans to solve the battery life issue with 2 times as many cores and a (seemingly) beefier GPU, not even mentioning implementation in phones..
freshlysqueezed said:
First Tegra 4 Tablet running stock android 4.2:
http://www.theverge.com/2013/1/7/3845608/vizio-10-inch-tablet-combines-tegra-4-android-thin-body
Click to expand...
Click to collapse
Apparently this is the tablet that comes to take Nexus 10 spot, Vizio 10 inch tablet with Tegra 4, 2560 x 1600 resolution, 32gb storage, Android 4.2. And it should be coming out Q1 2013, I think this one makes me wait to hear some more about it until I buy the Nexus 10, although to be honest the brand is a bit of a let down for me.
Edit: the 10-inch model, key specs (aside from Tegra 4) include a 2,560 x 1,600 display, 32GB of on-board memory, NFC and dual 5MP / 1.3MP cameras.
http://www.engadget.com/2013/01/07/vizio-10-inch-tegra-4-tablet-hands-on/

[SPECS] Nexus 9 Specifications - Members Thoughts.

Hey all
Its good to see HTC back in the tablet game! This device features Google's latest version of Android (lollipop) nicknamed Android "L". Feel free to discuss the hardware or the overall design of the Nexus 9:good:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Specifications:
Display - IPS LCD,1536 x 2048 pixels, 8.9 inches (~281 ppi pixel density)
Chipset - Nvidia Tegra K1
CPU - Dual-core 2.3 GHz Denver
GPU - Kepler DX1
RAM - 2GB
Memory - 16/32GB
Camera - 8MP, 3264 x 2448 pixels, autofocus, LED flash
Secondary Camera - 1.6MP
Sensors - Accelerometer, gyro, proximity, compass
Connectivity - Wi-Fi 802.11 a/b/g/n/ac, dual-band, Wi-Fi Direct, DLNA, Wi-Fi hotspot, Bluetooth v4.0
Battery - Non-removable 6700 mAh
Dimensions - 228.2 x 153.7 x 7.9 mm
Weight - 425g
Source: gsmarena.
ForumNinja said:
Hey all
Its good to see HTC back in the tablet game! This device features Google's latest version of Android (lollipop) nicknamed Android "L". Feel free to discuss the hardware or the overall design of the Nexus 9:good:
Image coming soon!
Specifications:
Display - IPS LCD,1536 x 2048 pixels, 8.9 inches (~281 ppi pixel density)
Chipset - Nvidia Tegra K1
CPU - Quad-core 2.3 GHz Denver
GPU - Kepler DX1
RAM - 2GB
Memory - 16/32GB
Camera - 8MP, 3264 x 2448 pixels, autofocus, LED flash
Secondary Camera - 1.6MP
Sensors - Accelerometer, gyro, proximity, compass
Connectivity - Wi-Fi 802.11 a/b/g/n/ac, dual-band, Wi-Fi Direct, DLNA, Wi-Fi hotspot, Bluetooth v4.0
Battery - Non-removable 6700 mAh
Dimensions - 228.2 x 153.7 x 7.9 mm
Weight - 425g
Source: gsmarena.
Click to expand...
Click to collapse
Its a big!! And a giant upgrade. coming from the nexus 7 2012
Is Nexus 9 also have turbo charging like the Nexus 6?
jjardinero.01 said:
Is Nexus 9 also have turbo charging like the Nexus 6?
Click to expand...
Click to collapse
doesn't matter we will make it happen
Kernel hacking
I don't think so I think the fast charging is a feature built in to the Qualcomm chips.
Sent from my GT-I9505G using XDA Free mobile app
For me is 3 things that are causing noise:
1) The pixel density is what's causing the first impact (not in a good way) in me. I'm no expert when it comes in judging screen but when other devices with far smaller screens as well as devices that have been in the market for a long time come up with much higher densities, it does make you wonder about this brand new flagship device that only offers 281 ppi. (And please, spare me the explanation about how irrelevant pixel density is in relation to the smaller screen sizes, I understand that).
2) Also, the 2GB RAM (as opposed to 3).
3) The portability aspect of such a big device: the physical size and weight seem to be a concern, specially when you're upgrading from the fantastically compact Nexus 7 (2013).
edited: I forgot about a 4th thing that has me deeply disappointed:
4) Lack of wireless Qi-charging capability.
But who are we kidding, being the nerds we are, we know most of us here will be getting one regardless.
Its not quad core denver chip but dual core denver chip
I agree on #1 and #3.. This is a step backwards from the DPI on the N7 and at 425g its pretty much the same as a 9.7 inch Ipad air (437g)
rdelfin said:
For me is 3 things that are causing noise:
1) The pixel density is what's causing the first impact (not in a good way) in me. I'm no expert when it comes in judging screen but when other devices with far smaller screens as well as devices that have been in the market for a long time come up with much higher densities, it does make you wonder about this brand new flagship device that only offers 281 ppi. (And please, spare me the explanation about how irrelevant pixel density is in relation to the smaller screen sizes, I understand that).
2) Also, the 2GB RAM (as opposed to 3).
3) The portability aspect of such a big device: the physical size and weight seem to be a concern, specially when you're upgrading from the fantastically compact Nexus 7 (2013).
edited: I forgot about a 4th thing that has me deeply disappointed:
4) Lack of wireless Qi-charging capability.
But who are we kidding, being the nerds we are, we know most of us here will be getting one regardless.
Click to expand...
Click to collapse
I don't see any mention of a GPS in the Nexus 9... Can anyone confirm GPS (or maybe GLONASS?) capability?
Snowflake6 said:
I don't see any mention of a GPS in the Nexus 9... Can anyone confirm GPS (or maybe GLONASS?) capability?
Click to expand...
Click to collapse
its there
http://en.wikipedia.org/wiki/Nexus_9
I like the 4/3 form factor. Only time I have used this form factor on a tablet was on an Ipad. I personally don't like Ipad but one thing I loved is that form factor. Easier to hold and work with.
Android L looks good. I can't wait to see how fast the K1 will be using ART.
Funny to see that the N9 camera will protrude from the tablet a little bit, like on the new Iphone 6 (Iphone users complained about this). I personally don't care but I'm sure some future N9 users will complain about it.
Camera is not the best but hey, who brings a tablet to take pictures? Grab your DSLR or your phone Pple looks so stupid when taking picture using a tablet..lol. This also makes me think about something else. Why not putting a better resolution camera on the front? Personally, when I'm using my tablet camera, It's for Skype. I never took a picture using the main camera, Imo, the 8 Mp camera should be front facing so we can have descent video resolution when skyping.
2 Gb of ram is enough for me, especially when using "pure" android (no Sense or TouchWizz UI). 3Gb of ram would have raised the price too much and most users don't need 3gb.
Polycarbonate cover:
HTC were so excited when they announced that the One X was getting a uni-body poly-carbonate shell back in 2012. This polycarb. design caused so much trouble with that phone, the polycarb. body was somewhat flexible and caused bad electrical contact between main board and antennas glued inside the polycarb cover. (i.e the infamous Wifi issue of the One X) I personally don't thrust polycarb anymore, I prefer aluminium body like the M7/M8. Hopefully, they have designed the N9 so that problem will not occur again...
Tegra K1 CPU:
I also had bad experience with Tegra CPU on my One X. They announced the Tegra 3 (in 2012) to be a powerful quadcore CPU with very good power consumption using a battery-saver "companion" core to handle low-power tasks when 4 cores are not required (standby mode, playing music etc etc). Performance wasn't there (compared to the qualcomm equivalent of that year) and power consumption was really bad. I was very disappointed by Tegra CPU. Also used a Tegra 2 on my gf's Asus TF101. I prefer qualcomm cpu, much faster and power friendly. It also looks like there is more development for qualcomm CPU than Tegra CPU. But K1 is ready and Qualcomm's 810 isn't, so yes the K1 will be to most powerful chip on the mobile market, but im not optimistic about power consumption where the 810 will probably be much better when released.
Overall I think it will be a very good tablet keeping in mind that this tablet is designed for the average consumer and not for enthusiast.
edit:
front facing speakers (boomsound) will be awesome!
a bit bummed out on only 2gb ram, no qi charging, lowish dpi but am ordering it anyways...
do really like turbo charging, ac wifi, bt apt-x, htc design and big battery...
I hate that they went big with both the 6&9. The 7" fit perfectly for my needs. Oh well...
4x3, front facing speakers, and stock Android? WHERE HAVE YOU BEEN ALL MY LIFE!?!?
This tablet will be a huge hit, despite the less-than-record-setting specs. I do wish it were cheaper, but I think people who do buy it will love it.
And it's time for 16x9 tablets to die. I hope this is the start!
jjardinero.01 said:
Is Nexus 9 also have turbo charging like the Nexus 6?
Click to expand...
Click to collapse
Yeah it comes with a turbo charger which is weird considering the turbo charger is made by Motorola. (I forget where I read this so may not be reliable but I know I read it lol)
Sent from my LG-VS980
4:3 is what I've been looking for in a tablet as one of my main uses for it as a medical professional is reading PDFs on the go. 16:9 or 16:10 is not the aspect ratio for reading letter-size PDFs, but it could be managed on the larger >10" tablets. Up to this point, the iPad was the best device for consuming PDFs, but now I have another choice on the market to take a look at.
roninmedia said:
4:3 is what I've been looking for in a tablet as one of my main uses for it as a medical professional is reading PDFs on the go. 16:9 or 16:10 is not the aspect ratio for reading letter-size PDFs, but it could be managed on the larger >10" tablets. Up to this point, the iPad was the best device for consuming PDFs, but now I have another choice on the market to take a look at.
Click to expand...
Click to collapse
I too have been waiting for a 4/3 android option without resorting to cheapo Chinese tablets without after sale service. And to those who will explain you that 16/9 is better for movies, that is a very limited point of view in my opinion : how do iPad users do ? I mostly watch movies on my tablet in my bed, with other lights off or very dimmed, so I just can't see the bezel anyway and will gladly have a much better format for books and Internet consumption !
It's not THAT different from most other phones, but I like the size.
I hope it's dual core and not quad core so it's finally worth investing for real.
What I like:
- Tegra K1 - 'nuff said
- Android apps will automatically compile to 64bit when installed on it. The beauty of Dalvik/ART :')
- 4:3 screen - FINALLY an android tablet for working people. Working with PDFs and documents will be an absolute joy with this.
- Android updates
Cons:
- Limited storage -- what's the point of having all that power if you don't have enough storage for content to make use of it. HD games and videos use a LOT of space.
Even Apple realised that and dropped $100 off 64 and 128GB models
- eMMC 4.51 -- which peaks at 200MB/s is significantly slower than other flagships. My iPhone 5S gets over 360MB/s read speeds in benchmarks.. SSD-speeds really.
- 2GB RAM, unlikely to be a problem in the near future.
- Too expensive for what it has to offer.
All in all, I think its the best android tablet on the market now. Sad to see it out powered by the iPad Air 2 only a day after release though the difference is only minimal.
Can't wait to get mine

[Discussion] M9+ benchmarks, real life performance experiences

All HTC M9+ owners!
We're a handful yet on XDA, but getting more and more as the device is rolling out to more locations. and people who look for an elegant, high end device with premium build quality and the extra features like fingerprint scanner and 2K display and the duo camera settle with this fantastic device. It's not the perfect for everything unfortunately, not the best gaming phone out there, but in my experience a very well performing device in terms of phone call quality, reception quality, wifi power, multimedia, battery, display panel and overall UX feel, smoothness. High end games of latest year 2015 might stutter a bit here and there or lack some important shaders to compensate, but nonetheless good for last years (2014 and before 3D gaming and all kinds of 2D games), let's gather the experience and benchmarks of this unique device which was maybe the first MTK device in alubody
Let's discuss performance related experience, real user feel/experience and benchmarks free of whining, facing the truth that in some respects it's not the top-notch device, but let the curious ones who consider to buy this device, know what to expect if choosing this elegant business class phone.
I'll start with some of my benchmarks result screenshots in separate posts.
UPDATE:
Here's my short game testing video on M9+
https://www.youtube.com/watch?v=QmLGCoI4NLw
Antutu on stock Europe base 1.61
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
My real life experience is that in everyday chores, the phone is very snappy, with some small lags when starting new apps to the memory. Task switching is quite good.
There's unfortunately some memory leak issue with 5.0.x Android version which after a while kicks in (on most 5.0.x devices as well), but overall the UX smoothness is just quite good. Occasionally there are some apps that brings up popup windows a bit stuttery, like facebook comments animation tends to be a bit stuttery.
The Sense Home/Blink Feed experience is just perfect. At normal operations, when no big application updates are happening in the background, I never faced any lags on the Sense Home UI.
As for the games, games from and before 2014 run perfectly. New ones might get some shaders removed, or with reduced polygon counts, so don't expect a jaw dropping 3D experience. If you are ok with last years (2014 and before) 3d game quality, M9+ is quite good, but latest games will be most probably running in the dumbed down mode accordingly to the PowerVR GPU the M9+ with mtk chipset has. The phone keeps a good temperature mostly, except when charging battery and at the same time playing 3D heavy games. (But that's expected from most devices, especially with alu body)
The screen quality is quite good, i've got a perfect panel with the first unit I got, and the refresh rate is 60hz, smooth and with lovely dpi and brightness of 2K.
The benchmarks show a general good performance with operations bound to the CPU. Where the MTK chip comes shorthanded is the GPU part. All tests show 3D performance that is below 2014's flagships performance by far. The GPU PowerVR 6200 is the same that was built into iPhone5s, which let's face is a mediocre GPU for a 2K display. If you face this fact and accept that gaming won't be with the highest quality textures and shaders, probably you won't be disappointed.
I'm curious what others think...
Played with it for a few hours. First impression is that it's clearly slower then the M9. The fingerprint sensor is almost perfect, a few hit and misses. I'll be back after a few days of using it with a more adequate feedback.
Sent from my HTC_M9pw using Tapatalk
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
DeadPotato said:
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
Click to expand...
Click to collapse
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
tbalden said:
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
Click to expand...
Click to collapse
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
DeadPotato said:
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
Click to expand...
Click to collapse
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
tbalden said:
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
Click to expand...
Click to collapse
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
DeadPotato said:
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
Click to expand...
Click to collapse
Second you.
X10 is MTK's first high-end tier SoC. They should equip with high-end graphics card to become well-known,not two-years ago one same with iPhone5s era. Although the most casual gaming is OK,but it is not covered with "flagship" devices.
It's excellent there were some possibility to tweak it on kernel level. But the Mediatek Helio X20 should work better,better & better on GPU part.
Not heavy gamer,the UX is pretty good for me. (Camera is another topic.)
Even I measured the CPU speed with Chess game,the result reached the top-tier.
tbalden said:
Antutu on stock Europe base 1.61
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
Click to expand...
Click to collapse
FYI.
Mine : stock , Taiwanese base 1.08,rooted device
Sigh.... the result was maxed out for my device on Ice Storm "Extreme".
Very similar result as mine was except ice storm unlimited. I think it might be related to temperature throttling, I'll test it again later.
Yeah, second run on base 1.61
Hello guys,
I live in France and i really look forward the HTC m9+ ; i thinks it is what the M9 should have been, but i don't understand HTC 2015 sh*tty choices of everything.
WHat i would like to know is about battery life ; Can you guys tell me whats about it ?
I just bought galaxy s6 edge a few days ago and it's creepy. I have no battery in the end of afternoon. I don't play games, just few photos and browsing, and messenger sometimes. And anyways i miss boomsound and metal body.
I'm so desapointed. Should never have sold my old one m8 or xperia Z3.
My only hope is turned on lenovo vibe x3 but there is no news since march ...
Thanks guys !
After installing more and more software that is synchronized, like dropbox and a few other my battery life got lower a notch, from 23+ hours to 22+ and average screen on around 4 hrs. All in all, not a miracle, but that's what I usually got with my previous phone, OPO with its larger battery and lower screen resolution. A bit less though, admittedly. So it gets me through the day nicely, but I'm not gaming.
It's average I think. Compared to the amazing Sony z3 compact it's pathetic, my friend's z3c gets 3+ days and 5 hours of screen on time. And I think it's done with some great stock kernel enchanted by Sony engineers and Sony stock rom... wish other phones could do that
Not sure why, I could not choose no lock screen and lock screen without security mode under security settings. Both options are greyed out and it says it is disabled by administrator,encryption policies or other apps.
I only have prey and lockout security installed which could affect this but this was totally fine on my M7 device.
Anyone has any idea? Thank you.
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
JellyKitkatLoli said:
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
Click to expand...
Click to collapse
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Yes, and it's fairly accurate.
I got 11871 with 2K, I couldn't screenshot as it doesn't know how to (Doesn't work above 1080p without dedicated software for it).
M8 1080p.
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
JellyKitkatLoli said:
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
Click to expand...
Click to collapse
hmmm......
The efficiency of SW decode will be performed vividly by multi-cores in your real life.
There were much comparison between MT6795(not "T") and SD801 in this site.
About the browser speed, RAR compression ,real gaming, power consumption........
http://tieba.baidu.com/p/3825199511?see_lz=1#69839534914l
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Are those results after reducing resolution to 1080p, did I read correctly? Because if that's the case, it's sad to confirm the limits of the 3D graphic card as other stated before, if you see the above M8 (which I find surprisingly higher than I experienced on my own M8 anyway) you'll notice even the last year's M8 has a better 3D score, and not to mention the regular M9 which produces almost double of that score. Multicore wise the Helio X10 is a beast, but graphics wise, looks like it's not so much

Categories

Resources