I played with the Arc today and... - XPERIA X10 General

First off, I don't want any fanboy reactions from SGS2 users or what kind of other phone. I'm just wondering why we don't have got it?
Today I played with the Arc of my nephew. And what must I say? We updated to the newest firmware before starting.
I must say that our x10 has received a very very very crappy 2.3.3 update. The Arc has ALMOST NO lags. Everything runs smoother then on our x10. They both got 1Ghz processor, how did this happen?
Also battery life is MUCH better. x10 was on lowest brightness, Arc FULL brightness both connected with wifi. X10 had a simcard, and the Arc not. But damn? Our x10 lost so many % battery in constant use while the Arc just lost 1%!!! X10 has even custom rom...
Both phones similair processor, only newer generation on Arc. Both running same clockspeed. How the heck can our x10 then perform much and much worse?
Just a few things I'm wondering, and NO I'm not going to buy a Arc. I got better things to buy
By chance any developer then can fully uncap all FPS to 60 FPS on our x10. Including 3D. I'm not expecting 3D games to run 60FPs but atleast that they will be able to reach the max FPS our GPU can handle if uncapped... The only thing we've got uncapped is 2D graphics, nothing more... And 2D graphics on Arc are still smoother...
Who will be able to make us x10 perform better? Is it kernel related? Arc has newer kernel?
I know thread is a bit messy, but how the heck can a phone with almost similair hardware (just newer generation processor and some new things) perform so much much much better? You can't make me believe that the processor in the Arc is 100% faster... Both same clockspeed just some little improvements for energy and stronger GPU, and the UI has no heavy graphics... Arc = butter smooth in scrolling and all UI related stuff.
I believe SE is capping more then only the FPS?

Did you know that the Are processor higher than X10, RAM is higher than X10 and GPU performance is higher than X10? What are you talking about? Compare a new phone appears almost 2 years later with the first Android phone from SE?
EDIT: I'm scared of this one
Both phones similair processor
--------------------------------------
almost similair hardware
Click to expand...
Click to collapse

Better processor would also mean better gpu...
It is the processor completely. Try any phone with that gen, its the same reason those phones can overclock to 1.8 ghz comfortably

Hey guys...
My Dell with a 2.4 ghz Pentium IV running win7 runs like crap compared to my Dell with a 2.4 ghz intel core and nvidia gtx 460. Why did Dell give us such an inferior Win7?
I mean, both laptops have the same speed but the core one lasts 4x as long as the old one! They're so similar but it's so *convenient* that the newer one is running win7 better.
Doesn't anyone else think that Dell is crapping our Win7 on purpose? It's a conspiracy folks!

I guess there are several clarifications needed.
1. X10's brightness is controlled by fluorescent illuminators as compared to Arc's LED backlit. have you ever wondered why sometimes the backlight doesn't dim smoothly? fluorescent illuminated display consumes hell lot more battery as compared to LED. this is rather self explanatory.
2. the process in x10's and arc's processor are different. x10 uses 65nm while arc uses 45nm. power consumption and efficiency in both are obviously different. performance wise, different too.
3. adreno 200 and adreno 205 are of major difference. adreno 205 enables hardware accelerated flash and rendering while 200 does not. adreno 200 is ancient FYI.
4. FPS on the x10 is uncapped to 55. i suppose that doesn't make much difference with 60?

There is no "3d fps cap". This is the actual performance of Adreno200 (compare it with N1)
Cheers,
z

zdzihu said:
There is no "3d fps cap". This is the actual performance of Adreno200 (compare it with N1)
Cheers,
z
Click to expand...
Click to collapse
Can't we overclock our gpu sir ?
And why does it show 280mb ram instead of 384mb ram ?
Sorry if my question is out of topic ..

Oodie said:
Can't we overclock our gpu sir ?
And why does it show 280mb ram instead of 384mb ram ?
Sorry if my question is out of topic ..
Click to expand...
Click to collapse
I don't think there is a way to directly overclock GPU on SoC. it's not like in PC we have a separate graphics card to run graphic rendering.
100MB RAM is dedicated to vRAM.

silveraero said:
Did you know that the Are processor higher than X10, RAM is higher than X10 and GPU performance is higher than X10? What are you talking about? Compare a new phone appears almost 2 years later with the first Android phone from SE?
EDIT: I'm scared of this one
Click to expand...
Click to collapse
but ram doesnt render the ui? both 1ghz processor why is the x10 so laggy then...

Oodie said:
Can't we overclock our gpu sir ?
And why does it show 280mb ram instead of 384mb ram ?
Sorry if my question is out of topic ..
Click to expand...
Click to collapse
you can overclock gpu by overclocking the whole processor. only gpu is not possible

KeizBaby said:
but ram doesnt render the ui? both 1ghz processor why is the x10 so laggy then...
Click to expand...
Click to collapse
Did you read any of the thread?

Oodie said:
Can't we overclock our gpu sir ?
Click to expand...
Click to collapse
No, but as mentioned before, you can overclock the whole chip.
Oodie said:
And why does it show 280mb ram instead of 384mb ram ?
Click to expand...
Click to collapse
The "missing memory" is used by Linux kernel (dsp, camera, encoder and) and is not accessible by user. This is common
Cheers,
z

Old technology vs New technology...
go figure.

Thanks ToonXW , keizbaby and Z for repling. Got it now

So is there any way to overclock rhe whole chip on stock 2.3.3

Why is battery life better on Arc:
X10 65nm CPU vs 45nm on Arc
X10 998MHz @ 1300mV vs 1024MHz @ 1200mV on Arc
Alone these things make it better.
And you forgot to mention that Arc's screen reacts faster/better.
Couldn't type really fast on my X10, it didn't register some touches then. On my Arc I can type really, really fast.
All these things make it better, BUT the Arc is 1 year newer than the X10!
In 1 year hardware makes huge steps forward.
Sent from my LT15i using XDA Premium App

Flo95 said:
Why is battery life better on Arc:
X10 65nm CPU vs 45nm on Arc
X10 998MHz @ 1300mV vs 1024MHz @ 1200mV on Arc
Alone these things make it better.
And you forgot to mention that Arc's screen reacts faster/better.
Couldn't type really fast on my X10, it didn't register some touches then. On my Arc I can type really, really fast.
All these things make it better, BUT the Arc is 1 year newer than the X10!
In 1 year hardware makes huge steps forward.
Sent from my LT15i using XDA Premium App
Click to expand...
Click to collapse
It reacts better because it has better fps and multi touch...
Sent from my X10i using Tapatalk

adham3322 said:
So is there any way to overclock rhe whole chip on stock 2.3.3
Click to expand...
Click to collapse
Custom kernels... We wait for semc sources...
Sent from my X10i using Tapatalk

Already overclock CPU speed to 1.3GHz, install custom ROM if you want it.
The differences between the X10 and Arc are CPU, RAM, GPU. Same CPU name with different code 200-205, RAM: 384/512, GPU is included on CPU so new CPU technology mean better GPU. That why everything on ARC is smoother than X10. You must know exactly what you are talking about, not only the value of processor.

The x10 is pretty damn smooth now, I get all day usage from it, constantly on WiFi. The people complaining must have forgotten what 1.6 was like, it's night and day now.
Sent from my X10i using XDA Premium App

Related

JP7 at Linpack

Found the following entry at: http://www.greenecomputing.com/apps/linpack/linpack-by-device/ (Samsung Galaxy S - Position 6)
1000.0MHz samsung/GT-I9000/GT-I9000/GT-I9000:2.2/FROYO/XXJP7:user/release-keys
I know thats possible to set the note manually, but maybe JP7 is inofficially out now.
Is there a download available?
See also: http://samsunggalaxysforums.com/showthread.php/1268-you-will-hate-me-for-this-but...
hmmm our galaxy benchmarks seems bit slow compare to desire doesn't it?
Think Froyo with JIT enabled will boost up our galaxy.
JIT is not working in JP1/2/3.
http://androidandme.com/2010/05/news/jit-performance-boost-coming-with-android-2-2/
Why do u think its not working?Look at the scores of Droid 2 which comes with FROYO. It has the same scores as SGS. Looks like the jit is more optimised fro qualcomm CPUs.
Aery said:
JIT is not working in JP1/2/3.
Click to expand...
Click to collapse
In fact, it's disabled in JP1 and JP2, but it is enabled in JP3.
Mikulec said:
Why do u think its not working?Look at the scores of Droid 2 which comes with FROYO. It has the same scores as SGS. Looks like the jit is more optimised fro qualcomm CPUs.
Click to expand...
Click to collapse
I was just thinking the same thing..
Mikulec said:
Why do u think its not working?Look at the scores of Droid 2 which comes with FROYO. It has the same scores as SGS. Looks like the jit is more optimised fro qualcomm CPUs.
Click to expand...
Click to collapse
I can understand it is more optimized for qualcomm's cpus but there is a 100% difference in the score.
Seems a bit odd to me.
Mikulec said:
Why do u think its not working?Look at the scores of Droid 2 which comes with FROYO. It has the same scores as SGS. Looks like the jit is more optimised fro qualcomm CPUs.
Click to expand...
Click to collapse
Sigh. Damn the N1.
More optimized for Qualcomm CPUs??? Isn't the CPU part of both the Samsung Hummingbird and the Snapdragon in essence the same ARM11 Cortex-A8? Yes they have some minor differences but please, nearly 3 times difference????? I'm beginning to think that Linpack is favoring Qualcomm chips!
And something else! How the hell did the Nexus One score 78 MFLOPS ?!?!?!?!?!?!?
Lets get back to the more important thing - what about the JP7, fake or real?
All I can say is that the snapdragon CPU and Hummingbird are built on the same architecture so they should perform quite close to each other.
Only thing that will prevent it from such a thing are drivers. But just wait I think Samsung will show the true power of the device with 2.2 soon.
they all based on coretex a8 more or less, if that what you say.
like Motorolas using TI OMAP 3 dont get high scores, so is galaxy s.
never get 30MFlops.
There is another thread somewhere that mentioned jp7 so yes i think its out in the wild somewhere.
I dont know , but maybe VFP (FPU hardware) functionality enabled in 2.2 snapdragon .
(limpack is FPO intensive).
For example: (in s3c6410 arm v6)
(mod enables VFP)
Average Linpack scores :
Stock: ~2.8 - 3.0
Stock w/ JIT: ~4.5
Stock w/ this mod: ~4.6
Stock w/ Jit & this mod: ~7.5 - 7.7
http://forum.sdx-developers.com/android-2-1-development/arm11-optimized-libdvm-so-3587/
edit: "(limpack is FPO (floating point operations) intensive)" instead "(limpack is FPU intensive)"
I'm hoping Samsung has put a performance-leach on the test versions because they want to blow peoples minds once they release it properly.
It's a bit of a stretch though...
pepitodequetequejas said:
(limpack is FPU intensive).
Click to expand...
Click to collapse
If true, this makes linpack a very very poor benchmark for phone speed... Nothing really uses the FPU on a phone besides for graphics, and that is all done by the graphics chipset and not the CPU anyway.
RyanZA said:
If true, this makes linpack a very very poor benchmark for phone speed... Nothing really uses the FPU on a phone besides for graphics, and that is all done by the graphics chipset and not the CPU anyway.
Click to expand...
Click to collapse
Yes , my slower (800Mhz armv6 wm) i8000 can easily beat my SGS in
FPO per second with chainfire moded libraries ( enables VFP ).
http://en.wikipedia.org/wiki/LINPACK
Pardon my language, but the benchmarks made me cry WTF is up with Samsung. but then Linpack may have a multiplier x2 if it sees Motorola as the brand name or something. LOL
After All why do we care about Linpack score?
OrionBG said:
After All why do we care about Linpack score?
Click to expand...
Click to collapse
Because the kind of geeks who read a forum like this our obsessed with benchmarks and anything they can point to to say they've got the best geek-toy on the planet currently!
MomijiTMO said:
Sigh. Damn the N1.
Click to expand...
Click to collapse
Yeah, damn the N1. LOL. Seriously, I have one, and it's such a love/hate relationship:
Perfect form factor/size/weight (well, wouldn't mind a 4" screen).
Great button layout.
Quality build.
Gets the latest greatest Google sh*t first.
Horrible touchscreen - almost impossible for me to type anything on it.
Horrible red bias to the screen/display.
Horrible pink dot in pictures.
Unusable in sunlight.
So close to the perfect phone, but not quite.
Same can be said for our SGS, though. Hardware pretty damned sweet. Screen accurate and beautiful and useable in sunlight, weight perfect, size perfect, works in sunlight. Of course, out of the box it's almost unusable because of the lag! At least the SGS can be fixed with software for the most part. The N1's flaws are hardware and permanent. I'm not even sure HTC attempted to fix any of them in later builds of the phone.

[CLOSED/DZ] HTC Desire Z (G2 w/ Sense)

HTC just announced the Desire Z version of their "Vision" handset that is identical to the G2. It runs a newer version of HTC Sense that's got a lot of nice new enhancements. I expect G2 owners will want to port this in a hurry.
Desire Z will be available in October in Europe and Asia only.
Check out this "Fast Boot" --- boots in like 4 seconds::: http://www.youtube.com/watch?v=OpWPMens9C8
YouTube video:
http://www.youtube.com/watch?v=ERH9BoU64pw
http://www.youtube.com/watch?v=NXnCOhvoqgw
http://www.youtube.com/watch?v=2FAIlKN2u5M
http://www.youtube.com/watch?v=XwiNDsZNRag
Specifications (from here):
800 MHz Qualcomm Scorpion
Qualcomm MSM7230
2x faster than Snapdragon CPU
5x better than Snapdragon GPU
Android 2.2 w/ HTC Sense
3.7 inch display
800 x 480 resolution
SLCD technology
5 megapixel camera
LED flash
720p HD recording
HSPA+ integration
WiFi b/g/n
Size: 4.7 x 2.4 x 0.6 inches
Weight: 6.5 ounces
Battery: 1300 mAh Lion
Talk time: up to 6.5 hours
Standby time: up to 17.5 days
Pop-out QWERTY keyboard
Four rows
Programmable “quick access” keys
4GB of internal memory
Full aluminum body
HTC Fast Boot
Pre-installed 8GB microSD card
GPS with A-GPS
DLNA Streaming and Media Sharing
3.5 mm stereo audio jack
micro-USB (5-pin micro-USB 2.0)
Audio supported formats – Playback: .aac, .amr, .ogg, .m4a, .mid, .mp3, .wav, .wma (Windows Media Audio 9)
Video supported formats – Playback: .3gp, .3g2, .mp4, .wmv (Windows Media Video 9)
Ambient light sensor
Proximity sensor
Digital compass
That fast boot is crazy. I'm loving the phone, definitely fast. However the phone looks VERY thick.
so its called "HTC fast boot". I wonder if that means the G2 won't have it. I'd be a lot more likely to turn my phone off and save battery if I knew I could turn it back on that quick
The Desire Z has 1.5GB of internal memory from what I've read at Engadget, not 4.
Infact, your source also says 1.5GB internal memory.
Where are they getting that the CPU is 2x faster than Snapdragon? For all we know it's the same CPU shrunk to 45nm, so it should be the same speed.
Its possible HTC said that at the event? I'm just speculating.. I would be happy if its 2x faster.. lol.
dualityim said:
Where are they getting that the CPU is 2x faster than Snapdragon? For all we know it's the same CPU shrunk to 45nm, so it should be the same speed.
Click to expand...
Click to collapse
a lot of the performance gain is from the Adreno 205 GPU as opposed to the older Adreno 200 GPUs on the Desire, HD2 and Evo. The Desire HD, however, also has the Adreno 205.
Superfrag said:
Its possible HTC said that at the event? I'm just speculating.. I would be happy if its 2x faster.. lol.
Click to expand...
Click to collapse
Not really sure if that equates to better performance, but it is more efficient.
If you think about it, past 1st gen snapdragon phones had very poor GPUs which made the CPU look slower. Maybe the upgraded GPU adds speed to an otherwise speedy CPU.
That could be true. Maybe the GPU in earlier Snapdragons was bottlenecking the CPU. Makes sense.. I also think these can be OC'ed more without issues due to the 45nm architecture.
but will the 1300mah battery life survive?
I think it should be better. The shrinkage in die size results in a more efficient CPU, which emits less heat and consumes less power. If it was 1500mAh it would have been super, but I think 1300 mAh will be pretty good.
Maybe, just maybe HTC might release 1500 batteries, which will be sold separately. Or we might have to turn to aftermarket battery makers to satisfy our needs! (obviously not the extended version, same size but higher mAh)
Superfrag said:
That could be true. Maybe the GPU in earlier Snapdragons was bottlenecking the CPU. Makes sense.. I also think these can be OC'ed more without issues due to the 45nm architecture.
Click to expand...
Click to collapse
I've heard that the MSM7x30 is actually the new 1Ghz Snapdragon underclocked at 800Mhz. Apparently that info is on the qualcomm website somewhere. If true then hell yeah the CPU can be overclocked, at the expense of battery power though.
Sent from my SGH-T959 using XDA App
The video on youtube titled "htc desire z - a closer look" is pretty amazing (i can't post links as i'm too new to this site). i'm just wondering how much is sense and how much will be on the g2...
skulk3r said:
a lot of the performance gain is from the Adreno 205 GPU as opposed to the older Adreno 200 GPUs on the Desire, HD2 and Evo. The Desire HD, however, also has the Adreno 205.
Click to expand...
Click to collapse
Notice that those specs specifically mention that the CPU is 2x faster and the GPU is 5x faster, so when they say the CPU is 2x faster, they are not talking about the performance boost gained from using the Adreno 205, they are specifically referring to a faster CPU core. That's what I'm wondering about, the GPU is faster for sure, but all evidence points to a CPU that performs the same (or worse, due to the 200MHz slower clock) but shrunk to 45nm for better power efficiency.
You guys think the g2 will get dlna functions?
We don't know. AFAIK, the Desire Z doesn't have DLNA, nor does the Desire HD. All we can do is wait for it to be released, or reviews of it to come out.
sheek360 said:
You guys think the g2 will get dlna functions?
Click to expand...
Click to collapse
Superfrag said:
We don't know. AFAIK, the Desire Z doesn't have DLNA, nor does the Desire HD. All we can do is wait for it to be released, or reviews of it to come out.
Click to expand...
Click to collapse
The Desire Z seems to have DLNA, according to HTC. It's around 4 mins into the promo video: http://www.youtube.com/watch?v=ly4C0TXmnlQ
ppl... clock speed is like horsepwoer in a car. its not the only determinant of performance. other stuff matters in a car right? like aerodynamics, weight, transmission etc? same for a chip. engineering matters. 2nd gen snapdragons are 2nd gen for a reason... they're better
Sent from my HTC Dream using XDA App
sheek360 said:
You guys think the g2 will get dlna functions?
Click to expand...
Click to collapse
Probably not. I don't see t-mobile digging into their pockets for that.
j.books said:
ppl... clock speed is like horsepwoer in a car. its not the only determinant of performance. other stuff matters in a car right? like aerodynamics, weight, transmission etc? same for a chip. engineering matters. 2nd gen snapdragons are 2nd gen for a reason... they're better
Sent from my HTC Dream using XDA App
Click to expand...
Click to collapse
You don't understand, Snapdragon is not a chip, it's a chipset or SoC. The CPU in Snapdragon is called Scorpion, and the G2 supposedly uses the same CPU, except manufactured in a 45nm process rather than a 65nm process. So except for the clock speed, we shouldn't expect any changes in the performance of the CPU.

[Q] Which one is faster ... HTC Desire Z or HTC Desire HD ?

Hi
Can any one tell me which device is faster
HTC Desire Z or HTC Desire HD
and if want one of them which one you will take
an9093 said:
Hi
Can any one tell me which device is faster
HTC Desire Z or HTC Desire HD
and if want one of them which one you will take
Click to expand...
Click to collapse
this seems a strange question
Desire Z = 800mhz with 512mb RAM
Desire HD = 1000mhz and 768mb Ram
So the HD is clearly faster.
Writing a Facebook status or anything else is faster on the HD because you don't have to pull out the keyboard
Because of the hardware specs I think the HD has more power and is faster, but we have to wait for the first reviews... ?!
Same CPU and GPU but Desire HD's is clocked higher so that wins that.
Bigger screen = easier menu acses = quicker
MacaronyMax said:
Same CPU and GPU but Desire HD's is clocked higher so that wins that.
Bigger screen = easier menu acses = quicker
Click to expand...
Click to collapse
I always thought that if the device has same cpu n gpu the the one with he smaller screen will be faster as same gpu has less real estate to render...
arnozzle said:
I always thought that if the device has same cpu n gpu the the one with he smaller screen will be faster as same gpu has less real estate to render...
Click to expand...
Click to collapse
Except in this case these two devices have different CPUs
chief1978 said:
this seems a strange question
Desire Z = 800mhz with 512mb RAM
Desire HD = 1000mhz and 768mb Ram
So the HD is clearly faster.
Click to expand...
Click to collapse
Sent from my HTC HD2 using XDA App
arnozzle said:
I always thought that if the device has same cpu n gpu the the one with he smaller screen will be faster as same gpu has less real estate to render...
Click to expand...
Click to collapse
Quite apart from the higher clock speed mentioned I'm not sure this is true, it's not like it has to render more pixels, same resolution = same real estate, just blown up a bit. Maybe I'm wrong.
The diference is if the resolution would be smaller... for example.
HTC Wildfire 1 GHz with 240x320 4.3" (i know it has 600MHz, but if we put the sam CPU and the 3.2" size)
HTC Desire HD 1 GHz with 480x800 4.3"
Wildfire would have at least 30% better performance... Its all in pixels not dimension of the LCD
TNStrangelove said:
Quite apart from the higher clock speed mentioned I'm not sure this is true, it's not like it has to render more pixels, same resolution = same real estate, just blown up a bit. Maybe I'm wrong.
Click to expand...
Click to collapse
You're correct.
Seriously????
Are you really all answering this not so intelligent question? REALLY???
For the guy, who is asking this:
Use google!!!
Do research!!!
Search the forum!!!
Find CPU model numbers, about nm's and gpu's - START TO LEARN.
This is a WAY to dumb question (pardon me) to be asked in a new thread. If you are asking such questions, I really advice you to NOT get this phone, cuz it's gonna get SO MUCH MORE DIFFICULT than you would ever expect, judging from simplicity of your question.
Stop living under a rock.
Sorry for my behavior, it's just that I get a little enraged from time to time. It's just a matter of time, that someone asks something so basic as: Which is brighter - dual LED flash or single LED flash, and which one would you get?
Isn't the Desire Z fitted with a newer CPU than the Desire HD? If so, it could be more powerful even if the clock speed is lower, especially since I just looked up the Desire Z's CPU and it says it's recommended top clock speed is 1GHz - I give it a few weeks after release at the most until the speed's upgraded with a custom kernel.
There are both 2nd generation 45nm processors with the same Adreno 205 GPU...only the Desire HD is clocked more on 1GHz and Z is at 800MHz
MaybachMan said:
Isn't the Desire Z fitted with a newer CPU than the Desire HD? If so, it could be more powerful even if the clock speed is lower, especially since I just looked up the Desire Z's CPU and it says it's recommended top clock speed is 1GHz - I give it a few weeks after release at the most until the speed's upgraded with a custom kernel.
Click to expand...
Click to collapse
Well, given both phones are being released at the same time, they both have the same model of CPU. Really man. Why would HTC give the Desire HD a worse processor than the Desire Z...
Smartmob said:
There are both 2nd generation 45nm processors with the same Adreno 205 GPU...only the Desire HD is clocked more on 1GHz and Z is at 800MHz
Click to expand...
Click to collapse
Does that mean we can over clock the Z to 1ghz?
What about RAM that would make a difference too.
questions questions...more is better...thats it. Numbers are numbers...
You can probably push him to some limit, but then you can push DHD even more then Z...
MaybachMan said:
Isn't the Desire Z fitted with a newer CPU than the Desire HD? If so, it could be more powerful even if the clock speed is lower, especially since I just looked up the Desire Z's CPU and it says it's recommended top clock speed is 1GHz - I give it a few weeks after release at the most until the speed's upgraded with a custom kernel.
Click to expand...
Click to collapse
No, both CPUs are next gen snapdragons. In fact I believe they are the exact same chip model, the desire Z is just under-clocked . It should be very easy to over-clock but that would just make them equal.
Yeah, thought it made more sense to give them the both CPU TBH, I'd just read otherwise on other threads.
Smartmob said:
questions questions...more is better...thats it. Numbers are numbers...
You can probably push him to some limit, but then you can push DHD even more then Z...
Click to expand...
Click to collapse
Not necessarily if the Z is using the same CPU under-clocked, then theoretically you should be able to over-clock them to the same amount.
Opel Astra 1.9 CDTI 100KS and Astra 1.9 CDTI 150KS... Are they the same only chiped? No, there are some minor diferences in the engine too...
For those of you that haven't seen yet... someone has overclocked a G2 to 1.4ghz.

Desire z processor is not that power

All people calming that desire z processor @800mhz scores better in benchmarks than overlooked snapdragon ,this is not true when Iam on desire z Rom I underclocked my hd2 to 806.4 mhz (same as dz) and I got 1512 score from the first time same as Dz with its perfect GPU , SO OUR CPU PERFORMS BETTER ( you can try it yourself ) Iam not a liar ,I think the improved performance is in THE ROM itself not in the processor
Sent from my HTC HD2 T8585 using XDA App
You have to remember that the benchmarks (quadrant, linpack etc) are all synthetic, like 3dmark back in the day for pc graphics cards. There are so many things that can affect your scores both adversely and positively that they should only be used as a very rough guideline and nothing more. Direct comparisons are all but pointless.
Reno_79 said:
You have to remember that the benchmarks (quadrant, linpack etc) are all synthetic, like 3dmark back in the day for pc graphics cards. There are so many things that can affect your scores both adversely and positively that they should only be used as a very rough guideline and nothing more. Direct comparisons are all but pointless.
Click to expand...
Click to collapse
Iam talking to people who take quadrant as a prove for performance , I know that its results are not accurate
Sent from my HTC HD2 T8585 using XDA App
I think you should use Quadrant Advanced to compare cpu scores, I know I/O scores help our HD2 a lot.
Even software affects quadrant cpu scores so it is not reliable. Quadrant benchmarks h264 decoding performance as part of cpu benchmark and for example having stagefright driver enabled inflates cpu score by double! Disable stagefright and your cpu will score 400 instead of 800. (in quadrant "advanced") if you use better software decoder it will affect cpu score by large amount. And rebenchmarking produces higher results because of caching. Mips calculating benchmarks are better (like the one in setcpu)
Sent from my HTC HD2 using XDA App
That might be true but the Desire Z/G2 has a co-processor for apps that we don't have.
Sent from my HTC HD2 T8585 using XDA App
psykick5 said:
That might be true but the Desire Z/G2 has a co-processor for apps that we don't have.
what is the coprocessor , desire z have same scorpion core like hd2 only with 45n.m tech
Click to expand...
Click to collapse
I'd say it has a better processor... it just got overclocked to 1.4 Ghz.
http://www.androidcentral.com/t-mobile-g2-overclock-gets-even-better-and-released
wow that makes me want it
RobertsDF said:
I'd say it has a better processor... it just got overclocked to 1.4 Ghz.
http://www.androidcentral.com/t-mobile-g2-overclock-gets-even-better-and-released
Click to expand...
Click to collapse
Your battey will say thank you. Your chipset too. This phone is not made for such things. That won't last very long I think. But it is quite impressive, seems to be veeeeery fast
JanssoN said:
Your battey will say thank you. Your chipset too. This phone is not made for such things. That won't last very long I think. But it is quite impressive, seems to be veeeeery fast
Click to expand...
Click to collapse
+1.. imagine, a 75% overclock?? i wonder if the carpet has any burn marks on them when he lifts the phone up, or if his face has any burn marks that's why he's not showing it up on the cam.. lol.. because a 75% OC on a very small device where there is not enough room to breathe, the whole phone would be like a big heatsink if used for a period of time.. and i guess that's also the reason why HTC slapped the 800mhz cpu instead of the 1ghz.. i'm our HD2 can achieve such high of an OC, but it wouldn't be adviceable as it would melt/crack the solder joints on the GPU and/or processor of the phone at that kind of heat.. and i believe that the GPU and apllication coprocessor that they're talking about on the G2 is just a marketing ploy to justify it's price tag.. maybe to cope up with the build price since there are moving parts (hinge) and the hard keyboard.. even the guy at the tmobile store told me that the G2 isn't fast at all.. he said it's nothing close to evo or the nexus one as some people and websites claims.. funny when he asked my what kind of phone do i have.. i pulled my HD2 and showed it to him.. he was surprised to see Android on it and asked me if he could play with it.. so i let him.. and after playing with it for a while, he advised me to wait for the new phone device that's supposed to come out before the end of the year.. he even told me that getting a G2 would be the same as downgrading as he feels that my HD2 is way way faster than the G2.. i told him i'm thinking about getting the vibrant because the port for our HD2 is nothing close to being perfect and that it's still running from the SD card.. again he discouraged me and told me to wait for the next phone device to come.. so i guess that's what i will do..
I'm surprised that you were in store and didn't test drive G2 for yourself, are you sure he is sale person?, he didn't sound like one.
justwonder said:
I'm surprised that you were in store and didn't test drive G2 for yourself, are you sure he is sale person?, he didn't sound like one.
Click to expand...
Click to collapse
as a matter of fact i did.. that's actually one of the main reason why i went to the store, as i've been reading a lot of good things on G2.. i went there to compare the G2 with the samsung vibrant.. but through the end, i didn't like the G2's performance despite the fact that it's the only phone right now on TMo that supports the HSPA+.. and yes he's a sales person.. i was surprised as well when he told me about the upcoming desire HD.. but that didn't happen until i showed my HD2 to him and let him play with it for a while.. maybe he knows that i'm a phone enthusiast and that i might just end up returning the phone within the 30 days period after playing with the G2.. who knows?? i think the G2 is wayyy overrated.. it performs within it's specs, nothing special..
RobertsDF said:
I'd say it has a better processor... it just got overclocked to 1.4 Ghz.
http://www.androidcentral.com/t-mobile-g2-overclock-gets-even-better-and-released
Click to expand...
Click to collapse
Desire Z overclocks to exactly same speeds. Record OC is 1470MHz.
So they are same CPUs, clocked at different speeds. Bigger screen = higher clock to handle bigger screen.
EDIT: Desire Z overclocks to 1.7GHz at 1400mv
EDIT: Desire Z overclocks to 1.9GHz at 1500mv

SGSIII Mali 400 Drivers on the note!

The folks at the HTC Sensation/EVO 3D section extracted the Adreno 225 drivers from the HTC One S, as some of you may know that the Adreno 225 is the same as the Adreno 220 GPU but just have double the frequency! the frequency has nothing to do here if you ask, using these drivers gave them a HUGE performance boost with the STOCK frequency
as we know that the Mali 400 GPU at the SGSIII is clocked at 400mhz but even if you clocked your Mali 400 GPU in your Note (which has the same Resolution) you wont be able to reach that performance which tells me that its all about the drivers just like the Adreno 225
So can the Developers extract the Mali 400 Drivers from the SGSIII so we can use it on our phones?
This is not a question so i think it belongs to here not the Q/A section as its just a discussion if this is going to work or not!
Same driver, bigger screen = performance loss.
That is why Sammy set CPU 200 Mhz faster on Note over S2.
Screen has NOTHING to do with anything the Resolution does, which is the same in the SGSIII and the Note
Also that's why i said if you overclock the GPU to 400mhz you still wont reach that performance so it has to do with the Drivers
The note and SGSIII do indeed have different different screen resolutions, the Note being at 1280x800, while the SGSIII is at 1280x720. not much of a difference though, basically 16:10 vs 16:9, respectively. I believe the new Mali400 Drivers will be in the next ROM update anyway.
Hell Guardian said:
The folks at the HTC Sensation/EVO 3D section extracted the Adreno 225 drivers from the HTC One S, as some of you may know that the Adreno 225 is the same as the Adreno 220 GPU but just have double the frequency! the frequency has nothing to do here if you ask, using these drivers gave them a HUGE performance boost with the STOCK frequency
as we know that the Mali 400 GPU at the SGSIII is clocked at 400mhz but even if you clocked your Mali 400 GPU in your Note (which has the same Resolution) you wont be able to reach that performance which tells me that its all about the drivers just like the Adreno 225
So can the Developers extract the Mali 400 Drivers from the SGSIII so we can use it on our phones?
This is not a question so i think it belongs to here not the Q/A section as its just a discussion if this is going to work or not!
Click to expand...
Click to collapse
Well , if they are exactly the same just different clock speeds then I would think they should work indeed.
This is interesting and I certainly hope it does , not that at 400mhz or even less, the GPU is lacking but who does not like more performance for free?
Muskie said:
The note and SGSIII do indeed have different different screen resolutions, the Note being at 1280x800, while the SGSIII is at 1280x720. not much of a difference though, basically 16:10 vs 16:9, respectively. I believe the new Mali400 Drivers will be in the next ROM update anyway.
Click to expand...
Click to collapse
I know that but that deference is not major by any mean to effect the performance that much is they are both have the same frequency
shaolin95 said:
Well , if they are exactly the same just different clock speeds then I would think they should work indeed.
This is interesting and I certainly hope it does , not that at 400mhz or even less, the GPU is lacking but who does not like more performance for free?
Click to expand...
Click to collapse
My thoughts exactly, If they folks at the Sensation did it, why can't we?
Link of the Drivers that got extracted from the One S
http://forum.xda-developers.com/showthread.php?t=1643472
Just check the replies to see the performance boost, This is the EXACT same situation as the Note and the SGSIII GPU
Wow, that's a good boost.
nex7er said:
Wow, that's a good boost.
Click to expand...
Click to collapse
I think if the Note users can have that kind of boost on their phones that will eliminate ANY kind of lag in the UI and it i will be Amazingly smooth it will also give huge boost to the SGSII users
if this really happened and it does work, what about the battery-life... can be poorer i think
In theory, I see where you're going with this, and in theory it sounds plausible. However, something that I think has been overlooked is the process design of the new S3's chipset vs the ones found in the current generation S2/Note (45nm vs 32nm). It's entirely possible that the only reason why Samsung is able to run the Mali-400 at 400mhz is due to the fact that the 32nm process is just that much more efficient, such that you can safely run at 400mhz using the same power as you would running at 266mhz on the 45nm process.
I just get the feeling that trying to push the 45nm process up to 400mhz might simply melt the silicon (or at least gobble your battery life in one gulp!). Call me defeatist if you have to, but I remain skeptical until I see evidence to the contrary.
I run my galaxy nexus with the GPU clocked to 512mhz (standard is 308mhz), and that cpu too uses the 45nm process.
Been running it like that for the last 3 months with no issue, and game fps is greatly improved.
Is there any kernels at all that even support over clocking the GNote gpu?
Very interesting, Would like to see this being investigated further for sure!
screen has nothing to do with it...on note we got 100k more pixels 1280x800-1280x720=100k
,,, and s3 has more cores in the mali-gpu...but yea i think the drivers would get us more performance
lyp9176 said:
if this really happened and it does work, what about the battery-life... can be poorer i think
Click to expand...
Click to collapse
The sg s3 seems to have a decent battery life
resistant said:
screen has nothing to do with it...on note we got 100k more pixels 1280x800-1280x720=100k
,,, and s3 has more cores in the mali-gpu...but yea i think the drivers would get us more performance
Click to expand...
Click to collapse
After some digging I found that the GPU In Exinos 4210(SGS2/Note) and 4412 (SGS3) is absolutely the same Mali 400MP4 (same number of GPU cores)! The only difference is that the 4412 GPU Can Go up to 400MHz (which is doable to our GPU too and have been done to the SGS2 already). The main difference here are the four CPU cores that help the GPU. I'm skeptical that the new drivers will do much (if at all) in terms of performance! Oh and lets not forget that the Adreno GPU Drivers are written by Qualcomm and they can't do anything right so the updated drivers may just be better written (or at least less buggier) than the old ones!
Manya3084 said:
I run my galaxy nexus with the GPU clocked to 512mhz (standard is 308mhz), and that cpu too uses the 45nm process.
Been running it like that for the last 3 months with no issue, and game fps is greatly improved.
Click to expand...
Click to collapse
It has been proved to make very little improvement over a well developed kernal. Hence why developers like Franco and imyosen took it out.
Game frame rate is simply due to force gpu being active
Sent from my GT-I9300 using xda premium
Mahoro.san said:
The sg s3 seems to have a decent battery life
Click to expand...
Click to collapse
That is due to the new processor voltage and the low idle drain of the CPU
Sent from my GT-I9300 using xda premium
GR36 said:
It has been proved to make very little improvement over a well developed kernal. Hence why developers like Franco and imyosen took it out.
Game frame rate is simply due to force gpu being active
Sent from my GT-I9300 using xda premium
Click to expand...
Click to collapse
This during kernel development in the gingerbread days or the new current ics kernels?
Sent from my Galaxy Nexus using Tapatalk 2
May be...
Clocking the GPU at 400Mhz would give a boost in performance but at the cost of battery life....and also making the phone really hot....which is not idle...just wait a little while and see how will s3 perform under those conditions...

Categories

Resources