News on the Wizard - 8125, K-JAM, P4300, MDA Vario General

Ok here is a quick round up of the news that is currently available on this device. For anyone who doesnt know the wizard looks like it is going to be the follow up to the magician. Size wise it is almost identical except the wizard looks like it is going to be a touch thicker than the magician. However this is forgiven as this extra thickness is due to the addition of a sideways slide out keyboard (see pic) This will please many owners of the magician who have found text input a little to fiddly on this device. It is rumoured that the device will be made available in a number of different versions i.e. some with a camera some without etc. However it is generally thought that the specs for the various models run along the lines of a 200MHz OMAP processor (hmm i hope that is quicker than it sounds?! :shock: ), GPRS/EDGE, built in WiFi and Bluetooth. The scrren size is 240x320 and it runs on WM5!
It is already expected that the device will be sold under various different brandings. Many of the pics floating around the internet show the device branded as Qtek 9100. It is also almost 100% confirmed that orange will offer the wizard as the SPV M600 and it could hit the shelves as early as september!!! No doubt the other opperators will also offer the phone under various different names!
Phone Scoop reports that the HTC Wizard has passed FCC certification, and the submitted version has 850 and 1900MHz support which is good news for those in the us.
Pics are courtesy of WindowsMobile.no and there are many more available HERE
And there is a video available HERE which is also thanks to those good people at WindowsMobile.no
Enjoy:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Tom 8)

but it's cpu speed is 175mhz.

Hmm Edge ... No use in UK then for Imternet speed

There are also pictures of a CDMA version. Sprint is calling it the PPC-6700 and it should be out before the end of October.
Report with pics on PCSintel.

SaltyDawg said:
There are also pictures of a CDMA version. Sprint is calling it the PPC-6700 and it should be out before the end of October.
Report with pics on PCSintel.
Click to expand...
Click to collapse
PhoneArena has an article on the CDMA implying a fast Intel processor, not the slow TI everyone is concerned about:
http://phonearena.com/htmls/HTC-Apache-the-CDMA-version-of-the-Wizard-article-a_643.html

i'm gonna cry
see my sig? i thought it was enough compensation that the wizard has a slow processor - won't make me sad that i didn't wait for it. now this CDMA version will really make me cry. BTW is CDMA = 3G? currently doing a google search on this. and how comparable is its size compared to the magician/non cdma-wizard? i mean, the size might change because of the antenna, but i want to know by how much.
(i forgot one more thing, hence the editing)
would've been excellent if only the screen slides out, so it looks like the sidekick. awkwardly off-center screen. but i can live with that.

CDMA itself is not 3G, no. But the CDMA carriers in the USA have EVDO, which is a 3G technology. So, CDMA is not 3G, but if it's CDMA in the USA it means it will be 3G, assuming it supports EVDO.
On the size, look at the picture showing it next to the Samsung i730. It's the same size. Too bad it has a big, fat antenna. But not counting the antenna, it's the same size as the Samsung i730.

http://gullfoss2.fcc.gov/prod/oet/forms/blobs/retrieve.cgi?attachment_id=555794&native_or_pdf=pdf
Nice to know

yeah it looks good, i may replace my mda compact, but the processor is the only thing at tho that may let the device down :?:
info on the omap850 chip
http://focus.ti.com/general/docs/wt...00&path=templatedata/cm/product/data/omap_850

Rumor has it that the CDMA version (Sprint PPC-6700) does have EVDO, which (like SaltyDawg said) is 3G.

when it release?
So sounds the slow processor not gonna to support Skype then? as Skype has to run on at least 400Mhz CPU...
ANy idea when it gonna to release? cheers

question for those who really know: would the omap 850 really be uber slow compared to intel's xscales? i mean doesn't the omap works like 'dual processors'? one for phone and one for pda function (i read this somewhere just can't remember it) and doesn't it consume less power? any insight would be appreciated.

bnycastro said:
question for those who really know: would the omap 850 really be uber slow compared to intel's xscales? i mean doesn't the omap works like 'dual processors'? one for phone and one for pda function (i read this somewhere just can't remember it) and doesn't it consume less power? any insight would be appreciated.
Click to expand...
Click to collapse
It depends on which Xscale are we talking about. Compared to the outgoing [email protected] MHz, well... I can assume its slowness is marginal, but compared to the existing PXA27x... It's a no contest. AFAIK, anything OMAP in three digit equals to bad anything... It's not even dual core, like everyone believed it is.
Again, AFAIK, the dual core OMAP uses 4 digit naming scheme, like the 1510 on the iPAQ 6300 series and the Tungsten T. Just check it at the official Texas Instrument site.
I am very much hoping that HTC Wizard will replace my Treo 650... I have bad memories with my Blue Angel almost a year ago... But since then, I couldn't live without integrated QWERTY keyboard. The only replacement good enough for me is the iPAQ 6500 series, but I can trek places where no PDA-Phone ever gone before with my Treo, as it has 1800 mAH battery capacity.
This processor fiasco has me think again about buying the Wizard. But we shall see in the near future about the truth.

Hrm, yep... seems that it is only a single core: http://focus.ti.com/general/docs/wt...00&path=templatedata/cm/product/data/omap_850... Again, I think this is a non-issue since its only _really_ going to matter for things such as Skype where the phone AND the PDA side of things need to be working together. For stuff like web browsing, IM, planners etc... it should be more than enough (at least compared to a damned Symbian Nokia which runs at 103MHz).
In a way I see the slow processor as a blessing (though, watch as I eat my words when the thing actually arrives) - keeping it at a slower clock speed means the battery will last longer. Couple this with the fact that it uses WM5.0 which has persistent storage and this means we should be looking at a pretty good battery life, or they'll have provided us with a really lame battery to save money *shrugs*.

ShALLaX said:
Hrm, yep... seems that it is only a single core: http://focus.ti.com/general/docs/wt...00&path=templatedata/cm/product/data/omap_850... Again, I think this is a non-issue since its only _really_ going to matter for things such as Skype where the phone AND the PDA side of things need to be working together. For stuff like web browsing, IM, planners etc... it should be more than enough (at least compared to a damned Symbian Nokia which runs at 103MHz).
In a way I see the slow processor as a blessing (though, watch as I eat my words when the thing actually arrives) - keeping it at a slower clock speed means the battery will last longer. Couple this with the fact that it uses WM5.0 which has persistent storage and this means we should be looking at a pretty good battery life, or they'll have provided us with a really lame battery to save money *shrugs*.
Click to expand...
Click to collapse
I don't understand your comment here; the datasheet clearly shows that the OMAP 850 is a dual-core CPU. The "primary" core is an ARM926 which will be used to run the OS, apps, etc. The "secondary" core is actually a multi-core which contains an ARM7 coupled with a C54 DSP to do all GSM/GPRS baseband processing.
True, only a single core will be used for the OS, but with Xscale phone designs a single core handles BOTH the OS and the GSM processing.
On another note, it makes little sense to compare CPUs of two different architectures (Xscale and OMAP) solely by their clock speeds, as in many cases the amount of work that can be performed on one CPU in a given clock cycle is different than another CPU.
Given that the "primary" core on the OMAP 850 is dediciated 100% to the OS, along with it's dual-core nature allowing to simultaneously process the GSM baseband and the OS, I think this CPU is up to the task. I don't think it will outperform the Xscale (especially at higher clock speeds), but I don't think it's gonna be a dog either.

Hah, you're right... I read ARM7 MCU as ARM9 MCU and assumed it was just the Memory Control Unit for the ARM9 processor... then I looked again and saw that there is a memory control unit for both processors in another layer (that and the fact that a memory control unit in the GSM/GPRS controller makes no damned sense at all). My bad, I'm ill with a fever in bed at the moment so cut me some slack ;p
So yes, we have an ARM7 and an ARM926...
Even though it isnt technically fair to compare the processors on clock-speed alone, we still know the OMAP will lose out because Xscale is a technological advancement... plain "ARM9" will be slower.
I agree with you, I dont think it will be terrible. I made do with a 133MHz SH-3 device for a good few years... and SH3 processors were horribly slow.

my head is spinning but I do thank you for your informed replies! so it is dual core afterall... that's re-assuring I read in a preview in one of the dutch/swedish/netherlands (not sure which) ethusiast site that it was zippy... but then again they might have not loaded it with alot of stuff. I also saw the video at WM.no and it seemed zippy... really waiting for a full review from our friends with fast access to cool new gadgets.... yooohooo!

Yeah, thanks for the info.... I apologize then, I couldn't read out the schematics on the TI site, I just look at the navigation tab and saw that the OMAP 850 falls in the category of Modems and Applications, where as the OMAP 1510 falls in the category of High-Performance... So based on that I extract an assumption, TI reserved the dual core chip for their higher performing chipset.
I agree wholeheartedly with OMAP as the most efficient (power wise) proc available today though. But I'd prefer [email protected] powering the wizard... At least I have tried overclocking my Treo to 400MHz without any problem... I expect my next Windows Mobile powered device to be powerfull yet still manage a full day's of usage.
It's basically down to the usage of the device. Probably HTC did a research a while back that Magician user doesn't fully utilised their device to the max, so that HTC can cut production cost by a little for using the TI processors... Well, that's my two cent. At least all Magician user here in my country are ladies and girls who doesn't know how to reset the device... Left alone installed something... They just use the device as-is.

-
123

wow that's great news for you guys over there in the US! wish we had the same news here... but I'd suspect we'd be a bit late as usual!

Related

Is HummingBird Really Slower than Snapdragon Gen 2? [Independent of JIT]

I know this topic has been debates over time but I noticed that most people attributed the differences in performance is caused by firmware difference (2.1 vs. 2.2).
Today there's an article release about G2 overlock to 1.42 Ghz. Along with the article I noticed "Native Benchmark" using SetCPU which doesn't uses JIT.
Lower is Better.
G2 Result:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Now My Vibrant at 1.2 Ghz:
C: 702.89
Neon: 283.15
The difference between the two phone is so great that I doubt it is due to the 200 MHz difference alone.
As a comparison, my score at regular 1 GHz is:
C: 839.21
Neon: 334.51
There is about 130 ms decrease for 200 Mhz overclock, which is Vibrant is at 1.4 Ghz would put the two CPU really close to each other but with G2 having a slight edge. Remember this test is suppose to be JIT independent running Native Codes. But since the vibrant can only be stable overclocked to 1.3 Ghz (what is available anyways), the newer generation of Snapdragon may just be more efficient than Hummingbird, despite us the galaxy owner believes otherwise.
Another thing to keep in mind though, is that Snapdragon are supposedly to have an edge in Neon instruction Set, so I didn't look into that score too much.
It appears to be true.
It appears Hummingbird is not only slower than the new Generation Scorpions, it also appears the Hummingbird is unable to fully capture the CPU performance gain of the Dalvik JIT compiler in Froyo 2.2
http://www.youtube.com/watch?v=yAZYSVr2Bhc
Dunno Something Is Not Right About This 2.2
The Thing That Really Bugs Me Is 2.2 is Suppose To Allow The Full Functionality Of Our 512MB of Ram..But It Doesn't
Erickomen27 said:
Dunno Something Is Not Right About This 2.2
The Thing That Really Bugs Me Is 2.2 is Suppose To Allow The Full Functionality Of Our 512MB of Ram..But It Doesn't
Click to expand...
Click to collapse
It's not 2.2, its Samsung.
SamsungVibrant said:
It's not 2.2, its Samsung.
Click to expand...
Click to collapse
I agree, they should use ext 4 on their phones.
I don't see why they would stick to their old RFS.
SamsungVibrant said:
It appears Hummingbird is not only slower than the new Generation Scorpions, it also appears the Hummingbird is unable to fully capture the CPU performance gain of the Dalvik JIT compiler in Froyo 2.2
http://www.youtube.com/watch?v=yAZYSVr2Bhc
Click to expand...
Click to collapse
I'm sorry, but could you explain what your youtube link has to do with the topic? I'm curious, as I wasn't any wiser on the question at hand when I watched it.
NEON is architecture extension for the ARM Cortex™-A series processors*
Is Snapdragon an ARM Cortex™-A series processor? NO!
Remember SSE instruction set in Intel, and the war AMD vs Intel?
Welcome back, LOL
*The source for NEON: http://www.arm.com/products/processors/technologies/neon.php
Probably is, but does it really matter?
Sent from my SGS Vibrant.
Scorpion/Snapdragon have faster FPU performance due to a 128 bit SIMD FPU datapath compared to Cortex-A8's 64 bit implementation. Both FPUs process the same SIMD-style instructions, the Scorpion/snapdragon just happens to be able to do twice as much.
http://www.insidedsp.com/Articles/t...ualcomm-Reveals-Details-on-Scorpion-Core.aspx
2.2 isnt going to magically give the galaxy S similar scorpion/snapdragon high scores
just look at droidX and other Cortex-A8 phones that already have official 2.2 ROMS they avr 15-20 linpack scores
This doesn't make the hummingbird a bad CPU at all LOL its stupid benchmarks IMHO not going to show in realword use...maybe when the OS matures and becomes more complex but not now..and even by then we will have dualcore CPU's...its a gimmick for HTC to have the "Fastest CPU"
IMO in real world use they are pretty much on par but then when you look at GPU performance its quit obvious the galaxy S pulls ahead thanks to the 90mts PowerVR SGX540
demo23019 said:
Scorpion/Snapdragon have faster FPU performance due to a 128 bit SIMD FPU datapath compared to Cortex-A8's 64 bit implementation. Both FPUs process the same SIMD-style instructions, the Scorpion/snapdragon just happens to be able to do twice as much.
http://www.insidedsp.com/Articles/t...ualcomm-Reveals-Details-on-Scorpion-Core.aspx
2.2 isnt going to magically give the galaxy S similar scorpion/snapdragon high scores
just look at droidX and other Cortex-A8 phones that already have official 2.2 ROMS they avr 15-20 linpack scores
This doesn't make the hummingbird a bad CPU at all LOL its stupid benchmarks IMHO not going to show in realword use...maybe when the OS matures and becomes more complex but not now..and even by then we will have dualcore CPU's...its a gimmick for HTC to have the "Fastest CPU"
IMO in real world use they are pretty much on par but then when you look at GPU performance its quit obvious the galaxy S pulls ahead thanks to the 90mts PowerVR SGX540
Click to expand...
Click to collapse
Once again quoting ARM HQ website:
NEON technology is cleanly architected and works seamlessly with its own independent pipeline and register file.
NEON technology is a 128 bit SIMD (Single Instruction, Multiple Data) architecture extension for the ARM Cortex™-A series processors, designed to provide flexible and powerful acceleration for consumer multimedia applications, delivering a significantly enhanced user experience. It has 32 registers, 64-bits wide (dual view as 16 registers, 128-bits wide.
Click to expand...
Click to collapse
Scorpion is not ARM Cortex™-A series processor
Fuskand said:
I'm sorry, but could you explain what your youtube link has to do with the topic? I'm curious, as I wasn't any wiser on the question at hand when I watched it.
Click to expand...
Click to collapse
I provided the link, because the first part of the link talks about the JIT compiler which increases CPU performance. I put that there in-case someone has never heard of this before. Thus, when I mentioned the Hummingbird can not take full advantage of the JIT compiler, someone would know what I'm talking about.
demo23019 said:
Scorpion/Snapdragon have faster FPU performance due to a 128 bit SIMD FPU datapath compared to Cortex-A8's 64 bit implementation. Both FPUs process the same SIMD-style instructions, the Scorpion/snapdragon just happens to be able to do twice as much.
http://www.insidedsp.com/Articles/t...ualcomm-Reveals-Details-on-Scorpion-Core.aspx
2.2 isnt going to magically give the galaxy S similar scorpion/snapdragon high scores
just look at droidX and other Cortex-A8 phones that already have official 2.2 ROMS they avr 15-20 linpack scores
This doesn't make the hummingbird a bad CPU at all LOL its stupid benchmarks IMHO not going to show in realword use...maybe when the OS matures and becomes more complex but not now..and even by then we will have dualcore CPU's...its a gimmick for HTC to have the "Fastest CPU"
IMO in real world use they are pretty much on par but then when you look at GPU performance its quit obvious the galaxy S pulls ahead thanks to the 90mts PowerVR SGX540
Click to expand...
Click to collapse
Search the net, people have made real world Videos of galaxy s running 2.2, compared to G2. The G2 is faster in the real world on things like launching aps.
lqaddict said:
Once again quoting ARM HQ website:
Scorpion is not ARM Cortex™-A series processor
Click to expand...
Click to collapse
LOL i never said the scorpion is ARM Cortex™-A
try reading my post again
SamsungVibrant said:
Search the net, people have made real world Videos of galaxy s running 2.2, compared to G2. The G2 is faster in the real world on things like launching aps.
Click to expand...
Click to collapse
LOl if it is faster it might be by the most 1-2 sec if its lucky
sorry its going to take allot more than that to impress me..again its a phone now a highend PC
SamsungVibrant said:
Search the net, people have made real world Videos of galaxy s running 2.2, compared to G2. The G2 is faster in the real world on things like launching aps.
Click to expand...
Click to collapse
Due to different filesystem implementation largely, once there is a workable hack to convert the entire filesystem on the Galaxy S to a real filesystem you can make the comparison of the things like launching apps.
Demo, I didn't mean to come off as a **** I was just pointing out the flaw in the OP benchmark - NEON instruction set execution is flawed. G2 processor is ARMv7 which is the base of Cortex-A8, Cortex-A8 adds the instructions specifically targeted for application, like multimedia, and that's where NEON comes into place.
lqaddict said:
Due to different filesystem implementation largely, once there is a workable hack to convert the entire filesystem on the Galaxy S to a real filesystem you can make the comparison of the things like launching apps.
Click to expand...
Click to collapse
Agreed. +10 char
lqaddict said:
Demo, I didn't mean to come off as a **** I was just pointing out the flaw in the OP benchmark - NEON instruction set execution is flawed. G2 processor is ARMv7 which is the base of Cortex-A8, Cortex-A8 adds the instructions specifically targeted for application, like multimedia, and that's where NEON comes into place.
Click to expand...
Click to collapse
No problem I didn't really take it
Also noticed i overlooks allot of things in OP...blame the ADD
What difference does it make? In real world use the difference is negligible. And in three months our phones will be mid-tier anyway. At least we don't have hinges that will fall apart in two months.
how is the hummingbird not able to fully take advantage of JIT?
Well there is a fix for our phones now. And from what I can tell there no way the g2 can open apps faster than my vibrant with the z4mod. Its smocking fast.by far the fastest I've ever seen this phone. No delays whatsoever. Can't wait till I get froyo with ocuv and this will be unreal. I feel like this phone us a high end pc running android or something. When I say instant it's instant lol.
Kubernetes said:
What difference does it make? In real world use the difference is negligible. And in three months our phones will be mid-tier anyway. At least we don't have hinges that will fall apart in two months.
Click to expand...
Click to collapse
Exactly.
People seem to forget that the SGS line is like 6 months old now, we should be glad they're still doing as well as they are.
Then theres the fact that there aren't many other phones that come with 16gb internal. To me, having 16GB and being able to upgrade to 48 [minus the 2GB that Samsung steals] total is worth way more than starting with 4GB [1.5GB usable] and upgrading to a max of 36 [minus what HTC steals from internal].
But, if you don't like your phone, SELL IT while it's still worth something!

Why Nexus S is Google's new baby.

Hardware. Pure and simple. The Nexus One hardware was great at the time, but there are a few things that the Nexus One's hardware that needed to be upgraded, or they wanted to support in their new dev phone:
1) Proper Multi-touch screen.
Nexus One's screen isn't multi touch, and it's hardly even dual touch. It's a single touch screen that offered some limited dual touch support that only really works for pinch to zoom. The rotate with two fingers gesture that's in the new version of maps isn't supported on the Nexus One.​
2) Front facing camera.
iPhone has one, and made it somewhat popular. Google needed it in their dev phone to keep up.​
3) PowerVR SGX540 GPU.
The PowerVR SGX540 chip is *the* most powerful mobile chip on the market. It's significantly better than the adreno 200 found in n1, and has roughly double the power of PowerVR SGX535 that's in the iPhone 4 and iPad. Galaxy S maxes out most of the commonly used benchmarks, and comes close to maxing nenamark1 too.​
4) Wolfson Sound Chip is brilliant
The Galaxy S phones have *the* best sound chip on the market, and Nexus S has the same chip
check out the perfect audio quality part in GSMArena's review of Galaxy S​
Oh, and there's also the NFC chip, Super AMOLED screen, three-axis Gyroscope, and larger battery.
Rawat said:
Hardware. Pure and simple. The Nexus One hardware was great at the time, but there are a few things that the Nexus One's hardware that needed to be upgraded, and they wanted to support in their new dev phone:
1) Proper Multi-touch screen.
Nexus One's screen isn't multi touch, and it's hardly even dual touch. It's a single touch screen that offered some limited dual touch support that only really works for pinch to zoom. The rotate with two fingers gesture that's in the new version of maps isn't supported on the Nexus One.​
2) Front facing camera.
iPhone has one, and made it somewhat popular. Google needed it in their dev phone to keep up.​
3) PowerVR SGX540 GPU.
The PowerVR SGX540 chip is *the* most powerful mobile chip on the market. It's significantly better than the adreno 205 found in n1, and has roughly double the power of PowerVR SGX535 that's in the iPhone 4 and iPad. Galaxy S maxes out most of the commonly used benchmarks, and comes close to maxing nenamark1 too.​
4) Wolfson Sound Chip is brilliant
The Galaxy S phones have *the* best sound chip on the market, and Nexus S has it too
check out the perfect audio quality part in GSMArena's review of Galaxy S​
Oh, and there's also the NFC chip, Super AMOLED screen, three-axis Gyroscope, and larger battery.
Click to expand...
Click to collapse
This is a great analysis Rawat
I'll be really interested to see how quick my Nexus-1 gets gingerbread. If it takes weeks after the 16th or after the new year then I would have to agree
ap3604 said:
This is a great analysis Rawat
I'll be really interested to see how quick my Nexus-1 gets gingerbread. If it takes weeks after the 16th or after the new year then I would have to agree
Click to expand...
Click to collapse
I think the trend will be that the newer versions of android will be developed on Nexus S, and as such they'll be the first to receive it, and the N1 will get the updates a around a month or so later, as long as the device meets the minimum spec.
Google said they do their OS development on one device. Think it was andy rubin when he was showing parts of the mototab, and it was maybe in one of the Nexus S / gingerbread phone videos.
The nexus one actually has an adreno 200 the 205's are much more improved as seen in the g2,desire hd,and my touch 4g. Also the new snapdragons are believed to be on par if not better than hummingbird cpu's
Some comparison androidevolutions . com /2010/10/13/gpu-showdown-adreno-205-msm7230-in-htc-g2-vs-powervr-sgx540-hummingbird-in-samsung-galaxy-s/
Indeed you're correct. 1st gen chips had adreno 200, 2nd gen had 205s.
I don't think the gpu and CPU are the reason more so the screen along with samsungs ability to prodce said screens.
adox said:
The nexus one actually has an adreno 200 the 205's are much more improved as seen in the g2,desire hd,and my touch 4g. Also the new snapdragons are believed to be on par if not better than hummingbird cpu's
Some comparison androidevolutions . com /2010/10/13/gpu-showdown-adreno-205-msm7230-in-htc-g2-vs-powervr-sgx540-hummingbird-in-samsung-galaxy-s/
Click to expand...
Click to collapse
The CPU's may be on parr. However. CPU isn't what needs improved on the Snapdragons.
This is correct. SGX540 does perform about 2x as fast as SGX530 (found in Droid X, Droid 2, iPhone 3GS and a variation of it in iPhone 4). Unfortunately, Samsung's Galaxy S has been using the same GPU for many months now. So TI is playing a catch up on Samsung's SoC. To be fair, other manufacturers aren't exactly doing any better. Qualcomm's second generation GPU - Adreno 205 also performs significantly worse than SGX540 and (soon to be released) Tegra 2's GPU is also expected to be outperformed by SGX540. With Samsung claiming Orion improving GPU performance by another 3-4x over SGX540 must sound scary to other manufacturers!
Click to expand...
Click to collapse
SGX540 = Hummingbird's GPU.
GPU means a ton when it comes to what you're actually going to see in action on the screen.
In the link I posted that doesn't seem so, the gpu actually faired well against the humming bird in the epic
adox said:
I don't think the gpu and CPU are the reason more so the screen along with samsungs ability to prodce said screens.
Click to expand...
Click to collapse
Google said they added more features for better game programming. That's one of the major improvements in 2.3, so why would they pick screen over gpu? Galaxy S phones are considered one of the best device for Android gaming so it makes a lot of sense to have Samsung make a phone. The screen is an icing on the cake. I bet Samsung is going to use samoled screens a lot more on big phones they manufacture.
so true cant wait!
adox said:
In the link I posted that doesn't seem so, the gpu actually faired well against the humming bird in the epic
Click to expand...
Click to collapse
On one benchmark. I wouldn't read into those results too much
http://www.anandtech.com/show/4059/nexus-s-and-android-23-review-gingerbread-for-the-holidays
anadtech review
Rawat said:
Hardware. Pure and simple. The Nexus One hardware was great at the time, but there are a few things that the Nexus One's hardware that needed to be upgraded, or they wanted to support in their new dev phone:
1) Proper Multi-touch screen.
Nexus One's screen isn't multi touch, and it's hardly even dual touch. It's a single touch screen that offered some limited dual touch support that only really works for pinch to zoom. The rotate with two fingers gesture that's in the new version of maps isn't supported on the Nexus One.​
2) Front facing camera.
iPhone has one, and made it somewhat popular. Google needed it in their dev phone to keep up.​
3) PowerVR SGX540 GPU.
The PowerVR SGX540 chip is *the* most powerful mobile chip on the market. It's significantly better than the adreno 200 found in n1, and has roughly double the power of PowerVR SGX535 that's in the iPhone 4 and iPad. Galaxy S maxes out most of the commonly used benchmarks, and comes close to maxing nenamark1 too.​
4) Wolfson Sound Chip is brilliant
The Galaxy S phones have *the* best sound chip on the market, and Nexus S has the same chip
check out the perfect audio quality part in GSMArena's review of Galaxy S​
Oh, and there's also the NFC chip, Super AMOLED screen, three-axis Gyroscope, and larger battery.
Click to expand...
Click to collapse
Goddammint!!! I can't wait til Thursday!!!
zachthemaster said:
Goddammint!!! I can't wait til Thursday!!!
Click to expand...
Click to collapse
Rofl I can't wait till there are tons of threads started such as "Goddammit I LOVE this phone!!!"
ap3604 said:
Rofl I can't wait till there are tons of threads started such as "Goddammit I LOVE this phone!!!"
Click to expand...
Click to collapse
Haha goddammit i can't wait to post in those threads.. I'm so excited... New phone, new network... PUMPED
hmm.. sounds awsome..
but hey, someone knows if we can open the battery cover to replace the battery? im too used to carry two batteries.. i need it cause long weekends with heavy usage of the phone.. >.<
i didnt find anything about this :3
D4rkSoRRoW said:
hmm.. sounds awsome..
but hey, someone knows if we can open the battery cover to replace the battery? im too used to carry two batteries.. i need it cause long weekends with heavy usage of the phone.. >.<
i didnt find anything about this :3
Click to expand...
Click to collapse
yah... duh haha
Sure you can
Here's a view of the phone with the cover off:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

Dual-Core phones (Sprint)

I'm starting to notice now that so me are hitting the market. Surely Sprint will lose customers if they don't come out with one soon...
I've seen the echo and I couldn't care less about dual screen....I do care about dual-core though. Has anyone heard of a rumor or announcement regarding Sprint's first dual-core phone?
Sent from my SPH-D700 using XDA App
Our SGS2, and it will ship with 2.2.1, but 2.3 will be soon to follow, enabling the 3g and 4g radio features. No, really, idunno what sprint has up their butt, probly another overhyped HTC. Or will they try to resurrect palm again??? RIM? I swear they got the brown acid at headquarters... the echo is beyond dumb, no more premier unless you pay for landline minutes nobody uses....???
Overstew said:
I'm starting to notice now that so me are hitting the market. Surely Sprint will lose customers if they don't come out with one soon...
I've seen the echo and I couldn't care less about dual screen....I do care about dual-core though. Has anyone heard of a rumor or announcement regarding Sprint's first dua-lcore phone?
Sent from my SPH-D700 using XDA App
Click to expand...
Click to collapse
Just curious why you want dual core outside of saying that you have it? Is there some app in particular that you use that will be updated to use both cores? Most apps that run in the background have so little overhead that a bump in speed is far more beneficial than adding another core. I'd like to have the option of a single multi-threaded core or dual core. I'd take a 1.5Ghz multi thread single core over a 1.0Ghz dual core any day of the week in my phone. Not sure if there is a single droid app out there that would actually have any benefit of another core at all???
I think true benefit would come from a heterogeneous processor design that basically fed all of the 'phone' functions to a 100Mhz or so core that did nothing but "basic" phone functions. Any graphics or other type processing would rely on the video unit as well as the "high speed" core, which is typically switched off when the phone is being used as a phone, or is idle waiting for a call, etc.... The video cpu should have adequate ram to store icons and stuff of the such where screen rotations, page flips, etc... should be instantaneous. There shouldn't be any lag when rotating orientations and what not. Battery life could be significantly greater using a low power/speed cpu when that's all that is required.
I agree with the last post. The only reason to upgrade to a dual core phone today is to future proof yourself and maybe a bit more efficient battery usage. Seems like as it stands now, it's the eqivalent of buying a high end blu-ray player just to watch standard DVDs.
Sent from my SPH-D700 using XDA App
I have been reading the forum for a while now and the only use I could tell would be compiling packages or roms. Dual core would definitely help with that. Outside of that it would just be a benchmark machine. There are two ways to use a dual core chip. Either you can program code to run across both cores at the same time(which most programmers do not do if you couldn't tell from the computer side). The simplicity of apps does not really generate a desire for programmers to do this. The other is multitasking. You can run more than one thread at a time since one is going to each. I can't see this helping all that much unless you have programs that run in the background.
I can however see it draining battery. But to each his own.
I bought a ps3 when blu-ray first came out and now I don't have to waste money on a 3d-player
I want a dualcore phone solely for the fact that I'll be futureproof.
Unless its like the epic, I'm not gonna uprade tho. The epic is probably the best single core phone, so hats adequate enough to hold me down until october
Sent from my SPH-D700 using XDA App
October sh...t ill wait 2 years before i get another phone. let them really get these bugs of the dual cores fixed and smoothed and to let the carriers catch up in the upgrading of android. unless there are some really sweet features that i cant live without and we cant get modded to this phone.
I'll probably wait until they iron things out and see if there will be more implementations to take advantage of the dual core feature, rather than purchasing the first gen models.
Well...to put into perspective of the futureproofability of the epic....the g1 just got a half working port of honeycomb LOL
Sent from my SPH-D700 using XDA App
A_Flying_Fox said:
I bought a ps3 when blu-ray first came out and now I don't have to waste money on a 3d-player
I want a dualcore phone solely for the fact that I'll be futureproof.
Click to expand...
Click to collapse
The PS3 is 5yrs old already. It took more than 2 years for titles to start hitting BluRay regularly. And since that time, the $500 PS3, dropped $100 after just 1yr, and came down another $100 2yrs after that. I'm betting against the "future" having any use for the dual core cells dropping now. Sort of how the future has nothing to do with the Galaxy Tab. By the time the software is written to utilize the Tab, it will be an Atom amongst i3s. If it can run 3.0, it will be so painfully slow that you'll toss it out the window. You may have plans for the 'future', but the future will have no plans for you Why don't you just save $$ by using one's head, and buy things when they can actually be utilized? You haven't had the epic but at most 7 months......keep it a full 2 years and grab the 2nd run of dual cores that have issues sorted out and software to utilize it.
Sort of like buying a USB 3 card 2 years ago for $100 when nothing utilized it. Now that devices are actually releasing for it, they're $27
I am aware technology tends to advance quickly, but why would I want to buy something that would be outdated by the time the next best thing comes out?
Intel mobil .24nm dual core FTW!
Seriously, I don't see how the Galaxy 2 will not have horrible battery time.
As long as they stay away from that Tegra 2 stinker of a dual core. What a let down that is.
muyoso said:
As long as they stay away from that Tegra 2 stinker of a dual core. What a let down that is.
Click to expand...
Click to collapse
Hummingbird dual-core makes me drool...
Overstew said:
Hummingbird dual-core makes me drool...
Click to expand...
Click to collapse
IDK. I am all about the TI OMAP 4430 with PowerVR SGX 540 clocked at 300mhz. The Optimus 3d has it and it DESTROYS the Tegra 2 and UTTERLY DESTROYS the Exynos Samsung dual core thing. The PowerVR SGX 540 is the exact GPU we have in our phones, except clocked 50% faster. WANT!!!!!!!!!!!!
Check out the anandtech review of the Optimus 3d for benchmarks.
muyoso said:
IDK. I am all about the TI OMAP 4430 with PowerVR SGX 540 clocked at 300mhz. The Optimus 3d has it and it DESTROYS the Tegra 2 and UTTERLY DESTROYS the Exynos Samsung dual core thing. The PowerVR SGX 540 is the exact GPU we have in our phones, except clocked 50% faster. WANT!!!!!!!!!!!!
Check out the anandtech review of the Optimus 3d for benchmarks.
Click to expand...
Click to collapse
Is it said what carrier will have the Optimus 3d?
EDIT: Just looked it up, looks like T-Mobile...along with Galaxy S II. WTH? lol
Overstew said:
Is it said what carrier will have the Optimus 3d?
Click to expand...
Click to collapse
Haven't seen a carrier yet. Its a damn cool phone and would be awesome for Sprint. I am not into the whole 3d thing yet, but having that option is kind neat. The hardware though, omg the hardware makes me drool.
If only we could overclock our powervr sgx540 gpu, that would be amazing. Not much use at the moment, but if we ever got microusb to hdmi out it would be able to play 1080p as well as the 720p its currently capable of.
Edit: Damnit, T-Mobile gets all of the amazing phones.
muyoso said:
Haven't seen a carrier yet. Its a damn cool phone and would be awesome for Sprint. I am not into the whole 3d thing yet, but having that option is kind neat. The hardware though, omg the hardware makes me drool.
If only we could overclock our powervr sgx540 gpu, that would be amazing. Not much use at the moment, but if we ever got microusb to hdmi out it would be able to play 1080p as well as the 720p its currently capable of.
Edit: Damnit, T-Mobile gets all of the amazing phones.
Click to expand...
Click to collapse
Well, I'm taking this link into consideration when I make a conclusion...still kinda iffy though.
http://answers.yahoo.com/question/index?qid=20110210173729AAswla4
I do plan onkeepingthe epic...its basically the epitome of the single core generation. The only way to one-upit is with dual cores. By the time I have to upgrade, quad cores will be out.
I hate how fast tech gets aged...but the hummingbird is more than dequate for my mobile needs....I mean come on. The epic has been out for 7 months (the galaxy s has been out since may I think...) and only NOW are phones/tablets catching up to its capabilities.
I'm probably not gonnaget a dual core phone. But I wouldn't mind a dual core tablet to compliment the epic for when stuff gets real serious and I need to whip out the pc-grade stuff
Sent from my SPH-D700 using XDA App
Overstew said:
Well, I'm taking this link into consideration when I make a conclusion...still kinda iffy though.
http://answers.yahoo.com/question/index?qid=20110210173729AAswla4
Click to expand...
Click to collapse
You see the benchmarks on anandtech?
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

GLBenchmark HTC ONE only 34 FPS @ Egypt HD 1080P Offscreen

HTC ONE GLBenchmark only scores 34 FPS at 1080P offscreen, this is much lower than the Samsung SHV-E300s which scores 41.3 FPS, both using Snapdragon 600, in the same test. IRC HTC One is using LPDDR2 RAM, so are we seeing a lack of bandwidth compared to the Samsung which may use LPDDR3, which is supported by the S600.
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
how do you know HTC One uses LPDDR2 memory
kultus said:
how do you know HTC One uses LPDDR2 memory
Click to expand...
Click to collapse
http://www.htc.com/uk/smartphones/htc-one/#specs
http://www.anandtech.com/show/6754/hands-on-with-the-htc-one-formerly-m7/2
Turbotab said:
HTC ONE GLBenchmark only scores 34 FPS at 1080P offscreen, this is much lower than the Samsung SHV-E300s which scores 41.3 FPS, both using Snapdragon 600, in the same test. IRC HTC One is using LPDDR2 RAM, so are we seeing a lack of bandwidth compared to the Samsung which may use LPDDR3, which is supported by the S600.
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
Click to expand...
Click to collapse
My first question would be is how they even got a benchmark of the SHV-E300?
Xistance said:
My first question would be is how they even got a benchmark of the SHV-E300?
Click to expand...
Click to collapse
How do any results appear on GLbenchmark?
I believe with GLBenchmark, that if you don't register / login before running the test, it automatically uploads to their server for public viewing, so maybe it was done intentionally, or somebody forgot to login?
fp581 said:
he is spamming all around the htc one just look at his posts plz ban him from posting in any htc forum ever again.
he probably works in sony nokia or samsung
Click to expand...
Click to collapse
Who are you talking about?
sorry wrong person i'll delete that lest one.
but i would love pics of that benchmark for proof
fp581 said:
sorry wrong person i'll delete that lest one.
but i would love pics of that benchmark for proof
Click to expand...
Click to collapse
Dude I was going to go atomic, I admit it I have a terrible temper
I believe the benchmark was run by a German Android site, called Android Next, there is a video on Youtube, the GLBenchmark starts at 2.22
http://www.youtube.com/watch?v=Wl1dmNhhcXs&list=UUan0vBtcwISsThTNo2uZxSQ&index=1
thanks turbo for advanced my knoledge...what a shame they didnt choose LPDDR3 but i think its nt issue these days
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Maedhros said:
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Click to expand...
Click to collapse
GLBenchmark is a test of GPU performance, and isn't really changed by CPU clockkspeed, but it is affected by bandwidth.
As a test, I downclocked my Nexus 7 from an overclocked 1.6 GHz to just 1.15 GHz, I ran GLBench and got 10 FPS. I then ran at the test again but with CPU at 1.6 GHz, the result, 10 FPS again.
I've benched the N7 with both CPU & GPU overclocked to the same level as Transformer Infinity, which gets 13 FPS, but I always get 10 FPS, the reason my N7 has lower memory bandwidth than the Transformer Infinity, because it use slower RAM and thus has less bandwidth. That is a difference of 30% in FPS, just because of lower bandwidth.
I read that LPDDR3 starts at 800 MHz or 12.8 GB/s in dual-channel configuration, whereas LPDDR2 maxs at 533 MHz or 8.5 GB/s max bandwidth in dual-channel configuration.
Turbotab said:
GLBenchmark is a test of GPU performance, and isn't really changed by CPU clockkspeed, but it is affected by bandwidth.
As a test, I downclocked my Nexus 7 from an overclocked 1.6 GHz to just 1.15 GHz, I ran GLBench and got 10 FPS. I then ran at the test again but with CPU at 1.6 GHz, the result, 10 FPS again.
I've benched the N7 with both CPU & GPU overclocked to the same level as Transformer Infinity, which gets 13 FPS, but I always get 10 FPS, the reason my N7 has lower memory bandwidth than the Transformer Infinity, because it use slower RAM and thus has less bandwidth. That is a difference of 30% in FPS, just because of lower bandwidth.
I read that LPDDR3 starts at 800 MHz or 12.8 GB/s in dual-channel configuration, whereas LPDDR2 maxs at 533 MHz or 8.5 GB/s max bandwidth in dual-channel configuration.
Click to expand...
Click to collapse
In that case the results are quite disappointing.
All these fantastic new phones, and so much disappointment.
Sent from my GT-I9300 using xda premium
Tomatoes8 said:
They could have used faster memory for the same price if they didn't cut off Samsung as a supplier. Makes you wonder where their priorities lie. Making the best products possible or just going with the motions.
Click to expand...
Click to collapse
No one is going to take anything you say here seriously, as you've managed to have 2 threads closed in the last 30 mins. One of those inane posts you made involved you saying that HTC is going to be paying, according to your genius calculation, 20% of their profits to Apple (I forget what insanely unintelligent reason you gave). Yeah, because being able to completely migrate data from 1 completely different phone to another is such a bad idea for a company that wants to push their product.
So, what is the per unit cost of what HTC is paying for RAM now vs. what they could have gotten from Samsung? Exactly, you have no idea. I also didn't hear anything about HTC "cutting off" Samsung as a supplier, but maybe I missed it, so I google'd "htc cut off samsung supplier" and found 2 links...
http://tech2.in.com/news/smartphones/following-apple-htc-cuts-component-orders-from-samsung/505402
http://www.digitimes.com/news/a20121009PD213.html
I'm not sure if you have the capability of reading or not, but I'll spoon feed you this information, ok hunny? I've taken the info from the 1st link, since there is more there.
After Apple Inc slashed its orders for memory chips for its new iPhone from major supplier and competitor, Samsung Electronics Co Ltd, HTC too has reportedly cut down on its smartphone component orders from the South Korean company.
Click to expand...
Click to collapse
So, Apple cut down on memory orders. You know, they are the one's who make the iPhone? Have a logo of an Apple on their products? Steve Jobs was the CEO before he died. Anyway, I'll continue...
According to a report by DigiTimes, HTC has reduced its orders from Samsung, and instead opted to order CMOS image sensors from OmniVision and Sony. The company has also chosen to move part of its AMOLED panel orders to AU Optronics, DigiTimes reported citing ‘sources’.
Click to expand...
Click to collapse
Notice it said that HTC reduced its orders from Samsung, specifically on the image sensors (that's for the camera, if you didn't know) and the screen. You know, the thing on the front of your phone that you touch to make it do things? You know what I mean, right? I encourage you to read this link (or possibly have someone read it to you)...
http://dictionary.reference.com/browse/reduce
The point is that reduce isn't the same as cut off. Cutting off would require HTC not ordering ANYTHING from Samsung. Guess what? The One doesn't use an OmniVision CMOS sensor (don't forget, that's what the camera uses) or an AMOLED screen (the bright part of your phone that shows you info).
Also, this is a far better designed phone, especially in regards to hardware, than anything Samsung has ever produced. I went back to my EVO 4G LTE, mainly because I couldn't stand the terrible build quality of the Note 2. It just feels like a cheap toy. And, IMO, Sense is far better than TW. Samsung may have the market right now because of the Galaxy line of products, but that doesn't mean that HTC is out of the game by any means.
Seriously, attempt to use just a bit of intelligence before opening your mouth and spewing diarrhea throughout the One forums. As the saying goes: "it's better to keep your mouth shut and have people think you're an idiot, then to open your mouth and prove it". Unfortunately for you, it's too late.
I really think Turbo was too hasty to open a new thread for this as we've been discussing this in the mega thread
http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro25&D=HTC+One
It scores 34fps in Egypt HD 1080p offscreen, while the leaked Samsung s600 device socres 41fps which is perfectly inline with Qualcomm's promised speed (3x Adreno 225)
here is a video of what I suspect the source of the benchmark, because we had no benchmark before it
http://www.youtube.com/watch?v=Wl1dmNhhcXs
notice how the battery is almost at end (HTC bar at this stage means its in the last 25%) also notice the activity in the notification area
more important the post ran more than a few full benchmarks, like quadrant before running GL benchmark, this alone is enough to lower the score, especially since Adreno 320 was known to throttle in the Nexus 4
I think benchmarks scores should not be relied on in such events, especially with hundreds of hands messing with the device, we have learned from the One X launch where videos poped up showing horrible performance from the One X, eventually turned out to be were very far from the final device in ur hands
finally both the One X and Nexus 7 at the same gpu clock, but the first is DDR2 and the second is DDR3, score the same in GL Benchmark
in other words its worrying but it's best to wait for proper testers like Anand
Thread cleaned
...from some serious trolling. There should be no trace from him for some time .
but remember:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
But...
I just wonder that a Samsung phone uses high end parts from Qualcomm instead of Samsungs processors. But I am not in Samsung devices so far, so I would not judge this
Gz
Eddi
Here's a second video also showing Egypt off screen bench at 34FPS.
https://www.youtube.com/watch?v=wijp79uCwFg
Skip to 3:30
Maedhros said:
Just to temper this news, we must remeber that the HTC One is running at 1.7ghz, while the Samsung device is running at 1.9.
Although 200mhz does not seem like much, it could possibly account for the 7 fps difference when u factor in the difference in UI.
If in fact the Samsung device really has DDR3 ram, and the difference (after accounting for clock speed) is 2-3 fps, I can understand why HTC opted not to include it. Was not worth the extra cost most likely.
Click to expand...
Click to collapse
So you're saying that 200mhz o the CPU can account for 7 fps on a GPU test?
Following what you said, the Nexus 4 should have scored 27 fps? Since it has 200mhz less...
But no, it scored 33.7...only 0.3 fps less than the One!
And you know why? First both use the same GPU (and it's what counts for a graphic test) and second the HTC phones are always slower due to Sense!
So stop *****ing and realize that the One is no god phone
Samsung device is running 4.2.1

[Discussion] M9+ benchmarks, real life performance experiences

All HTC M9+ owners!
We're a handful yet on XDA, but getting more and more as the device is rolling out to more locations. and people who look for an elegant, high end device with premium build quality and the extra features like fingerprint scanner and 2K display and the duo camera settle with this fantastic device. It's not the perfect for everything unfortunately, not the best gaming phone out there, but in my experience a very well performing device in terms of phone call quality, reception quality, wifi power, multimedia, battery, display panel and overall UX feel, smoothness. High end games of latest year 2015 might stutter a bit here and there or lack some important shaders to compensate, but nonetheless good for last years (2014 and before 3D gaming and all kinds of 2D games), let's gather the experience and benchmarks of this unique device which was maybe the first MTK device in alubody
Let's discuss performance related experience, real user feel/experience and benchmarks free of whining, facing the truth that in some respects it's not the top-notch device, but let the curious ones who consider to buy this device, know what to expect if choosing this elegant business class phone.
I'll start with some of my benchmarks result screenshots in separate posts.
UPDATE:
Here's my short game testing video on M9+
https://www.youtube.com/watch?v=QmLGCoI4NLw
Antutu on stock Europe base 1.61
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
My real life experience is that in everyday chores, the phone is very snappy, with some small lags when starting new apps to the memory. Task switching is quite good.
There's unfortunately some memory leak issue with 5.0.x Android version which after a while kicks in (on most 5.0.x devices as well), but overall the UX smoothness is just quite good. Occasionally there are some apps that brings up popup windows a bit stuttery, like facebook comments animation tends to be a bit stuttery.
The Sense Home/Blink Feed experience is just perfect. At normal operations, when no big application updates are happening in the background, I never faced any lags on the Sense Home UI.
As for the games, games from and before 2014 run perfectly. New ones might get some shaders removed, or with reduced polygon counts, so don't expect a jaw dropping 3D experience. If you are ok with last years (2014 and before) 3d game quality, M9+ is quite good, but latest games will be most probably running in the dumbed down mode accordingly to the PowerVR GPU the M9+ with mtk chipset has. The phone keeps a good temperature mostly, except when charging battery and at the same time playing 3D heavy games. (But that's expected from most devices, especially with alu body)
The screen quality is quite good, i've got a perfect panel with the first unit I got, and the refresh rate is 60hz, smooth and with lovely dpi and brightness of 2K.
The benchmarks show a general good performance with operations bound to the CPU. Where the MTK chip comes shorthanded is the GPU part. All tests show 3D performance that is below 2014's flagships performance by far. The GPU PowerVR 6200 is the same that was built into iPhone5s, which let's face is a mediocre GPU for a 2K display. If you face this fact and accept that gaming won't be with the highest quality textures and shaders, probably you won't be disappointed.
I'm curious what others think...
Played with it for a few hours. First impression is that it's clearly slower then the M9. The fingerprint sensor is almost perfect, a few hit and misses. I'll be back after a few days of using it with a more adequate feedback.
Sent from my HTC_M9pw using Tapatalk
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
DeadPotato said:
That's why I don't trust benchmarks at all: @tbalden benchmarks shows it should be on par with the M9, I got those scores on mine; but the 3D department is awful to say the least, Godfire is light years away on the M9
Click to expand...
Click to collapse
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
tbalden said:
Yeah, tho the benchmarks 3d score on antutu is more than half less, 3d is 9k on m9+ and 21k on m9, so after all it tells you something true. The CPU of the m9+ is quite good while the gpu is rather old, from the iPhone5S era when the display resolutions were much smaller. That says it all.
So everyday chores are quite snappy on the phone and gaming is mediocre on high end games.
Click to expand...
Click to collapse
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
DeadPotato said:
I see didn't notice that, I don't understand Mediatek though, why did they put such an outdated Graphics card on their 'flagship' processor?
Click to expand...
Click to collapse
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
tbalden said:
Indeed, it should have been a better one, although it's a question to explore if the SoC could or couldn't handle a higher bandwidth of system memory to gpu memory. Maybe they simply don't have a better gpu compatible or there are technical difficulties with incorporating one.
Either way the CPU itself is a promising high end one, and unfortunately we can't help with the gpu part. Maybe there will be some possibility to tweak it on kernel level. Remains to be seen, but I'm not holding breath, no wonders can be done
Click to expand...
Click to collapse
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
DeadPotato said:
Ya I agree, sad thing CPU wise Mediatek is doing a very good job to enter the high-end tier smartphones, but looks like they need to improve a lot GPU wise for the Mediatek Helio X20 to be actually considered a valid option for flagship devices
Click to expand...
Click to collapse
Second you.
X10 is MTK's first high-end tier SoC. They should equip with high-end graphics card to become well-known,not two-years ago one same with iPhone5s era. Although the most casual gaming is OK,but it is not covered with "flagship" devices.
It's excellent there were some possibility to tweak it on kernel level. But the Mediatek Helio X20 should work better,better & better on GPU part.
Not heavy gamer,the UX is pretty good for me. (Camera is another topic.)
Even I measured the CPU speed with Chess game,the result reached the top-tier.
tbalden said:
Antutu on stock Europe base 1.61
Vellano tests base 1.61
Futuremark test base 1.61
Sent from my HTC One M9PLUS using XDA Free mobile app
Click to expand...
Click to collapse
FYI.
Mine : stock , Taiwanese base 1.08,rooted device
Sigh.... the result was maxed out for my device on Ice Storm "Extreme".
Very similar result as mine was except ice storm unlimited. I think it might be related to temperature throttling, I'll test it again later.
Yeah, second run on base 1.61
Hello guys,
I live in France and i really look forward the HTC m9+ ; i thinks it is what the M9 should have been, but i don't understand HTC 2015 sh*tty choices of everything.
WHat i would like to know is about battery life ; Can you guys tell me whats about it ?
I just bought galaxy s6 edge a few days ago and it's creepy. I have no battery in the end of afternoon. I don't play games, just few photos and browsing, and messenger sometimes. And anyways i miss boomsound and metal body.
I'm so desapointed. Should never have sold my old one m8 or xperia Z3.
My only hope is turned on lenovo vibe x3 but there is no news since march ...
Thanks guys !
After installing more and more software that is synchronized, like dropbox and a few other my battery life got lower a notch, from 23+ hours to 22+ and average screen on around 4 hrs. All in all, not a miracle, but that's what I usually got with my previous phone, OPO with its larger battery and lower screen resolution. A bit less though, admittedly. So it gets me through the day nicely, but I'm not gaming.
It's average I think. Compared to the amazing Sony z3 compact it's pathetic, my friend's z3c gets 3+ days and 5 hours of screen on time. And I think it's done with some great stock kernel enchanted by Sony engineers and Sony stock rom... wish other phones could do that
Not sure why, I could not choose no lock screen and lock screen without security mode under security settings. Both options are greyed out and it says it is disabled by administrator,encryption policies or other apps.
I only have prey and lockout security installed which could affect this but this was totally fine on my M7 device.
Anyone has any idea? Thank you.
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
JellyKitkatLoli said:
They really should've went with the 801, YES THE 801, for this phone.
On the same resolution the adreno 330 has better performance AND the price wouldn't change, really shameful on HTC for picking a MTK SoC (**** or crap).
I am very disappointed.
Click to expand...
Click to collapse
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Yes, and it's fairly accurate.
I got 11871 with 2K, I couldn't screenshot as it doesn't know how to (Doesn't work above 1080p without dedicated software for it).
M8 1080p.
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
JellyKitkatLoli said:
Also now that I look at it, your cpu integer is out of proportion due to all the cores the MTK SoC has lol, too bad that has no real life use, and as you can see single core performance is much better on the 801 :/.
Click to expand...
Click to collapse
hmmm......
The efficiency of SW decode will be performed vividly by multi-cores in your real life.
There were much comparison between MT6795(not "T") and SD801 in this site.
About the browser speed, RAR compression ,real gaming, power consumption........
http://tieba.baidu.com/p/3825199511?see_lz=1#69839534914l
yvtc75 said:
1.61 base,stock,rooted
lower the resolution to 1080P,just for fun.
GPU do tie down the result,right?
Click to expand...
Click to collapse
Are those results after reducing resolution to 1080p, did I read correctly? Because if that's the case, it's sad to confirm the limits of the 3D graphic card as other stated before, if you see the above M8 (which I find surprisingly higher than I experienced on my own M8 anyway) you'll notice even the last year's M8 has a better 3D score, and not to mention the regular M9 which produces almost double of that score. Multicore wise the Helio X10 is a beast, but graphics wise, looks like it's not so much

Resources