Octacore?! Only 4 cores work... why? - P8lite Q&A, Help & Troubleshooting

Hi everyone!
I dont know if anyone noticed yet but, someone using honor 4x (which uses the same chipset) as made this question.
Our chipset is octacore indeed but no matter what app you are using, any benchmark you perform, anything you do, only 4 cores work. Why is this so?
Then someone says that they only activate when using heavy stuff that requests much work from cpu... thats a lie... when you perform any kind of benchmark or heavy load app which should test your cpu to the limit it does indeed show 8 cores, but 4 of them are offline/sleep and dont get a speed reading. I know that probably it does not make much of a difference but we pay for an 8 core and only get 4 of them to work.
Does anyone know how to get them to work?
Thanks in advance

siul_lasc said:
Hi everyone!
I dont know if anyone noticed yet but, someone using honor 4x (which uses the same chipset) as made this question.
Our chipset is octacore indeed but no matter what app you are using, any benchmark you perform, anything you do, only 4 cores work. Why is this so?
Then someone says that they only activate when using heavy stuff that requests much work from cpu... thats a lie... when you perform any kind of benchmark or heavy load app which should test your cpu to the limit it does indeed show 8 cores, but 4 of them are offline/sleep and dont get a speed reading. I know that probably it does not make much of a difference but we pay for an 8 core and only get 4 of them to work.
Does anyone know how to get them to work?
Thanks in advance
Click to expand...
Click to collapse
>P8 Lite General
>>Honor 4x
Facepalm, but ok
Other core's sleep because this is for economy power(stock emui feature), wake up only when necessary, while multi-tasking. Other question?

Rerikh's got the point, however, it should show up in the benchmarking apps as Offline.
Also there's a difference between versions of the phone.
There's one called ALE-L04, theres our ALE-L21, and I heard about ALE-L23 too.
The ALE-L04 has a different chipset, a Snapdragon 615. With this chipset comes a Quad-Core Cortex-A53 CPU clocked at 1.5 GHz, and also a Quad-Core Cortex-A53 CPU, clocked only at 1.0 GHz. This type of model also has a different GPU in the person of the Adreno 405.
The ALE-21/23 has the HiSilicon Kirin 620 chipset, wich comes with a Octa-core Cortex-A53 clocked at 1.2 GHz, and a Mali-450MP4 GPU.
You can see the compraison between the two models at GSMArena.
The kernel disables/enables the whole secondary CPU depending on the load of the system.
I'm using the PACPerformance application, and it shows all the 8 cores for me correctly, however im an owner of a ALE-L21 model.
With this app you should be able to disable cores indepdently too, which allows greater battery life for slightly decreased performance, but it's not working, because the built-in governor keeps turning back the offline cores.
I used this program on my previous S3 Mini, and worked great.
If you're interested, you can get the app from here: https://goo.gl/LCpxrA (the program stored in my Google Drive inside a ZIP archive)

Thanks for the explanation davidosa...
About the other guy, just shut up if you dont want to give an answer to my question only. Whats the problem on refering honor 4? Dont give a f&#? on what you say...

Has anyone been trying to compile kernel from source? I'm trying but I'm stuck at a weird error. I think huawei kernel sources for our phone miss vendor files, that instead are present in p8 kernel sources.
About the error: compilation process at some point doesn't find init/mounts.o. I don't know if this is related to vendor files.
Cheers

arviit said:
Has anyone been trying to compile kernel from source? I'm trying but I'm stuck at a weird error. I think huawei kernel sources for our phone miss vendor files, that instead are present in p8 kernel sources.
About the error: compilation process at some point doesn't find init/mounts.o. I don't know if this is related to vendor files.
Cheers
Click to expand...
Click to collapse
Would love to be doing that but unfortunately I dont know anything about code... Would love to know at this point... Huawei is the future... In less than 1 year huawei will be the next apple... believe me...

daviddosa said:
Rerikh's got the point, however, it should show up in the benchmarking apps as Offline.
Also there's a difference between versions of the phone.
There's one called ALE-L04, theres our ALE-L21, and I heard about ALE-L23 too.
The ALE-L04 has a different chipset, a Snapdragon 615. With this chipset comes a Quad-Core Cortex-A53 CPU clocked at 1.5 GHz, and also a Quad-Core Cortex-A53 CPU, clocked only at 1.0 GHz. This type of model also has a different GPU in the person of the Adreno 405.
The ALE-21/23 has the HiSilicon Kirin 620 chipset, wich comes with a Octa-core Cortex-A53 clocked at 1.2 GHz, and a Mali-450MP4 GPU.
You can see the compraison between the two models at GSMArena.
The kernel disables/enables the whole secondary CPU depending on the load of the system.
I'm using the PACPerformance application, and it shows all the 8 cores for me correctly, however im an owner of a ALE-L21 model.
With this app you should be able to disable cores indepdently too, which allows greater battery life for slightly decreased performance, but it's not working, because the built-in governor keeps turning back the offline cores.
I used this program on my previous S3 Mini, and worked great.
If you're interested, you can get the app from here: https://goo.gl/LCpxrA (the program stored in my Google Drive inside a ZIP archive)
Click to expand...
Click to collapse
The Ale L23 is the one that got shipped to México, I got mine with telcel, dont know if is the same for latinamerica

Well, this phone is running with only 4 cores... every CPU program in the market say it. With 100% load for more than 10 minutes the other cores never turn on that's it.

reindjura said:
Well, this phone is running with only 4 cores... every CPU program in the market say it. With 100% load for more than 10 minutes the other cores never turn on that's it.
Click to expand...
Click to collapse
You have to find CPU program for octa core, cpu stats for example.

http://forum.xda-developers.com/p8lite/help/cores-doesn-t-t3273248
It is a major problem of the apps to see all 8 cores!

siul_lasc said:
Thanks for the explanation davidosa...
About the other guy, just shut up if you dont want to give an answer to my question only. Whats the problem on refering honor 4? Dont give a f&#? on what you say...
Click to expand...
Click to collapse
Hahah XD, you are so funny, what do you think, why the forum is divided into topics? With same success I can ask in the topic of Nexus about Samsung, not the same, huh? Pls follow the rules next time

Time_Bandit said:
http://forum.xda-developers.com/p8lite/help/cores-doesn-t-t3273248
It is a major problem of the apps to see all 8 cores!
Click to expand...
Click to collapse
thanks good app thank you

download from playstore cpu stats you will see 8 cores

Hello, like you I have seen only 4 core working. It's like this because the core 4 to 7 didn't have governors and other's system files like the other's 0 to 3.
By this fact our device's have 8 cores but only 4 can run. It's really bad.
And it's only 4 real core.

dkonect said:
Hello, like you I have seen only 4 core working. It's like this because the core 4 to 7 didn't have governors and other's system files like the other's 0 to 3.
By this fact our device's have 8 cores but only 4 can run. It's really bad.
And it's only 4 real core.
Click to expand...
Click to collapse
No, just use appropriate app to see.
I saw all 8 cores working testing cpu stats + asphalt 8

dkonect said:
Hello, like you I have seen only 4 core working. It's like this because the core 4 to 7 didn't have governors and other's system files like the other's 0 to 3.
By this fact our device's have 8 cores but only 4 can run. It's really bad.
And it's only 4 real core.
Click to expand...
Click to collapse
All 8 cores running
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

Octo cores
By chance one good guy's is working on a custom kernel for this exotic chipset.
In test stages now
For enable all the cores all time only need to edit one enter in the build prop.

dkonect said:
For enable all the cores all time
Click to expand...
Click to collapse
Why should work all 8 cores when they are not needed at all ?! This is absolute waste of energy!
When it comes to high load, all 8 cores are activated!

i got this phone a week ago, as i noticed, the 4-8 core depends on battery settings
normal is 8core
intelligent is 4core
i checked with a video+cpu stat, it lagged with 4core

All cores working here
Enviado de meu ALE-UL00 usando Tapatalk

Related

[TEST] Tegra 2 OC vs Tegra 3

Hello guys! Walking around the net I found some diagrams that describe the benchmarks of Tegra 2 stock vs. 3 relative to other processors. I always wondered if now that the processor is overclocked to 1.7GHz (70% more power) has reduced the distances from the future quad-core (which we remember to be 5 times faster than current SoC as Nvidia has said).
Some developers can make this comparison? It would be interesting to see the benefits
"when looking at the compiler versions and settings used to compile the CoreMark benchmark, the Core 2 Duo numbers were produced via GCC 3.4 and only the standard set of optimizations (-O2), while the Tegra 3 numbers were run on a more recent GCC 4.4 with aggressive optimizations (-O3). Il Sistemista website took a Core 2 Duo T7200 and re-ran the benchmark compiled with GCC 4.4 and the same optimization settings. The results were no longer in favor of NVIDIA, as the Core 2 chip scored about 15,200 points, compared to the Tegra's 11,352."
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
CoreMark benchmark comparing nVidia Tegra 3 @1GHz clock to various real and hypothetical products
CoreMark/MHz index shows how much Coremarks can a particular chip extract given its frequency
WoW....More powerful than Core2Due O_O!!!
Now THATS something to get excited about!!!
Ahmed_PDA2K said:
WoW....More powerful than Core2Due O_O!!!
Now THATS something to get excited about!!!
Click to expand...
Click to collapse
if you read the bold they fixed compiler optimizations and the Tegra did not hold up
At this point it would be really interesting to know how much you have reduced the gap with the super-overclock to 1.7GHz of Tegra (which is really the maximum achievable?). I don't know how to operate the CoreMark 1.0 so I ask someone in the community for check if values ​​are underestimated or substantially improved compared to those offered by Nvidia and especially to see if at this point Tegra 3 can really be worth .
I recently read that Nvidia is pushing a lot of their projects and have already popped the development of Tegra 4. The specifications include a 4 Tegra chip manufacturing to 28Nm. IMHO it would be more correct to wait for this to update the Tegra 2 hardware version (so these benchmarks could be a way to understand this in this context).
I highly doubt many people can get their tegra 2 up to 1.7, I can only get mine stable at 1.4, probably 1.5 and maybe 1.6 if I spent a lot of time messing with the voltages. I think I read somewhere that 1.7ghz produces a lot of heat, like almost in the danger zone but I could be exaggerating it a little.
Sent from my ADR6300 using Tapatalk
In any case it would be nice to make a comparative improvement in all frequencies compared to those values ​​offered by Nvidia, to give an idea of who has the device the hardware's potential. But I think it's a matter of development and optimization as we are seeing here on the forum these days ... the tablet is slowly improving on all fronts
I'm sure that once we have access to the OS code the tf will run like a beast! I had an OG Droid and the difference between stock and modded was mind blowing.
Sent from my ADR6300 using Tapatalk
brando56894 said:
I highly doubt many people can get their tegra 2 up to 1.7, I can only get mine stable at 1.4, probably 1.5 and maybe 1.6 if I spent a lot of time messing with the voltages. I think I read somewhere that 1.7ghz produces a lot of heat, like almost in the danger zone but I could be exaggerating it a little.
Sent from my ADR6300 using Tapatalk
Click to expand...
Click to collapse
I'm one of the few with one that does 1.7GHz no problem.
Its not only the cpu speed of tegra3 that will pwn the transformer1, a 12core gpu and support for dynamic lightning , high profile hd decoding and much more. No way a tegra2 will outperform the tegra3.
Just face it: transformer1 will be pwned big time by transformer2, launched okt/nov 2011....
Not to mention the new Ti cpus coming q1 2012.... they will pwn even more then tegra3...
Sent from my Desire HD using XDA App
Tempie007 said:
Its not only the cpu speed of tegra3 that will pwn the transformer1, a 12core gpu and support for dynamic lightning , high profile hd decoding and much more. No way a tegra2 will outperform the tegra3.
Just face it: transformer1 will be pwned big time by transformer2, launched okt/nov 2011....
Not to mention the new Ti cpus coming q1 2012.... they will pwn even more then tegra3...
Sent from my Desire HD using XDA App
Click to expand...
Click to collapse
Yeah it is true that is stronger, but my thesis is that Tegra 3 is only a technology shift. Although it has 12 computing cores capable of dynamically processing the lights, this CPU produces these values ​​and it would be interesting to relate them to the values ​​of the Tegra 2 overclocked and see if indeed the Tegra 3 can really be a solution to replace Tegra 2. We are faced with two different SoC, the first a dual core, the second a quad core, but probably, common sense tells me that if well exploited this device could give a long hard time to the next model (I remember the story of HTC HD2, which was released back in 2009 and today is one of the longest-running phones through a rearrangement of the software every day). I argue that there is no need for raw power in these devices but the fineness of the calculation, since the batteries are not as performance and putting more capacity probabily they can't make the devices more streamlined in near future.
Someone knows how to use CoreMark to update these values​​?
Of course the tegra two will beat the crap out of the tegra 3, thats like comparing a core 2 duo to a core i7 lol
chatch15117 said:
I'm one of the few with one that does 1.7GHz no problem.
Click to expand...
Click to collapse
Lucky bastard! :-D how many mV are you running it at?
Never tried 1.7Ghz but have done 1.5-1.6 on 3 tabs without messing with voltages. Acutally right now running 1.2Ghz messing with LOWERING the voltages.
Coremark download
If anyone can compile Coremark for run under Android here there is the software to download:
Download the Coremark Software readme file
Download the CoreMark Software documentation
This documentation answers all questions about porting, running and score reporting
Download the Coremark Software
Download the CoreMark Software MD5 Checksum file to verify download
Use this to verify the downloads: >md5sum -c coremark_<version>.md5
Download the CoreMark Platform Ports
These CoreMark ports are intended to be used as examples for your porting efforts. They have not been tested by EEMBC, nor do we guarantee that they will work without modifications.
Here there are result's table:
http://www.coremark.org/benchmark/index.php
If a DEV can compile and Run some test for comparing the results with the new frequencys would be great!
brando56894 said:
I highly doubt many people can get their tegra 2 up to 1.7, I can only get mine stable at 1.4, probably 1.5 and maybe 1.6 if I spent a lot of time messing with the voltages. I think I read somewhere that 1.7ghz produces a lot of heat, like almost in the danger zone but I could be exaggerating it a little.
Sent from my ADR6300 using Tapatalk
Click to expand...
Click to collapse
There is no danger zone.. at least not on my kernel. I haven't posted the one for 3.1 (only 3.01), but there is a built in down clock once the chip reaches temps beyond 47degC... This only happens if the tablet (in laptop form) is left in a bag with the screen on (by accident, sometimes the hinge says its open when its closed).. So if you can run 1.7ghz stable, theres no reason not to run it.. as you'll be protected from heat by the built in thermal throttling.
Default thermal throttling had some seriously bizzare ranges.. 80-120degC..
I wish nvidia's boards were produced like this:
http://hardkernel.com/renewal_2011/main.php
http://www.hardkernel.com/renewal_2011/products/prdt_info.php?g_code=G129705564426
Something modular would be a nice change in the mobile tech segment.. Swapping SoCs would require some serious low-level work tho.. Unless writing to the bootloader and boot partitions were made easy via external interface.. something like that.
Blades said:
There is no danger zone.. at least not on my kernel. I haven't posted the one for 3.1 (only 3.01), but there is a built in down clock once the chip reaches temps beyond 47degC... This only happens if the tablet (in laptop form) is left in a bag with the screen on (by accident, sometimes the hinge says its open when its closed).. So if you can run 1.7ghz stable, theres no reason not to run it.. as you'll be protected from heat by the built in thermal throttling.
Default thermal throttling had some seriously bizzare ranges.. 80-120degC..
I wish nvidia's boards were produced like this:
http://hardkernel.com/renewal_2011/main.php
http://www.hardkernel.com/renewal_2011/products/prdt_info.php?g_code=G129705564426
Something modular would be a nice change in the mobile tech segment.. Swapping SoCs would require some serious low-level work tho.. Unless writing to the bootloader and boot partitions were made easy via external interface.. something like that.
Click to expand...
Click to collapse
And so? are you good to use CoreMark to run some bench on your device and compare with Tegra 3 results? coz here i would test this
devilpera64 said:
"when looking at the compiler versions and settings used to compile the CoreMark benchmark, the Core 2 Duo numbers were produced via GCC 3.4 and only the standard set of optimizations (-O2), while the Tegra 3 numbers were run on a more recent GCC 4.4 with aggressive optimizations (-O3).
Click to expand...
Click to collapse
That statement there tells why benchmarks are extremely useless and misleading.
im one of the few that can reach 1.7 ghz no problem to. never ran it long enough to get hot tho. maybe 30 mins just to run benchmarks. never had fc's or have my system crash

Plp are saying the Nexus 7 processor is faster then then the 10. This true?

Thx for any feedback
Sent from my SGH-I747 using xda app-developers app
Who's saying that? Lol. Must be mis-informed
Sent from my EVO using Tapatalk 2
Nexus 7: 1.2GHz quad core, A9 based
Nexus 10: 1.7GHz dual core, A15 based
It's probably quite difficult to find a use case on a tablet where more than two cores provides any tangible performance increase, but the much higher clock rate and newer architecture should definitely make a difference.
Sent from my HTC One X using xda app-developers app
Benchmarks shows that the new dual core is much faster than the n7 quad. Also the 2gb of ram
I suspect it's mostly because "Plp" don't understand how 2 cores can be faster than 4 cores.
Yes, they can and yes, they are. ARM15 CPUs have a new and vastly superior architecture.
It's top notch. Although core counts are great for marketing, don't get to worried. After people get their hands on it, we will have a better idea of performance. I remain confident I'm about to purchase the most powerful tablet available.
Sent from my HTC One S using Tapatalk 2
Biohazard0289 said:
It's top notch. Although core counts are great for marketing, don't get to worried. After people get their hands on it, we will have a better idea of performance. I remain confident I'm about to purchase the most powerful tablet available.
Sent from my HTC One S using Tapatalk 2
Click to expand...
Click to collapse
Technically that's not true, the latest iPad, I believe, vastly out performs this thing. That said, how is it gonna make use of all that power other than playing games, which let's face it, that's all iPad buyers do really...
Not like it'll be cracking aircrack dumps eh.
Sent from my GT-I9300 using Tapatalk 2
I saw a benchmark that placed the nexus 10 under the transformer TF300 (antutu). Would this be accurate? I would've thought that the nexus10 would've outperformed the transformer which has a quad core tegra 3 processor.
Sent from my ASUS Transformer Pad TF300T using xda app-developers app
UndisputedGuy said:
I saw a benchmark that placed the nexus 10 under the transformer TF300 (antutu). Would this be accurate? I would've thought that the nexus10 would've outperformed the transformer which has a quad core tegra 3 processor.
Sent from my ASUS Transformer Pad TF300T using xda app-developers app
Click to expand...
Click to collapse
Saw that too, as well benchmarks in which the note 10.1 is having a better performance than the Nexus 10
http://www.engadget.com/2012/11/02/nexus-10-review/
Funnily other sites really do say the Nexus 10 is the fastest they have seen so far.
4z01235 said:
Nexus 7: 1.2GHz quad core, A9 based
Nexus 10: 1.7GHz dual core, A15 based
Click to expand...
Click to collapse
4 * 1.2 GHz = 4.8 GHz
2 * 1.7 GHz = 3.4 GHz
Clearly, the Nexus 7 is much faster.
That is the discussion level when people come to the conclusion stated in the topic. Even in benchmarks - which all of the reviewers agree don't mirror the level of performance they experience in real world use - the Nexus 10 comes out way higher than the Nexus 7 while pushing 4 times the pixels! The Nexus 10 runs circles around its little brother...
The differences are so vast, it is almost incomprehensible to some. I'll use my same reference as I do with desktop processors. When quad cores really started coming around and making big appearances; software and games weren't exactly capable of utilizing the abilities. So, an e8400 3.0GHz dual core would out perform it's bigger quad core brother that ran at the same frequency and architecture in a lot of applications. Back in 2008, gaming rigs almost always would sport the dual core processors. Simple fact is that those processors use cache that is on the die. 6mb for two cores is better than 8mb for four cores.
So, until software developers really have a need for a quad core, the dual cores will run with the quad cores just fine. Benchmarking however, almost never show any real world performance. I'm sure this dual core is going to surprise even the more skeptical guys and gals.
Sent from the Blue Kuban on my Epic 4G Touch
Maxey said:
Saw that too, as well benchmarks in which the note 10.1 is having a better performance than the Nexus 10
http://www.engadget.com/2012/11/02/nexus-10-review/
Funnily other sites really do say the Nexus 10 is the fastest they have seen so far.
Click to expand...
Click to collapse
The benchmarks are selling the N10 very, very short. There is a great review linked in this post where the author says exactly that.
Here's the post that stamps its feet and declares the absolute truth:
Cores aren't everything.
And you can think that we're just a bunch of fanboys trying to justify the use of dual core in the Nexus 10, but even non-fanboys who know how these processors work would tell you a dual core A15 is far more powerful than a quad core A9.
Now everything below this post is purely observation based (I'm not an engineer nor have I really studied up on computer parts), but I think it gives a casual user a good idea of how a CPU works.
So, before you go out screaming "wtf. why are there only 2 cores in the nexus 10. lol so noob," ask yourself this: what do extra cores even do? If you can't answer that, then don't complain because you clearly don't know what you're talking about. It's not simple multiplication of the frequency (for example, 4 x 1.2ghz = 4.8ghz). A 4.8ghz phone would be able to run Crysis without a problem (of course with a dedicated GPU, but that's besides the point). That's obviously not the case.
The major difference between the A15 and the A9 is their microarchitecture. You can think of it this way. A mouse has to go through a course. There are two courses:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
and
.
We call the maze A9 and we call the straight route A15.
Now, we will pretend "ghz" is a measure of intelligence (higher the better). A mouse that has an intuition level of 2.0ghz has to go through maze A9. Complicated right? It'd take awhile even if the mouse was kinda smart. But now a mouse that has an intuition level of 1.7 ghz has to go through maze A15. Easy. The maze route is a straight line - who wouldn't be able to find the end? So in the end, we ask ourselves, "Who gets the job done faster?" Obviously, the mouse in the A15 does even though it's not as smart. In this way, the A15 is just more efficient. So clock speeds (which are things like 1.0 ghz, 2.0 ghz, 1.5 ghz, 1.8 ghz) are only a part of the story. You need to factor in the microarchitecture (or in this example, the way the maze is organized).
Now, we go into cores. We can think of cores as splitting the work. Now, we will consider this scenario: we want to calculate how much time it takes for the maze to be completed four times (regardless of how many mice you are using - we just want to complete this maze 4x). A quadcore CPU that contains the 2.0ghz mouse can be represented by 4 mazes:
The dualcore CPU that contains the 1.7 ghz mouse can be represented by 2 mazes.
.
With the quadcore CPU, we will finish the task in only one go. Just put a mouse in each maze, and once they're done, we've completed the maze four times. With the dualcore CPU, we will finish the task in two go's. The mouse in each maze will have to go through each maze twice to finish the maze 4x. However, let's look at the "microarchitecture," or the maze route. Even though the dual core CPU (A15) needs to finish this task in two go's, it'll still do it a lot faster, because the route is far easier to go through. This makes the A15 more powerful. You can complete tasks quickly.
So when judging the "power" of SOCs, you need to keep three things in mind: cores, clock speed, and microarchitecture.
Clock speed = frequency such as 1.5ghz
Cores = dual core, quad core
Microarchitecture = A9, A15
Sometimes the microarchitecture won't be a vast enough improvement to justify a seriously lopsided clock speed. For example, a 4.5 ghz Intel Sandy Bridge CPU will be tons faster than a 3.2 ghz Intel Ivy Bridge CPU even though the Ivy Bridge CPU has a new microarchitecture. But, an Intel Sandy Bridge CPU clocked at 4.5 ghz will be slower than an Ivy Bridge CPU clocked at 4.4 ghz because the microarchitecture is slightly better.
Anyway, I hope this clears things up. I know the information here is probably not 100% accurate, but I'm hoping this is easier to understand than pure numbers and technical talk.
Where are the idiots that keep saying this? Honestly, I'm growing tired
Think about it, do you really think the quad core in your tablet is faster than say, a dual core core i5 in a laptop?
NO
Maybe this explains to you how a new processor architecture (A15), even at a dual core, can be faster than a quad core on the old architecture
Good explanation @404 !
What I'm worried about is whether the processor can handle that screen. I wonder if its possible to overclock it?

Samsung announces Exynos 5 Octa, an A15 and A7 hybrid SoC

Samsung's most exiting announcement yet - the Exynos 5 Octa chipset.:laugh:
It brings 8 processor cores, which distribute the work load among each other.
Four Cortex-A15 cores ensure incredible performance while the other four are low-power Cortex-A7s that kick in for the less demanding tasks and save battery power.
The chipset is based on ARM's big.LITTLE tech, which ensures that you will always get enough performance without having to deal with terrible battery life.
Samsung promises up to 70% lower power consumption compared to the Exynos 5 Dual, which is powering the Google Nexus 10 tablet and the latest Chromebook by Samsung.
The Exynos 5 Octa could posses as much as twice the 3D rendering prowess of the Exynos 4 Quad, which is found in the Galaxy S III. :good:
Will this end to Samsung S4?
Hard to say, normally use exynos 5 quad since no device using it yet
Yet, the other 4 core should use cortex a9
Then totally is a beast already
Sent from my GT-I9300 using xda premium
So? What is it that a single A15 core can not do but 4 A7 cores can? Low expected performance or low power consumption?
_delice_doluca_ said:
So? What is it that a single A15 core can not do but 4 A7 cores can? Low expected performance or low power consumption?
Click to expand...
Click to collapse
I think the main idea about this chip is the low power consumption.
Tegra used single companion core but Samsung uses 4 companion core-like.
Samsung is crazy they may add it in S4, besides they are the manufactures.
XeactorZ said:
Hard to say, normally use exynos 5 quad since no device using it yet
Yet, the other 4 core should use cortex a9
Then totally is a beast already
Sent from my GT-I9300 using xda premium
Click to expand...
Click to collapse
Exynos 5, maybe. But the Chip are the same with Exynos 5 but with just addition of little cores. Im i right?
XeactorZ said:
Hard to say, normally use exynos 5 quad since no device using it yet
Yet, the other 4 core should use cortex a9
Then totally is a beast already
Sent from my GT-I9300 using xda premium
Click to expand...
Click to collapse
Agree. A7 cores may provide good battery life but i guess performance will not be up to the mark. But then there's 4 a15 cores, a heck of performance
Sent from my GT-I9300 using xda app-developers app
All that announcement sounds great, but how to cool it down when reaching high clocks and high temperatures? Isn't this Biglittle chip more suitable in a bigger body , Chromebook, tablet, rather than in a phone? I understand the swapping on the fly between the lower and higher cores, but the heat question is not answered for me. Can somebody with better knowledge enlighten me?
Aur3L said:
All that announcement sounds great, but how to cool it down when reaching high clocks and high temperatures? Isn't this Biglittle chip more suitable in a bigger body , Chromebook, tablet, rather than in a phone? I understand the swapping on the fly between the lower and higher cores, but the heat question is not answered for me. Can somebody with better knowledge enlighten me?
Click to expand...
Click to collapse
Some rumours suggest that they'll use the 8 core in tablets and chrome books, and that the S4 may have a 4 core version with 2 A15's and 2 A7's. Just a rumour but heat and volts will be a lot lower with 4 less cores. I have no idea how dual A15's are in terms of perfomance, haven't checked other phones benchmarks in ages
Just need to announce the most expected product: exynos sources
Sent from my GT-I9300 using xda premium
_delice_doluca_ said:
So? What is it that a single A15 core can not do but 4 A7 cores can? Low expected performance or low power consumption?
Click to expand...
Click to collapse
You must look at this as the current Intel laptops who have I3,5,7 processors.
When you are browsing the web the onboard videpchip is doing all the rendering. But when you Start up for example FarCry the onboard chip gets overloaded and the graphics card takes over the duty. Power when you need it, to save power.
Another example :
When the V-TEC kicks in yo !
Sent from my GT-I9300 using xda premium
The reason for the A15 and A7 is that the A7 will handle simple tasks and minimal multitasking but when pressure is placed on the CPU it will switch to the A15 to easily handle it then back to A7 when the power of the A15 is not needed aka BIG little tech which means the perfect balance between performance and battery life
Sent from my GT-I9300 using xda premium
marcellocord said:
Just need to announce the most expected product: exynos sources
Sent from my GT-I9300 using xda premium
Click to expand...
Click to collapse
1+ mate
DramatikBeats said:
Some rumours suggest that they'll use the 8 core in tablets and chrome books, and that the S4 may have a 4 core version with 2 A15's and 2 A7's. Just a rumour but heat and volts will be a lot lower with 4 less cores. I have no idea how dual A15's are in terms of perfomance, haven't checked other phones benchmarks in ages
Click to expand...
Click to collapse
If they switch to Dual A15 and Dual A7's then Sony Z will take over in performance and will be the number one choice.
If S4 gonna have Big Small 8core its gonna Rock and will have unbeatable performance.
About the temp, when on 3G and browsing i guess it will be on A7 so temperatures is expected to be low.
Do you know most of the time S3 uses dual core and the rest are off and sometimes single core only!!
Note III will have plenty of space to include this monster of a chip. Looking at all the 5"+ devices at CES, the S IV will also be big enough for it (and also with enough room to dissipate all the heat).
Guys, just think, if you check on wikipedia, there is no Exynos 5 quad and I don’t think they will release any after the OCTA: http://en.wikipedia.org/wiki/Exynos_(system_on_chip)
As everywhere is said, the exynos 5 quad will be just a murder for a battery, so I’m sure they will use OCTA in S4. Come on Samsung, we want S4 with OCTA !
demlasjr said:
Guys, just think, if you check on wikipedia, there is no Exynos 5 quad and I don’t think they will release any after the OCTA: http://en.wikipedia.org/wiki/Exynos_(system_on_chip)
As everywhere is said, the exynos 5 quad will be just a murder for a battery, so I’m sure they will use OCTA in S4. Come on Samsung, we want S4 with OCTA !
Click to expand...
Click to collapse
Exynos octa is a exynos 5 quad with a A7 cpu
demlasjr said:
Guys, just think, if you check on wikipedia, there is no Exynos 5 quad and I don’t think they will release any after the OCTA: http://en.wikipedia.org/wiki/Exynos_(system_on_chip)
As everywhere is said, the exynos 5 quad will be just a murder for a battery, so I’m sure they will use OCTA in S4. Come on Samsung, we want S4 with OCTA !
Click to expand...
Click to collapse
Wikipedia is written by users, not companies.
There is a Exynos 5 Quad and they announced it in their smart TV's and that expansion module for upgrading for > 2012 models. It's running at 1.3GHz.
We know there's a 5410 (As shown during the Octa presentation at CES) and a 5440, as disclosed in the Linux upstream kernel.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Do you have the thread link on PCInLife? Edit: Found it but attachments are registered only and I can't get passed the Chinese catpcha.
Anyway, those slides are great. It supports all use-case models so basically the possibilities are limitless on the side of the kernel and power management. Only thing I wonder now is if the two clusters are on separate frequency planes.
I think it's come from here : http://we.pcinlife.com/forum.php?mod=viewthread&tid=2038663&extra=&ordertype=1&threads=thread&mobile=yes
But i'm not sure because i don't speak chinese. I found it on Anandtech's forum.
Edit : Sorry, i'm a little late.

Exynos Octa 5 is confirmed to be used in Galaxy S4

Octa 5, which is also named as 5410, has 4+4 cores(Can be both enable at the same time, but Samsung doesnt provide 8-core mode because of power consumption ) and 544mp(maybe 533mhz 544MP3) GPU.
The good news is that 28nm HP makes it has much lower power consumption than 4412 when only the A7 cores are running.
But sadly it only get 24000+ scores in Antutu, lower than Tegra 4. What's worse, they both dont support Open GL 3.0!!!!!!
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Sent from my GT-N7100 using xda premium
alexander1995 said:
Octa 5, which is also named as 5410, has 4+4 cores and 544mp GPU.
The good news is that 28nm HP makes it has much lower power consumption than 4412 when only the A7 cores are running.
But sadly it only get 24000+ scores in Antutu, lower than Tegra 4. What's worse, they both dont support Open GL 3.0!!!!!!
View attachment 1773421
View attachment 1773422
Sent from my GT-N7100 using xda premium
Click to expand...
Click to collapse
i find this hard to believe. For a start A15 cores are huge and hot and made for tablets unless clocked around 900mhz. Die area for an 8 core design is going to be way bigger than anything else on the market. Makes me think this isnt made for a phone.
Even Tegra 4 is made for tablets and the 4i is the phone variant with LTE built in. Since all of these are 28nm then there is no die shrink to help.
Snapdragon 600 looks to be a quad core with low power use and benchmarks very high. I doubt the solution is an A15 + A7 chip 8 core.
Where is the proof?
From 'various' source I have read (just google) the international will have the 8 core where the US version will have the snapdragon 600. These are mostly speculation with 'some' insider info. Won't be much longer until we know for sure...
Here is the maximun power list
4210→about 3w
4412→2w
8064→about 5w
tegra4→between 5w and 7w
Sent from my GT-N7100 using xda premium
alexander1995 said:
Octa 5, which is also named as 5410, has 4+4 cores and 544mp GPU.
The good news is that 28nm HP makes it has much lower power consumption than 4412 when only the A7 cores are running.
But sadly it only get 24000+ scores in Antutu, lower than Tegra 4. What's worse, they both dont support Open GL 3.0!!!!!!
View attachment 1773421
View attachment 1773422
Sent from my GT-N7100 using xda premium
Click to expand...
Click to collapse
That 40000 bugs me.
sent from an Galaxy s3 GT I9300
Running perseus kernel 33.1 , XELLA 4.1.2 leaked build
forum.xda-developers.com/showthread.php?t=1784401
Dont click,you might regret , I won't be responsible if you brick ur head
Hmm...
5.5W?
Seems likely this is real deal
Sent from my GT-N7100 using xda app-developers app
alexander1995 said:
But sadly it only get 24000+ scores in Antutu, lower than Tegra 4.
Click to expand...
Click to collapse
Those preliminary numbers from Teg4 are nothing but Nvidia's famous marketing hype which they are far better at than meeting delivery schedules and self-declared performance claims. The benchmarks were from a cobbled together device that can't even be considered a reference platform. And there was no battery life optimization applied so, at the scores posted, a tablet would probably last three hours. Until their chips are in a device that's purchasable anything Nvidia's says or does should be taken with a grain of salt.
Or maybe Octa start from A7s to A15s, That it isnt higher than Intel Dual Cores makes no sense
Sent from my GT-N7100 using xda premium
alexander1995 said:
Or maybe Octa start from A7s to A15s, That it isnt higher than Intel Dual Cores makes no sense
Sent from my GT-N7100 using xda premium
Click to expand...
Click to collapse
What makes no sense?
Are you guys still fighting over Antutu/quadrant scores? Someone should kill those benchmarks. Waste of time and a disgrace in the name of benchmark tests. an antutu/quadrant score is just a score. you could literally smell a device and give better prediction about performance than those benchmarks could possibly give you.
hot_spare said:
What makes no sense?
Are you guys still fighting over Antutu/quadrant scores? Someone should kill those benchmarks. Waste of time and a disgrace in the name of benchmark tests. an antutu/quadrant score is just a score. you could literally smell a device and give better prediction about performance than those benchmarks could possibly give you.
Click to expand...
Click to collapse
Not to mention this forum is about the sgn2, not speculation or otherwise on other devices. :/
-----
I would love to help you, but help yourself first: ask a better question
http://www.catb.org/~esr/faqs/smart-questions.html
spycedtx said:
Not to mention this forum is about the sgn2, not speculation or otherwise on other devices. :/
-----
I would love to help you, but help yourself first: ask a better question
http://www.catb.org/~esr/faqs/smart-questions.html
Click to expand...
Click to collapse
http://forum.xda-developers.com/showthread.php?t=2091232
sent from an Galaxy s3 GT I9300
Running perseus kernel 33.1 , XELLA 4.1.2 leaked build
forum.xda-developers.com/showthread.php?t=1784401
Dont click,you might regret , I won't be responsible if you brick ur head
*another* variant
sent from an Galaxy s3 GT I9300
Running perseus kernel 33.1 , XELLA 4.1.2 leaked build
forum.xda-developers.com/showthread.php?t=1784401
Dont click,you might regret , I won't be responsible if you brick ur head
i9100g user said:
*another* variant
sent from an Galaxy s3 GT I9300
Running perseus kernel 33.1 , XELLA 4.1.2 leaked build
forum.xda-developers.com/showthread.php?t=1784401
Dont click,you might regret , I won't be responsible if you brick ur head
Click to expand...
Click to collapse
26xxx points in Antutu? I think all these benchmark screenshots are fake. Tegra 4 gets more than 36xxx points while getting double the cpu integer and floating point performance of the exynos octa. Both chipsets are using 4 A15 cores. The only reason for such a bad score can be that there are only the A7 cores running instead of the A15 cores. So I think, the final S4 with Exynos Octa will get at least 30xxx points or more in Antutu.
SaschaHa said:
26xxx points in Antutu? I think all these benchmark screenshots are fake. Tegra 4 gets more than 36xxx points while getting double the cpu integer and floating point performance of the exynos octa. Both chipsets are using 4 A15 cores. The only reason for such a bad score can be that there are only the A7 cores running instead of the A15 cores. So I think, the final S4 with Exynos Octa will get at least 30xxx points or more in Antutu.
Click to expand...
Click to collapse
I will be actually happy if the device runs most times on A7. If that means lower scores on some benchmarks, i will have no complaints. BTW, it's useless discussing about nvidia's marketing numbers. we can talk this to death after it's released in a device. i am sure we can also do a lot with benchmarks on a heatsinked test device.
the moment a whitepaper talks about antutu/quadrant scores, it starts to lose credibility.
BTW, antutu is a device benchmark tool, not a processor benchmark.
Finally freed from ARM Mali 400-MP4 ! Note 3 will probably also have this system. Now in Real Racing 3 in high details ~25-30 FPS so it's sad
Bad english sorry ... Google translator
Delete
Those s4 leaks were fake apparently (updated by sammobile)
But these are real ones , this is AT&T variant running Snapdragon600:
[Url http://www.gsm-israel.co.il/%D7%97%D7%93%D7%A9%D7%95%D7%AA/%D7%91%D7%9C%D7%A2%D7%93%D7%99-%D7%A6%D7%99%D7%9C%D7%95%D7%9E%D7%99-%D7%9E%D7%A1%D7%9A-%D7%9E%D7%AA%D7%95%D7%9A-%D7%94-galaxy-s-iv] Source here[/url]
sent from an Galaxy s3 GT I9300
Running perseus kernel 33.1 , XELLA 4.1.2 leaked build
forum.xda-developers.com/showthread.php?t=1784401
Dont click,you might regret , I won't be responsible if you brick ur head
Exynos Octa 5 vs Intel arm, may be soon !
Exynose for me all the way. The best soc of current era. No overheating issues easily overclockable and gives the max
Sent from my GT-N7000 using xda app-developers app

Exynos 5420 vs Snapdragon 800

Hello guys,
I am a newbie and I would love if someone could technically compare this two socs.
I have already read the topics in the Galaxy S4 sections, but they are not updated.
I would like to hear from expert users what are the differences performance-wise.
I also have some questions:
1) Which version is the Octacore soc in the Note 3? 5420?
2) In practice I know that if I choose the Snap800 version I get LTE and 4k video recording; but what are the advantages of the Octa Core soc?
I know that the new Octa Core is 15-20% more powerful than the 5410, but so is the snap800 vs snap600. What is the best in raw performance between the two socs?
3) How does the new Adreno 330 compare against the new Mali T628 in terms of pure performance?
4) I read in the S4 forum that the Exynos 5410 is affected by a bug at the Corelink which caused some slow downs in the switch between A7 and A15 cores. Is the Exynos 5420 bugfree?
5) Which cpu is more future proof? I mean, aren't the A15 cores newer than the Snap's Krait 400?
6) Do we actually know if the 5420 supports the HMP (Heterogenius MultiProcessing)? If it does, will we have to wait for a software upgrade? (kernel?)
7) Which will be the best for battery consuption? Today I read that the Snap800 has an Envelope Tracking which should reduce the battery consuption caused by the mobile network by 30%, but I have read everywhere that the octa core is a bit more power efficient.
Sorry for all those questions and for my english. I study economy and I have just a little knowledge of these things
It does now. And they actually created a song to show that:
CLARiiON said:
It does now. And they actually created a song to show that:
Click to expand...
Click to collapse
Thank you, so we have an answer for my 6th question, but are we sure that the video is referring to the 5420 soc and not to a future revision?
Edit: for the people who might be interest, I found this page in a forum in which there is a comparison between Adreno 330 vs Mali T628:
http://beyond3d.com/showthread.php?p=1768581 and in terms of pure performance the Adreno 330 seems to be around 25-26% better than the Mali. But both deliver more GFlops than the Tegra 4.
So at the moment:
Snap800:
+ Ultra HD 4k video recording;
+ Lte
+ Better GPU performance (aprox 26% more)
OctaCore 5420
+ HMP which should led to better battery life
+ New A15 platform
The only things left to understand is the raw power of the cpu (4 A15 @1.9ghz vs 4 Krait 400 @2.3ghz) and the efficiency (Octa's HMP vs Snap's Envelope Filter http://gigaom.com/2013/09/05/thanks...xy-note-3s-huge-screen-wont-kill-its-battery/ which is said to spare 25% of battery life! Is that believable??)
Any idea?
TMaLuST said:
Edit: for the people who might be interest, I found this page in a forum in which there is a comparison between Adreno 330 vs Mali T628:
http://beyond3d.com/showthread.php?p=1768581 and in terms of pure performance the Adreno 330 seems to be around 28-29% better than the Mali.
Click to expand...
Click to collapse
Don't look at GFLOPS number for 2 different architecture. Doesn't make much sense.
Just as an example, check numbers for Adreno 330 and T4. GFLOPS are quite different, but actual GFXbench numbers are very close.
Samsung says GPU is 2.3 times the Octa 1st-gen.
In SIGGRAPH event, they showed that it's 2x N10 numbers in a prototype. Just not the fps, but the overall demo looks much cleaner. So, I expect the number to be close to 2.5x-3x once it's optimized fully. You will notice that similar happened with T4 Shield. Over period of time, it has got much better result.
Another very nice demo of the T628 GPU:
TMaLuST said:
Hello guys,
I am a newbie and I would love if someone could technically compare this two socs.
I have already read the topics in the Galaxy S4 sections, but they are not updated.
I would like to hear from expert users what are the differences performance-wise.
I also have some questions:
1) Which version is the Octacore soc in the Note 3? 5420?
2) In practice I know that if I choose the Snap800 version I get LTE and 4k video recording; but what are the advantages of the Octa Core soc?
I know that the new Octa Core is 15-20% more powerful than the 5410, but so is the snap800 vs snap600. What is the best in raw performance between the two socs?
3) How does the new Adreno 330 compare against the new Mali T628 in terms of pure performance?
4) I read in the S4 forum that the Exynos 5410 is affected by a bug at the Corelink which caused some slow downs in the switch between A7 and A15 cores. Is the Exynos 5420 bugfree?
5) Which cpu is more future proof? I mean, aren't the A15 cores newer than the Snap's Krait 400?
6) Do we actually know if the 5420 supports the HMP (Heterogenius MultiProcessing)? If it does, will we have to wait for a software upgrade? (kernel?)
7) Which will be the best for battery consuption? Today I read that the Snap800 has an Envelope Tracking which should reduce the battery consuption caused by the mobile network by 30%, but I have read everywhere that the octa core is a bit more power efficient.
Sorry for all those questions and for my english. I study economy and I have just a little knowledge of these things
Click to expand...
Click to collapse
1) Exynos 5420
4) Exynos 5410 had its CCI disabled-a hardware limitation, not a bug. The CCI is enabled in 5420
6) It does support all the 3 processing types. Latest linux kernel has the support.
The rest of your questions (3,5,7) need real life testing and user reaction and professional reviews as the two variants will finally be out.
BTW, the youtube video demonstrated another new variant- Exynos Octa-Pella which most probably will be called Exynos 5440 and will come later this year.
thekoRngear said:
1) Exynos 5420
4) Exynos 5410 had its CCI disabled-a hardware limitation, not a bug. The CCI is enabled in 5420
6) It does support all the 3 processing types. Latest linux kernel has the support.
The rest of your questions (3,5,7) need real life testing and user reaction and professional reviews as the two variants will finally be out.
BTW, the youtube video demonstrated another new variant- Exynos Octa-Pella which most probably will be called Exynos 5440 and will come later this year.
Click to expand...
Click to collapse
Thank you very much. I learn a lot everyday here on Xda!
If you want a more future-proof device, i'd go with an obvious (for me, at least) choice for the S800. You'll get the drivers and more dev support. Samsung doesn't release all the drivers - they never did and i don't think that's going to change for their Exynos chipsets.
Not to mention the fact that if you like CM/AOSP stuff, you better stick with the Snapdragon model
Edit: adding link:
http://forum.xda-developers.com/showpost.php?p=31873214&postcount=7773
.
electronical said:
If you want a more future-proof device, i'd go with an obvious (for me, at least) choice for the S800. You'll get the drivers and more dev support. Samsung doesn't release all the drivers - they never did and i don't think that's going to change for their Exynos chipsets.
Click to expand...
Click to collapse
Really ?
And I thought GPU drivers got updated on my s3(Exynos) :wink:
Sent from my GT-I9300 using Tapatalk HD
qualcomn for better development on ASAP based roms
Sent from my GT-I9505 using Tapatalk 2
Going with the s800.
Purely for the fact that it will receive better dev support.
Interesting to see how they both compare against each other.
If they both had 4g it would be a different story - What is samsungs excuse for the lack of 4g? Surely the 5420 supports it
I'd go for the 5420 for the HMP and the Wolfson DAC
Sent from my iPhone 5 using Tapatalk
hisname said:
qualcomn for better development on ASAP based roms
Sent from my GT-I9505 using Tapatalk 2
Click to expand...
Click to collapse
ASAP? Or did you mean AOSP?
TMaLuST said:
Snap800:
+ Ultra HD 4k video recording;
+ Lte
+ Better GPU performance (aprox 26% more)
OctaCore 5420
+ HMP which should led to better battery life
+ New A15 platform
Click to expand...
Click to collapse
Wait, is this true? Only the Snap800 will have LTE support? That doesn't seem logical? Isn't the Exy5420 supposed to be a super modern SOC?
batna.antab said:
I'd go for the 5420 for the HMP and the Wolfson DAC
Click to expand...
Click to collapse
Damn...the Wolfson dac.....serious point !
Is the Krait dac on the Snapdragon that much worse ?
.
LTE has been around quite sometime already... i wonder why samsung processor don't support it...
Edit: Never mind
.
I really want to know the performance of Exynos version before I buy it (because Snapdragon version is not available in my country)
Is there any lag in Exynos version?
Riimez said:
Going with the s800.
Purely for the fact that it will receive better dev support.
Interesting to see how they both compare against each other.
If they both had 4g it would be a different story - What is samsungs excuse for the lack of 4g? Surely the 5420 supports it
Click to expand...
Click to collapse
Regarding better dev support for the S800 a person ( most likely spam but I am unsure not being fully immersed in the tech world) on this site in the comment section claims vice versa, poor support from qualcomm and better support from samsung.
Seems I can't post the link because I am a new member. Can anyone confirm this?
Also on a side note, what size is the architecture of the A15 built on and the new Snapdragon 800 eg. 22nm process or 28 nm process?
Preliminary benchmarks!
Some preliminary benchmark from Exynos Octa version.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}

Categories

Resources