Picture atop this section - Xperia Z5 Compact General

Now I may be splitting hairs here but the picture of the white device at the top of the Z5C section isn't a Z5C!

Related

Color transition render problem?

Hi eveyone,
I just got my Nexus 7 (C80) and was wondering why the color transition is sometimes so bad? I attached two screenshot which show the "issue" with a screenshot and a picture from quadrant benchmark.
In the taken picture you can see the bad color transition at the red balls (marked with a yellow circle). here is clearly a bad transition from red to black visible.
In the second picture, which is a screenshot made with android, you can see how it should look like (again marked with a yellow circle). here you can see a nice color transition.
Since in the picture it show how my eyes see it (it's not the camera with which I took the picture), I was wondering if it is the display and not the GPU or CPU?
Does anyone has recognized the same?
Thanks,
ICEMANno1
ICEMANno1 said:
Hi eveyone,
I just got my Nexus 7 (C80) and was wondering why the color transition is sometimes so bad? I attached two screenshot which show the "issue" with a screenshot and a picture from quadrant benchmark.
In the taken picture you can see the bad color transition at the red balls (marked with a yellow circle). here is clearly a bad transition from red to black visible.
In the second picture, which is a screenshot made with android, you can see how it should look like (again marked with a yellow circle). here you can see a nice color transition.
Since in the picture it show how my eyes see it (it's not the camera with which I took the picture), I was wondering if it is the display and not the GPU or CPU?
Does anyone has recognized the same?
Thanks,
ICEMANno1
Click to expand...
Click to collapse
Are you referring the the discrete transitions of shading on the red balls? To me that would indicate that the display/GPU is doesn't have sufficient number of bits. Kinda' like the earlier Windows graphic cards with 16k colors instead of 16M.
mmarquis said:
Are you referring the the discrete transitions of shading on the red balls? To me that would indicate that the display/GPU is doesn't have sufficient number of bits. Kinda' like the earlier Windows graphic cards with 16k colors instead of 16M.
Click to expand...
Click to collapse
yes, exactly! it's like if you go to safe mode in windows when it disables the gpu driver and you have those bad transitions.
It's just kind of strange, since the picture shows what the display shows but the screenshot shows what the gpu sends to the display. at least thats what I guess

What colour Black or White and why?

As per the Title
Our country only offers the S6 and S6 Edge in Black and White.
I'll be picking mine up tomorrow
Which colour would be better and why?
Thanks:good:
From what I saw in reviews and comments, white is less prone to fingerprints and smudges, or at least that you can't see them as much
Black because I think it looks more "professional" and I think it looks better in the dark, as you will see all screen and no bezel.
You don't really need to start a new poll for this.
http://forum.xda-developers.com/galaxy-s6-edge/general/s6-edge-colors-availability-t3044109
http://forum.xda-developers.com/galaxy-s6/help/galaxy-s6-black-white-t3067430
http://forum.xda-developers.com/galaxy-s6/help/samsung-galaxy-s6-color-t3043646
http://forum.xda-developers.com/galaxy-s6/help/black-sapphire-versus-traditional-black-t3071324
Mod Edit
There is already a thread on this:
http://forum.xda-developers.com/galaxy-s6/help/samsung-galaxy-s6-color-t3043646
Thread closed
malybru
Forum Moderator

Screen banding

It's come to my attention that some people may not have tinted displays but have horizontal screen banding.
This is not normal, and no, it is not a properly of AMOLED screens. It is not image retention. It is a defect. It is especially visible on grey backgrounds.
You can check for yourself by downloading "Dead Pixel Test" by Ossibus Software on the Play store. When viewing the darker shade of grey, if your screen has banding, it'll be clearly visible as horizontal line(s).
Has anyone experienced anything similar?
*Raises hand!
HMkX2 said:
You want to see banding on a Nexus 6P? I'll show you banding.
Both are Nexus 6P. Guess which one is going back for replacement.
For some reason it shows up better the farther away you are... in person it is fairly obvious.
Click to expand...
Click to collapse
Here is my post from the "pink hue" thread, most appropriate here. It shows an example of what to look for.
(Ignore the diagonal lines. Those are scanline artifacts of a CMOS camera sensor.)
The thumbnails/looking from a distance/in person it is far more evident. Especially on movies/images with gradients. The human eye is remarkably sensitive to contrast.
You shouldn't have to hold a phone 5" or closer from your face to not notice abnormalities. If you have to, it's a defect. That's my reasoning, anyway.
Also, this is a "minor" defect, but a very noticeable one.
It's like a watermark, or bright spot on the display, and can be very distracting during daily use.
It draws your eye's attention to one region of the screen, regardless of content there. (Text, etc -- lower 1/3, in this case, gets focus.)
This criticism is especially true because this is an "ultra flagship" device.
(I didn't spend $900 on my first car, let alone my first phone.)
It's really looking like Samsung sent the defective screens that were in the garbage to huawei from all these threads I'm seeing on the screen.

Nexus 6P: Camera Issues (Hot Pixels?)

6P Camera issues (hot pixels?)
I'm having issues with my brand new Nexus 6P device. While taking HDR+ photos under low light conditions, or using the flash with HDR+ I see a red dot on the right corner (vertical) or very center top (horizontal) in each single photo, while it is processing the image. After processing is done, the dot is being filtered out by the software, but I don't think this is normal... Since the dot is always in the same place. I have a few sample photos of how it looks in low light but not allowed to post links here yet.
The red dot does not appear when HDR+ is switched off, or when there's plenty of light when taking the photo.
I'm guessing nobody else has this issue, or I'm worried for nothing. It's just peculiar how the red spot is always at the exact same place, while processing. That got me wondering if there is something wrong with it.
I have not had this issue on the Nexus 6P. However, I've had this issue on a Canon digital camera before, although the dot appeared in all my photos!
What I can tell you is that stuck/hot pixels are common. In fact, almost all camera sensors are produced with some amount of bad pixels (on accident of course, just part of manufacturing tiny things), but they simply modify the firmware to edit the pixel out. This only works so long as there aren't a ton of bad pixels or no new bad pixels arise after the camera is shipped out. In the case of my Canon camera, I had to send it back under warranty for them to modify the firmware and map out the bad pixel. Later, when I got more bad pixels out of warranty, I sideloaded custom software on the camera to map out the new ones myself.
Now, what may be happening in your case is that the Nexus 6P firmware is designed to try to map out bad pixels on the fly. Does the dot appear in any final photos, or just in the processing preview? Either way, I would suggest contacting customer support immediately as your phone may require a replacement to prevent it getting any worse!
Hannes084 said:
I'm having issues with my brand new Nexus 6P device. While taking HDR+ photos under low light conditions, or using the flash with HDR+ I see a red dot on the right corner (vertical) or very center top (horizontal) in each single photo, while it is processing the image. After processing is done, the dot is being filtered out by the software, but I don't think this is normal... Since the dot is always in the same place. I have a few sample photos of how it looks in low light but not allowed to post links here yet.
The red dot does not appear when HDR+ is switched off, or when there's plenty of light when taking the photo.
Click to expand...
Click to collapse
Check to see if there is any bleed on the screen, I had something vaguely similar on my s3 where I had some black dots, visible only in certain conditions, there is a screen test app which allows the screen to be on but whilst it's blacked out, at certain angles you can see the blotches.
Apogies if this isn't the case...
If not, take pics of clear and bright images eg: a white piece of paper or anything light and clear, take shots with different effects turned on on the camera to see if you can capture the speck.
If it does indeed appear to be a hardware fault the only fix is going to be a return or a hardware fix.

Camera Aspect Ratios and MP

Why is 16:9 Ratio locked to 12MP? When the main camera is 16MP, 4:3 Aspect Ratio can go up to 16MP, so why can't 16:9? And could somebody possibly develop a mod to allow 16:9 to take full capability of the camera, I would try but I don't know how to, just a question I thought I'd ask, thanks in advance for replies
Joe199799 said:
Why is 16:9 Ratio locked to 12MP? When the main camera is 16MP, 4:3 Aspect Ratio can go up to 16MP, so why can't 16:9? And could somebody possibly develop a mod to allow 16:9 to take full capability of the camera, I would try but I don't know how to, just a question I thought I'd ask, thanks in advance for replies
Click to expand...
Click to collapse
16:9 just isn't a thing for 16MP. On any camera really. I'd try to explain it but I'm bad at explaining so I'll try to summarize (or at least how I like to think of it, someone can correct me). Think of megapixel as a large portion of a photo. When you have 12 it's basically 4 megapixels across and 3 megapixels tall, making 12. 16 just adds 4 more to the top or bottom row of the photo. So it's not a limitation of the phone, just of cameras in general.
Thisisabadname said:
16:9 just isn't a thing for 16MP. On any camera really. I'd try to explain it but I'm bad at explaining so I'll try to summarize (or at least how I like to think of it, someone can correct me). Think of megapixel as a large portion of a photo. When you have 12 it's basically 4 megapixels across and 3 megapixels tall, making 12. 16 just adds 4 more to the top or bottom row of the photo. So it's not a limitation of the phone, just of cameras in general.
Click to expand...
Click to collapse
The 16:9 (12MP) Toggle comes enabled on the phone out of the box (at least I think it does) (and if not, the stock rom I'm using, the Creator might have switched it, but it's not mentioned anywhere else in his thread) and 4:3 (16MP) won't let a picture fill the screen completely, I know MP isn't everything when it comes to a camera, but it's just odd that it's a 16MP main sensor with it only using 12 of 16 out of the box, it's possible I'm not comprehending this and all the last phones I've had (GS6 and LG G5) Both have had camera mods that's gave them 60FPS recordings, improvements in general, and 100% JPG Quality, I appreciate your answer, and as I've said I might be just not grasping it or not understanding the reasoning behind it, just was digging around the stock camera app came upon the setting and thought I'd ask people that know more about this stuff then me. Thank you.
It's a hardware sensor limitation in the 16mp camera for the LG V20, and while you could get an app on the market that takes 16:9 pictures on the v20 by stretching the 4:3 picture to fit the ratio, I would just accept what we have ATM considering the alternative is kinda harmful to image fidelity
EDIT: Yes there is an option to switch to 16:9 with the drop to 12mp/6-mp depending on the options you choose obviously (just looked thanks to the post Joe199799 made just before mine)
This "degradation" in Megapixel count is actually fairly normal and happens in most cameras. My previous phone had a 13MP camera but recorded 9.7MP 16:9 shots. Here's a detailed explanation for why this happens:
If you think about how a camera works, light passes through the lens to hit the sensor. Making this lens round is the easiest option to make the image projected onto the sensor as close to reality without any distortion. Making the lens another shape would distort the image, ab bit like using the wide angle camera does, which is difficult to correct in post.
The lens projects a circular image, which you want to record as much of as possible. A circular sensor could capture everything, but circular content on rectangular screens doesn't make much sense.
The next shape that takes up the most area within a circle is a square. But at the time digital image sensors were developed, computer screens had adopted ratios of 4:3 or 3:2, slight deviations from the square.
As a result, image sensors were also built in 4:3 ratios, which sacrifice some, but not too much of the entire projected image. Now, with screen ratios favouring more width, we could adapt our sensors, however, to capture images of comparable quality, lenses would have to become larger or sensor electronics would need to shrink. Neither option is particularly preferable, which is why the 4:3 ratio has stuck around and most cameras recording 16:9 images simply crop out part of a 4:3 image. Because it arguably makes sense, megapixel count refers to the entire area of the image sensor. By cropping the image, the megapixel count will obviously decrease as a certain amount of pixels is deleted.
Alexsp32 said:
This "degradation" in Megapixel count is actually fairly normal and happens in most cameras. My previous phone had a 13MP camera but recorded 9.7MP 16:9 shots. Here's a detailed explanation for why this happens:
If you think about how a camera works, light passes through the lens to hit the sensor. Making this lens round is the easiest option to make the image projected onto the sensor as close to reality without any distortion. Making the lens another shape would distort the image, ab bit like using the wide angle camera does, which is difficult to correct in post.
The lens projects a circular image, which you want to record as much of as possible. A circular sensor could capture everything, but circular content on rectangular screens doesn't make much sense.
The next shape that takes up the most area within a circle is a square. But at the time digital image sensors were developed, computer screens had adopted ratios of 4:3 or 3:2, slight deviations from the square.
As a result, image sensors were also built in 4:3 ratios, which sacrifice some, but not too much of the entire projected image. Now, with screen ratios favouring more width, we could adapt our sensors, however, to capture images of comparable quality, lenses would have to become larger or sensor electronics would need to shrink. Neither option is particularly preferable, which is why the 4:3 ratio has stuck around and most cameras recording 16:9 images simply crop out part of a 4:3 image. Because it arguably makes sense, megapixel count refers to the entire area of the image sensor. By cropping the image, the megapixel count will obviously decrease as a certain amount of pixels is deleted.
Click to expand...
Click to collapse
Thank you for the explanation it's very through and fleshed out, I think I understand now

Categories

Resources