Megapixels don’t matter

Edwin Chabuka Avatar

In 2012 Nokia released the 808 PureView and the headlines were that it came with a massive 41MP camera. Everyone else at the time was still hovering between 8MP and 16MP cameras so 41 was A LOT of megapixels.

And for a whole 3 years after it was released, it was simply the best camera in a phone. An absolute beast. But things have changed since then. Huawei is up to 50MP. Samsung and Xiaomi are at 108 and there are rumours of a 200MP camera in the works.

But we have companies like Apple and Google who have the very best cameras in the industry still using 12MP cameras. Which begs the question. If a camera with 12MP is doing a job as good as a camera with 100+ megapixels or even better. Then what is the point of making a big deal about Megapixels? You can watch the video below for the visuals.

So just a disclaimer, I’ll be going into the weeds a bit on the technical bits behind what makes a good camera but I promise I’ll try my best to explain it as simply and clearly as possible.

Let’s start with what makes up an image. Think of an image as a page within a maths exercise book. Don’t worry we won’t be doing any math right now. So each box on this page is equivalent to a pixel. And we can colour each box differently until a collection of all these boxes makes up an image.

This is where the word Megapixels comes from. Mega is a word describing one million and so 1 megapixel is one million of these boxes on one page. In maths it’s the area that the image covers.

Now that we all know what a pixel is and what megapixel means let’s go a bit deeper. How does a pixel work? How do pixels produce the exact colour at that part of the image? I remember in primary school we were taught that the primary colours are Red, Green and Blue. What I did not know was why.

Later on I understood that red, green and blue are called primary colours because they are the colours that are used as building blocks for every other colour. Lighter and darker shades of any of these 3 primary colours will make any shade of any colour.

The same is true for cameras. Each pixel is split into 3 with each pixel creating a colour on that part of the image by mixing different amounts of red, blue and green. But this is just the surface of it so we are going to go even deeper into how the sensor works.

The camera sensor is also referred to as a CMOS sensor. And in its pure form it can only see images in greyscale. Basically what it means is that the sensor sees things in thousands or even millions of shades of grey and not just the 50 that you see in movies.

For the picture to be turned into colour, there is a colour filter put on top of the sensor called a Bayer filter. This then maps the greyscale pic to the RGB filter and you have some colour. The same way you get them snapchat and insta filters.

This filter and the CMOS sensor need to be carefully made so that they work together to recreate the best possible image. The filter needs to be able to map colors as accurately as possible and also allow as much light as possible through it for the CMOS sensor to get the clearest and most detailed image of whatever it’s looking at.

Think of the CMOS sensor as your eye and the bayer filter as some sunglasses. Good sunglasses are ones that make your eyes see clearly even when you are facing the direction of the sun. Bad ones make everything look hazy or cloudy when the sun shines on them.

The last piece of the puzzle which completes the picture…get it…picture? Anyway it is the ISP or Image Signal Processor. It is a chip just dedicated to collecting information from the Bayer filter and the CMOS sensor and converting that information into an image that you can then see on the screen of your phone, tablet, laptop, TV etc.

It’s also where camera engineers tune the look of the images the sensors capture. So for example. The Samsung Galaxy S20 Ultra and the Xiaomi Redmi Note 10 pro are using the same 108MP sensor. But the ISP in the Samsung and the Xiaomi are different and so the images these cameras capture will look different.

The ISP can be made so powerful to the point where the properties of the CMOS sensor like it’s physical size or the number of megapixels won’t matter. That is how Google was able to use the same 12MP sensor for 3 years in 3 different iterations of their Pixel smartphone and still take some of the best images in any phone, day and night.

They had a super powerful ISP that extracted a lot more performance out of average hardware to produce spectacular images that were as good as, or better than cameras with over 4 times the number of pixels and in some cases over twice the physical size.

Side note. A physically bigger CMOS sensor means more area for the sensor to capture more light. This means you can be a bit lazier with the ISP and leverage on great hardware for great photos. More light generally means better pics and in the age of today, better low light performance.

So you can have a phone with a 40, 64, 108 or even 200MP camera but if the quality of the CMOS sensor, bayer filter and ISP is not up to scratch, all you’ll have is a hot mess with plenty of pixels.

The best way to show you why megapixels are now useless is how any camera from 40MP and above never takes photos at it’s full resolution in auto mode. There is a technology called pixel binning where multiple pixels are combined into one.

So for example, the Huawei P30 Pro I have will take 10MP photos when it’s in auto mode even though the main camera is 40MP. It is grouping 4 pixels on the main sensor into 1 pixel so that each pixel in this 10MP photo has 4 times the information than if I were to take a pic at the full fat 40MP.

It’s a compromise manufacturers are now making where they put less resources in the image processing in favour of the hardware. Because the hardware is easier to work with than software.

So your 48MP camera is taking 12MP photos almost all the time. Your 64 MP camera is taking 16MP photos and so forth. And because companies like Apple and Google prefer letting the ISP do most of the heavy lifting they will start with average hardware spec wise compared to competing brands but go full nerd with the image processing software to make their photos amazing. Or in other words. They don’t fuss about having the most megapixels but still take photos that are as good or better than everyone else.

Next time when you go buying a phone for it’s camera. Try taking a couple of photos with it and see how it does. Don’t just buy it because it has the most megapixels. Those numbers don’t mean much these days.

,

14 comments

What’s your take?

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  1. Technomancer

    Very misleading title as usual. Megapixels do matter especially when it comes to wanting to view them on high resolution displays or when printing. While 12MP might be suitable for some use cases, it does not fit every use case.

    12Megapixels is about the equivalent resolution of a 4K Display, if you were to crop an image taken on a 12MP camera this would result in the introduction of “noise” into the photograph when displayed on a 4K display. That noise would be even more evident when we upscale that image to an 8K display.

    The same applies for print. A 12MP image would result in an 8″ x 12″ print at 300PPI. Once cropped an image taken from that camera would result in a print with a resolution of less than 300PPI.

    1. Laywoman

      I’d rather buy a real 5 megapixel device than a fake 100 megapixel one.

      1. Tim Apple

        What is a “fake 100 megapixel one”? If megapixels didn’t matter…if it didn’t matter to Apple, to Samsung, to any of the smartphone makers & manufacturers, then they’d have all along been talking about AI, advanced algorithms, and enhanced lenses on 2 MP lenses a decade ago. Oh no, you say? You want a clearer image with more detail? That’s why things moved up to 5mp, 8mp, and 12mp, and why Samsung is and eventually Apple too will be binning higher mo cameras for higher quality images and to have enough performance capabilities to be able to record high quality, stabilized video at high frame rates.

  2. Imi vanhu musadaro

    Many bad analogies and incorrect explanations.

    Anyway, one thing that has been overlooked is that hardware is fixed, if you have a 12 MP sensor that’s it, whilst if you have a 20 MP sensor that too doesn’t change. The difference is that the manufacturer can issue updates to the underlying camera firmware, but you can never increase the sensor MPs through an update. I’d rather go with higher megapixels, though that isn’t the only decision factor.

    Mega-pixels matter even more on professional cameras, such as DSLRs, where you may want to work with raw images. Oddly, the discussion has been restricted to moblie phone cameras.

    1. Laywoman

      I don’t find it odd that the discussion has been restricted to mobile phones. Most of Zimbabwe’s population has a (cheap Chinese) mobile phone that claims to have 100+ megapixels but only a few rich can afford a professional camera.

      1. Imi vanhu musadaro

        The discussion isn’t about poverty or device affordability, it’s about camera megapixels.

  3. Isaac Machakata

    An insult to photographers & video editors 😂. Great post and comments 👌🏾

  4. The Nerd

    Does the article take into account higher pixel count in A3 or A2 posters? Pixels do matter!

    1. Laywoman

      Pixels do matter but 5 real megapixels are better than 100 fake megapixels.

      1. Vincent

        I think your argument is premised on fake phones which have fake megapixels.How about Samsung and huawei megapixels did you account for them?.are those fake too even though they are above 50+MPs?mind you those phones Market their phones based on camera resoultion and MPs.Take that away from their business model (if they are proved fake) and they have nothing

  5. Sagitarr

    An otherwise decent article on explaining pixels for the entry-level “mobile phone” community was ruined by massive generalisations. Pixels are used in photography which is a very wide field, they may not be the only, or most important aspect in smartphones. This is different in other applications. In general, it is difficult to manage what you can’t measure, which is why we have units of measurement. If we lose this aspect in product design what then becomes the basis for comparison or development? A kilobyte is 2^10=1024 bytes and a megabyte is a kilobyte squared i.e. 1024 x 1024 = 1048576 bytes. In both cases kilo and mega are approximations, kilo for 1000 and mega for 1000000 as used in the metric not binary sense.
    “Because the hardware is easier to work with than software.” – this is highly subjective. Admittedly, software development is borrowing design principles from hardware manufacturing processes because hardware is more reliable. Also, there are far too many pretenders in the software development arena than in mechanical or electronic/electrical engineering for instance. Reliable software is now being used in hardware design! I’m old school having finished my primary school in 1977, but, at that time our Rhodesian teachers had fully illustrated to us that primary colours when mixed, give the rest of the secondary colours. Red and blue=Magenta/Purple; Green and Blue = Cyan; Green and Red=Yellow. We were taught Venn diagrams that showed this and in practice used crayons so that the lessons sink….and they did.

  6. Anonymously

    Judging by the comments section, I guess it maybe wise for TechZim to once in a while consult in one or a few experts in the field to get some unbiased perspective. It seems some to most of some of these articles contain some generalised opinions.

    When it comes though to Apple vs Others in terms of Cameras and stuff, in my non-expert opinion, lol, I’d opine that, perhaps, it’s that Apple makes products, in particular phones, that maybe general consumer targeted, where as the other guys maybe making products and considering just in case, just in case that is, you may need to use your phone for other stuff.

    Though performance wise, Apple does the most with the least, thumbs up to them, they’re doing great actually.

  7. Rapid Onset Obsolescence

    “Computational Photography” is the buzzword that neatly sums up Google’s and Apple’s secret photography sauce. Some of the underlying concepts were touched on here, but if you are a shutterbug that wants to learn more about this, google that term to ‘go deeper Papa’.

    On the issue of megapixels for large format prints, it still matters but AI driven post processing is rapidly closing the gap. I was lucky enough to get to try out an online service’s open beta a few years ago and it was obscenely capable by the time it became fully commercial. I managed to scale SOME images to the point of being exhibition worthy, with the filled in detail standing up to close inspection. There is actually quite an interesting and growing ecosystem of online AI imaging services that can scale, enhance, isolate subjects, repair, generate images… Ai is coming for my job but I have to admit the progress is fascinating!

  8. Lucky

    Senior the primary colours are actually red, yellow and blue

2023 © Techzim All rights reserved. Hosted By Cloud Unboxed