Use industry standard terminology for compression settings (not SD/HD)

When it comes to codecs, the word codec means compression. MPEG-2, MPEG-4 part 10 (H.264) and H,265 are all compression based codecs, The notion of codec efficiency means that for a given bitrate, one codec has fewer compression artifacts than another. The magic of a codec is so clever and is a collection of rules that describe a frame as well as rules that describe how the next frame differs from the previous one. The algorithms are amazing and each successive codec have more and more of these built in which is why they continue to get better.

Codec development, storage media development and display resolution development loosely go hand in had. That’s why DVD used MPEG2 at 480p for the current display standard, Blu-ray used H,264 at 1080p for the current display standard and 4k Blu-ray used H.265 at 2160p for the current display standard. Codec efficiency has improved over time. For DVD, Blu-ray and 4k Blu-ray, the codecs were tuned to the maximum bitrate that the storage media was able to store in order to ensure there were no compression artifacts. i.e. visible loss of image quality due to compression by the codec.

The point being is that the term ‘compression’ does not mean image quality. Compression is a codec selection decision and that alone. Wyze cameras all use h.264 compression.

For the various resolutions and for a given codec, bitrate tuning is the key. This is what determines the degree to which compression artifacts are visible (image quality). Too low a bitrate and the image quality suffers e.g. 15FPS. H.264 at 150Kbps… 360p vs 1080p image quality (compression artifacts) is devastatingly different. That’s because there are 9 times more pixels in a 1080p image than a 360p image, therefore the bitrate needs to be higher for 1080p. You can argue that 1080p at 9x150=1350Kbps, 15FPS would be an equivalently tuned bitrate to 360p at 150kbps 15FPS. Selection of resolution is generally driven by the capabilities of the display. Case in point… YouTube. When viewing full screen, you wouldn’t select 4K if your display is 1080p, there is no point. Similarly, when viewing in a small window, there is not point choosing 1080p on a 1080p display.

With IP cameras the bitrate of the stream is relevant for a few things:

  • required bandwidth for viewing live or recorded stream
  • required storage for continuous recording to storage media
  • compression artifacts (perceived image quality)
  • display size when viewed
  • connection between devices as it pertains to maximum supported bandwidth

Note that bitrate and bandwidth use the same unit of measure, bits per second.

I did a post that has an analysis of this.

Wyze have ignored industry standard terminology with their use of SD and HD and it is this alone that is the basis for so much confusion on this topic. My ask to the Wyze team is that they update their terminology to align with industry standards.

JJWatMyself post: Bytes, bits, bitrates, FPS, codecs (compression) and Samsung PRO Endurance microSD cards

Display resolutions explained

Codecs explained

4 Likes