AFAIK, the current Cam OG/OG-T are using Realtek Ameba IoT RTL8735bm SoC with built-in 802.11n WLAN and BLE capability.
Based on the documents, 802.11ax is supported on this new potential variant. This may point to be using a new SoC. It might be using a newer Realtek Ameba IoT SoC that is 802.11ax capable. Or it could be switching to Ingenic T31 (used in Cam V3) or T41 with Altobeam ATBM6062 WiFI chip (used in Cam V4) ?
One concerning detail is that the FCCID label PDF file shows a photo of a Cam OG without a microSD slot, but that could be a dummy mockup for the label placement reference only.
Interesting, maybe the old SOC is just going end of life or becoming more expensive (as older tech often does when it is no longer being used as much).
It’s funny that the current one has BLE - completely useless/unnecessary in the cam but probably was just what was available.
I’d be very surprised if they didn’t leave the SD card in it. But perhaps it will be aimed at truly being a “Cam Plus Only” cam and exceptionally cheap to incentivize that.
I saw that a new FCC report had been filed for an OG variant, but on the 2 sites I watch for those, it told me the files were restricted until a later date. Interesting that you found one showing some of the restricted files.
As you know, it’s not uncommon for Wyze to continue to create new variants with different SOCs, WiFi Modules or Image Sensors. We know the V2 cam has at least 2 variants and the V3 has at least 3 variants, and Indoor plug has at least 3 variants, the standard color bulb has at least 2 variants, even though all variants are still generally referred to the same. It doesn’t surprise me that they would try to make changes with their entry-level camera over time to try to keep it competitive as one of the most affordable options on the market while offering higher end features.
That is weird that the Label picture shows a setup button with no SD card slot, but I would be shocked if it turns out to not have SD card support. That model in particular is trying to target people who are trying to SAVE as much money as possible and are less likely to get a subscription. Of all the cameras, it will almost definitely have to have SD card support. We’ll have to see when more of the details are public. I tried to check the official FCC gov site but it kept timing out. Usually they will have “External Pictures” of the correct updated version on there, and those are often not restricted.
BLE is Bluetooth Low Energy - used for sending high quality audio at lower power draw via bluetooth connections (well LE Audio is the main use, but it can also be used to reduce power draw for other things). Given that setup takes a couple minutes, it is sort of pointless to have, and it is only compatible with newer phones/devices so I’m guessing it isn’t used at all. Just happened to be on the SOC they picked.
But that aside, yes I would have expected the OG to have to use QR code, it is surprising that the OG has Bluetooth for setup when the Panv3 and other more expensive ones require QR. My guess is the SOC that was available and cheapest at the time just had it, so they used it. I just thought it was funny that it lists BLE support when that is totally useless here.
I can tell you why everyone is using BLE for provisioning now. It’s because the CSA Matter Alliance has chosen BLE as the standard for device provisioning now. After provisioning, most devices are supposed to then operate over Wi-Fi, Thread, or Ethernet. Though it does sound like they may be considering expanding to allow Z-Wave as a standard as well. I’m not totally certain if that will fly, but Z-Wave has been making concessions lately so they don’t get left behind.
Regardless, BLE will be used by everyone from now on because all the Big companies have declared it as the new standard for setting anything up securely and all efforts are being poured into BLE provisioning.
They give a lot of rationales for why they chose BLE over other options, including standard Bluetooth, but ultimately the rationales don’t matter much. All that matters is that all the major players agreed this must be the standard, so that’s what almost everyone is going to do with newer devices if they want to leave open the option for compatibility and certification at some point.
Just seems silly for something you’re going to use for a few minutes once (maybe a few times) then will remain disabled.
But like I said, I’m guessing that whatever is popular and desirable is on the cheapest bulk products so it is more there as coincidence than any actual need. I haven’t tried setting up an OG or v4 with my really old phone (which does not support BLE), I’m guessing since we haven’t seen a ton of complaints that people can’t set up these cams with older phones that it likely doesn’t rely on BLE at all.
My 4 year old “new phone” just barely supports it, I think it was the first or second Pixel with it, and it wasn’t enabled until Android 14 rolled out.
Yeah, I get that too. My understanding is that it has lower complexity, reduced power usage (especially for battery devices), and easier implementation with simplified connection overall. I’m not expert, but from what I saw in an article for CSA rationales, they claimed that BLE was also preferred precisely because there was no audio streaming features in provisioning, just simple, secure data exchange which BLE does more efficiently than regular bluetooth. All this without unnecessary overhead. They also claim that BLE has been a standard in all screened Devices for a long time now since it was introduced in 2009 and basically included in everything since the mid 2010’s, and since phones are considered insecure within a few short years (no more security patches after a few years), almost every device people have should be using BLE by now.
I mean, I do still have an old but functioning Palm Centro from 2008 that still works that doesn’t have BLE (just Bluetooth 1.2), but those kind of issues are pretty rare for anyone who has a still functioning smart phone or tablet to not have BLE. I’d say you have to go back at least 7 or more years to find something without it, and it would be rare for even devices that old.
Still, I’m a little surprised that they made a “Standard” for something that requires devices to have a SEPARATE radio signal.controller just for provisioning and never be used again. We’re seriously paying something like $0.50 extra on all smart devices now just to use BLE once for setup and never get used again. It’s like a $0.50 activation fee nobody knows about. I would think if the device will use WiFi, they would just have provisioning done over WiFi like used to be the case with some Smart Bulbs for a while…you tell your phone to connect to the smart bulb WiFi, and then add the internet to it. No need for Bluetooth. But I will admit that using Bluetooth for provisioning is a lot more simple and smother than provisioning through WiFi, so I think it is a good idea for all the less tech savvy people out there.
BLE is a great thing (and it can have audio if they put LE Audio support in the chip). Just seems like whatever battery it might use during setup (after which bluetooth is disabled until you hold the setup button again) is negligible. I honestly don’t think this was any sort of intentional inclusion by Wyze, but who knows. When Pixel added full BLE with audio, my family member with bluetooth hearing aids went from 10 hour battery life to 18 (on the aids, not the phone), simply because the phone to hearing aid connection wasn’t constantly active anymore, only when needed. The one disadvantage is she can’t just say “hey google” anymore when the phone isn’t near, she has to tap the aid twice to activate the connection, but a small price to pay.
Bluetooth, second only to HDMI, is somewhat run on the Mafia or Union model. You do what we say, or else. HDMI is the perfect example of when you allow commercial companies to set standards (Thunderbolt too).
Sadly USB is even sort of like that, even though the guy that invented it released the technology free of charge with no patent rights. But at least to USB’s credit, they generally maintain full backwards compatibility and just add more throughput. Although USB-C/Thunderbolt/PD is starting to change that with the varying capabilities that you have to study the icons to figure out (and even then it isn’t always clear).
I hate that so much. I can’t tell you how many times people tell me things like their second monitor is broken when it turns out they just don’t understand the difference between the types of USB-C/Thunderbolt Cords and were using the wrong “USB-C” cord instead of a higher quality Thunderbolt cord. But I admit it’s tricked me a couple of times too. I’ve started wrapping a special label around the higher quality cords that are needed to run and power certain devices because they won’t function with the other kinds of USB-C.
It’s been a recent frustration with certain picky devices that really need the higher throughput and LOOK like they’ll work with any USB-C cord, when they won’t. I’ve been trying to explain this to people, but it seems hard for some of them to grasp that USB-C cords are not all the same even though they all LOOK the same to them.
I do agree with this:
I think it’s mostly that they are just trying to move forward with the new standard.
The problem is USB-C is no longer a universal protocol, just a universal connector (which is what they should have called it). It may be video only, it may be data only, it may be power only, or it may be any combination of those 3. Different data rates, different power rates too.
And in someone’s infinite wisdom they’ve got one lightning bolt icon to signify PD, and a similar one to signify Thunderbolt. And of course the cables often don’t have those icons on them.
At least USB 3 ports are color coded to differentiate from USB 2 ports. But with C it is just a guess mostly.
You’ve just explained the bane of my sanity recently and why I have reverted to labeling cords so people stop getting so confused about what works with what even though they all look the same and “fit.”