Pets vs People events

AI Question?

does anyone know how AI works to determine pets vs people?
Is it the size of the heat source?

I ask because we get deer in front yard overnight which I would to try to differentiate from cats and racoons.


Believe it ur not, there is a lot of worldwide debate on the subject and the general consensus is that there isn’t a single person in world who really knows how AI works to determine anything. :sweat_smile: We feed it training data and tell it what the difference is with different samples and it figures out it’s own patterns to guess how we decided that. But the underlying code is so complex that nobody has ever been

Some classic research examples:

  • ImageNet Errors: A study found that the ImageNet dataset, which is widely used to train AI models, contained numerous labeling errors. For instance, a mushroom was labeled as a spoon, and a frog was labeled as a cat.
  • Single Pixel Confusion: Research highlighted that changing just one pixel in an image could lead to misclassification by AI. In some cases, the errors were significant, such as a stealth bomber being labeled as a dog. Humans can’t be tricked that easily with a single pixel change. In fact, humans wouldn’t even notice there was a difference.

There have been instances where training data will make connections where it shouldn’t. I’ve seen examples of a certain color car being labeled as a face detection for someone because most of their previous face detection sample data included them wearing a shirt of that color, so now it assumed that color was related to their face and reported the car as that person even though no person was even in the image at all.

This complexity is a concept known as the “black box” nature of AI. Due to the complexity of machine learning models, especially deep learning, it can be challenging to interpret exactly how these models arrive at their decisions. The intricate network of connections and weights within models like neural networks processes information in ways that are not always transparent, even to the developers of the AI.

Experts in the field are working on making AI decision-making more interpretable and aligned with optimization challenges.
This is part of a broader effort in the field of AI known as Explainable AI (XAI), which aims to create more transparent models. So it may be possible in the future, just not currently.

For instance, the Smart “Predict, then Optimize” (SPO) framework is an approach that focuses on minimizing decision error rather than prediction error, which helps align machine learning models more closely with practical objectives.

Another technique like feature importance analysis and model-agnostic methods are being developed to shed light on the reasoning behind AI decisions too.

But currently? No. Nobody truly knows. We know we can feed it new training data and new weights, but we don’t ultimately know all the reasons it decided what it did. We can make some reasonable deductions and inferences based on detection outcomes and proper testing though.


But to answer your general question more directly, Wyze has people submit all animals as “pets” so the training data is likely to list all of the above as pets right now with no differentiation.

1 Like

except probably not deer as do not expect too many deer are pets!!! :grinning:

Deer absolutely show up as pets. It’s actually the only reason my mom keeps paying for Cam Plus on her Wyze Doorbell, so she can get pet notifications when a deer passes by their house.

She got Cam Plus included for 1 year with her doorbell purchase a while ago, so they used it. But then it alerted her of deer as a pet detection the few times deer passed by her house, and she got all excited, so when the year ended, she and my dad decided to keep Cam Plus SOLELY because it tells them when deer pass by their house a few times per year by sending them a pet notification. :joy: Seriously, my parents were not going to pay for Cam Plus at all except that it was included for 1 year with their purchase (the VDB had a Christmas special for like $15 or something with 1yr cam plus), and then when it detected and alerted them to deer, they now only keep it for that (like a few notifications per year) and pay for Cam Plus just for those deer notifications, otherwise they’d totally get rid of Cam Plus because they only need the VDB to call them when someone presses the button.

Kind of funny, but there it is. @WyzeMatt you reading the above? People LOVE being alerted to cool wildlife, not just domesticated pets. If you want to increase your subscriptions, consider that. I can tell you my parents don’t care about any of the other detections or cam plus benefits. They pay you solely so they can know when deer pass by their house. :joy: You are getting pay $20/yr just for the 2-3 notifications per year that tell my mom when she had a deer by her house.


We have talked about doing a totally separate subscription just to identify different types of birds.



And deer!! :grinning:

| WyzeMatt Wyze Team
June 26 |

  • | - |

We have talked about doing a totally separate subscription just to identify different types of birds.

1 Like

So for those introduced to the AI Video Search Beta… the future roadmap is to turn literally anything into a notification/granular detection. This is expensive, but can be done. The question really is… how much will customers pay for it.


That’s a cool idea to allow custom notifications. I’m curious how you’d implement it. I started making a list of all the objects in it’s recognition, and it’s a lot, so you’d have to do something like let a person type words in a notification field rather than face checkbox for all the possibilities.

Would this not be included with one of the new subscription tiers showing up in the faqs or website, including the one that hasn’t launched yet? Would you make it separate?

1 Like