Hey Wyze: Waiving trees are not people

I’m getting tired of waiving trees being mis-recognized as persons. I always go into the event and give the appropriate feedback too.

Come on man.

FYI, Most of the time, the waiving trees are not the thing that is being detected as a person, but are instead the event trigger, which causes the AI to search the rest of the screen, and often it is something else within the frame that has a false detection as an AI object, such as a mailbox, fire hydrant, front yard lamp/post, maybe a chair, and several other things.

That’s not to say that those things shouldn’t be improved too, but In the short term, if you figure out what is being falsely identified, there are sometimes things you can do to resolve that, such as block out a piece of them, or alter them slightly in some way so that they don’t keep identifying that way. That way. For example, I have had a chair get identified as a person by multiple brands cameras, not just Wyze, but when I tilted the chair slightly in one way or put something on it, the false detection suddenly went away for all brands’ cameras.

Granted, I have the benefit of having multiple cameras. That will tell me exactly what object is being identified as what object, including one of my wyze cameras. I had a Dev that [publicly] offered in an AMA to enable detection identification on one of my V3 cameras. So it will highlight in purple what the AI object is that is being detected by the AI. Whereas everybody else only gets the green motion box, which is not actually showing you what is being detected as an AI object only. What is having pixel changes which may be something totally separate, in which I have verified with my specially enabled v3 camera where sometimes the green highlighted movement has nothing to do with the actual AI object being detected.

At least that’s how the AI currently works. Wyze has stated that they are working on updating their AI system so that it will only show us AI objects when they have movement, and will ignore stationary objects. So hopefully this will mostly get resolved sometime soon. I’m hoping they launch it at the same time that they launch their new special subscription they’ve been working on all year, but who knows.

For now, if you’d like some help from people who have a lot of experience and identifying the likely culprit, you are welcome. Post an example of either a video or a screenshot of an event where a false person detection event occurred. There are many of us who have gotten pretty good at figuring out what is likely being misidentified by the AI And giving advice to numerous people about potential options of how to resolve that. This is one really useful part of the forums with having user er to user feedback like this. Lots of us have been able to figure out and help people with similar things.

But, don’t feel pressured at all. Being concerned about privacy is 100% okay and understandable :+1: this is why I tried to describe some common examples of what to look for just in case you’d like to do some experimentation yourself. Maybe block out a certain area, or other kinds of things.

Either way, Wyze has stated that they have been working on and testing a new AI model that should resolve a lot of these false detection issues. They just haven’t launched it publicly yet. Well, they sort of launched it for the Floodlight Pro, in a way, but that might not be the exact model they’ll all use.

1 Like

These days when I get repeatedly mis-categorized AI, after I submit video feedback, I hit the camera’s Reset Services button.

Somehow it seems to help with the issue. Not sure why…

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.