Cam V3 Automation To Turn ON Garage Lights

I’ll get a few out this evening. :+1:

2 Likes

Something else I noticed when reading through this and looking at @cara.rejas’s excellent visuals is that even the display of the “Turn on for” Action is buggy: There’s no space between “10” and “minutes”. I suspect these screenshots were done on iOS, because on Android when I try to recreate something like this I see “10 minutes”, but I’ve seen other iOS users post screenshots that look more like the “10minutes” seen above.

I don’t know if that helps narrow the scope of the bug search for @WyzeDesmond or anyone else addressing this, but I haven’t seen it mentioned elsewhere, so I wanted to point it out.

:pencil2: Edit: I just tested with a Device & Service Trigger Automation using a Cam Pan v3 (IF “Detects motion”) and a Bulb Color Group (DO “Turn on for 1 minutes”), and it performed as expected. I’m using Android only, and I see that @cara.rejas’s profile notes “Both iOS & Android” (as does @Seapup’s), so I really think iOS might be the common denominator here.

It fails under both because it’s server-related (automation executed in the cloud). “X time” is unpredictable… sometimes action occurs as intended, sometimes it doesn’t. Keep triggering your test automation. It will fail, unless Wyze recently fixed it. I won’t have time to retest for at least 4 hours.

1 Like

I’ll keep doing some more testing. I still think the iOS thing (the missing space) is weird, though. I don’t know if that’s related to this specific issue in any way, but if nothing else it looks a little sloppy. :man_shrugging:

Missing space is an iOS app UI typo.

I don’t disagree with that and am willing to stipulate that it may be a separate problem. I’ve continued testing here, though, in an Android-only Wyze environment, where I set up that new Automation today, and it succeeds every time so far.

If the iOS UI can have a typo, then I believe it’s also possible that there’s an error in the iOS app code that’s sending bad data to Wyze’s servers causing this kind of Automation Action to fail. I don’t know if that’s actually the case here, and I really am speculating based on my own limited observations. I just think it might be a potential piece of the puzzle to consider.

In other words, I think it might actually matter whether an Android or iOS device was used to create or save (after an edit) such an Automation.

I tested both platforms, multiple OS versions and multiple app versions. All failed. The iOS typo is a presentation level typo. It doesn’t affect the logic (automation) passed to the server at rule creation or execution times. This issue is being reported by numerous users running iOS and Android. Users have different input triggers and different output devices for actions. The only common denominators are action threshold “…for X time” and automations are stored on and executed from server.

1 Like

I don’t doubt your tests, and I do have doubts about my own speculation, because it seems weird that this could pop up with existing Automations if it was merely an app issue (and I’m not saying that it’s that, because obviously the Automations run from Wyze servers); however, I’ve been unable to replicate the failures, and I’ve tried many times today, so I’m just kind of thinking out loud here and remarking on my own observations, hoping that leads to discovery of the problem’s root and a definitive solution. I will be happy to log and report my own Automation’s failure as soon as I see one, if I do.

1 Like

Maybe your automations are different in some way?

If you still have them, plz post screenshots. :pray:

Edit: Belay last… I need to retest to make sure issue still exists. Maybe Wyze fixed the issue sometime yesterday. I need to retest anyway and generate logs before Wyze gets back in the office tomorrow. Can’t retest for a few hours though… too many other pressing issues.

1 Like

This is the one I’ve been triggering with success all day today:

At this point it’s almost annoyingly reliable. :wink:

I hear that. My new Wyze toys that arrived today will have to wait until later in the week.

1 Like

I haven’t tried motion on a CPv3 as trigger, just Cam v3, Cam v4, AI types, Motion Sensor v2 and Entry Sensor v2. But I think the key may be your time setting. Maybe something to do with time expiring before server receives motion clear status. Try setting time to 5 minutes or longer (and try not to move :grin:).

1 Like

Good idea. I’ll bump it up and let you know what happens. I have to walk into the next room to trigger the camera, and the lights are where my :brain: is parked right now.

:pencil2: Edit: I just tested at 5 and 8 minutes, and both passed. :white_check_mark::white_check_mark::man_shrugging:

:pencil2: Edit: I should’ve included these earlier for good measure:

That last test at 1909 was unintentionally (on my part) triggered by my :cat: “lab assistant”, who, thankfully, wasn’t moving through that room during previous testing, but I’m glad he’s part of the team.

1 Like

how do you get it to NOT detect motion when the light turns off? and then trigger the light back on… in an endless loop of light

If you still have your test setup, try setting “on for X” to 10 minutes.

I’m currently testing:

  1. CPv3 (motion) > bulb white v2 (on for 2m) pass 1, 2, 3
  2. CPv3 (motion) > bulb white v2 (on for 3m) pass 1, 2, 3
  3. CPv3 (motion) > bulb white v2 (on for 5m) pass 1, 2
  4. CPv3 (motion) > bulb white v2 (on for 10m) fail 1 (off@5m), pass 2, pass 3
  5. Cam v3 (motion) > bulb white v2 (on for 2m) pass 1, 2
  6. Cam v3 (motion) > Plug 2022 (on for 4m) pass 1, 2
  7. Cam v3 (motion) > Plug 2022 (on for 10m) fail 1 (off@5m), pass 2, pass 3, fail 4 (off@9m), pass 5
  8. Motion Sensor v2 (motion) > Plug 2021 (on for 2m) pass 1, 2, 3, 4
  9. Motion Sensor v2 (motion) > Plug 2021 (on for 3m) pass 1, 2, 3
  10. Motion Sensor v2 (motion) > Plug 2021 (on for 5m) pass 1, 2
  11. Motion Sensor v2 (motion) > Plug 2021 (on for 10m) fail 1 (off@6m), pass 2, fail 3 (off@5m)
  12. Motion Sensor v2 (motion) > bulb white v2 (on for 5m) pass 1, 2
  13. Motion Sensor v2 (motion) > bulb white v2 (on for 10m) fail 1 (off@2m), fail 2 (off@7m), pass 3, pass 4, fail 5 (off@5m) Log ID: 1616750

FYI… half of the test cases were created under iOS, half under Android. Some using prod 2.50.x app, some using prod 3.1.x some beta 3.2.x. A few more tests and I’ll start generating some log files…

@WyzeDesmond - starting log submissions. When submitting a log, there is no option for app or automations, only devices are available from which to choose. I’m choosing the target/action device when submitting logs.

Test case #13, run #5: Motion Sensor v2 (motion) > Bulb White v2 (on for 10m).

Bulb turned off @ 5m. Log ID: 1616750

2 Likes

This is exactly what I experienced when doing a test for another user in a different topic:

You’re totally right about that, and I think each environment is going to be different and will require some trial and error to achieve the desired effect. I’m speculating that a stationary camera with the right combination of Detection Zone and Motion Detection Sensitivity settings might be able to ignore the pixel/light changes or that a “4-minute on” Action might work inside the 5-minute cooldown period (for users who aren’t subscribers), but I haven’t tested these things. The testing I was doing yesterday involved…

  1. Starting a countdown timer on my Wyze Watch 47 or Google Home Mini.
  2. Walking into the next room to trigger a motion event in front of the test camera.
  3. Returning to the first room to resume whatever I was doing before.
  4. Observing what happened to my test lights once the timer ended.

Every test I ran yesterday passed. :man_shrugging:

I don’t know if that really answers your question, though. We might need more detail if there’s a specific problem you’re trying to solve.

I can do that. It might be later in the day. I might throw some more devices into my testing since a couple of other cameras showed up yesterday.

1 Like

If you’re busy with “real” life, it’s not necessary. But I appreciate the help. :+1: I can get this to fail under a variety of conditions about 50% of time. I will be able to send some logs to Wyze today and they should be able to make headway from my results.

2 Likes

I still haven’t been able to make this fail, and I’ve been running 5- and 10-minute tests on my new Cam OG and Cam OG Telephoto. It’s boringly reliable.

I noticed that you amended your post to include information about your app and mobile platform versions, and I still wonder if that’s a piece of the puzzle, though I agree that I’ve also seen users who designate in their profiles that they’re using the Android app report this issue. (Still that doesn’t preclude their iOS use; it just means that they haven’t been explicit about using it, so that’s potentially a confounding variable.)

For the record, my test Automations today were created in the v2.50.9.512 app; yesterday I was using v3.1.5.569 in my tests with Cam Pan v3.

I haven’t submitted any logs, because I haven’t been able to reproduce the error, but I hope this is useful somehow.

1 Like

I’ve been creating automations and testing using the same Android app versions as you. But I’ve also been testing the iOS equivalents and also latest beta versions under both iOS and Android. I haven’t found an app version under Android or iOS that doesn’t fail because it’s not the app that is failing. As I replied to the OP, there is no consistency. I.e., there are no tell-tale trends regarding trigger device, action device, turn on for x time length (turn off time), phone/tablet brand/model, operating system version, Wyze app version, automation newly created, old used-to-work automation, old used-to-work automation modified, etc. I’m out of input trigger device types, output action device types and phone/tablet models to find something that consistently works. The reason I’m not finding something that consistently works is because the issue isn’t related to any of those things. The issue is the time threshold on the server that sends the turn off command to the target device. That is evident when one automation, whether old, new or old modified, results in the same behavior… inconsistent, unpredictable turn off time. I’m just waiting for extended family to leave so I can run a few more uninterrupted tests to generate 2-3 additional logs. I’m hoping to catch a log for a test where turn on for is less than 10 minutes and another where turn on for is 10 minutes, but action occurs well past 10 minutes. It just takes so much time to do this stuff. And I’m not sure how app logs will even be of help to Wyze, but I am sure they’ll figure this issue out soon. :+1:

Note that even actions that are not user configurable fail if that action is related to server turn-on-for time. For example, bulb blink also fails in the same manner because to the server, blink is merely a fixed server value of 5 seconds (firmware performs blink for server specified turn-on-for X time). That applies to all flavors of lighting products that support the blink action.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.