Is Wyze using AI now to reply to support tickets?

Yup. Thankfully they always quickly realize the information they didn’t give and send it.

Can’t say the same for the general population…

1 Like

I think log follow-up is another important benefit that a lot of users seem to overlook. I have a better understanding of this after reading one of your recent posts, and I followed that back to last year’s AMA that you quoted. I really appreciate your taking the time to do this, and I think it’s an important reminder for a lot of users who complain on the Forum that “I sent a log but never heard back.” or just post the log number in a Forum topic.

Well…uh…did you actually create a Support ticket like the system told you to immediately after you submitted the log? :roll_eyes:

:hole:

2 Likes

It is possible Wyze AI can be misidentifying us when we are on Hold on the support phone line. AI could be identifying us as a “Package” or “Pet” instead of the “Person” we are.

Sorry, I could not resist the humor.

My cats tell me those are both more accurate identifications for how they see me.

1 Like

Unless the call has video, then maybe Wyze AI Sound Detection is identifying crying…or possibly meowing in @carverofchoice’s case. Perhaps later in the call it’s identifying breaking glass and gunshots, the sounds of extreme frustration. :weary:

2 Likes

I worked on a few call center apps and they can:

  • keep notes on individual customers as in “never has a problem, only wants to chat”
  • keep track of how many times a particular phone number calls in and identify it as a “trouble caller”.

Not saying anybody here would be considered a troublemaker, but there may be some truth in the AI identification idea. :wink:

@Crease @IL1

I believe (with some evidence) that one’s long-on-hold grumblings are monitored and affect hold-times. Algorithms love this kind of stuff.

…in other news

Nurses say be careful when advocating for a loved one in a hospital. You must assert, not aggress. Otherwise you’re noted in the chart as argumentative (one of the few things in nurses’ notes doctors will actually read.)

Source: nursessay.com

1 Like

Yeah, one of the nurses told me so when I was at the hospital. She told me my note says “easy-going”, which was good to hear. So, all those data-entry they’re doing is not just for medical info.

1 Like

image

I was interested in reading that.

WOMP

This hasn’t been my experience, but it’s possible I’m atypical…. :thinking:

Nurses lie. A lot. Like…a lot.

I’m totally kidding! I’ve worked with some great nurses, and medical charts can sometimes be unexpectedly amusing reads. Semi-tangentially-related to the original topic, I imagine AI is gonna suck all the humor (and humanity) right outta health care documentation someday.

That’s generous. :wink:

I like to bag on doctors. The best doctors I ever had had both been nurses first. And, at least in my town, all the paramedic/firemen were starting quarterbacks in HS and never lost the attitude. One of the nursedoctors backed me up on that. True, only one nursedoctor, but still…

…and @IL1 is way easy-going, for sure…

Saw Mr Elon say recently he’s developing a new AI to compete with the dominant two - his primary focus TRUTH… and the best sense of humor.

‘If we are gonna die at least we should die laughing,’ is what he actually said.

1 Like

Key indicators of a bot or an agent that’s useless in helping solve your problem:
“How are you today?”
“First of all, I want to express my gratitude for …”
“I do apologize for the inconvenience …”

3 Likes

This is exactly the same thing that level 1 “support” does at just about every tech company. Level 1 support is contracted out, usually to Asia, and the “support” agents are provided a book of answers to cut&paste to customers, indexed by keywords. This is a function that could easily be handled as well or better by AI but is usually done by people because people cost less than setting up and maintaining a functional AI (so long as you contract out the support to countries with a low wage).

We have absolutely reached the point where dealing with “support” at any tech company is like an inverted Turing test – trying to guess if the support is smart enough to be AI or is just human.

If I am told to “clear your cache and reboot” or “try another browser” one more time, I will scream (no, actually I did scream, and yes it was at Wyze “support” about a week ago).

I’m pretty sure Wyze is using humans for “support” because I have occasionally found one with enough real intelligence to put aside the script after my response to the first page and one who actually provided some useful information. On the other hand, sometimes they give you that “let me have 3-5 minutes to look into this” then just disconnect.

Depending on who you happen to get, Wyze “support” can be range from bad to worse, but unfortunately it is about average for the tech industry.

1 Like

Sad to hear. It is a sign of the times.

I sadly find the forums a little useless lately. When you bring up an issue you just get a bunch of people telling you how the stuff is garbage and a bunch of random things.

Or a bunch of people saying “I never had that problem” or “I never had any problem” followed by a paragraph or two explaining that anything with the Wyze name on it is always perfect and the only problems are either the user doing something wrong or the user expecting the product to do things that supposedly “Wyze never claimed it would do”

1 Like

I found you really need to dig through the wasteland to find a few gems :gem:.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.