Real Time Streaming Protocol (RTSP)

@WildBill - A year ago (and things change) I tried MotionEye for about half a day with some other (not Wyze) cameras. Never got good results. I mocked up a video source (Unix is still the best kit ever, RIP DMR and thanks for your enduring legacy.) and quickly determined the camera wasn’t the problem. The MotionEye software at that time at least had a strong hobbyist itch-scatching vibe. And that’s brilliant. Just not where I wanted to contribute my time. You might do better, if you’re willing to jump into the code.

Been running this for years, using other cameras, without a problem. So still not sure what’s creating the issue.

2 Likes

@WildBill - thanks for the info - I just installed motionEye to experiment - very similar setup to yours but I pulled the latest version (motioneye-0.42) and installed it manually to have control over the configuration files and settings.

I installed it on the same box that is running Zoneminder 1.32. I added 3 cameras, all using the Network Camera with the TCP option:
image

I then went ahead and modified other settings after it added it to change resolution and frame rate:

So - I’m now connecting via RTSP to 3 of my cameras from 2 different sources (Zoneminder and motionEye are running)…I’m not getting any drops or issues. I am running Wyze Cam V2’s w/ firmware 4.28.4.41. Most of my cams are running in the 80%+ signal strength.

I have disabled motion detection and I have no alerts on the cams setup (Motion Tagging disabled) from the default. I basically tried to make them consume as few CPU cycles as possible. I am saving to a 32GB SD in each camera (Micro Center cheapo 32 GB - not the endurance cards).

My install uses motionEye 0.42 along with Motion 4.0…

2 Likes

I’m using MotionEye 0.41 which I installed and manually configured sometime ago.

Camera configuration:

image

image

I’ll try changing some of the parameters to see if that changes anything.

1 Like

There wasn’t much that changed from 0.41 to 0.42 that would impact performance substantially, at least for Wyze cameras.

You could try changing the parameters to 1920x1080 @ 15 fps - make sure it’s TCP. 10 fps is used during night vision but when I had it set @ 15 it worked fine (or just keep it at 10).

Are you able to disable specific extensions on 802.11 for your access point? I personally disabled everything except for “n” on my access points that manage my camera’s. @UserCustomerGwen since RTSP seems so picky with bandwidth, maybe an option to specify what connection speed the cameras are connected to AP routers in the Wyze app may be helpful? I have a feeling that many of the dropout issues relate to interference and 802.11 standard that the cams connect to via Wi-Fi.

For reference - I had the exact same issues with connectivity until I forced 802.11n on my AP’s…after that, it’s been solid and smooth sailing.

Also - @WildBill have you done a AP channel audit? Install NetSpot from the Android store and see if you have signal from other AP’s that is on the same as yours…if so, change the channel to one that doesn’t have as much interfere - here’s how I laid out my AP’s…

2 Likes

I don’t know what would be involved with that but I’ll bring it up with the team. :slight_smile:

1 Like

@ElectroStrong another great share, you’re doing a lot of testing for the RTSP firmware, hope that Wyze could learn something so that they could develop and improve the RTSP firmware.

1 Like

I am so glad I found this thread! I’m slowly reading through it, but I have one quick question for now. My house is an Apple environment and I want to use VLC on all platforms (Mac, iPhone, iPad, AppleTV) to view local-only streams from Wyze cameras using RTSP. I think this is all possible using the RTSP beta firmware, and I correct?

And many thanks to Wyze for supporting this community. Must be the great air up here in Seattle.

2 Likes

Yes with the RTSP firmware loaded into your Wyze cameras, you can easily view the RTSP feeds using VLC (Open Network Stream > rtsp://root:ismart12@[cam’s ip address]/live)

Good series of tests btw! I have also experienced the audio, pause, audio, pause especially while I watch one of the cameras through the BI5 web interface. However, the recorded video from BI5 does not have this, it’s just during live watching.

I was wondering the same on the audio part of the feed; when I tried to manually (command line) combine MP4s on the mSD card, I would get some message from ffmpeg that the A-law audio was not supported in a MP4 container, so I had to use MKV. Luckily, when BI5 captures it writes it to AAC on the MP4 container, so it’s easy to combine. Does U-law or AAC encoding eat up more CPU cycles? I would imagine any decisions made on encoding are made so that the CPU doesn’t get bottlenecked on these cameras.

For the record, I don’t work for BlueIris, I have recently tried it (read a lot of good stuff on it) to automate creating a single daily file for the baby cam. Before I had to do it manually from all the files on the mSD card, what a huge huge pain (and also waste of space). Now, with RTSP fw, I use BI to only capture clips with motion, combines them automatically in a single<< file, and I have a daily (much much) smaller file (as it doesn’t capture video when there is no motion).

<<if you use the native recording format for BI, it will record/build a single daily file with all the motion clips appended in chronological order; if you use MP4, it will create several files but it’s ok, it’s still so much less work than combining 1,440 files … :slight_smile:

Just viewing a file recorded by MotionEye and the video stops at 10 seconds in (15:49:40) then resumes shortly after at 22 seconds (15:49:50). I did some research and the problem may be with my NUC. Info I found suggests the CPU can’t handle realtime encoding of the video resulting in gaps in the recordings. I’m testing passthrough recording which should bypass the CPU.

Interesting - I looked up the specs of the processor and it should be plenty capable to take a single stream and decode/encode. Heck, I’m running all 12 (but with Zoneminder pass-thru) on an i7-2700K…a CPU introduced in Q4 of 2011. Yours was launched in Q1 of 2017…

Is this a general purpose NUC (using it as a workstation as well)? When you start the stream, and you open system monitor in the desktop interface, what does the CPU load look like? If your not using a GUI you could use “top” to look at CPU load as well…

This is good info @teredactle! I took a deeper look into the MP4 container information and PCM, including A-law and μ-law, is not a supported codec for the MP4 container.

Both A-law and μ-law are very similar in terms of compute power. AAC would more than likely consume a bit more, but it should be negligible at the bitrate they are capturing. PCM is sooooo much thicker in terms of bandwidth. It would make so much more sense to stream either AAC, MP3, or even FLAC (support added in 2018) in RTSP. AAC would be my personal preference as every player and tool on earth works with H.264/AVC video and AAC audio in an MP4 container.

Here’s where it gets funny. When we stream via RTSP you can see that A-law is the codec used:
image001

BUT (a big but!!) if you record a video in the Wyze app, you get the following:

As you can see - the Wyze app creates a standard compliant MP4 container, while any direct pass-thru with RTSP will result in a non-standard MP4 container because it contains PCM.

This really only affects you if you use RTSP and save the stream directly. If you transcode it, you will be able to convert the audio to AAC. But if you try to pass-thru:

It’d be really, really, really nice if RTSP just streamed AAC…Wyze is doing it everywhere else…this should also be buttoned up :grin:

1 Like

I’ll have to look at the load on the system. It serves two functions: Motioneye and Plex Media server. Plex is only in use during the evening to record a few OTA shows and to watch them. Rest of the day it is idle. Problems with cameras occur during daytime hours and probably evening as well.

1 Like

I’ve been doing more research and it appears to be a problem with motioneye. When running CPU utilization is around 60% for my 9 cameras. Here’s a screen shot of TOP running on the system:

Seems to be a known problem with RTSP streams.

@WildBill - That’s definitely some high CPU - here’s an example of mine with 3 camera’s running:

The zm* processes are specific to Zoneminder, my primary video capture system.

As you’re on Ubuntu 18.04, you could download motionEye 0.42 and manually install it - I did mine to /usr/local so that way I could remove it when needed.

Sad to hear that it is motionEye that is the issue, but hopefully another solution (or an update) can address it so you can enjoy RTSP! :slight_smile:

Just saw your post. Here’s the interesting thing. I have two wyze cams passing feeds to BI5 via RTSP. I do a direct to disk save when recording video off the 2 cams (no encoding whatsoever), when it records to disk it records the audio in AAC LC, much like recording via the Wyze app (what you posted). This works great as if I have several “motion” triggered videos recorded via BI, I can concatenate them via ffmpeg in the MP4 container, w/o issue (like a-law).

My only question at this point, is what settings could I use to compress the larger files into HEVC - I use handbrake for this, and while most other MP4 AVC files compress well, the stuff off the Wyze cams don’t compress at all, but rather grow in size when I go to HEVC. I would love to save some space and get these files smaller.

Anyone have any ideas??

Cheers

I think the RTSP feed consumption by the Motion Project (https://motion-project.github.io/) software – which is used by motionEye – is very sensitive to variations; I installed Live555 as a proxy and it appears to be better able to handle the Wyze camera’s RTSP output. However, I have also noticed sensitivity w.r.t. the WiFi network. I found the most reliable connection is using a RPI as a WiFi access point with it’s own SSID and subnet for the Wyze camera; I run Live555 proxy on the RPI and have the Motion Project software read from the proxy. Check GitHub - dcmartin/raspberrypi-access-point: Script and documentation to configure RaspberryPi as WiFi access point. for more information.

1 Like

Funny enough, Wyze is using Live555 as their RTSP server.

Hey @teredactle - sorry for the delay - I was on vacation! :slight_smile:

Moving to H.265 (HEVC) with the H.264 encoded output that you have should reduce the size by around 25-50% depending on the quality parameters that you are entering. I generally use FFMPEG for all encoding. I haven’t used Handbrake in some time, but what I recall is that they were looking at transitioning from libav to ffmpeg at around the Handbrake 1.2.

Your question sparked my curiosity, so I downloaded Handbrake 1.3.1 and took a video file that I had from Zoneminder, which is a direct RTSP stream with it re-encoded as AAC, and had the following file:

Key aspects of this were that the file size for video was 9.57 MB. I then took the file, popped it into Handbrake, set the CQ to 23, selected H.265 and encoded it - lo’ and behold, I had a file that was 4x larger!

Ouch! So, I figured it has to be settings in Handbrake as we should be seeing lower bit rates for comparable quality. The audio stream didn’t really change but the video size definitely did. The first thing to do was to really reduce the Constant Quality value from the default 22/23 to something more like 40. I went ahead and set the codec to H.265 (not 10/12 bit - the source material from Wyze is 8-bit), set the FPS as same as source, set it as variable, selected the Slow encode, and specified the Profile as Main along with a CQ of 40:

After doing this - I saw a 30% reduction in file size for quality that I could tell very little difference and the Stream size was now around 7 MB:

Another option that I generally do is variable based encoding to specify an average bit-rate but then have the encoder allocate a higher bit rate on motion frames - I record extra time before/after an event in my setup - when I set mine up as follows for a lower average bitrate and used a 2 pass encoding, the quality was a little better (barely) IMO but obviously a much slower encode:

The file output at an average of 800 kb/sec was around 7 MB:

You’ll want to probably experiment a bunch to see what settings are acceptable for you. I personally keep them in AVC format as a 30% reduction in size doesn’t really affect my environment as I’ve set it up for a 30 day run with events deleted if they are not archived (which means nothing bad really happened!).

Hope this helps a bit!