[Tutorial] Dewarping 360 Video #13326
Replies: 2 comments 8 replies
-
This is very clever, and timely - I only found out recently that this sort of dewarping could be effectively done within Frigate. I hope to try it with my doorbell camera which gives a "fisheye" image as seen in my other post to create a corrected image (the example is courtesy of the EZViz "app" but I don't know if this degree of success is feasible with ffmpeg). |
Beta Was this translation helpful? Give feedback.
-
Hi! I went through the tortuous road of ffmpeg some weeks ago, not being an expert but a newbie in Frigate, following 2 or 3 post on reddit and here about dewarping. So in my case was trial and error. I have a slim passageway outside my kitchen , where I put a turret Dahua-clone cam, but it was only focussing one side of the corridor. I found a fisheye Hikvision DS-2CD2955FWD-IS, as I didn't want to put two cameras focsing to the two point of interest: Looking for the solution I found topics calling to use ffplay to experiment the best result. Although at first I thought the best solution was to use "output:dfisheye" as you, the use of the stream in HA and the detection wouldn't be the best for my use case. Finally I decided to divide the stream and to use it as they were 2 cameras. Obviously this would increase the CPU usage, but until now my I7-8700 is being nice.
The result: I miss the OSD with the name, date, time of my other cameras, and sometimes the fan of the pc running Frigate speed up often, but for now is the best that I got. Audio is not working yet, but I will try to make it work if possible when having more time. |
Beta Was this translation helpful? Give feedback.
-
I have a Ubiquiti AI 360 camera that I wanted to feed into Frigate to have it along side all my other non-Ubiquiti cameras. The AI 360 was the cheapest path for a "good" 360 camera since I already had a UDM SE.
Figuring out a "good" way to dewarp the video for consumption involved:
ffplay
to quickly test various filter optionsffplay -i rtsps://USER:[email protected]:7441/CAMERA_KEY -vf "v360=fisheye:output=equirect:ih_fov=180:iv_fov=180:pitch=90,crop=in_w:in_h/2:0:in_h/2"
Here is what the camera outputs and a few output options:
-vf "v360=fisheye:output=c6x1:ih_fov=180:iv_fov=180:pitch=90"
-vf "v360=fisheye:output=equirect:ih_fov=180:iv_fov=180:pitch=90,crop=in_w:in_h/2:0:in_h/2"
-vf "v360=fisheye:output=dfisheye:ih_fov=180:iv_fov=180:pitch=90,crop=in_w:in_h/2:0:in_h/2"
-vf "v360=fisheye:output=hequirect:ih_fov=180:iv_fov=180:pitch=90,crop=in_w:in_h/2:0:in_h/2"
Ideally I would feed two Half Equirectangular streams into Frigate and treat them as independent cameras. Unfortunately the v360 dewap and crop filters do not support hardware acceleration so this process does end up being CPU intensive, using an entire core of a i5-12500.
So for a "single stream" option I went with the Dual Fisheye output format which Frigate's object detection seems to play well enough with.
Go2RTC is the real hero here since the entire dewarp and crop process can be configured on it's stream allowing Frigate to consume as many copies of that single stream as needed without increasing the CPU cost of the operations.
The next issue I ran into is the setup time for the stream. Unifi's re-streaming endpoint is slow to start AND the dewarp filter adds to the initialization time. I had to manually set up the
input_args
for the camera to override the default 5s timeout and increase it to 9s.Ideally I'd be able to use a camera like the HIKVISION DS-2CD63C5G0E-IVS which supports on-board ePTZ streams allowing multiple "virtual" dewarped camera streams. That camera is more than 2x the cost of the AI360 however.
Breaking down the ffmpeg filter config:
fisheye
- input video formatoutput=dfisheye
- output video format, dual fisheyeih_fov=180
- input horizontal field of view, since the video is 360 and we are doing "dual" outputs needs to be at 180iv_fov=180
- input vertical field of view, since the video is 360 and we are doing "dual" outputs needs to be at 180pitch=90
- orients the "virtual view" 90 degrees up so the view is "to the side" vs "down at the ground"in_w:in_h/2
- Width:Height - sets the width and height of the output image,in_w
andin_h
are references to the input width and height. Since we are cropping the black top half of the dewarped image we set the output width to match the input and the output height to half the input0:in_h/2
- X Position:Y Position - where to set the X,Y of the top left of the crop. So X is the left edge (0) and Y is half way down the image (in_h/2)Beta Was this translation helpful? Give feedback.
All reactions