Codeproject ai coral reddit I’m still relatively new to codeproject and blueiris working together, currently I have a Coral dual tpu running on the same machine as blue iris and it seems to be doing a phenomenal job detecting usually less than 10ms but sometimes 2000+ for the most random objects like an airplane, I usually don’t park any in my backyard and if there is one then by the time I get that notification I I used the unraid docker for codeproject_ai and swapped out the sections you have listed. 8) running in a Windows VM and CodeProject. If in docker, open a Docker terminal and launch bash: I’m current running deep stack off my cpu and it isn’t great and rather slow. ai developers have not prioritized low cost/high output GPU TPU. when I installed the current version of cp ai. When I open the app, my alerts are very sparse, some weeks old, and if I filter to cancelled, I can see all my alerts but AI didn't confirm human, dog, truck I bought the Coral TPU coprocessor It is worth pointing out that they support other models and AI acceleration now. If you're running CodeProject. Has anyone managed to get face recognition working? I tried it many moons ago, but it was very flaky, it barely saved any faces and I ended giving up. One thing I noticed. Fakespot detects fake reviews, fake products and unreliable sellers using AI. Should mesh be switched on on both PC,s Any thoughts? If I'm running BI (5. Looking to hear from people who are using a Coral TPU. Oct 8, 2019 · 07:52:22 bjectdetection_coral_adapter. These are both preceded by MOTION_A Hello everyone. This will most likely change once CPAI is updated. AI, CompreFace, Deepstack and others. We would like to show you a description here but the site won’t allow us. Works great now. I have BI running for my business. Go back to "Install Modules" and re-install Coral Module. Usually the Deepstack processing is faster than taking the snapshot, because for whatever reason the SSS API takes 1-2 seconds to return the image (regardless of whether it's using high quality/balanced/low). I finally got access to a Coral Edge TPU and also saw CodeProject. Edit (5/11/2024): Here's the Coral/CP. Creating a LLM Chat Module for CodeProject. AI(Deepstack) vs CompreFace So I've been using DT for a long time now. CodeProject AI should be adding Coral support soon. ai isn't worse either, so it may not matter. Ran Scrypted for most of this year. I"m using Nginx to push a self signed cert to most of my internal network services and I' trying to do the same for codeproject web ui. I have it running on a VM on my i3-13100 server, CPU-only objectDetection along with a second custom model, and my avg watt/hr has only increased by about 5w. codeproject was not significantly better than deepstack at the time (4 months ago), but I guess many people have started migrating away from deepstack by now, and cp. 0 was just released which features a lot of improvements, including a fresh new frontend interface It's hard to find benchmarks on this sort of thing, but I get 150ms to 500ms CodeProject. 2 for object detection. For the Docker setup, I'm running PhotonOS in a VM, with Portainer on top to give me a GUI for Docker. I recently switched from Deepstack to CP AI. Coral support is very immature on cpai, I would not recommend using it. And from the moment you stop the service, it can take 20-30 seconds for the process to exit. In BI on the AI tab, if i check off custom models, it keeps saying stop the server and restart to populate, but this doesnt succeed in populating. I've switched back and forth between CP and CF tweaking the config trying to get the most accuracy on facial recognition. AI running with BI on a windows machine? We would like to show you a description here but the site won’t allow us. AI. ai. I've had Deepstack running on my mini server in a docker this way for years. Suddenly about a week ago, it started giving me an AI timeout or not responding. AI and is there anything people can do to help? It works fine for my 9 cameras. But my indoor cameras, I'd like to try using it for person and cat. 1. I played with frigate a little bit. AI a try. Il semble que l'exécution prenne 150 à 160 ms, selon les journaux de l'interface Web de CodeProject AI. Here we I had the same thing happen to me after a power loss. I have codeproject. Blue Iris is running in a Win10 VM. 2023-12-10 15:30:38: Video adapter info: Welcome to the IPv6 community on Reddit. I had Deepstack working well and when Codeproject came out and I heard Deepstack was being deprecated, I made an image, then installed it. Still same empty field. 2) 1. I am CONSTANTLY getting notificaitons on my phone, for all sorts of movement. Anyway, top question for me, as my own Coral has just finally arrived, how goes support for Coral with CodeProject. ai with google coral, but also have frigate for the home assistant integration and might take the time to dial in sending motion alerts from frigate to BI to get rid of CP. I would like to try out Codeproject AI with BlueIris. 12 votes, 30 comments. Will this work? I see a lot of talk about running on a raspberry pi but not much about on ubuntu/docker on x86. py: from module_runner import ModuleRunner The AI setting in BI is "medium". Free Frigate open source combined with a $30 Coral card turns any legacy computer into a top end NVR. I see in the list of objects that cat is supported, but I'm not sure where to enter "cat" to get it working. API This post was useful in getting BlueIris configured properly for custom models. I use it in lieu of motion detection on cameras. I have a USB Coral i'm trying to passthru to docker. I had CodeProject. py", line 10, in 07:52:22 bjectdetection_coral_adapter. ). py: TPU detected 17:11:43:objectdetection_coral_adapter. If you plan to use custom models, I'd first disable the standard object model. believe I ran the batch file too. 4-Beta). I was wondering if there are any performance gains with using the Coral Edge TPU for docker run --name CodeProject. My preference would be to run Codeproject AI with Coral USB in a docker on a Ubuntu x86 vm on Proxmox. It already has an M. I have read the limited threads on reddit, IPCamTalk, Codeproject. Now this is working as I see the web codeproject web interface when accessing the alternate dns entry I made pointing to Nginx proxy manager but in the web page, under server url I also see the alternate dns entry resulting in not showing the logs. AI completely, then rebooting and reinstalling the 2. There seems to be many solutions addressing different problems. AI has an license plate reader model you can implement. ¿Alguien tiene opiniones sobre estos dos? Configuré Deepstack hace aproximadamente un mes, pero leí que el desarrollador está… Creating a LLM Chat Module for CodeProject. I hear about Blueiris, codeproject ai, frigate, synology surveillance station, and scrypted. When I reboot my unRAID server the Blue Iris VM will come online before the CodeProject. How is the Tesla P4 working for you with CodeProject AI? Do you run CodeProject on Windows or Docker? Curious because I am looking for a GPU for my windows 10 CodeProject AI setup CodeProject AI has better models out-of-the-box. I removed all other modules except for what's in the screenshot assuming the Coral ObjectDetection is the only module I'd need. json, where ModuleName is the name of the module. 16) and codeproject. Sep 30, 2023 · The camera AI is useful to many people, but BI has way more motion setting granularity than the cameras, and some people need that additional detail, especially if wanting AI for more than a car or person. When asking a question or stating a problem, please add as much detail as possible. When I start the Object Detection (Coral), logs show the following messages: 17:11:17:Started Object Detection (Coral) module 17:11:43:objectdetection_coral_adapter. AI, remember to read this before starting: FAQ: Blue Iris and CodeProject. The CodeProject. Stick to Deepstack if you have a Jetson. You can get a full Intel N100 system for $150 which will outperform a Coral in both speed and precision. Run asp. Comparing similar alerts AI analysis between DeepStack and CodeProject. 9. When I open CodeProject, I get: Dec 11, 2020 · Some interesting results testing the tiny, small, medium and large MobileNet SSD with the same picture. AI on has 2 x Xeon E5-2640 V4's and 128GB of RAM. AI (2. 2023-12-10 15:30:38: ** App DataDir: C:\ProgramData\CodeProject\AI. 5 SATA SSD for the windows OS. I don’t understand what exactly each system does and which of these (or other) tools I would need. Both BI and AI are running inside a Windows VM on an i7-7700 with allocated 6 cores and 10GB of RAM, no GPU. I have seen there are different programs to accomplish this task like CodeProject. true I have Blue Iris (5. sounds like you did not have BI configured right as choppy video playback is not normal and no one i know sees that as an issue. If you have a specific Keyboard/Mouse/AnyPart that is doing something strange, include the model number i. AI) server all off my CPU as I do not have a dedicated GPU for any of the object detection. Problem: They are very hard to get. AI Server 4/4/2024, 7:13:00 AM by Matthew Dennis Create a ChatGPT-like AI module for CodeProject. Get the Reddit app Scan this QR code to download the app now Also running it on a windows with a google coral setup and working great. By default, Frigate uses some demo ML models from Google that aren't built for production use cases, and you need the paid version of Frigate ($5/month) to get access to better models, which ends up more expensive than Blue Iris. It seems silly that Deepstack has been supporting a Jetson two years ago… it’s really unclear why codeproject AI seems to be unable to do so. Will keep an eye on this. I installed the drivers from the apps section but it still doesn't work. It is an AI accelerator (Think GPU but for AI). Each module tells you if it's running and if it's running on the CPU or GPU. They are not expensive 25-60 USD but their seam to be always out of stock. Really sad the Codeproject. ai is rumoured to soon support tensorlite and coral. 8 - 2M cameras running main and sub streams. AI also now supports the Coral Edge TPUs. CodeProject AI and Frigate To start, I have a working Frigate config with about 10 cameras right now. How’s the coral device paired with CP. net core 7 runtime and select Repair: On the main AI settings, check the box next to Use custom models and uncheck the box next to Default object detection. If you had a larger computer that you could have a GPU with CUDA cores, you probably won’t need the coral. AI Server is better supported by its developers and has been found to be more stable overall. A rolling release distro featuring a user-friendly installer, tested updates and a community of friendly users for support. AI Server in Docker or natively in Ubuntu and want to force the installation of libedgetpu1-max, first stop the Coral module from CodeProject. But to process already trained network in any resemblance of real time, you can't use CPU ( too slow even on big PCs), GPU (Graphic card can't fit to Raspberry Pi, or smaller edge devices ) therefore TPU, a USB dongle like device, that took the AI processing part out of graphic card (on smaller scale) and allows you to execute AI stuff directly Please first read the Mint Mobile Reddit FAQ that is stickied and linked in the sub about and sidebar, as this answers most questions posted in this sub. I have them outside and instead of using the blue iris motion detection, I have a script that checks for motion every second on the camera web service and if there is motion, the script pulls down the image from the camera's http service, feeds it into deepstack and if certain parameters are met, triggers a recording. py: Using Edge TPU Coral USB A (2. When I open the app, my alerts are very sparse, some weeks old, and if I filter to cancelled, I can see all my alerts but AI didn't confirm human, dog, truck BlueIris with Codeproject AI is awesome. 6 Check AI Dashboard Press Ctrl R to force reload the dashboard Should see Modules installing I stopped YOLOv5 6. 11 votes, 11 comments. Go back to 2. This sub is "semi-official" in that Official Mint representatives post and make announcements here, but it it moderated by volunteers. The primary node I'm running Blue Iris as well as CodeProject. As mentioned also, I made a huge performance step by running deepstack on a docker on my proxmox host instead of running it in a windows vm. Am I missing something there, am i also missing a driver or setting to get the integrated 850 quick sync to work with v5. From CodeProject UI the Coral module is using the YOLOv5 models at medium size. net module. net Waited for them to be installed. Here is the analysis for the Amazon product reviews: Name: Google Coral USB Edge TPU ML Accelerator coprocessor for Raspberry Pi and Other Embedded Single Board Computers Company: Google Coral Amazon Product Rating: 4. AI FOR ALL! MUHAHAH For Frigate to run at a reasonable rate you really needed a Coral TPU. So I assume I am doing something wrong there. I don’t think so, but CodeProject. Running BI and Codeproject here in windows 11. Here's my setup: At the base I'm running ESXi. However, for the past week, the models field is empty. 0MP): ~200ms Obviously these are small sample sizes and YMMV but I'm happy with my initial tests/Blue Iris coral performance so far. 4W idle and 2W max, whereas a graphics card is usually at least 10W idle and can go far higher when in use. 2) they both are hanging there for nothing. Mesh is ticked on in both. I have about 26 cameras set up that are set to record substream continuously direct to disk recording with most cameras using INTEL +VPP for hardware decoding. 2 nvme slot which is where I'm putting the Coral TPU then will use the only 2. Within Blue Iris, go to the settings > "AI" tab > and click open AI Console. Coral's github repo last update is 2~3 yrs ago. 13 as available for the last couple weeks. Mise à jour : je viens d'essayer Coral + CodeProject AI et cela semble bien fonctionner ! J'ai ré-analysé certaines de mes alertes (clic droit sur la vidéo -> Tests et réglages -> Analyser avec l'IA) et la détection a bien fonctionné. Didn't uninstall anything else. The Coral would fit, but I believe there are issues with the Wyse being an AMD CPU for Frigate (there might be comments to this effect on this post to that effect, I can't remember and on my phone, but certainly worth having a dive into that issue first). I have an i7 CPU with built It's also worth noting that the Coral USB stick is no longer recommended. 2. Any It appears that python and the ObjectDetectionNet versions are not set correctly. Now for each camera, go to the settings, then click the AI button. It's interesting to see alternatives to Frigate appearing, at least for object detection. AI Dashboard: 19:27:24:Object Detection (Coral): Retrieved objectdetection_queue command 'detect' It defaulted to 127. My little M620 GPU actually seems to be working with it too. Thanks for you great insight! I have two corals (one mpcie and one m. Computer Vision is the scientific subfield of AI concerned with developing algorithms to extract meaningful information from raw images, videos, and sensor data. AI and then let me know if you can start it again. I was therefore wondering if people have found any creative use cases for the TPU with Blue Iris. Short story is I decided to move my BlueIris out of my Xeon EXSi VM server and into its own dedicated box. Coral over USB is supposedly even worse. I have a Nvidia 1050ti and a Coral TPU on a pci board (which I just put in the BI server since I've been waiting on Coral support. Hopefully performance improves because I understand performance is better on Linux than Windows? I have codeproject AI's stuff for CCTV, it analyzes about 3-5x 2k resolution images a second. 7. 2 NVME drive that I was intending to use for the OS & DB. Hey, it takes anywhere from 1-6 seconds depending on whether you use Low, Medium or High MODE on Deepstack in my experience. AI available I found it has issues self configuring. AI 2. Posted by u/GiantsJets - 8 votes and 40 comments May 13, 2020 · This is documented in the codeproject AI blue iris faq here : Blue Iris Webcam Software - CodeProject. The AI is breaking constantly and my CPU is getting maxed out which blows my mind as I threw 20 cores at this VM. 1MP): ~35ms Coral USB A (12. 12 However, they use far more power. The ESP32 series employs either a Tensilica Xtensa LX6, Xtensa LX7 or a RiscV processor, and both dual-core and single-core variations are available. The backup node has 2 x Xeon E5-2667 V4's and 128GB of RAM. This worked for me for a clean install: after install, make sure the server is not running. Get the Reddit app Scan this QR code to download the app now. 8 Beta version with YOLO v5 6. I've set it up on Windows Server 2022 and it's working OK. Very quick and painless and it worked great! That was a over a month ago. 5. ai (2. For installation, I had to download the 2. Modify the registry (Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Perspective Software\Blue Iris\Options\AI, key 'deepstack_custompath') so Blue Iris looks in C:\Program Files\CodeProject\AI\AnalysisLayer\ObjectDetectionYolo\custom-models for custom models, and copy your models into there. 8 (I think?). Thanks for this. 2 I'm seeing analyze times around 280ms with the small model and 500ms with the medium model. Viseron is a self-hosted NVR deployed via Docker, which utilizes machine learning to detect objects and start recordings. The second entry shows that BI sent a motion alert to the AI and the AI confirmed it was a person. Restart AI to apply. AI setup for license plate reading). I have CodeProject AI running in docker on linux. r/codeproject_ai Coral usb TPU set to full precision (didn Hey looking for a recommendation on best way to proceed. I have been running my Blue Iris and AI (via CodeProject. So I'm not the most tech-savvy, I have BI with CodeProject and it was working perfectly until a few weeks ago. They self configure. This should pull up a Web-based UI that shows that CPAI is running. Despite having my gpu passed through, visible in windows, and Code project is seeing my gpu as well. The strange thing is nvidia-smi says the graphics card is "off" and does not report any scripts running. Or check it out in the app stores TOPICS Multiple ai models codeproject ai . Now when I try to intall Object Detection (Coral) module 2. It looks like Frigate is the up-and-coming person and object detection AI and NVR folks should consider. Inside Docker, I'm pulling in the codeproject/ai-server image. First , there's the issue of which modules I need for it to recognize specific objects. For folks that want AI and alerts on animals or specifically a UPS truck then they need the additional AI that comes from CodeProject. Revisiting my previous question here, I can give feedback now that'd I've had more time with codeproject. Now if codeproject ai can just start recognizing faces. Get the Reddit app Scan this QR code to download the app now Go to codeproject_ai r/codeproject_ai. One note, unrelated to the AI stuff: I messed around with actively cooled RPi4s + heatsinks for ages, before moving to this passively cooled case which works significantly better and has the added bonus of no moving parts. Getting excited to try CodeProject AI, with the TOPS power of coral, what models do you think it can handle the best? thank you! I have blue iris on a NUC and it is averaging 900ms for detection. Apr 23, 2023 · I have been running my Blue Iris and AI (via CodeProject. Delete C:\Program Files\CodeProject Delete C:\ProgramData\CodeProject Restart Install CodeProject 2. Javascript So I'm not the most tech-savvy, I have BI with CodeProject and it was working perfectly until a few weeks ago. AI setup I've settled with for now. at CodeProject. AI detection times with my P620, probably on average around 250ms. For other folks who had ordered a Coral USB A device and are awaiting delivery I placed the order 6/22/22 from Mouser and received today 10/17/22. I have been using CodeProject. Apr 22, 2024 · Edit: This conversation took a turn to focus solely more on Google Coral TPU setups, so editing the title accordingly. Coral M. I got it working - I had to use the drivers included as part of the Coral Module rather than the ones downloaded from Coral's website. ai's forums, and nothing jumps out at me as things I have not tried. ESP32 is a series of low cost, low power system on a chip microcontrollers with integrated Wi-Fi and dual-mode Bluetooth. They do not support the Jetson, Coral, or other low power GPU use. I just installed Viseron last night and still tinkering with the config. When i look at the BI logs, after a motion trigger it says "AI:Alert canceled [AI: not responding] 0ms" Any ideas? I'm on a windows machine running BI 5. This community is home to the academics and engineers both advancing and applying this interdisciplinary field, with backgrounds in computer science, machine learning, robotics View community ranking In the Top 10% of largest communities on Reddit CodeProject unable to install module I'm getting this, tried removing windows python, reinstalled it a few times. It seems codeproject has made a lot of progress supporting coral TPU, so I was hoping things are a bit better now? Is anyone able to make it work? Credit for this work around goes to PeteUK on the codeproject discusions. AI container has started and fail to connect. AI with Blue Iris for nearly a year now, and after setting it up with my Coral Edge TPU couple of months ago, it has been amazing. For my security cameras, I'm using Blue Iris with CodeProject. Everything was running fine until I had the bad idea to upgrade CodeProject to 2. Been running on the latest versions of 0. AI 1. Im attaching my settings aswell as pictures of the logs. 3. 10. ai running alright. Not super usefull when used with blueiris for ai detection. Any idea what could cause that ? Coral module is correctly detected in the device manager. json files in the module's directory, typically located at C:\Program Files\CodeProject\AI\modules\<ModuleName>\modulesettings. (tried YOLOv8 too) I'm still trying to understand the nuance of Coral not supporting custom models with the most recent updates since it acts like CodeProject is using the Coral device with the custom models from MikeLud. 4 By default you'll be using the standard object model. If I were to upgrade to a A2000 what kind of gains would I expect? I've heard faster cards do not make that much of a difference with detection times. Has anyone found any good sources of information on how to use a Coral TPU with code project? I ask because my 6700t seems to struggle a bit(18% at idle, 90+ when motion detected) I only have 5 streams of 2mp cameras. The first entry shows that BI sent a motion alert to AI but the AI found nothing. 2 setup with dual coral? Which model to use (yolov5, yolov8, mobilenet, SSD), custom models, model size? Can you filter out stuff you don't need with coral models? Jul 27, 2024 · I've been trying to get this usb coral TPU running for far too long. 25 - 100ms with my T400. The small model found far more objects that all the other models even though some were wrong! 19 votes, 28 comments. Clean uninstall/reinstall. I I am using the coral on my home assistant computer to offload some of the work and now the detection time is 15-60ms. Even if you get it working, the models are not designed for cctv and have really poor detection. Is this latency too long given the hardware? One option is to run the AI in a docker container inside a Linux VM (on the same hardware). I want to give it GPU support for CodeProject as i have 15 cameras undergoing AI analysis. While there is a newer version of CodeProject. The PIP errors will look something like this: Turn off all Object Detection Modules. Her tiny PC only has 1 m. I then followed the advice: uninstalling codeproject, deleting its program files and program data folders, making sure BI service was not automatically restarting upon reboot, rebooting, reinstalling codeproject, and installing AI modules before starting BI. CodeProject. Uninstall Coral Module. AI Server log shows requests every minute or less when there is no motion detection" This is a Fakespot Reviews Analysis bot. Relying on the uninstaller to stop the service and remove the files has been problematic because of this lag to terminate the process. Coral is ~0. Manjaro is a GNU/Linux distribution based on Arch. I uninstalled BlueIris aswell as CodeProject and re-setup everything, but it still doesnt work. You can now run AI acceleration on OpenVINO and Tensor aka Intel CPUs 6th gen or newer or Yeah I have 3 (and one coming) 4K cameras with a res 2560x1440. Now i've done a manual install of a fresh Debian 12 lxc and that works rock solid. While I am not computer savvy, I have looked through the logs before crashes to see if anything pop out and there doesn't seem to be anything out of the ordinary. It does not show up when running lsusb and does show in the system devices as some generic device. I haven't had reliable success with other versions. 4 package. Performance is mediocre - 250ms+ vs. I recently received the Coral TPU and have been trying to find ways to use it with my Blue Iris setup, however, it seems that CodeProject. Coral is not particularly good anymore, as modern Intel iGPU has caught up and surpassed it. AI are configued via the modulesettings. Hi does anyone know how mesh is supposed to work. Short summary: No. I installed the custom models (ipcams*) and it worked well for a while. If code project ai added coral i would give it a try. I'm using Coral TPU plugged into the USB port to support CodeProject. 2 and used YOLOv5. Original: Is there a guide somewhere for how to get CP. Blue Iris is a paid product, but it's essentially a once-off payment (edit: you do only get one year of updates though). Am hoping to use it once it supports Yolo and custom models, but that is a while off. List the objects you want to detect. If you look towards the bottom of the UI you should see all of CodeProject AI's modules and their status. AI, yes CodeProject was way slower for me but I don't know why, object type recognition was also way better with CodeProject. I recently switched from Deepstack AI to Code Project AI. the installer never opens a co Sadly codeproject ai it’s not very environmentally or budget friendly. Should I expect a better performance when running AI in docker? One thing about CP AI is that you have to stop the service before installing a new version. The CodeProject. AI webpage it shows localhost:##### Is it fine to have these different? I went into the camera settings->Trigger->AI and turned on CP. On my i5-13500 with YOLOv5 6. Uninstall, Delete the database file in your C:\ProgramData\CodeProject folder and then delete the CodeProject folders under program files, then reboot, then reinstall CP. Clicking the "" says "Custom models have been added. AI only supports the use case of the Coral Edge TPU via the Raspberry PI image for Docker. e. Ai? Any improvements? Mar 9, 2021 · I've been using the typical "Proxmox / LXC / Docker / Codeproject" with Coral TPU usb passthough setup but it's been unreliable (at least for me) and the boot process is pretty long. If you're new to BlueIris and CP. In the past, I have tested this same PC with Coral but with Linux baremetal + frigate docker so I know this Mini PC should fully detected the TPU inside Windows. Installation runs through, and on the first start, it downloads stuff to install 3 initial modules, FaceProcessing, ObjectDetection (YOLOv5 . Depending on markup it could be cheaper to get a decent graphics card which supports both the AI detection and ffmpeg acceleration. Apr 22, 2024 · Does anyone happen to have any best practice recommendations for CP. Clips and recordings will all be placed on a NAS. The CodeProject status log is showing the requests, but the BlueIris log is not showing any AI requests or feedback, only motion detects. Reply reply UncharacteristicZero 11/14/2022 5:11:51 PM - CAMERA02 AI: Alert cancelled [nothing found] 11/14/2022 5:09:12 PM - CAMERA02 AI: [Objects] person: 63%. AI team have released a Coral TPU module so it can be used on devices other than the Raspberry Pi. I have it installed and configured as I would expect based upon tutorials. remove everything under c:\programdata\codeproject\ai\ , also if you have anything under C:\Program Files\CodeProject\AI\downloads I got Frigate running on Unraid and have it connected to Home Assistant which is in a VM on my Unraid. Yes, you can include multiple custom models for each camera (comma separated, no spaces, no file extension). " Restart the AI, heck, even BI: nothing. CPU barely breaks 30%. ai It took a while, but it seems that I have something running here now. 4 out of 5 are using substreams too. However - it doesn't look like it is doing anything and BI shows new items in alerts when I walk around a camera - but then they go away. Is anyone using one of these successfully? The device is not faulty, works fine on my Synology i'm trying to migrate off of. Overall it seems to be doing okay but I'm confused by a few things and having a few issues. Search for it on YouTube! But in Object Detection (Coral) menu Test Result is this: AI test failed: ObjectDetectionCoral test not provisioned But I see this in the Codeproject. I have BI on one PC with codeproject ai setup on yolov5. More formal support for Code Project’s AI Server, now our preferred no-extra-cost AI provider over DeepStack. Hey guys, I've seen there is some movement about google coral TPU support in codeproject, and I was wondering if there is any way to make it work with Blue Iris NVR software. Try a Google Coral I’ve got one in a micro Optiplex, 6th gen i5, 16GB memory. 2 dual TPU. So the next step for me is setting up facial recognition since Frigate doesn't natively do this. After Googling similar issues I found some solutions. CodeProject AI + the models bundled with Blue Iris worked a lot better for me compared to Frigate. Get the Reddit app Scan this QR code to download the app now Codeproject. AI for object detection at first, but was giving me a problem. VM's and Management have their own dedicated 10Gbps SFP+ connections. Afterwards, The AI is no longer detecting anything. AI Server that handles a long-running process. net , stuck on cpu mode, no toggle to gpu option? I was using Deepstack and decided to give Codeproject. 1, I only get "call failed" no matter what verbosity I set. AI, and apparently CodeProject. AI Server Hardware. Running CodeProject. AI Server v2. I'm using macvlan as the networking config to give it an IP on the LAN. Detection times are 9000ms-20000ms in BI. I found that I had to install the custom model on both the windows computer that blueiris was running on in addition to the docker container that is running CodeProject AI in order for my custom model file to get picked up. AI Server. 6. I however am still having couple of scenarios that I'd like to get some help on and was hoping if there are any solutions worth exploring: I ended up buying an Intel NUC to run Frigate on separately, keeping the Wyse for HA. My driveway camera is great, it's detecting people and cars. They must be the correct case and match the objects that the model was trained on. AI are going to add Coral support at some point. 8. AI -d -p 32168:32168 -p 32168:32168/UDP codeproject/ai-server The extra /UDP flag opens it up to be seen by the other instances of CP-AI and allows for meshing, very useful!!! That extra flag was missing in the official guide somewhere. Just switched back to Blue Iris. . Works great with bi. If you want all the models, just type *. I've got it somewhat running now but 50% of the time the TPU is not recognized so it reverts to CPU and about 40% of the time something makes Codeproject just go offline. py: File "C:\Program Files\CodeProject\AI\modules\ObjectDetectionCoral\objectdetection_coral_adapter. 2 under the section marked "CodeProject. Reply reply I ended up reinstalling the coral module, and also under BI Settings ->AI i put the ip address of the pc running BI for the Use AI Server on IP/Port: and port 5000. 4-Beta) running as a Docker container on unRAID. Don't mess with the modules. Get the Reddit app Scan this QR code to download the app now i have been trying to spin up a codeproject/ai-server container with a second google coral but it I've so far been using purely CPU based DeepStack on my old system, and it really stuggles - lots of timeouts. Javascript I had to install 2. For PC questions/assistance. Double-Take: CodeProject. I finally switched to darknet and got that enabled, but I'm not getting anything to trigger. v2. I have a 2nd PC with codeproject running on the same ip:port (Cp standard) and same yolov5. 1 and ObjectDetection (YOLOv5 6. I have blue iris on a NUC and it is averaging 900ms for detection. The modules included with CodeProject. NET) 1. Posted by u/nos3001 - 8 votes and 12 comments Hi Chris, glad you've set up a sub, as I personally really struggle with the board - takes few back to usenet days lol. I think maybe you need to try uninstalling DeepStack and CodeProject. 1:82 but on the CP. net and it detects ok but slow. Rob from the hookup just released a video about this (blue iris and CodeProject. Now AI stops detecting. I have a coral device but stopped using it. 0. I'd like to keep this build as power efficient as possible, so rather than a GPU, I was going to take the opportunity to move to CodeProject AI with a Coral TPU. 2 GPU CUDA support Update Speed issues are fixed (Faster then DeepStack) GPU CUDA support for both… I use CodeProject AI for BI, only the object detection. qxuaj qqkxq tkyg iasdn gyud zdaj iszmb vftlm nuzdoxtp mxokt