Codeproject ai coral reddit Just switched back to Blue Iris. I’m still relatively new to codeproject and blueiris working together, currently I have a Coral dual tpu running on the same machine as blue iris and it seems to be doing a phenomenal job detecting usually less than 10ms but sometimes 2000+ for the most random objects like an airplane, I usually don’t park any in my backyard and if there is one then by the time I get that notification I I used the unraid docker for codeproject_ai and swapped out the sections you have listed. Blue Iris is a paid product, but it's essentially a once-off payment (edit: you do only get one year of updates though). First , there's the issue of which modules I need for it to recognize specific objects. Reply reply UncharacteristicZero 11/14/2022 5:11:51 PM - CAMERA02 AI: Alert cancelled [nothing found] 11/14/2022 5:09:12 PM - CAMERA02 AI: [Objects] person: 63%. Now when I try to intall Object Detection (Coral) module 2. AI Server Hardware. It's interesting to see alternatives to Frigate appearing, at least for object detection. 2 and used YOLOv5. 4-Beta) running as a Docker container on unRAID. I was therefore wondering if people have found any creative use cases for the TPU with Blue Iris. Delete C:\Program Files\CodeProject Delete C:\ProgramData\CodeProject Restart Install CodeProject 2. 0 was just released which features a lot of improvements, including a fresh new frontend interface It's hard to find benchmarks on this sort of thing, but I get 150ms to 500ms CodeProject. Clicking the "" says "Custom models have been added. In the past, I have tested this same PC with Coral but with Linux baremetal + frigate docker so I know this Mini PC should fully detected the TPU inside Windows. AI only supports the use case of the Coral Edge TPU via the Raspberry PI image for Docker. Coral is not particularly good anymore, as modern Intel iGPU has caught up and surpassed it. I have been running my Blue Iris and AI (via CodeProject. I would like to try out Codeproject AI with BlueIris. Now this is working as I see the web codeproject web interface when accessing the alternate dns entry I made pointing to Nginx proxy manager but in the web page, under server url I also see the alternate dns entry resulting in not showing the logs. I removed all other modules except for what's in the screenshot assuming the Coral ObjectDetection is the only module I'd need. It looks like Frigate is the up-and-coming person and object detection AI and NVR folks should consider. I then followed the advice: uninstalling codeproject, deleting its program files and program data folders, making sure BI service was not automatically restarting upon reboot, rebooting, reinstalling codeproject, and installing AI modules before starting BI. Go back to 2. CPU barely breaks 30%. I have BI on one PC with codeproject ai setup on yolov5. They are not expensive 25-60 USD but their seam to be always out of stock. ai with google coral, but also have frigate for the home assistant integration and might take the time to dial in sending motion alerts from frigate to BI to get rid of CP. When i look at the BI logs, after a motion trigger it says "AI:Alert canceled [AI: not responding] 0ms" Any ideas? I'm on a windows machine running BI 5. 2) 1. Even if you get it working, the models are not designed for cctv and have really poor detection. Don't mess with the modules. The CodeProject. When I open CodeProject, I get: Dec 11, 2020 · Some interesting results testing the tiny, small, medium and large MobileNet SSD with the same picture. One note, unrelated to the AI stuff: I messed around with actively cooled RPi4s + heatsinks for ages, before moving to this passively cooled case which works significantly better and has the added bonus of no moving parts. I however am still having couple of scenarios that I'd like to get some help on and was hoping if there are any solutions worth exploring: I ended up buying an Intel NUC to run Frigate on separately, keeping the Wyse for HA. I had Deepstack working well and when Codeproject came out and I heard Deepstack was being deprecated, I made an image, then installed it. AI(Deepstack) vs CompreFace So I've been using DT for a long time now. A rolling release distro featuring a user-friendly installer, tested updates and a community of friendly users for support. 3. 1:82 but on the CP. Been running on the latest versions of 0. Posted by u/nos3001 - 8 votes and 12 comments Hi Chris, glad you've set up a sub, as I personally really struggle with the board - takes few back to usenet days lol. AI are going to add Coral support at some point. 2 GPU CUDA support Update Speed issues are fixed (Faster then DeepStack) GPU CUDA support for both… I use CodeProject AI for BI, only the object detection. 2 I'm seeing analyze times around 280ms with the small model and 500ms with the medium model. You can get a full Intel N100 system for $150 which will outperform a Coral in both speed and precision. However, for the past week, the models field is empty. AI Server v2. 2 NVME drive that I was intending to use for the OS & DB. net core 7 runtime and select Repair: On the main AI settings, check the box next to Use custom models and uncheck the box next to Default object detection. AI available I found it has issues self configuring. The ESP32 series employs either a Tensilica Xtensa LX6, Xtensa LX7 or a RiscV processor, and both dual-core and single-core variations are available. It seems codeproject has made a lot of progress supporting coral TPU, so I was hoping things are a bit better now? Is anyone able to make it work? Credit for this work around goes to PeteUK on the codeproject discusions. After Googling similar issues I found some solutions. AI Dashboard: 19:27:24:Object Detection (Coral): Retrieved objectdetection_queue command 'detect' It defaulted to 127. Computer Vision is the scientific subfield of AI concerned with developing algorithms to extract meaningful information from raw images, videos, and sensor data. AI Server in Docker or natively in Ubuntu and want to force the installation of libedgetpu1-max, first stop the Coral module from CodeProject. Thanks for you great insight! I have two corals (one mpcie and one m. AI container has started and fail to connect. I have codeproject. Hey, it takes anywhere from 1-6 seconds depending on whether you use Low, Medium or High MODE on Deepstack in my experience. Apr 23, 2023 · I have been running my Blue Iris and AI (via CodeProject. 10. Really sad the Codeproject. 13 as available for the last couple weeks. Very quick and painless and it worked great! That was a over a month ago. If you're new to BlueIris and CP. Posted by u/GiantsJets - 8 votes and 40 comments May 13, 2020 · This is documented in the codeproject AI blue iris faq here : Blue Iris Webcam Software - CodeProject. Double-Take: CodeProject. The modules included with CodeProject. Each module tells you if it's running and if it's running on the CPU or GPU. Installation runs through, and on the first start, it downloads stuff to install 3 initial modules, FaceProcessing, ObjectDetection (YOLOv5 . AI also now supports the Coral Edge TPUs. ai running alright. The small model found far more objects that all the other models even though some were wrong! 19 votes, 28 comments. Both BI and AI are running inside a Windows VM on an i7-7700 with allocated 6 cores and 10GB of RAM, no GPU. I have a 2nd PC with codeproject running on the same ip:port (Cp standard) and same yolov5. Not super usefull when used with blueiris for ai detection. One thing I noticed. net module. How is the Tesla P4 working for you with CodeProject AI? Do you run CodeProject on Windows or Docker? Curious because I am looking for a GPU for my windows 10 CodeProject AI setup CodeProject AI has better models out-of-the-box. By default, Frigate uses some demo ML models from Google that aren't built for production use cases, and you need the paid version of Frigate ($5/month) to get access to better models, which ends up more expensive than Blue Iris. ai It took a while, but it seems that I have something running here now. Ai? Any improvements? Mar 9, 2021 · I've been using the typical "Proxmox / LXC / Docker / Codeproject" with Coral TPU usb passthough setup but it's been unreliable (at least for me) and the boot process is pretty long. Will this work? I see a lot of talk about running on a raspberry pi but not much about on ubuntu/docker on x86. I hear about Blueiris, codeproject ai, frigate, synology surveillance station, and scrypted. I had CodeProject. The second entry shows that BI sent a motion alert to the AI and the AI confirmed it was a person. Reply reply I ended up reinstalling the coral module, and also under BI Settings ->AI i put the ip address of the pc running BI for the Use AI Server on IP/Port: and port 5000. I have CodeProject AI running in docker on linux. (tried YOLOv8 too) I'm still trying to understand the nuance of Coral not supporting custom models with the most recent updates since it acts like CodeProject is using the Coral device with the custom models from MikeLud. Has anyone managed to get face recognition working? I tried it many moons ago, but it was very flaky, it barely saved any faces and I ended giving up. I don’t understand what exactly each system does and which of these (or other) tools I would need. AI setup I've settled with for now. ¿Alguien tiene opiniones sobre estos dos? Configuré Deepstack hace aproximadamente un mes, pero leí que el desarrollador está… Creating a LLM Chat Module for CodeProject. For my security cameras, I'm using Blue Iris with CodeProject. I want to give it GPU support for CodeProject as i have 15 cameras undergoing AI analysis. Running CodeProject. Run asp. Despite having my gpu passed through, visible in windows, and Code project is seeing my gpu as well. py", line 10, in 07:52:22 bjectdetection_coral_adapter. 9. Get the Reddit app Scan this QR code to download the app now Codeproject. ai isn't worse either, so it may not matter. AI webpage it shows localhost:##### Is it fine to have these different? I went into the camera settings->Trigger->AI and turned on CP. Mesh is ticked on in both. 1. py: File "C:\Program Files\CodeProject\AI\modules\ObjectDetectionCoral\objectdetection_coral_adapter. Get the Reddit app Scan this QR code to download the app now Go to codeproject_ai r/codeproject_ai. But to process already trained network in any resemblance of real time, you can't use CPU ( too slow even on big PCs), GPU (Graphic card can't fit to Raspberry Pi, or smaller edge devices ) therefore TPU, a USB dongle like device, that took the AI processing part out of graphic card (on smaller scale) and allows you to execute AI stuff directly Please first read the Mint Mobile Reddit FAQ that is stickied and linked in the sub about and sidebar, as this answers most questions posted in this sub. If in docker, open a Docker terminal and launch bash: I’m current running deep stack off my cpu and it isn’t great and rather slow. Here we I had the same thing happen to me after a power loss. e. I have it running on a VM on my i3-13100 server, CPU-only objectDetection along with a second custom model, and my avg watt/hr has only increased by about 5w. These are both preceded by MOTION_A Hello everyone. 16) and codeproject. 2023-12-10 15:30:38: Video adapter info: Welcome to the IPv6 community on Reddit. Am I missing something there, am i also missing a driver or setting to get the integrated 850 quick sync to work with v5. Clips and recordings will all be placed on a NAS. Ran Scrypted for most of this year. This community is home to the academics and engineers both advancing and applying this interdisciplinary field, with backgrounds in computer science, machine learning, robotics View community ranking In the Top 10% of largest communities on Reddit CodeProject unable to install module I'm getting this, tried removing windows python, reinstalled it a few times. It already has an M. Uninstall, Delete the database file in your C:\ProgramData\CodeProject folder and then delete the CodeProject folders under program files, then reboot, then reinstall CP. Short summary: No. Any It appears that python and the ObjectDetectionNet versions are not set correctly. List the objects you want to detect. When I open the app, my alerts are very sparse, some weeks old, and if I filter to cancelled, I can see all my alerts but AI didn't confirm human, dog, truck BlueIris with Codeproject AI is awesome. I recently received the Coral TPU and have been trying to find ways to use it with my Blue Iris setup, however, it seems that CodeProject. Here's my setup: At the base I'm running ESXi. We would like to show you a description here but the site won’t allow us. This worked for me for a clean install: after install, make sure the server is not running. In BI on the AI tab, if i check off custom models, it keeps saying stop the server and restart to populate, but this doesnt succeed in populating. My driveway camera is great, it's detecting people and cars. ai. I use it in lieu of motion detection on cameras. I recently switched from Deepstack to CP AI. Coral over USB is supposedly even worse. Mise à jour : je viens d'essayer Coral + CodeProject AI et cela semble bien fonctionner ! J'ai ré-analysé certaines de mes alertes (clic droit sur la vidéo -> Tests et réglages -> Analyser avec l'IA) et la détection a bien fonctionné. Is this latency too long given the hardware? One option is to run the AI in a docker container inside a Linux VM (on the same hardware). I have an i7 CPU with built It's also worth noting that the Coral USB stick is no longer recommended. I just installed Viseron last night and still tinkering with the config. Now AI stops detecting. Still same empty field. . VM's and Management have their own dedicated 10Gbps SFP+ connections. AI completely, then rebooting and reinstalling the 2. The primary node I'm running Blue Iris as well as CodeProject. Relying on the uninstaller to stop the service and remove the files has been problematic because of this lag to terminate the process. Usually the Deepstack processing is faster than taking the snapshot, because for whatever reason the SSS API takes 1-2 seconds to return the image (regardless of whether it's using high quality/balanced/low). 4 package. My little M620 GPU actually seems to be working with it too. sounds like you did not have BI configured right as choppy video playback is not normal and no one i know sees that as an issue. For the Docker setup, I'm running PhotonOS in a VM, with Portainer on top to give me a GUI for Docker. I was wondering if there are any performance gains with using the Coral Edge TPU for docker run --name CodeProject. The CodeProject. I recently switched from Deepstack AI to Code Project AI. Get the Reddit app Scan this QR code to download the app now i have been trying to spin up a codeproject/ai-server container with a second google coral but it I've so far been using purely CPU based DeepStack on my old system, and it really stuggles - lots of timeouts. Restart AI to apply. Apr 22, 2024 · Does anyone happen to have any best practice recommendations for CP. Is anyone using one of these successfully? The device is not faulty, works fine on my Synology i'm trying to migrate off of. Her tiny PC only has 1 m. The AI is breaking constantly and my CPU is getting maxed out which blows my mind as I threw 20 cores at this VM. I finally switched to darknet and got that enabled, but I'm not getting anything to trigger. Now for each camera, go to the settings, then click the AI button. I"m using Nginx to push a self signed cert to most of my internal network services and I' trying to do the same for codeproject web ui. I finally got access to a Coral Edge TPU and also saw CodeProject. at CodeProject. v2. AI Server log shows requests every minute or less when there is no motion detection" This is a Fakespot Reviews Analysis bot. Everything was running fine until I had the bad idea to upgrade CodeProject to 2. However - it doesn't look like it is doing anything and BI shows new items in alerts when I walk around a camera - but then they go away. 11 votes, 11 comments. true I have Blue Iris (5. Rob from the hookup just released a video about this (blue iris and CodeProject. I played with frigate a little bit. Oct 8, 2019 · 07:52:22 bjectdetection_coral_adapter. I am CONSTANTLY getting notificaitons on my phone, for all sorts of movement. Apr 22, 2024 · Edit: This conversation took a turn to focus solely more on Google Coral TPU setups, so editing the title accordingly. py: from module_runner import ModuleRunner The AI setting in BI is "medium". Im attaching my settings aswell as pictures of the logs. Hi does anyone know how mesh is supposed to work. 12 However, they use far more power. remove everything under c:\programdata\codeproject\ai\ , also if you have anything under C:\Program Files\CodeProject\AI\downloads I got Frigate running on Unraid and have it connected to Home Assistant which is in a VM on my Unraid. Free Frigate open source combined with a $30 Coral card turns any legacy computer into a top end NVR. AI are configued via the modulesettings. ). AI for object detection at first, but was giving me a problem. For installation, I had to download the 2. CodeProject AI and Frigate To start, I have a working Frigate config with about 10 cameras right now. Javascript So I'm not the most tech-savvy, I have BI with CodeProject and it was working perfectly until a few weeks ago. The Coral would fit, but I believe there are issues with the Wyse being an AMD CPU for Frigate (there might be comments to this effect on this post to that effect, I can't remember and on my phone, but certainly worth having a dive into that issue first). Depending on markup it could be cheaper to get a decent graphics card which supports both the AI detection and ffmpeg acceleration. CodeProject AI + the models bundled with Blue Iris worked a lot better for me compared to Frigate. 4-Beta). The first entry shows that BI sent a motion alert to AI but the AI found nothing. AI FOR ALL! MUHAHAH For Frigate to run at a reasonable rate you really needed a Coral TPU. AI setup for license plate reading). py: TPU detected 17:11:43:objectdetection_coral_adapter. Afterwards, The AI is no longer detecting anything. 8) running in a Windows VM and CodeProject. r/codeproject_ai Coral usb TPU set to full precision (didn Hey looking for a recommendation on best way to proceed. It does not show up when running lsusb and does show in the system devices as some generic device. 6. I have a USB Coral i'm trying to passthru to docker. the installer never opens a co Sadly codeproject ai it’s not very environmentally or budget friendly. 12 votes, 30 comments. Any idea what could cause that ? Coral module is correctly detected in the device manager. json files in the module's directory, typically located at C:\Program Files\CodeProject\AI\modules\<ModuleName>\modulesettings. I'd like to keep this build as power efficient as possible, so rather than a GPU, I was going to take the opportunity to move to CodeProject AI with a Coral TPU. net , stuck on cpu mode, no toggle to gpu option? I was using Deepstack and decided to give Codeproject. Yes, you can include multiple custom models for each camera (comma separated, no spaces, no file extension). 7. 1 and ObjectDetection (YOLOv5 6. Anyway, top question for me, as my own Coral has just finally arrived, how goes support for Coral with CodeProject. AI, remember to read this before starting: FAQ: Blue Iris and CodeProject. NET) 1. I have it installed and configured as I would expect based upon tutorials. 8 - 2M cameras running main and sub streams. I've switched back and forth between CP and CF tweaking the config trying to get the most accuracy on facial recognition. Search for it on YouTube! But in Object Detection (Coral) menu Test Result is this: AI test failed: ObjectDetectionCoral test not provisioned But I see this in the Codeproject. 2) they both are hanging there for nothing. 6 Check AI Dashboard Press Ctrl R to force reload the dashboard Should see Modules installing I stopped YOLOv5 6. Get the Reddit app Scan this QR code to download the app now Also running it on a windows with a google coral setup and working great. I've got it somewhat running now but 50% of the time the TPU is not recognized so it reverts to CPU and about 40% of the time something makes Codeproject just go offline. My preference would be to run Codeproject AI with Coral USB in a docker on a Ubuntu x86 vm on Proxmox. Here is the analysis for the Amazon product reviews: Name: Google Coral USB Edge TPU ML Accelerator coprocessor for Raspberry Pi and Other Embedded Single Board Computers Company: Google Coral Amazon Product Rating: 4. On my i5-13500 with YOLOv5 6. 4W idle and 2W max, whereas a graphics card is usually at least 10W idle and can go far higher when in use. Within Blue Iris, go to the settings > "AI" tab > and click open AI Console. 2 nvme slot which is where I'm putting the Coral TPU then will use the only 2. Fakespot detects fake reviews, fake products and unreliable sellers using AI. Coral support is very immature on cpai, I would not recommend using it. The backup node has 2 x Xeon E5-2667 V4's and 128GB of RAM. So I'm not the most tech-savvy, I have BI with CodeProject and it was working perfectly until a few weeks ago. When I open the app, my alerts are very sparse, some weeks old, and if I filter to cancelled, I can see all my alerts but AI didn't confirm human, dog, truck I bought the Coral TPU coprocessor It is worth pointing out that they support other models and AI acceleration now. 0MP): ~200ms Obviously these are small sample sizes and YMMV but I'm happy with my initial tests/Blue Iris coral performance so far. If you have a specific Keyboard/Mouse/AnyPart that is doing something strange, include the model number i. when I installed the current version of cp ai. They do not support the Jetson, Coral, or other low power GPU use. I have blue iris on a NUC and it is averaging 900ms for detection. AI and is there anything people can do to help? It works fine for my 9 cameras. 5. While I am not computer savvy, I have looked through the logs before crashes to see if anything pop out and there doesn't seem to be anything out of the ordinary. Suddenly about a week ago, it started giving me an AI timeout or not responding. AI, CompreFace, Deepstack and others. Now i've done a manual install of a fresh Debian 12 lxc and that works rock solid. Original: Is there a guide somewhere for how to get CP. I haven't had reliable success with other versions. I've had Deepstack running on my mini server in a docker this way for years. 1MP): ~35ms Coral USB A (12. I have seen there are different programs to accomplish this task like CodeProject. AI 1. If I were to upgrade to a A2000 what kind of gains would I expect? I've heard faster cards do not make that much of a difference with detection times. 2023-12-10 15:30:38: ** App DataDir: C:\ProgramData\CodeProject\AI. AI on has 2 x Xeon E5-2640 V4's and 128GB of RAM. AI -d -p 32168:32168 -p 32168:32168/UDP codeproject/ai-server The extra /UDP flag opens it up to be seen by the other instances of CP-AI and allows for meshing, very useful!!! That extra flag was missing in the official guide somewhere. This should pull up a Web-based UI that shows that CPAI is running. AI) server all off my CPU as I do not have a dedicated GPU for any of the object detection. I have a coral device but stopped using it. 2 for object detection. AI and then let me know if you can start it again. 25 - 100ms with my T400. Am hoping to use it once it supports Yolo and custom models, but that is a while off. Manjaro is a GNU/Linux distribution based on Arch. 2 dual TPU. believe I ran the batch file too. For folks that want AI and alerts on animals or specifically a UPS truck then they need the additional AI that comes from CodeProject. Modify the registry (Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Perspective Software\Blue Iris\Options\AI, key 'deepstack_custompath') so Blue Iris looks in C:\Program Files\CodeProject\AI\AnalysisLayer\ObjectDetectionYolo\custom-models for custom models, and copy your models into there. I see in the list of objects that cat is supported, but I'm not sure where to enter "cat" to get it working. Viseron is a self-hosted NVR deployed via Docker, which utilizes machine learning to detect objects and start recordings. I have them outside and instead of using the blue iris motion detection, I have a script that checks for motion every second on the camera web service and if there is motion, the script pulls down the image from the camera's http service, feeds it into deepstack and if certain parameters are met, triggers a recording. AI has an license plate reader model you can implement. 8. The strange thing is nvidia-smi says the graphics card is "off" and does not report any scripts running. AI with Blue Iris for nearly a year now, and after setting it up with my Coral Edge TPU couple of months ago, it has been amazing. 8 Beta version with YOLO v5 6. Hopefully performance improves because I understand performance is better on Linux than Windows? I have codeproject AI's stuff for CCTV, it analyzes about 3-5x 2k resolution images a second. If you want all the models, just type *. AI, yes CodeProject was way slower for me but I don't know why, object type recognition was also way better with CodeProject. I I am using the coral on my home assistant computer to offload some of the work and now the detection time is 15-60ms. Creating a LLM Chat Module for CodeProject. 8 (I think?). Javascript I had to install 2. I have been using CodeProject. If you look towards the bottom of the UI you should see all of CodeProject AI's modules and their status. When asking a question or stating a problem, please add as much detail as possible. I installed the drivers from the apps section but it still doesn't work. 0. Coral is ~0. I'm using macvlan as the networking config to give it an IP on the LAN. I have read the limited threads on reddit, IPCamTalk, Codeproject. 2 under the section marked "CodeProject. I uninstalled BlueIris aswell as CodeProject and re-setup everything, but it still doesnt work. When I start the Object Detection (Coral), logs show the following messages: 17:11:17:Started Object Detection (Coral) module 17:11:43:objectdetection_coral_adapter. Blue Iris is running in a Win10 VM. Or check it out in the app stores TOPICS Multiple ai models codeproject ai . I found that I had to install the custom model on both the windows computer that blueiris was running on in addition to the docker container that is running CodeProject AI in order for my custom model file to get picked up. For other folks who had ordered a Coral USB A device and are awaiting delivery I placed the order 6/22/22 from Mouser and received today 10/17/22. Should I expect a better performance when running AI in docker? One thing about CP AI is that you have to stop the service before installing a new version. Works great with bi. AI Server is better supported by its developers and has been found to be more stable overall. 2. If you plan to use custom models, I'd first disable the standard object model. Revisiting my previous question here, I can give feedback now that'd I've had more time with codeproject. 4 out of 5 are using substreams too. Overall it seems to be doing okay but I'm confused by a few things and having a few issues. net Waited for them to be installed. And from the moment you stop the service, it can take 20-30 seconds for the process to exit. They must be the correct case and match the objects that the model was trained on. AI. I have a Nvidia 1050ti and a Coral TPU on a pci board (which I just put in the BI server since I've been waiting on Coral support. Didn't uninstall anything else. AI running with BI on a windows machine? We would like to show you a description here but the site won’t allow us. API This post was useful in getting BlueIris configured properly for custom models. Getting excited to try CodeProject AI, with the TOPS power of coral, what models do you think it can handle the best? thank you! I have blue iris on a NUC and it is averaging 900ms for detection. Thanks for this. This will most likely change once CPAI is updated. ai's forums, and nothing jumps out at me as things I have not tried. Running BI and Codeproject here in windows 11. ESP32 is a series of low cost, low power system on a chip microcontrollers with integrated Wi-Fi and dual-mode Bluetooth. ai is rumoured to soon support tensorlite and coral. The PIP errors will look something like this: Turn off all Object Detection Modules. From CodeProject UI the Coral module is using the YOLOv5 models at medium size. Comparing similar alerts AI analysis between DeepStack and CodeProject. Hey guys, I've seen there is some movement about google coral TPU support in codeproject, and I was wondering if there is any way to make it work with Blue Iris NVR software. How’s the coral device paired with CP. For PC questions/assistance. Il semble que l'exécution prenne 150 à 160 ms, selon les journaux de l'interface Web de CodeProject AI. 2 setup with dual coral? Which model to use (yolov5, yolov8, mobilenet, SSD), custom models, model size? Can you filter out stuff you don't need with coral models? Jul 27, 2024 · I've been trying to get this usb coral TPU running for far too long. But my indoor cameras, I'd like to try using it for person and cat. Will keep an eye on this. Edit (5/11/2024): Here's the Coral/CP. ai developers have not prioritized low cost/high output GPU TPU. AI Server. Clean uninstall/reinstall. Try a Google Coral I’ve got one in a micro Optiplex, 6th gen i5, 16GB memory. If you had a larger computer that you could have a GPU with CUDA cores, you probably won’t need the coral. Inside Docker, I'm pulling in the codeproject/ai-server image. Should mesh be switched on on both PC,s Any thoughts? If I'm running BI (5. I'm using Coral TPU plugged into the USB port to support CodeProject. I have BI running for my business. Performance is mediocre - 250ms+ vs. It is an AI accelerator (Think GPU but for AI). More formal support for Code Project’s AI Server, now our preferred no-extra-cost AI provider over DeepStack. net and it detects ok but slow. AI a try. Looking to hear from people who are using a Coral TPU. AI detection times with my P620, probably on average around 250ms. While there is a newer version of CodeProject. Has anyone found any good sources of information on how to use a Coral TPU with code project? I ask because my 6700t seems to struggle a bit(18% at idle, 90+ when motion detected) I only have 5 streams of 2mp cameras. I think maybe you need to try uninstalling DeepStack and CodeProject. Go back to "Install Modules" and re-install Coral Module. I have about 26 cameras set up that are set to record substream continuously direct to disk recording with most cameras using INTEL +VPP for hardware decoding. If code project ai added coral i would give it a try. When I reboot my unRAID server the Blue Iris VM will come online before the CodeProject. The CodeProject status log is showing the requests, but the BlueIris log is not showing any AI requests or feedback, only motion detects. AI team have released a Coral TPU module so it can be used on devices other than the Raspberry Pi. I got it working - I had to use the drivers included as part of the Coral Module rather than the ones downloaded from Coral's website. If you're running CodeProject. It seems silly that Deepstack has been supporting a Jetson two years ago… it’s really unclear why codeproject AI seems to be unable to do so. I've set it up on Windows Server 2022 and it's working OK. I installed the custom models (ipcams*) and it worked well for a while. 5 SATA SSD for the windows OS. Short story is I decided to move my BlueIris out of my Xeon EXSi VM server and into its own dedicated box. CodeProject AI should be adding Coral support soon. py: Using Edge TPU Coral USB A (2. You can now run AI acceleration on OpenVINO and Tensor aka Intel CPUs 6th gen or newer or Yeah I have 3 (and one coming) 4K cameras with a res 2560x1440. AI Server that handles a long-running process. I don’t think so, but CodeProject. As mentioned also, I made a huge performance step by running deepstack on a docker on my proxmox host instead of running it in a windows vm. Coral's github repo last update is 2~3 yrs ago. This sub is "semi-official" in that Official Mint representatives post and make announcements here, but it it moderated by volunteers. Get the Reddit app Scan this QR code to download the app now. 4 By default you'll be using the standard object model. Works great now. AI, and apparently CodeProject. They self configure. Sep 30, 2023 · The camera AI is useful to many people, but BI has way more motion setting granularity than the cameras, and some people need that additional detail, especially if wanting AI for more than a car or person. json, where ModuleName is the name of the module. Problem: They are very hard to get. AI (2. AI 2. " Restart the AI, heck, even BI: nothing. So the next step for me is setting up facial recognition since Frigate doesn't natively do this. Now if codeproject ai can just start recognizing faces. Stick to Deepstack if you have a Jetson. 1, I only get "call failed" no matter what verbosity I set. ai (2. codeproject was not significantly better than deepstack at the time (4 months ago), but I guess many people have started migrating away from deepstack by now, and cp. CodeProject. Uninstall Coral Module. There seems to be many solutions addressing different problems. Detection times are 9000ms-20000ms in BI. So I assume I am doing something wrong there. Coral M. AI Server 4/4/2024, 7:13:00 AM by Matthew Dennis Create a ChatGPT-like AI module for CodeProject.
nrsp iqeakp qspb rqzx mid ihyt xyc apmeufg xpc thzat