Smart Search (Beta)

Smart search is a feature that enhances the ability to find specific actions or activities related to construction equipment and materials. It allows users to query for intersections between different types of construction equipment. The results can be filtered and presented based on various criteria, helping users to locate relevant footage and data efficiently.

Smart Search has its tab in the Left Side Panel (under Timeline). This feature is available upon request.




Camera Field of View (FOV) Requirements

To ensure reliable object detection and accurate Smart Search results, please adhere to the following positioning and visibility guidelines.

1. Hardware Requirements

  • Fixed Position: Cameras must be installed in a fixed, stationary position. PTZ (Pan-Tilt-Zoom) cameras are not supported.

2. Composition and Framing

  • Target Orientation: The camera should ideally be centred on the construction site.

  • Off-Centre Views: If the primary site area is shifted (e.g., to the far left or right), this is acceptable provided all relevant objects remain fully visible and recognisable.

  • View Distance: Construction equipment must be within a reasonable range. Avoid "wide-angle" or distant shots where machinery appears too small for the system to identify.

3. Environment Types

  • Construction Sites: Must feature clearly visible machinery and infrastructure.

  • Warehouses: Indoor views are supported if the FOV contains clear, monitorable subjects such as workers, MEWPs (Mobile Elevating Work Platforms), and other logistical equipment.

  • Top-Down Perspectives: While "birds-eye" or top-down views are compatible, please note that detection accuracy may be slightly lower compared to angled views.

4. Visibility and Obstructions

  • Unobstructed View: Ensure a clear line of sight to all target objects. Visual clutter or physical obstructions that hide parts of the equipment will reduce query reliability.


Pro-Tip: Testing Compatibility

Before finalising a camera for Smart Search, use the Object Detection tool in the dashboard:

Rule of Thumb: If the system can accurately detect and label objects in the live preview, the camera position is suitable for Smart Search.



🔍 How to Use

  1. Step 1: Select an Object 🚚

Step 1: Select an Object

Step 1: Select an Object


  1. Step 2: Select a Condition

    1. Object proximity search: search for an object near another object 🛠️


b. Area-based search: search for an object within an area of the camera view 📍


c. Identify a Time Schedule 🕛



  1. Step 3: Select a date range. 📅

Step 3: Select a Date Range

Step 3: Select a Date Range


  1. Step 4: Save the query. 💾


  1. Step 5: View and/or export the report. 🔽

Step 5: View and/or Export the report

Step 5: View and/or Export the report


Each event is based on a stationary object. Once it moves, another event is created.

Saved Queries are visible on the right-side panel:



"Data is based on automated visual analysis and may contain omissions or misclassifications."


  1. Optional Step: Check the Heatmap

    Object detection heatmaps visually highlight areas with frequent object occurrences or high model confidence, using color intensity (hotter colors for more activity/confidence) to show spatial patterns, useful for understanding traffic flow, behaviour, or model focus, often generated with tools like YOLOv8 and OpenCV for real-time video analysis or training supervision. They transform simple bounding boxes into intuitive visualisations of movement or importance, revealing hotspots.


    Motion Heatmaps: Display the movement of objects (people, cars) over time in a video, ideal for analysing traffic or crowd dynamics.