HitFactor

Video intelligence for competitive shooting

HitFactor is an iOS product in development that analyzes raw match footage to detect events, segment runs, and generate performance analytics from video.

In active developmentProduct company, not services

Intended users: Competitive shooters who want structured timing, review, and performance data from POV and match video.

Representative in-development iOS workflowIn active development
HitFactoriOS run review
Stage 5 timeline with event markers and splits
iPhone run reviewOn-device model
Frame streamShot + event detections
Marker 1Marker 2Marker 3Marker 4Marker 5
Start detected
Shot burst clustered
Finish confirmed
Embedded model pipeline
Runs in the iOS app with the model bundled into the product
Transforms raw footage into timeline markers and run segments
Generates structured review output for analytics and playback
On-device detection

Embedded model identifies shots and key events directly on the device.

Run segmentation

Translate detections into a stage timeline with splits and review anchors.

Structured review

Generate output that can drive analytics, playback, and coaching feedback.

iOS appEmbedded modelTimeline reviewStructured output

What this representative workflow demonstrates

Public product preview showing the workflow we are building in the iOS app: an embedded model detects shots and events on device, then the product assembles a review timeline and structured feedback.

Runtime
On-device iOS model
Review mode
Run timeline
Output type
Structured review data

Why This Product Exists

What HitFactor is designed to do

The product is being built for competitive shooters who want structured timing and review data from POV and match video, with core detection running inside the iOS app through an embedded model. HitFactor is an iOS product in development where embedded model inference turns raw match video into structured timing and review data.

Core detection is designed to run in the iOS app with the model embedded directly in the product

Cloud infrastructure supports training, evaluation, storage, and iteration around the app

Built as a scalable product workflow, not a manual video-analysis service

Core capabilities

  • On-device shot and event detection from video
  • Run segmentation and timing analysis
  • Analytics generated from raw footage
  • Media ingestion and processing pipelines
  • User-facing review and feedback workflows

Representative workflow

  1. 1Import POV or match footage into the iOS app and prepare it for analysis
  2. 2Run on-device detection to identify shots and key events, then segment the run
  3. 3Generate timeline review data and analytics from the resulting event stream

Infrastructure Fit

How the product connects to the AI stack

These workloads are why Busic is building on cloud infrastructure from the beginning.

Training, evaluation, and iteration for embedded video models

Storage and sync for footage metadata, derived event streams, and review artifacts

Product telemetry, APIs, and release infrastructure for ongoing product improvement

Business model

Busic is building digital products delivered through software, subscriptions, and cloud-connected product experiences. This product page exists to make that model visible publicly.