Skip to content

Conversation

@gioelecerati
Copy link
Member

No description provided.

@codecov
Copy link

codecov bot commented Dec 13, 2024

Codecov Report

Attention: Patch coverage is 17.14286% with 87 lines in your changes missing coverage. Please review.

Project coverage is 24.58194%. Comparing base (665bf99) to head (bacf55b).

Files with missing lines Patch % Lines
ai/clickhouse.go 0.00000% 64 Missing ⚠️
ai/client.go 54.54545% 14 Missing and 1 partial ⚠️
ai/query_spec.go 0.00000% 8 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@                 Coverage Diff                 @@
##                main        #198         +/-   ##
===================================================
- Coverage   27.11864%   24.58194%   -2.53670%     
===================================================
  Files              6           9          +3     
  Lines            413         598        +185     
===================================================
+ Hits             112         147         +35     
- Misses           286         435        +149     
- Partials          15          16          +1     
Files with missing lines Coverage Δ
ai/query_spec.go 0.00000% <0.00000%> (ø)
ai/client.go 54.54545% <54.54545%> (ø)
ai/clickhouse.go 0.00000% <0.00000%> (ø)

... and 6 files with indirect coverage changes


Continue to review full report in Codecov by Sentry.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 665bf99...bacf55b. Read the comment docs.

Files with missing lines Coverage Δ
ai/query_spec.go 0.00000% <0.00000%> (ø)
ai/client.go 54.54545% <54.54545%> (ø)
ai/clickhouse.go 0.00000% <0.00000%> (ø)

... and 6 files with indirect coverage changes

"stream_id",
"avg(input_fps) as avg_input_fps",
"avg(output_fps) as avg_output_fps",
"countIf(last_error != '') as error_count",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not an entirely correct aggregation, the right way would be aggregating the error events. How hard would that be? I expected the data pipeline itself to be doing that kind of aggregation though, so we're not scanning the table at query time.

"avg(output_fps) as avg_output_fps",
"countIf(last_error != '') as error_count",
"arrayFilter(x -> x != '', groupUniqArray(last_error)) as errors",
"sum(restart_count) as total_restarts",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is also incorrect. restart_count is already cumulative. This should be a max on the simplest solution. Should work.

"countIf(last_error != '') as error_count",
"arrayFilter(x -> x != '', groupUniqArray(last_error)) as errors",
"sum(restart_count) as total_restarts",
"arrayFilter(x -> x != '', groupUniqArray(last_restart_logs)) as restart_logs").
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe only keep the last one as well? I think it would be confusing the merge+uniq these logs.

var ErrAssetNotFound = errors.New("asset not found")

type StreamStatus struct {
StreamID string `json:"streamId"`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we use snake_case like the other AI APIs?

return m.rows, nil
}

func TestQueryAIStreamStatusEvents(t *testing.T) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants