diff --git a/.changeset/purple-jokes-rhyme.md b/.changeset/purple-jokes-rhyme.md index df8239016..383bbcdbf 100644 --- a/.changeset/purple-jokes-rhyme.md +++ b/.changeset/purple-jokes-rhyme.md @@ -2,4 +2,4 @@ '@callstack/reassure-measure': minor --- -Added a `dropOutliers` option to detect and drop statistical outliers +Added a `removeOutliers` option to detect and drop statistical outliers diff --git a/.changeset/violet-hornets-call.md b/.changeset/violet-hornets-call.md new file mode 100644 index 000000000..88014b660 --- /dev/null +++ b/.changeset/violet-hornets-call.md @@ -0,0 +1,7 @@ +--- +'reassure': patch +'@callstack/reassure-measure': patch +'@callstack/reassure-compare': patch +--- + +enable outlier detection by default diff --git a/README.md b/README.md index 9640a4dbc..d1e73b5bf 100644 --- a/README.md +++ b/README.md @@ -370,6 +370,7 @@ async function measureRenders( interface MeasureRendersOptions { runs?: number; warmupRuns?: number; + removeOutliers?: boolean; wrapper?: React.ComponentType<{ children: ReactElement }>; scenario?: (view?: RenderResult) => Promise; writeFile?: boolean; @@ -378,9 +379,10 @@ interface MeasureRendersOptions { - **`runs`**: number of runs per series for the particular test - **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs (default 1). +- **`removeOutliers`**: should remove statistical outlier results (default: `true`) - **`wrapper`**: React component, such as a `Provider`, which the `ui` will be wrapped with. Note: the render duration of the `wrapper` itself is excluded from the results; only the wrapped component is measured. - **`scenario`**: a custom async function, which defines user interaction within the UI by utilising RNTL or RTL functions -- **`writeFile`**: (default `true`) should write output to file. +- **`writeFile`**: should write output to file (default `true`) #### `measureFunction` function @@ -399,13 +401,15 @@ async function measureFunction( interface MeasureFunctionOptions { runs?: number; warmupRuns?: number; + removeOutliers?: boolean; writeFile?: boolean; } ``` - **`runs`**: number of runs per series for the particular test -- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs. -- **`writeFile`**: (default `true`) should write output to file. +- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs +- **`removeOutliers`**: should remove statistical outlier results (default: `true`) +- **`writeFile`**: should write output to file (default `true`) #### `measureAsyncFunction` function @@ -426,13 +430,15 @@ async function measureAsyncFunction( interface MeasureAsyncFunctionOptions { runs?: number; warmupRuns?: number; + removeOutliers?: boolean; writeFile?: boolean; } ``` - **`runs`**: number of runs per series for the particular test -- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs. -- **`writeFile`**: (default `true`) should write output to file. +- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs +- **`removeOutliers`**: should remove statistical outlier results (default: `true`) +- **`writeFile`**: should write output to file (default `true`) ### Configuration diff --git a/docusaurus/docs/api.md b/docusaurus/docs/api.md index 939971e71..4ab0e3e14 100644 --- a/docusaurus/docs/api.md +++ b/docusaurus/docs/api.md @@ -50,6 +50,7 @@ test('Test with scenario', async () => { interface MeasureRendersOptions { runs?: number; warmupRuns?: number; + removeOutliers?: boolean; wrapper?: React.ComponentType<{ children: ReactElement }>; scenario?: (view?: RenderResult) => Promise; writeFile?: boolean; @@ -57,10 +58,11 @@ interface MeasureRendersOptions { ``` - **`runs`**: number of runs per series for the particular test -- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs. +- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs (default: 1) +- **`removeOutliers`**: should remove statistical outlier results (default: `true`) - **`wrapper`**: React component, such as a `Provider`, which the `ui` will be wrapped with. Note: the render duration of the `wrapper` itself is excluded from the results, only the wrapped component is measured. - **`scenario`**: a custom async function, which defines user interaction within the ui by utilized RNTL functions -- **`writeFile`**: (default `true`) should write output to file. +- **`writeFile`**: should write output to file (default `true`) ### `measureFunction` function {#measure-function} @@ -91,13 +93,15 @@ test('fib 30', async () => { interface MeasureFunctionOptions { runs?: number; warmupRuns?: number; + removeOutliers?: boolean; writeFile?: boolean; } ``` - **`runs`**: number of runs per series for the particular test -- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs. -- **`writeFile`**: (default `true`) should write output to file. +- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs +- **`removeOutliers`**: should remove statistical outlier results (default: `true`) +- **`writeFile`**: should write output to file (default `true`) ### `measureAsyncFunction` function {#measure-async-function} @@ -136,13 +140,15 @@ test('fib 30', async () => { interface MeasureAsyncFunctionOptions { runs?: number; warmupRuns?: number; + removeOutliers?: boolean; writeFile?: boolean; } ``` - **`runs`**: number of runs per series for the particular test -- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs. -- **`writeFile`**: (default `true`) should write output to file. +- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs +- **`removeOutliers`**: should remove statistical outlier results (default: `true`) +- **`writeFile`**: (default `true`) should write output to file ## Configuration diff --git a/packages/measure/CHANGELOG.md b/packages/measure/CHANGELOG.md index 6ce1f672d..442e380ef 100644 --- a/packages/measure/CHANGELOG.md +++ b/packages/measure/CHANGELOG.md @@ -5,7 +5,7 @@ ### Minor Changes - c51fb5f: feat: add measureAsyncFunction -- 59b21d4: Added a `dropOutliers` option to detect and drop statistical outliers +- 59b21d4: Added a `removeOutliers` option to detect and drop statistical outliers ### Patch Changes diff --git a/packages/measure/src/__tests__/measure-async-function.test.tsx b/packages/measure/src/__tests__/measure-async-function.test.tsx new file mode 100644 index 000000000..26a77ed18 --- /dev/null +++ b/packages/measure/src/__tests__/measure-async-function.test.tsx @@ -0,0 +1,70 @@ +/* eslint-disable promise/prefer-await-to-then */ +import stripAnsi from 'strip-ansi'; +import { measureAsyncFunction } from '../measure-async-function'; +import { setHasShownFlagsOutput } from '../output'; + +// Exponentially slow function +function fib(n: number): number { + if (n <= 1) { + return n; + } + + return fib(n - 1) + fib(n - 2); +} + +test('measureAsyncFunction captures results', async () => { + const fn = jest.fn(() => Promise.resolve().then(() => fib(5))); + const results = await measureAsyncFunction(fn, { runs: 1, warmupRuns: 0, writeFile: false }); + + expect(fn).toHaveBeenCalledTimes(1); + expect(results.runs).toBe(1); + expect(results.counts).toEqual([1]); +}); + +test('measureAsyncFunction runs specified number of times', async () => { + const fn = jest.fn(() => Promise.resolve().then(() => fib(5))); + const results = await measureAsyncFunction(fn, { runs: 20, warmupRuns: 0, writeFile: false }); + + expect(fn).toHaveBeenCalledTimes(20); + expect(results.runs).toBe(20); + expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(20); + expect(results.counts).toHaveLength(20); + expect(results.meanCount).toBe(1); + expect(results.stdevCount).toBe(0); +}); + +test('measureAsyncFunction applies "warmupRuns" option', async () => { + const fn = jest.fn(() => Promise.resolve().then(() => fib(5))); + const results = await measureAsyncFunction(fn, { runs: 10, warmupRuns: 1, writeFile: false }); + + expect(fn).toHaveBeenCalledTimes(11); + expect(results.runs).toBe(10); + expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(10); + expect(results.counts).toHaveLength(10); + expect(results.meanCount).toBe(1); + expect(results.stdevCount).toBe(0); +}); + +const errorsToIgnore = ['❌ Measure code is running under incorrect Node.js configuration.']; +const realConsole = jest.requireActual('console') as Console; + +beforeEach(() => { + jest.spyOn(realConsole, 'error').mockImplementation((message) => { + if (!errorsToIgnore.some((error) => message.includes(error))) { + realConsole.error(message); + } + }); +}); + +test('measureAsyncFunction should log error when running under incorrect node flags', async () => { + setHasShownFlagsOutput(false); + const results = await measureAsyncFunction(jest.fn(), { runs: 1, writeFile: false }); + + expect(results.runs).toBe(1); + const consoleErrorCalls = jest.mocked(realConsole.error).mock.calls; + expect(stripAnsi(consoleErrorCalls[0][0])).toMatchInlineSnapshot(` + "❌ Measure code is running under incorrect Node.js configuration. + Performance test code should be run in Jest with certain Node.js flags to increase measurements stability. + Make sure you use the Reassure CLI and run it using "reassure" command." + `); +}); diff --git a/packages/measure/src/__tests__/measure-function.test.tsx b/packages/measure/src/__tests__/measure-function.test.tsx index 6aa632c44..7f9d7cdde 100644 --- a/packages/measure/src/__tests__/measure-function.test.tsx +++ b/packages/measure/src/__tests__/measure-function.test.tsx @@ -39,7 +39,7 @@ test('measureFunction runs specified number of times', async () => { expect(fn).toHaveBeenCalledTimes(20); expect(results.runs).toBe(20); - expect(results.durations).toHaveLength(20); + expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(20); expect(results.counts).toHaveLength(20); expect(results.meanCount).toBe(1); expect(results.stdevCount).toBe(0); @@ -51,7 +51,7 @@ test('measureFunction applies "warmupRuns" option', async () => { expect(fn).toHaveBeenCalledTimes(11); expect(results.runs).toBe(10); - expect(results.durations).toHaveLength(10); + expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(10); expect(results.counts).toHaveLength(10); expect(results.meanCount).toBe(1); expect(results.stdevCount).toBe(0); diff --git a/packages/measure/src/__tests__/measure-renders.test.tsx b/packages/measure/src/__tests__/measure-renders.test.tsx index e50663a59..0917279ce 100644 --- a/packages/measure/src/__tests__/measure-renders.test.tsx +++ b/packages/measure/src/__tests__/measure-renders.test.tsx @@ -17,7 +17,7 @@ test('measureRenders run test given number of times', async () => { const scenario = jest.fn(() => Promise.resolve(null)); const results = await measureRenders(, { runs: 20, scenario, writeFile: false }); expect(results.runs).toBe(20); - expect(results.durations).toHaveLength(20); + expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(20); expect(results.counts).toHaveLength(20); expect(results.meanCount).toBe(1); expect(results.stdevCount).toBe(0); @@ -32,7 +32,7 @@ test('measureRenders applies "warmupRuns" option', async () => { expect(scenario).toHaveBeenCalledTimes(11); expect(results.runs).toBe(10); - expect(results.durations).toHaveLength(10); + expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(10); expect(results.counts).toHaveLength(10); expect(results.meanCount).toBe(1); expect(results.stdevCount).toBe(0); @@ -63,9 +63,9 @@ function IgnoreChildren(_: React.PropsWithChildren<{}>) { test('measureRenders does not measure wrapper execution', async () => { const results = await measureRenders(, { wrapper: IgnoreChildren, writeFile: false }); - expect(results.runs).toBe(10); - expect(results.durations).toHaveLength(10); expect(results.counts).toHaveLength(10); + expect(results.runs).toBe(10); + expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(10); expect(results.meanDuration).toBe(0); expect(results.meanCount).toBe(0); expect(results.stdevDuration).toBe(0); diff --git a/packages/measure/src/config.ts b/packages/measure/src/config.ts index 352144bc7..388d06aae 100644 --- a/packages/measure/src/config.ts +++ b/packages/measure/src/config.ts @@ -6,7 +6,7 @@ export type Cleanup = () => void; type Config = { runs: number; warmupRuns: number; - dropOutliers: boolean; + removeOutliers: boolean; outputFile: string; testingLibrary?: TestingLibrary; }; @@ -14,7 +14,7 @@ type Config = { const defaultConfig: Config = { runs: 10, warmupRuns: 1, - dropOutliers: false, + removeOutliers: true, outputFile: process.env.REASSURE_OUTPUT_FILE ?? '.reassure/current.perf', testingLibrary: undefined, }; diff --git a/packages/measure/src/measure-async-function.tsx b/packages/measure/src/measure-async-function.tsx index 968cf16b6..3e27dffc8 100644 --- a/packages/measure/src/measure-async-function.tsx +++ b/packages/measure/src/measure-async-function.tsx @@ -25,7 +25,7 @@ async function measureAsyncFunctionInternal( ): Promise { const runs = options?.runs ?? config.runs; const warmupRuns = options?.warmupRuns ?? config.warmupRuns; - const dropOutliers = options?.dropOutliers ?? config.dropOutliers; + const removeOutliers = options?.removeOutliers ?? config.removeOutliers; showFlagsOutputIfNeeded(); @@ -39,5 +39,5 @@ async function measureAsyncFunctionInternal( runResults.push({ duration, count: 1 }); } - return processRunResults(runResults, { warmupRuns, dropOutliers }); + return processRunResults(runResults, { warmupRuns, removeOutliers }); } diff --git a/packages/measure/src/measure-function.tsx b/packages/measure/src/measure-function.tsx index f7b656dfd..a9f075b16 100644 --- a/packages/measure/src/measure-function.tsx +++ b/packages/measure/src/measure-function.tsx @@ -6,7 +6,7 @@ import { showFlagsOutputIfNeeded, writeTestStats } from './output'; export interface MeasureFunctionOptions { runs?: number; warmupRuns?: number; - dropOutliers?: boolean; + removeOutliers?: boolean; writeFile?: boolean; } @@ -23,7 +23,7 @@ export async function measureFunction(fn: () => void, options?: MeasureFunctionO function measureFunctionInternal(fn: () => void, options?: MeasureFunctionOptions): MeasureResults { const runs = options?.runs ?? config.runs; const warmupRuns = options?.warmupRuns ?? config.warmupRuns; - const dropOutliers = options?.dropOutliers ?? config.dropOutliers; + const removeOutliers = options?.removeOutliers ?? config.removeOutliers; showFlagsOutputIfNeeded(); @@ -37,5 +37,5 @@ function measureFunctionInternal(fn: () => void, options?: MeasureFunctionOption runResults.push({ duration, count: 1 }); } - return processRunResults(runResults, { warmupRuns, dropOutliers }); + return processRunResults(runResults, { warmupRuns, removeOutliers }); } diff --git a/packages/measure/src/measure-helpers.tsx b/packages/measure/src/measure-helpers.tsx index 28a7d8ef5..741a9030c 100644 --- a/packages/measure/src/measure-helpers.tsx +++ b/packages/measure/src/measure-helpers.tsx @@ -10,23 +10,23 @@ export interface RunResult { type ProcessRunResultsOptions = { warmupRuns: number; - dropOutliers?: boolean; + removeOutliers?: boolean; }; export function processRunResults(inputResults: RunResult[], options: ProcessRunResultsOptions): MeasureResults { const warmupResults = inputResults.slice(0, options.warmupRuns); const runResults = inputResults.slice(options.warmupRuns); - const { results, outliers } = options.dropOutliers ? findOutliers(runResults) : { results: runResults }; + const { results, outliers } = options.removeOutliers ? findOutliers(runResults) : { results: runResults }; const durations = results.map((result) => result.duration); - const meanDuration = math.mean(durations) as number; + const meanDuration = math.mean(...durations) as number; const stdevDuration = math.std(...durations); const warmupDurations = warmupResults.map((result) => result.duration); const outlierDurations = outliers?.map((result) => result.duration); - const counts = results.map((result) => result.count); - const meanCount = math.mean(counts) as number; + const counts = runResults.map((result) => result.count); + const meanCount = math.mean(...counts) as number; const stdevCount = math.std(...counts); return { diff --git a/packages/measure/src/measure-renders.tsx b/packages/measure/src/measure-renders.tsx index 755206efd..521a64967 100644 --- a/packages/measure/src/measure-renders.tsx +++ b/packages/measure/src/measure-renders.tsx @@ -16,7 +16,7 @@ logger.configure({ export interface MeasureRendersOptions { runs?: number; warmupRuns?: number; - dropOutliers?: boolean; + removeOutliers?: boolean; wrapper?: React.ComponentType<{ children: React.ReactElement }>; scenario?: (screen: any) => Promise; writeFile?: boolean; @@ -56,7 +56,7 @@ async function measureRendersInternal( const runs = options?.runs ?? config.runs; const scenario = options?.scenario; const warmupRuns = options?.warmupRuns ?? config.warmupRuns; - const dropOutliers = options?.dropOutliers ?? config.dropOutliers; + const removeOutliers = options?.removeOutliers ?? config.removeOutliers; const { render, cleanup } = resolveTestingLibrary(); const testingLibrary = getTestingLibrary(); @@ -114,7 +114,7 @@ async function measureRendersInternal( revertRenderPolyfills(); return { - ...processRunResults(runResults, { warmupRuns, dropOutliers }), + ...processRunResults(runResults, { warmupRuns, removeOutliers }), issues: { initialUpdateCount: initialRenderCount - 1, redundantUpdates: detectRedundantUpdates(renderJsonTrees, initialRenderCount), diff --git a/packages/reassure/README.md b/packages/reassure/README.md index 9640a4dbc..d1e73b5bf 100644 --- a/packages/reassure/README.md +++ b/packages/reassure/README.md @@ -370,6 +370,7 @@ async function measureRenders( interface MeasureRendersOptions { runs?: number; warmupRuns?: number; + removeOutliers?: boolean; wrapper?: React.ComponentType<{ children: ReactElement }>; scenario?: (view?: RenderResult) => Promise; writeFile?: boolean; @@ -378,9 +379,10 @@ interface MeasureRendersOptions { - **`runs`**: number of runs per series for the particular test - **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs (default 1). +- **`removeOutliers`**: should remove statistical outlier results (default: `true`) - **`wrapper`**: React component, such as a `Provider`, which the `ui` will be wrapped with. Note: the render duration of the `wrapper` itself is excluded from the results; only the wrapped component is measured. - **`scenario`**: a custom async function, which defines user interaction within the UI by utilising RNTL or RTL functions -- **`writeFile`**: (default `true`) should write output to file. +- **`writeFile`**: should write output to file (default `true`) #### `measureFunction` function @@ -399,13 +401,15 @@ async function measureFunction( interface MeasureFunctionOptions { runs?: number; warmupRuns?: number; + removeOutliers?: boolean; writeFile?: boolean; } ``` - **`runs`**: number of runs per series for the particular test -- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs. -- **`writeFile`**: (default `true`) should write output to file. +- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs +- **`removeOutliers`**: should remove statistical outlier results (default: `true`) +- **`writeFile`**: should write output to file (default `true`) #### `measureAsyncFunction` function @@ -426,13 +430,15 @@ async function measureAsyncFunction( interface MeasureAsyncFunctionOptions { runs?: number; warmupRuns?: number; + removeOutliers?: boolean; writeFile?: boolean; } ``` - **`runs`**: number of runs per series for the particular test -- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs. -- **`writeFile`**: (default `true`) should write output to file. +- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs +- **`removeOutliers`**: should remove statistical outlier results (default: `true`) +- **`writeFile`**: should write output to file (default `true`) ### Configuration diff --git a/test-apps/native/jestSetup.js b/test-apps/native/jestSetup.js index f2790b1d9..161690848 100644 --- a/test-apps/native/jestSetup.js +++ b/test-apps/native/jestSetup.js @@ -4,5 +4,4 @@ import { configure } from 'reassure'; configure({ testingLibrary: 'react-native', verbose: true, - dropOutliers: true, });