Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .changeset/purple-jokes-rhyme.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@
'@callstack/reassure-measure': minor
---

Added a `dropOutliers` option to detect and drop statistical outliers
Added a `removeOutliers` option to detect and drop statistical outliers
7 changes: 7 additions & 0 deletions .changeset/violet-hornets-call.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
---
'reassure': patch
'@callstack/reassure-measure': patch
'@callstack/reassure-compare': patch
---

enable outlier detection by default
16 changes: 11 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -370,6 +370,7 @@ async function measureRenders(
interface MeasureRendersOptions {
runs?: number;
warmupRuns?: number;
removeOutliers?: boolean;
wrapper?: React.ComponentType<{ children: ReactElement }>;
scenario?: (view?: RenderResult) => Promise<any>;
writeFile?: boolean;
Expand All @@ -378,9 +379,10 @@ interface MeasureRendersOptions {

- **`runs`**: number of runs per series for the particular test
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs (default 1).
- **`removeOutliers`**: should remove statistical outlier results (default: `true`)
- **`wrapper`**: React component, such as a `Provider`, which the `ui` will be wrapped with. Note: the render duration of the `wrapper` itself is excluded from the results; only the wrapped component is measured.
- **`scenario`**: a custom async function, which defines user interaction within the UI by utilising RNTL or RTL functions
- **`writeFile`**: (default `true`) should write output to file.
- **`writeFile`**: should write output to file (default `true`)

#### `measureFunction` function

Expand All @@ -399,13 +401,15 @@ async function measureFunction(
interface MeasureFunctionOptions {
runs?: number;
warmupRuns?: number;
removeOutliers?: boolean;
writeFile?: boolean;
}
```

- **`runs`**: number of runs per series for the particular test
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs.
- **`writeFile`**: (default `true`) should write output to file.
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs
- **`removeOutliers`**: should remove statistical outlier results (default: `true`)
- **`writeFile`**: should write output to file (default `true`)

#### `measureAsyncFunction` function

Expand All @@ -426,13 +430,15 @@ async function measureAsyncFunction(
interface MeasureAsyncFunctionOptions {
runs?: number;
warmupRuns?: number;
removeOutliers?: boolean;
writeFile?: boolean;
}
```

- **`runs`**: number of runs per series for the particular test
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs.
- **`writeFile`**: (default `true`) should write output to file.
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs
- **`removeOutliers`**: should remove statistical outlier results (default: `true`)
- **`writeFile`**: should write output to file (default `true`)

### Configuration

Expand Down
18 changes: 12 additions & 6 deletions docusaurus/docs/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,17 +50,19 @@ test('Test with scenario', async () => {
interface MeasureRendersOptions {
runs?: number;
warmupRuns?: number;
removeOutliers?: boolean;
wrapper?: React.ComponentType<{ children: ReactElement }>;
scenario?: (view?: RenderResult) => Promise<any>;
writeFile?: boolean;
}
```

- **`runs`**: number of runs per series for the particular test
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs.
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs (default: 1)
- **`removeOutliers`**: should remove statistical outlier results (default: `true`)
- **`wrapper`**: React component, such as a `Provider`, which the `ui` will be wrapped with. Note: the render duration of the `wrapper` itself is excluded from the results, only the wrapped component is measured.
- **`scenario`**: a custom async function, which defines user interaction within the ui by utilized RNTL functions
- **`writeFile`**: (default `true`) should write output to file.
- **`writeFile`**: should write output to file (default `true`)

### `measureFunction` function {#measure-function}

Expand Down Expand Up @@ -91,13 +93,15 @@ test('fib 30', async () => {
interface MeasureFunctionOptions {
runs?: number;
warmupRuns?: number;
removeOutliers?: boolean;
writeFile?: boolean;
}
```

- **`runs`**: number of runs per series for the particular test
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs.
- **`writeFile`**: (default `true`) should write output to file.
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs
- **`removeOutliers`**: should remove statistical outlier results (default: `true`)
- **`writeFile`**: should write output to file (default `true`)

### `measureAsyncFunction` function {#measure-async-function}

Expand Down Expand Up @@ -136,13 +140,15 @@ test('fib 30', async () => {
interface MeasureAsyncFunctionOptions {
runs?: number;
warmupRuns?: number;
removeOutliers?: boolean;
writeFile?: boolean;
}
```

- **`runs`**: number of runs per series for the particular test
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs.
- **`writeFile`**: (default `true`) should write output to file.
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs
- **`removeOutliers`**: should remove statistical outlier results (default: `true`)
- **`writeFile`**: (default `true`) should write output to file

## Configuration

Expand Down
2 changes: 1 addition & 1 deletion packages/measure/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
### Minor Changes

- c51fb5f: feat: add measureAsyncFunction
- 59b21d4: Added a `dropOutliers` option to detect and drop statistical outliers
- 59b21d4: Added a `removeOutliers` option to detect and drop statistical outliers

### Patch Changes

Expand Down
70 changes: 70 additions & 0 deletions packages/measure/src/__tests__/measure-async-function.test.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
/* eslint-disable promise/prefer-await-to-then */
import stripAnsi from 'strip-ansi';
import { measureAsyncFunction } from '../measure-async-function';
import { setHasShownFlagsOutput } from '../output';

// Exponentially slow function
function fib(n: number): number {
if (n <= 1) {
return n;
}

return fib(n - 1) + fib(n - 2);
}

test('measureAsyncFunction captures results', async () => {
const fn = jest.fn(() => Promise.resolve().then(() => fib(5)));
const results = await measureAsyncFunction(fn, { runs: 1, warmupRuns: 0, writeFile: false });

expect(fn).toHaveBeenCalledTimes(1);
expect(results.runs).toBe(1);
expect(results.counts).toEqual([1]);
});

test('measureAsyncFunction runs specified number of times', async () => {
const fn = jest.fn(() => Promise.resolve().then(() => fib(5)));
const results = await measureAsyncFunction(fn, { runs: 20, warmupRuns: 0, writeFile: false });

expect(fn).toHaveBeenCalledTimes(20);
expect(results.runs).toBe(20);
expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(20);
expect(results.counts).toHaveLength(20);
expect(results.meanCount).toBe(1);
expect(results.stdevCount).toBe(0);
});

test('measureAsyncFunction applies "warmupRuns" option', async () => {
const fn = jest.fn(() => Promise.resolve().then(() => fib(5)));
const results = await measureAsyncFunction(fn, { runs: 10, warmupRuns: 1, writeFile: false });

expect(fn).toHaveBeenCalledTimes(11);
expect(results.runs).toBe(10);
expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(10);
expect(results.counts).toHaveLength(10);
expect(results.meanCount).toBe(1);
expect(results.stdevCount).toBe(0);
});

const errorsToIgnore = ['❌ Measure code is running under incorrect Node.js configuration.'];
const realConsole = jest.requireActual('console') as Console;

beforeEach(() => {
jest.spyOn(realConsole, 'error').mockImplementation((message) => {
if (!errorsToIgnore.some((error) => message.includes(error))) {
realConsole.error(message);
}
});
});

test('measureAsyncFunction should log error when running under incorrect node flags', async () => {
setHasShownFlagsOutput(false);
const results = await measureAsyncFunction(jest.fn(), { runs: 1, writeFile: false });

expect(results.runs).toBe(1);
const consoleErrorCalls = jest.mocked(realConsole.error).mock.calls;
expect(stripAnsi(consoleErrorCalls[0][0])).toMatchInlineSnapshot(`
"❌ Measure code is running under incorrect Node.js configuration.
Performance test code should be run in Jest with certain Node.js flags to increase measurements stability.
Make sure you use the Reassure CLI and run it using "reassure" command."
`);
});
4 changes: 2 additions & 2 deletions packages/measure/src/__tests__/measure-function.test.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ test('measureFunction runs specified number of times', async () => {

expect(fn).toHaveBeenCalledTimes(20);
expect(results.runs).toBe(20);
expect(results.durations).toHaveLength(20);
expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(20);
expect(results.counts).toHaveLength(20);
expect(results.meanCount).toBe(1);
expect(results.stdevCount).toBe(0);
Expand All @@ -51,7 +51,7 @@ test('measureFunction applies "warmupRuns" option', async () => {

expect(fn).toHaveBeenCalledTimes(11);
expect(results.runs).toBe(10);
expect(results.durations).toHaveLength(10);
expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(10);
expect(results.counts).toHaveLength(10);
expect(results.meanCount).toBe(1);
expect(results.stdevCount).toBe(0);
Expand Down
8 changes: 4 additions & 4 deletions packages/measure/src/__tests__/measure-renders.test.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ test('measureRenders run test given number of times', async () => {
const scenario = jest.fn(() => Promise.resolve(null));
const results = await measureRenders(<View />, { runs: 20, scenario, writeFile: false });
expect(results.runs).toBe(20);
expect(results.durations).toHaveLength(20);
expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(20);
expect(results.counts).toHaveLength(20);
expect(results.meanCount).toBe(1);
expect(results.stdevCount).toBe(0);
Expand All @@ -32,7 +32,7 @@ test('measureRenders applies "warmupRuns" option', async () => {

expect(scenario).toHaveBeenCalledTimes(11);
expect(results.runs).toBe(10);
expect(results.durations).toHaveLength(10);
expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(10);
expect(results.counts).toHaveLength(10);
expect(results.meanCount).toBe(1);
expect(results.stdevCount).toBe(0);
Expand Down Expand Up @@ -63,9 +63,9 @@ function IgnoreChildren(_: React.PropsWithChildren<{}>) {

test('measureRenders does not measure wrapper execution', async () => {
const results = await measureRenders(<View />, { wrapper: IgnoreChildren, writeFile: false });
expect(results.runs).toBe(10);
expect(results.durations).toHaveLength(10);
expect(results.counts).toHaveLength(10);
expect(results.runs).toBe(10);
expect(results.durations.length + (results.outlierDurations?.length ?? 0)).toBe(10);
expect(results.meanDuration).toBe(0);
expect(results.meanCount).toBe(0);
expect(results.stdevDuration).toBe(0);
Expand Down
4 changes: 2 additions & 2 deletions packages/measure/src/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,15 @@ export type Cleanup = () => void;
type Config = {
runs: number;
warmupRuns: number;
dropOutliers: boolean;
removeOutliers: boolean;
outputFile: string;
testingLibrary?: TestingLibrary;
};

const defaultConfig: Config = {
runs: 10,
warmupRuns: 1,
dropOutliers: false,
removeOutliers: true,
outputFile: process.env.REASSURE_OUTPUT_FILE ?? '.reassure/current.perf',
testingLibrary: undefined,
};
Expand Down
4 changes: 2 additions & 2 deletions packages/measure/src/measure-async-function.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ async function measureAsyncFunctionInternal(
): Promise<MeasureResults> {
const runs = options?.runs ?? config.runs;
const warmupRuns = options?.warmupRuns ?? config.warmupRuns;
const dropOutliers = options?.dropOutliers ?? config.dropOutliers;
const removeOutliers = options?.removeOutliers ?? config.removeOutliers;

showFlagsOutputIfNeeded();

Expand All @@ -39,5 +39,5 @@ async function measureAsyncFunctionInternal(
runResults.push({ duration, count: 1 });
}

return processRunResults(runResults, { warmupRuns, dropOutliers });
return processRunResults(runResults, { warmupRuns, removeOutliers });
}
6 changes: 3 additions & 3 deletions packages/measure/src/measure-function.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import { showFlagsOutputIfNeeded, writeTestStats } from './output';
export interface MeasureFunctionOptions {
runs?: number;
warmupRuns?: number;
dropOutliers?: boolean;
removeOutliers?: boolean;
writeFile?: boolean;
}

Expand All @@ -23,7 +23,7 @@ export async function measureFunction(fn: () => void, options?: MeasureFunctionO
function measureFunctionInternal(fn: () => void, options?: MeasureFunctionOptions): MeasureResults {
const runs = options?.runs ?? config.runs;
const warmupRuns = options?.warmupRuns ?? config.warmupRuns;
const dropOutliers = options?.dropOutliers ?? config.dropOutliers;
const removeOutliers = options?.removeOutliers ?? config.removeOutliers;

showFlagsOutputIfNeeded();

Expand All @@ -37,5 +37,5 @@ function measureFunctionInternal(fn: () => void, options?: MeasureFunctionOption
runResults.push({ duration, count: 1 });
}

return processRunResults(runResults, { warmupRuns, dropOutliers });
return processRunResults(runResults, { warmupRuns, removeOutliers });
}
10 changes: 5 additions & 5 deletions packages/measure/src/measure-helpers.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -10,23 +10,23 @@ export interface RunResult {

type ProcessRunResultsOptions = {
warmupRuns: number;
dropOutliers?: boolean;
removeOutliers?: boolean;
};

export function processRunResults(inputResults: RunResult[], options: ProcessRunResultsOptions): MeasureResults {
const warmupResults = inputResults.slice(0, options.warmupRuns);
const runResults = inputResults.slice(options.warmupRuns);

const { results, outliers } = options.dropOutliers ? findOutliers(runResults) : { results: runResults };
const { results, outliers } = options.removeOutliers ? findOutliers(runResults) : { results: runResults };

const durations = results.map((result) => result.duration);
const meanDuration = math.mean(durations) as number;
const meanDuration = math.mean(...durations) as number;
const stdevDuration = math.std(...durations);
const warmupDurations = warmupResults.map((result) => result.duration);
const outlierDurations = outliers?.map((result) => result.duration);

const counts = results.map((result) => result.count);
const meanCount = math.mean(counts) as number;
const counts = runResults.map((result) => result.count);
const meanCount = math.mean(...counts) as number;
const stdevCount = math.std(...counts);

return {
Expand Down
6 changes: 3 additions & 3 deletions packages/measure/src/measure-renders.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ logger.configure({
export interface MeasureRendersOptions {
runs?: number;
warmupRuns?: number;
dropOutliers?: boolean;
removeOutliers?: boolean;
wrapper?: React.ComponentType<{ children: React.ReactElement }>;
scenario?: (screen: any) => Promise<any>;
writeFile?: boolean;
Expand Down Expand Up @@ -56,7 +56,7 @@ async function measureRendersInternal(
const runs = options?.runs ?? config.runs;
const scenario = options?.scenario;
const warmupRuns = options?.warmupRuns ?? config.warmupRuns;
const dropOutliers = options?.dropOutliers ?? config.dropOutliers;
const removeOutliers = options?.removeOutliers ?? config.removeOutliers;

const { render, cleanup } = resolveTestingLibrary();
const testingLibrary = getTestingLibrary();
Expand Down Expand Up @@ -114,7 +114,7 @@ async function measureRendersInternal(
revertRenderPolyfills();

return {
...processRunResults(runResults, { warmupRuns, dropOutliers }),
...processRunResults(runResults, { warmupRuns, removeOutliers }),
issues: {
initialUpdateCount: initialRenderCount - 1,
redundantUpdates: detectRedundantUpdates(renderJsonTrees, initialRenderCount),
Expand Down
Loading