Skip to content

Commit

Permalink
refactor: rename meassurePerformance to measureRenders (#433)
Browse files Browse the repository at this point in the history
* refactor: rename measurePerformance to measureRenders

* refactor: expose legacy `measurePerformance` alias

* chore: file naming

* chore: add changeset

* refactor: add `writeFile` option

* refactor: warn once

* docs: tweaks

* refactor: rename MeasureRendersOptions

* refactor: rename resetToDefaults
  • Loading branch information
mdjastrzebski authored Jan 5, 2024
1 parent d4a2001 commit 4352279
Show file tree
Hide file tree
Showing 21 changed files with 179 additions and 108 deletions.
8 changes: 8 additions & 0 deletions .changeset/chatty-paws-turn.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
---
'@callstack/reassure-measure': minor
'reassure': minor
'test-app-native': minor
---

- Rename `measurePerformance` to `measureRenders`.
- Add `writeFile` option to `measureRenders`/`measureFunction`.
35 changes: 19 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,11 +94,11 @@ Now that the library is installed, you can write your first test scenario in a f

```ts
// ComponentUnderTest.perf-test.tsx
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { ComponentUnderTest } from './ComponentUnderTest';

test('Simple test', async () => {
await measurePerformance(<ComponentUnderTest />);
await measureRenders(<ComponentUnderTest />);
});
```

Expand All @@ -111,7 +111,7 @@ This test will measure render times of `ComponentUnderTest` during mounting and
If your component contains any async logic or you want to test some interaction, you should pass the `scenario` option:

```ts
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { screen, fireEvent } from '@testing-library/react-native';
import { ComponentUnderTest } from './ComponentUnderTest';

Expand All @@ -121,7 +121,7 @@ test('Test with scenario', async () => {
await screen.findByText('Done');
};

await measurePerformance(<ComponentUnderTest />, { scenario });
await measureRenders(<ComponentUnderTest />, { scenario });
});
```

Expand All @@ -130,7 +130,7 @@ The body of the `scenario` function is using familiar React Native Testing Libra
In case of using a version of React Native Testing Library lower than v10.1.0, where [`screen` helper](https://callstack.github.io/react-native-testing-library/docs/api/#screen) is not available, the `scenario` function provides it as its first argument:

```ts
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { fireEvent } from '@testing-library/react-native';

test('Test with scenario', async () => {
Expand All @@ -139,7 +139,7 @@ test('Test with scenario', async () => {
await screen.findByText('Done');
};

await measurePerformance(<ComponentUnderTest />, { scenario });
await measureRenders(<ComponentUnderTest />, { scenario });
});
```

Expand Down Expand Up @@ -352,34 +352,36 @@ Looking at the example, you can notice that test scenarios can be assigned to ce

### Measurements

#### `measurePerformance` function
#### `measureRenders` function

Custom wrapper for the RNTL `render` function responsible for rendering the passed screen inside a `React.Profiler` component,
measuring its performance and writing results to the output file. You can use the optional `options` object that allows customizing aspects
of the testing

```ts
async function measurePerformance(
async function measureRenders(
ui: React.ReactElement,
options?: MeasureOptions,
options?: MeasureRendersOptions,
): Promise<MeasureResults> {
```

#### `MeasureOptions` type
#### `MeasureRendersOptions` type

```ts
interface MeasureOptions {
interface MeasureRendersOptions {
runs?: number;
warmupRuns?: number;
wrapper?: React.ComponentType<{ children: ReactElement }>;
scenario?: (view?: RenderResult) => Promise<any>;
writeFile?: boolean;
}
```

- **`runs`**: number of runs per series for the particular test
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs (default 1).
- **`wrapper`**: React component, such as a `Provider`, which the `ui` will be wrapped with. Note: the render duration of the `wrapper` itself is excluded from the results; only the wrapped component is measured.
- **`scenario`**: a custom async function, which defines user interaction within the UI by utilising RNTL or RTL functions
- **`writeFile`**: (default `true`) should write output to file.

#### `measureFunction` function

Expand Down Expand Up @@ -435,10 +437,11 @@ const defaultConfig: Config = {
```

**`runs`**: the number of repeated runs in a series per test (allows for higher accuracy by aggregating more data). Should be handled with care.

- **`warmupRuns`**: the number of additional warmup runs that will be done and discarded before the actual runs.
**`outputFile`**: the name of the file the records will be saved to
**`verbose`**: make Reassure log more, e.g. for debugging purposes
**`testingLibrary`**: where to look for `render` and `cleanup` functions, supported values `'react-native'`, `'react'` or object providing custom `render` and `cleanup` functions
**`outputFile`**: the name of the file the records will be saved to
**`verbose`**: make Reassure log more, e.g. for debugging purposes
**`testingLibrary`**: where to look for `render` and `cleanup` functions, supported values `'react-native'`, `'react'` or object providing custom `render` and `cleanup` functions

#### `configure` function

Expand All @@ -448,10 +451,10 @@ function configure(customConfig: Partial<Config>): void;

The `configure` function can override the default config parameters.

#### `resetToDefault` function
#### `resetToDefaults` function

```ts
resetToDefault(): void
resetToDefaults(): void
```

Reset the current config to the original `defaultConfig` object
Expand Down
44 changes: 27 additions & 17 deletions docusaurus/docs/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,24 +7,30 @@ sidebar_position: 4

## Measurements

### `measurePerformance` function
### `measureRenders()` function {#measure-renders}

Custom wrapper for the RNTL `render` function responsible for rendering the passed screen inside a `React.Profiler` component,
:::info

Prior to version 1.0, this function has been named `measurePerformance`.

:::

Custom wrapper for the RNTL/RTL's `render` function responsible for rendering the passed screen inside a `React.Profiler` component,
measuring its performance and writing results to the output file. You can use optional `options` object allows customizing aspects
of the testing
of the testing.

```ts
async function measurePerformance(
async function measureRenders(
ui: React.ReactElement,
options?: MeasureOptions,
options?: MeasureRendersOptions,
): Promise<MeasureResults> {
```
#### Example
#### Example {#measure-renders-example}
```ts
// sample.perf-test.tsx
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { screen, fireEvent } from '@testing-library/react-native';
import { ComponentUnderTest } from './ComponentUnderTest';

Expand All @@ -34,27 +40,29 @@ test('Test with scenario', async () => {
await screen.findByText('Done');
};

await measurePerformance(<ComponentUnderTest />, { scenario });
await measureRenders(<ComponentUnderTest />, { scenario });
});
```
### `MeasureOptions` type
### `MeasureRendersOptions` type {#measure-renders-options}
```ts
interface MeasureOptions {
interface MeasureRendersOptions {
runs?: number;
warmupRuns?: number;
wrapper?: React.ComponentType<{ children: ReactElement }>;
scenario?: (view?: RenderResult) => Promise<any>;
writeFile?: boolean;
}
```
- **`runs`**: number of runs per series for the particular test
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs.
- **`wrapper`**: React component, such as a `Provider`, which the `ui` will be wrapped with. Note: the render duration of the `wrapper` itself is excluded from the results, only the wrapped component is measured.
- **`scenario`**: a custom async function, which defines user interaction within the ui by utilized RNTL functions
- **`writeFile`**: (default `true`) should write output to file.
### `measureFunction` function
### `measureFunction` function {#measure-function}
Allows you to wrap any synchronous function, measure its performance and write results to the output file. You can use optional `options` to customize aspects of the testing.
Expand All @@ -65,7 +73,7 @@ async function measureFunction(
): Promise<MeasureResults> {
```
#### Example
#### Example {#measure-function-example}
```ts
// sample.perf-test.tsx
Expand All @@ -77,17 +85,19 @@ test('fib 30', async () => {
});
```
### `MeasureFunctionOptions` type
### `MeasureFunctionOptions` type {#measure-function-options}
```ts
interface MeasureFunctionOptions {
runs?: number;
warmupRuns?: number;
writeFile?: boolean;
}
```
- **`runs`**: number of runs per series for the particular test
- **`warmupRuns`**: number of additional warmup runs that will be done and discarded before the actual runs.
- **`writeFile`**: (default `true`) should write output to file.
## Configuration
Expand Down Expand Up @@ -133,7 +143,7 @@ function configure(customConfig: Partial<Config>): void;
You can use the `configure` function to override the default config parameters.
#### Example
#### Example {#configure-example}
```ts
import { configure } from 'reassure';
Expand All @@ -144,13 +154,13 @@ configure({
});
```
### `resetToDefault` function
### `resetToDefaults` function {#reset-to-defaults}
```ts
resetToDefault(): void
resetToDefaults(): void
```
Reset current config to the original `defaultConfig` object. You can call `resetToDefault()` anywhere in your performance test file.
Reset current config to the original `defaultConfig` object. You can call `resetToDefaults()` anywhere in your performance test file.
### Environmental variables
Expand Down
15 changes: 8 additions & 7 deletions docusaurus/docs/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,10 +44,10 @@ Now that the library is installed, you can write you first test scenario in a fi

```ts
// ComponentUnderTest.perf-test.tsx
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';

test('Simple test', async () => {
await measurePerformance(<ComponentUnderTest />);
await measureRenders(<ComponentUnderTest />);
});
```

Expand All @@ -60,7 +60,7 @@ This test will measure render times of `ComponentUnderTest` during mounting and
If your component contains any async logic or you want to test some interaction you should pass the `scenario` option:

```ts
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { screen, fireEvent } from '@testing-library/react-native';

test('Test with scenario', async () => {
Expand All @@ -69,7 +69,7 @@ test('Test with scenario', async () => {
await screen.findByText('Done');
};

await measurePerformance(<ComponentUnderTest />, { scenario });
await measureRenders(<ComponentUnderTest />, { scenario });
});
```

Expand All @@ -78,7 +78,7 @@ The body of the `scenario` function is using familiar React Native Testing Libra
In case of using a version of React Native Testing Library lower than v10.1.0, where [`screen` helper](https://callstack.github.io/react-native-testing-library/docs/api/#screen) is not available, the `scenario` function provides it as its first argument:

```ts
import { measurePerformance } from 'reassure';
import { measureRenders } from 'reassure';
import { fireEvent } from '@testing-library/react-native';

test('Test with scenario', async () => {
Expand All @@ -87,14 +87,15 @@ test('Test with scenario', async () => {
await screen.findByText('Done');
};

await measurePerformance(<ComponentUnderTest />, { scenario });
await measureRenders(<ComponentUnderTest />, { scenario });
});
```

If your test contains any async changes, you will need to make sure that the scenario waits for these changes to settle, e.g. using
`findBy` queries, `waitFor` or `waitForElementToBeRemoved` functions from RNTL.

For more examples look into our example apps:

- [React Native (CLI)](https://github.com/callstack/reassure-examples/tree/main/examples/native)
- [React Native (Expo)](https://github.com/callstack/reassure-examples/tree/main/examples/native-expo)
- [React (Next.js)](https://github.com/callstack/reassure-examples/tree/main/examples/web-nextjs)
Expand Down Expand Up @@ -253,7 +254,7 @@ for performance tests you can add following override to your `.eslintrc` file:
rules: {
'jest/expect-expect': [
'error',
{ assertFunctionNames: ['expect', 'measurePerformance'] },
{ assertFunctionNames: ['expect', 'measureRenders'] },
],
}
```
2 changes: 1 addition & 1 deletion packages/reassure-compare/src/output/json.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ export async function writeToJson(filePath: string, data: CompareResult) {
} catch (error) {
logger.error(`❌ Could not write JSON output file ${filePath}`);
logger.error(`🔗 ${path.resolve(filePath)}`);
logger.error(error);
logger.error('Error details:', error);
throw error;
}
}
2 changes: 1 addition & 1 deletion packages/reassure-compare/src/output/markdown.ts
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ async function writeToFile(filePath: string, content: string) {
} catch (error) {
logger.error(`❌ Could not write markdown output file ${filePath}`);
logger.error(`🔗 ${path.resolve(filePath)}`);
logger.error(error);
logger.error('Error details:', error);
throw error;
}
}
Expand Down
1 change: 1 addition & 0 deletions packages/reassure-logger/src/index.ts
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
export * as logger from './logger';
export { warnOnce } from './warn-once';
18 changes: 9 additions & 9 deletions packages/reassure-logger/src/logger.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,32 +28,32 @@ export function configure(options: Partial<LoggerOptions>) {
// Jest is wrapping console.* calls, so we need to get the raw console object
const rawConsole = require('console') as typeof console;

export function error(...args: unknown[]) {
rawConsole.error(colorError(...args));
export function error(message?: string, ...args: unknown[]) {
rawConsole.error(colorError(message, ...args));
}

export function warn(...args: unknown[]) {
export function warn(message?: string, ...args: unknown[]) {
if (config.silent) return;

rawConsole.warn(colorWarn(...args));
rawConsole.warn(colorWarn(message, ...args));
}

export function log(...args: unknown[]) {
export function log(message?: string, ...args: unknown[]) {
if (config.silent) return;

rawConsole.log(...args);
rawConsole.log(message, ...args);
}

export function verbose(...args: unknown[]) {
export function verbose(message?: string, ...args: unknown[]) {
if (!config.verbose || config.silent) return;

rawConsole.log(colorVerbose(...args));
rawConsole.log(colorVerbose(message, ...args));
}

export function color(color: keyof typeof colors, ...args: unknown[]) {
if (config.silent) return;

return rawConsole.log(chalk.hex(colors[color])(args));
return rawConsole.log(chalk.hex(colors[color])(...args));
}

/** Log message that indicates progress of operation, does not output the trailing newline. */
Expand Down
12 changes: 12 additions & 0 deletions packages/reassure-logger/src/warn-once.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
import { warn } from './logger';

const warned = new Set<string>();

export function warnOnce(message: string, ...args: unknown[]) {
if (warned.has(message)) {
return;
}

warn(message, ...args);
warned.add(message);
}
Loading

0 comments on commit 4352279

Please sign in to comment.