Skip to content

Commit

Permalink
fix: typo - commited -> committed (#468)
Browse files Browse the repository at this point in the history
* fix: typo - commited -> committed

* fix: update eslint rules after gts 5.3.1 broke it

* fix: output of npm run fix

* fix typos in samples readme

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
  • Loading branch information
leahecole and gcf-owl-bot[bot] committed Jul 2, 2024
1 parent 017c783 commit 672ab7d
Show file tree
Hide file tree
Showing 7 changed files with 15 additions and 15 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -291,7 +291,7 @@ Samples are in the [`samples/`](https://github.com/googleapis/nodejs-bigquery-st
| Sample | Source Code | Try it |
| --------------------------- | --------------------------------- | ------ |
| Append_rows_buffered | [source code](https://github.com/googleapis/nodejs-bigquery-storage/blob/main/samples/append_rows_buffered.js) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-bigquery-storage&page=editor&open_in_editor=samples/append_rows_buffered.js,samples/README.md) |
| Append_rows_json_writer_commited | [source code](https://github.com/googleapis/nodejs-bigquery-storage/blob/main/samples/append_rows_json_writer_commited.js) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-bigquery-storage&page=editor&open_in_editor=samples/append_rows_json_writer_commited.js,samples/README.md) |
| Append_rows_json_writer_committed | [source code](https://github.com/googleapis/nodejs-bigquery-storage/blob/main/samples/append_rows_json_writer_committed.js) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-bigquery-storage&page=editor&open_in_editor=samples/append_rows_json_writer_committed.js,samples/README.md) |
| Append_rows_json_writer_default | [source code](https://github.com/googleapis/nodejs-bigquery-storage/blob/main/samples/append_rows_json_writer_default.js) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-bigquery-storage&page=editor&open_in_editor=samples/append_rows_json_writer_default.js,samples/README.md) |
| Append_rows_pending | [source code](https://github.com/googleapis/nodejs-bigquery-storage/blob/main/samples/append_rows_pending.js) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-bigquery-storage&page=editor&open_in_editor=samples/append_rows_pending.js,samples/README.md) |
| Append_rows_proto2 | [source code](https://github.com/googleapis/nodejs-bigquery-storage/blob/main/samples/append_rows_proto2.js) | [![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-bigquery-storage&page=editor&open_in_editor=samples/append_rows_proto2.js,samples/README.md) |
Expand Down
10 changes: 5 additions & 5 deletions samples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ See sample code on the [Quickstart section](#quickstart).
* [Before you begin](#before-you-begin)
* [Samples](#samples)
* [Append_rows_buffered](#append_rows_buffered)
* [Append_rows_json_writer_commited](#append_rows_json_writer_commited)
* [Append_rows_json_writer_committed](#append_rows_json_writer_committed)
* [Append_rows_json_writer_default](#append_rows_json_writer_default)
* [Append_rows_pending](#append_rows_pending)
* [Append_rows_proto2](#append_rows_proto2)
Expand Down Expand Up @@ -159,16 +159,16 @@ __Usage:__



### Append_rows_json_writer_commited
### Append_rows_json_writer_committed

View the [source code](https://github.com/googleapis/nodejs-bigquery-storage/blob/main/samples/append_rows_json_writer_commited.js).
View the [source code](https://github.com/googleapis/nodejs-bigquery-storage/blob/main/samples/append_rows_json_writer_committed.js).

[![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-bigquery-storage&page=editor&open_in_editor=samples/append_rows_json_writer_commited.js,samples/README.md)
[![Open in Cloud Shell][shell_img]](https://console.cloud.google.com/cloudshell/open?git_repo=https://github.com/googleapis/nodejs-bigquery-storage&page=editor&open_in_editor=samples/append_rows_json_writer_committed.js,samples/README.md)

__Usage:__


`node samples/append_rows_json_writer_commited.js`
`node samples/append_rows_json_writer_committed.js`


-----
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,11 @@ function main(
datasetId = 'my_dataset',
tableId = 'my_table'
) {
// [START bigquerystorage_jsonstreamwriter_commited]
// [START bigquerystorage_jsonstreamwriter_committed]
const {adapt, managedwriter} = require('@google-cloud/bigquery-storage');
const {WriterClient, JSONWriter} = managedwriter;

async function appendJSONRowsCommitedStream() {
async function appendJSONRowsCommittedStream() {
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
Expand Down Expand Up @@ -105,8 +105,8 @@ function main(
writeClient.close();
}
}
// [END bigquerystorage_jsonstreamwriter_commited]
appendJSONRowsCommitedStream();
// [END bigquerystorage_jsonstreamwriter_committed]
appendJSONRowsCommittedStream();
}
process.on('unhandledRejection', err => {
console.error(err.message);
Expand Down
2 changes: 1 addition & 1 deletion samples/customer_record_pb.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
/*eslint-disable node/no-extraneous-require, eqeqeq, block-scoped-var, id-length, no-control-regex, no-magic-numbers, no-prototype-builtins, no-redeclare, no-shadow, no-var, sort-vars*/
/*eslint-disable n/no-extraneous-require, eqeqeq, block-scoped-var, id-length, no-control-regex, no-magic-numbers, no-prototype-builtins, no-redeclare, no-shadow, no-var, sort-vars*/
'use strict';

var $protobuf = require('protobufjs/minimal');
Expand Down
2 changes: 1 addition & 1 deletion samples/sample_data_pb.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
/*eslint-disable node/no-extraneous-require, eqeqeq, block-scoped-var, id-length, no-control-regex, no-magic-numbers, no-prototype-builtins, no-redeclare, no-shadow, no-var, sort-vars*/
/*eslint-disable n/no-extraneous-require, eqeqeq, block-scoped-var, id-length, no-control-regex, no-magic-numbers, no-prototype-builtins, no-redeclare, no-shadow, no-var, sort-vars*/
'use strict';

var $protobuf = require('protobufjs/minimal');
Expand Down
2 changes: 1 addition & 1 deletion samples/test/writeClient.js
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ describe('writeClient', () => {
projectId = table.metadata.tableReference.projectId;

const output = execSync(
`node append_rows_json_writer_commited ${projectId} ${datasetId} ${tableId}`
`node append_rows_json_writer_committed ${projectId} ${datasetId} ${tableId}`
);

assert.match(output, /Stream created:/);
Expand Down
4 changes: 2 additions & 2 deletions system-test/managed_writer_client_test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -1683,8 +1683,8 @@ describe('managedwriter.WriterClient', () => {

async function deleteDatasets() {
let [datasets] = await bigquery.getDatasets();
datasets = datasets.filter(
dataset => dataset.id?.includes(GCLOUD_TESTS_PREFIX)
datasets = datasets.filter(dataset =>
dataset.id?.includes(GCLOUD_TESTS_PREFIX)
);

for (const dataset of datasets) {
Expand Down

0 comments on commit 672ab7d

Please sign in to comment.