Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Snowflake v2 bug fixes and refactoring Convert action #3670

Merged
merged 30 commits into from
Oct 3, 2024

Conversation

jbrinkman
Copy link
Contributor


When submitting a connector, please make sure that you follow the requirements below, otherwise your PR might be rejected. We want to make you have a well-built connector, a smooth certification experience, and your users are happy :)

If this is your first time submitting to GitHub and you need some help, please sign up for this session.

  • I attest that the connector doesn't exist on the Power Platform today. I've verified by checking the pull requests in GitHub and by searching for the connector on the platform or in the documentation.
  • I attest that the connector works and I verified by deploying and testing all the operations.
  • I attest that I have added detailed descriptions for all operations and parameters in the swagger file.
  • I attest that I have added response schemas to my actions, unless the response schema is dynamic.
  • I validated the swagger file, apiDefinition.swagger.json, by running paconn validate command.
  • If this is a certified connector, I confirm that apiProperties.json has a valid brand color and doesn't use an invalid brand color, #007ee5 or #ffffff. If this is an independent publisher connector, I confirm that I am not submitting a connector icon.

If you are an Independent Publisher, you must also attest to the following to ensure a smooth publishing process:

  • I have named this PR after the pattern of "Connector Name (Independent Publisher)" ex: HubSpot Marketing (Independent Publisher)
  • Within this PR markdown file, I have pasted screenshots that show: 3 unique operations (actions/triggers) working within a Flow. This can be in one flow or part of multiple flows. For each one of those flows, I have pasted in screenshots of the Flow succeeding.
  • Within this PR markdown file, I have pasted in a screenshot from the Test operations section within the Custom Connector UI.
  • If the connector uses OAuth, I have provided detailed steps on how to create an app in the readme.md.

Fixes #3562, #2947, #2892, #2887

Deprecate the Convert action and incorporate conversion directly into the Submit SQL Statement for Execution and Check the Status and Get Results actions.

TobinWritesCode and others added 29 commits August 8, 2024 12:05
Convert all partitions, not just the first one.
Fix placeholders of querystring params that need gotten/set for fetching subsequent partitions.
Fix syntax errors to ensure that isn't why connector upload is failing.
Fix more compilation errors.
Wanted a record of the fact that these changes are still returning subsequent partitions in array format, despite literally every response being converted.

Is there maybe some sort of caching of the connector behavior, because it seems like I can't ever get the behavior of my data flow to change at all...
This version of the connector is the most complete example that can successfully be uploaded as a custom connector.

Yet I still can't get the behavior to change no matter what code changes I make.
* Update script.csx

- Fix null detection.

* Fix issue with null handling in Snowflake connector

---------

Co-authored-by: jbrinkman <[email protected]>
- This version of the swaggerjson SHOULD be working, but we are seeing the DataSchema object being flattened out once uploaded to customer connector.
Got the swagger right (was really the code was had checked in before with just a little cleanup.

The custom connector is now failing due to internal server error so we need to find a way to use the test page in powerapps online, despite the fact that it doesn't really handle array data very well. Possibly specifying the raw body data might be a workaround.
- Last few tweaks to get the customer connector to return subsequent partitions in pre-converted format.
- extra body element is required. Caused a whole mess of issues.
- change DataSchema to required and deprecate or delete unused endpoints as needed.
- Remove fetchAllPages feature and separate into its own branch.
- Make log messages more accurate.
- Remove last remnant of fetchAllPartitions.
Add version information into readme documentation
SPC-12/SPC-32 - All partitions should be returned in json class notation.
- The code is acting absolutely insane and returning GetResults method as just a single property "Data" formatted as array. Since this is the 0 partition it should include metadata.
- The interface is also not showing the partition parameter for the execsql method, so something is borked.
- This was a very subtle issue related to the fact that when you call GetResults operation for partition zero you have no request body, so it cannot be parsed as json.
- Change the response of execstmt async to match the schema of the sync version b/c the powerapps ui does not seem to be able to deal with the fact that async/sync have different response formats respectively.
* Update script.csx

- Fix async detection based on response code instead of request params, b/c apparently snowflake API can decide to return an async response if a synchronous response takes too long to return.

* Fix typo in script.csx

"BeginFetch" misspelled

---------

Co-authored-by: Joseph Brinkman <[email protected]>
* Update apiDefinition.swagger.json

- change parameter name case to match snowflake docs exactly

* add StatementHandles

Map new response property for multi-statement handling.

* Apply mappings to GetResults

Same statementHandles mapping that was previously added to ExecSql was applied to GetResults to support Async

* Remove async fixes

These changes are already in dev branch, it was just a temporary change for debugging.
Innacurate schema was causing compilation issues in the power apps. Better to leave it as a untyped object since the schema is dynamic.
They were represented as a string before.
* Document limitations per my experience.

* - Tweak readme

* Update language limitations in the Readme documentation.

---------

Co-authored-by: jbrinkman <[email protected]>
I was able to type the untyped objects, but a lot of those openapi spec validation errors are inherent to the fact that the snowflake api routes are technically all partial matches for eachother since the exec stmt path is "/"
@jbrinkman jbrinkman requested a review from a team as a code owner September 29, 2024 23:41
@jbrinkman
Copy link
Contributor Author

@jbrinkman please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.

@microsoft-github-policy-service agree [company="{your company}"]

Options:

  • (default - no company specified) I have sole ownership of intellectual property rights to my Submissions and I am not making Submissions in the course of work for my employer.
@microsoft-github-policy-service agree
  • (when company given) I am making Submissions in the course of work for my employer (or my employer has intellectual property rights in my Submissions by contract or applicable law). I have permission from my employer to make Submissions and enter into this Agreement on behalf of my employer. By signing below, the defined term “You” includes me and my employer.
@microsoft-github-policy-service agree company="Microsoft"

Contributor License Agreement

@microsoft-github-policy-service agree [company="Snowflake"]

@jbrinkman
Copy link
Contributor Author

@jbrinkman please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.

@microsoft-github-policy-service agree [company="{your company}"]

Options:

  • (default - no company specified) I have sole ownership of intellectual property rights to my Submissions and I am not making Submissions in the course of work for my employer.
@microsoft-github-policy-service agree
  • (when company given) I am making Submissions in the course of work for my employer (or my employer has intellectual property rights in my Submissions by contract or applicable law). I have permission from my employer to make Submissions and enter into this Agreement on behalf of my employer. By signing below, the defined term “You” includes me and my employer.
@microsoft-github-policy-service agree company="Microsoft"

Contributor License Agreement

@microsoft-github-policy-service agree company="Snowflake"

Copy link
Contributor

@vmanoharas vmanoharas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello @jbrinkman,

Kindly resolve review comments.

certified-connectors/Snowflake v2/readme.md Outdated Show resolved Hide resolved
Copy link
Contributor

@vmanoharas vmanoharas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dear Partner,

Congratulations your PR is approved !! We are proud to announce the Brand-New Certification experience for you to certify Power Platform Copilot Connectors & Plugins. Read the blog here- Announcing Partner Center to certify and publish Power Platform Copilot Connectors and Plugins - Microsoft Power Platform Blog

As next steps:

  1. Learn about the new certification experience- Get your Power Platform connector and plugin certified - Overview
  2. Package your connector and plugin files- Prepare Power Platform connector and plugin files for certification
  3. Validate the package for structure before submitting the package for certification to Partner Center - Run Package Validator tool
  4. Initiate a certification request in Partner Center- Verified publisher certification process
  5. Ensure your connector & plugin files comply to the Marketplace policies here- 5000 Power Platform Connector Policies for Marketplace and also 1000 Marketplace policies(as applicable)
  6. Test your connector post certification to provide go-live for deployment- Testing Guidelines
  7. If you wish to Update your connector or plugin in future- Updating Guidelines

For any queries or concerns please get in touch with us on [email protected]

Many thanks,
Power Platform
Copilot connector & plugin certification team

@vmanoharas vmanoharas merged commit 590ca87 into microsoft:dev Oct 3, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

PowerAutomatem flow: issue with Convert result set rows from array to objects
3 participants