-
Couldn't load subscription status.
- Fork 1
Initial merge #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Initial merge #1
Conversation
Hotfix for FLOW 143: Update connect schema to 2.0 (JSON SIM)
…or & Intermediary
* Remove custom fields * Update dropdown labels to match with NDSE
… recipients are specified Instead, present the user with option to add recipient name, email
… , so that it shows in the UI output
…mail subject" and "Email body" from the user
…velope custom field action
FLOW-312 - Value property needs to be part Get envelope custom field action
…port for email subject and body
FLOW-338: template prefill : in create from template action , add support for email subject and body
* support SMS delivery * support all recipient types
FLOW-343: Template Prefill - part 1
FLOW-345: DocGen: Get document id given a document name
Expose it , so that Microsoft's maker / c2 experience can leverage.
FLOW-348: List Templates Action for Power Automate
| } | ||
| }, | ||
| "/accounts/{accountId}/connectV2/{connectId}": { | ||
| "delete": { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this PR have R2 work that you are trying to merge to the release branch?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, it has both R2 changes and C2 changes.
Since there are overlapping requirements and also they are being shipped in same time frame, I was thinking to just have a common branch rather than separate.
Flow-29 trigger statuses
* Create Readme.md * Nodefusion Portal certified connector added * action route updated * updated according to the PR comments * x-ms-summary back in the code * security property removal revert * Nodefusion Portal b2c authUrl and actions updated (#1) * Update apiProperties.json b2c auth parameters updated * Additional action added Get Work Services * Readme updated * Update apiDefinition.swagger.json Added x-ms-summary and description properties * update Nodefusion Portal connector to version 8.3.4 * downgraded swagger document version from 3.0 to 2.0 * updated title property to pass paconn validation --------- Co-authored-by: Hrvoje Kusulja <hkusulja@kusulja.com> Co-authored-by: mmustapic <mmustapic@3pro.eu> Co-authored-by: dominicusmento <mario.mustapic13@gmail.com> Co-authored-by: Jure Fadiga <jfadiga@nodefusion.com>
…s-action Add decline tab to add tabs action
* - First pass at getting all partitions. Still needs a few parameters setup so the code is currently broken. WIP. * Update script.csx Convert all partitions, not just the first one. * Update script.csx Fix placeholders of querystring params that need gotten/set for fetching subsequent partitions. * Update script.csx Fix syntax errors to ensure that isn't why connector upload is failing. * Update script.csx Fix more compilation errors. * Update script.csx Wanted a record of the fact that these changes are still returning subsequent partitions in array format, despite literally every response being converted. Is there maybe some sort of caching of the connector behavior, because it seems like I can't ever get the behavior of my data flow to change at all... * Update script.csx This version of the connector is the most complete example that can successfully be uploaded as a custom connector. Yet I still can't get the behavior to change no matter what code changes I make. * Issue #5 - Null detection and Type Conversion Error (#1) * Update script.csx - Fix null detection. * Fix issue with null handling in Snowflake connector --------- Co-authored-by: jbrinkman <github@brinkman.me> * Update apiDefinition.swagger.json - This version of the swaggerjson SHOULD be working, but we are seeing the DataSchema object being flattened out once uploaded to customer connector. * array data for DataSchema Got the swagger right (was really the code was had checked in before with just a little cleanup. The custom connector is now failing due to internal server error so we need to find a way to use the test page in powerapps online, despite the fact that it doesn't really handle array data very well. Possibly specifying the raw body data might be a workaround. * Update script.csx - Last few tweaks to get the customer connector to return subsequent partitions in pre-converted format. * Add version information into readme documentation * cleanup endpoints - extra body element is required. Caused a whole mess of issues. - change DataSchema to required and deprecate or delete unused endpoints as needed. - Remove fetchAllPages feature and separate into its own branch. * Code cleanup. - Make log messages more accurate. - Remove last remnant of fetchAllPartitions. * code cleanup * more minor code cleanup * intermediate check in - The code is acting absolutely insane and returning GetResults method as just a single property "Data" formatted as array. Since this is the 0 partition it should include metadata. - The interface is also not showing the partition parameter for the execsql method, so something is borked. * GetResults partition zero fixed - This was a very subtle issue related to the fact that when you call GetResults operation for partition zero you have no request body, so it cannot be parsed as json. - Change the response of execstmt async to match the schema of the sync version b/c the powerapps ui does not seem to be able to deal with the fact that async/sync have different response formats respectively. * SPC-36: Handle unexpected async responses better (#5) * Update script.csx - Fix async detection based on response code instead of request params, b/c apparently snowflake API can decide to return an async response if a synchronous response takes too long to return. * Fix typo in script.csx "BeginFetch" misspelled --------- Co-authored-by: Joseph Brinkman <github@brinkman.me> * SPC-39: MULTI_STATEMENT_COUNT parameter was being ignored (#6) * Update apiDefinition.swagger.json - change parameter name case to match snowflake docs exactly * add StatementHandles Map new response property for multi-statement handling. * Apply mappings to GetResults Same statementHandles mapping that was previously added to ExecSql was applied to GetResults to support Async * Remove async fixes These changes are already in dev branch, it was just a temporary change for debugging. * GetResults schema (#7) Innacurate schema was causing compilation issues in the power apps. Better to leave it as a untyped object since the schema is dynamic. * - Parse Object/Array types (#8) They were represented as a string before. * Document limitations per my experience. (#9) * Document limitations per my experience. * - Tweak readme * Update language limitations in the Readme documentation. --------- Co-authored-by: jbrinkman <github@brinkman.me> * openapi spec validation errors (#10) I was able to type the untyped objects, but a lot of those openapi spec validation errors are inherent to the fact that the snowflake api routes are technically all partial matches for eachother since the exec stmt path is "/" * Updated version history * Fix typo --------- Co-authored-by: TobinWritesCode <tobin.chee@improving.com>
This has R2 changes that Microsoft can leverage for C2
When submitting a connector, please make sure that you follow the requirements below, otherwise your PR might be rejected. We want to make you have a well-built connector, a smooth certification experience, and your users are happy :)
If this is your first time submitting to GitHub and you need some help, please sign up for this session.
apiDefinition.swagger.json, by runningpaconn validatecommand.apiProperties.jsonhas a valid brand color and doesn't use an invalid brand color,#007ee5or#ffffff. If this is an independent publisher connector, I confirm that I am not submitting a connector icon.If you are an Independent Publisher, you must also attest to the following to ensure a smooth publishing process: