Cloud Data Pull Using JSON HTTP Request in Power Automate

This post reviews the process of setting up a Scheduled Cloud Flow to pull data from the cloud to a local SQL Server using an HTTP request. The pulled data comes in the form of a JSON file and will need to be parsed in order to be received by SQL Server.

You will also need to setup a Data Gateway to securely connect to the data host which is detailed here: Setting Up a Data Gateway in Power Automate.

Click on the link below to see the YouTube video of this post:

In this example, I’m using a Scheduled Cloud Flow to pull stock market data from IEX Cloud. Any of the flow types will work with the HTTP request.

Start by selecting a New Flow and then Schedule Cloud Flow as shown below:

Scheduled Cloud Flow
Scheduled Cloud Flow

Next give your flow a name, when to start running the flow, and the frequency it should be triggered to run. Then click “Create”. No need to spend too much time selecting this info as you will be able to update it in the next step.

Create Scheduled Cloud Flow
Create Scheduled Cloud Flow

The Recurrence flow is automatically added and is the first to update. These options will change based on the frequency selected. For this example, I have it run weekly, 5 days of the week at 5:30pm EST.

Recurrence Flow
Flow Recurrence

Next select the HTTP request flow. This is a premium flow that’s part of the Power Automate Per User Plan. For testing purposes there is a trial offering to use free for 90 days.

Select HTTP Request
Select HTTP Request

Setup the HTTP request action by selecting GET as the Method and enter the URL of the dataset.

HTTP Request Action
HTTP Request Action

Next select the Parse JSON flow. This function will analyze a valid JSON string and return an untyped object representing the JSON structure. Follow the steps in order below:

  1. Content: select Dynamic Content and the Body of the HTTP request
  2. Generate from Sample: Press the Generate from Sample button and paste in the JSON data, see next two paragraphs for more details.
Parse Json
Parse Json

Open your web browser and paste the URL that was used in the HTTP request. Copy all or a subset of the Json dataset from the web address, click on the Generate from Sample button and paste in the content. If you paste a subset of the data, make sure to include a full record.

Json Data
Json Data

After pasting in the sample JSON payload, select “Done” to generate the schema.

Json Payload
Json Payload

Scroll down to the very bottom of the schema to see the list of required fields.

JSON Schema
JSON Schema Fields

Then scroll back up to see the field types.

JSON Schema2
JSON Schema Data Type

Below is a screenshot of comparing a record of data from the JSON payload against the JSON Schema. This will aid in creating the SQL table.

Json Table Conversion
Json SQL Table Conversion

Open your SQL Editor of choice and use the SQL CREATE TABLE statement to build your table using the fields and data types from the JSON schema. Below is the statement I used to create my table.

Create SQL Table
Create SQL Table

Back to Power Automate, the last step is to add Apply to Each Row to the flow. Select Dynamic Content and the Body of the Parse JSON function, then Add an Action.

Apply to Each Row
Apply to Each

Search for and select Insert Row (Latest Version) (Premium) SQL Server. Choose your Authentication Type and fill in the required information to setup your connection. Select the Data Gateway you created as part of the first action taken in this post.

Connect to SQL Server1
Connect to SQL Server

Once connected you will need to re-enter the Server Name, Database Name and select the Table from the list of tables in your database. Then the form will expand and lists all the fields from the table you selected.

Connect to SQL Server2
Select SQL Server Table

Next you will need to manually add the Apply to Each formula for each field in the table.

The formula is: items[‘Apply_to_each’][‘field name’]

Formula examples based on my fields below:

  • items[‘Apply_to_each’][‘Date’]
  • items[‘Apply_to_each’][‘Minute’]
  • items[‘Apply_to_each’][‘Label’]
Apply to Each Row
Apply to Each Row

Now when the scheduled flow runs it will get the cloud data, parse the JSON file into a format that SQL Server can understand, and append the data to the desired SQL table.

Share Buttons

Twitter
LinkedIn
Facebook
Reddit
Email

Related Posts

Subscribe

Don't miss out on new updates in your email (Make sure to check your Junk Mail after submission to confirm)