Solved: Power Automate Flow Extract Email Attachment into Doc Library and Update Sharepoint

Isaac

Lifelong Learner
Local time
Today, 08:24
Joined
Mar 14, 2017
Messages
10,109
Again, posting this to build up content/traffic in this forum.

We have a enterprise-level, I.T.-built System, from whence can be exported useful reports manually. I wanted to automate the ingestion of one of these XLSX reports for use in an Access database. (Since we have users on different networks, Sharepoint and Email are the main common denominators that provide interconnectedness).

I created a business process + Flow that does the following useful thing:

  1. Convinced the admins of the I.T. system to automatically email this report to a Shared (Office 365) mailbox, each Friday morning
  2. Built a Microsoft Flow that does the following:
    1. Be triggered by an incoming email to the shared mailbox, with certain text in the Subject line
    2. Extract the attachment of the email to a Sharepoint document library, using a variable that will contain a unique file name
    3. Create a new record in a Sharepoint list, telling me the unique filename that was granted to the file, so that at any time I can use MS Access to query that linked Sharepoint list for the most recent entry, and then freely link to the downloaded xlsx file in the Sharepoint doc library for further processing
    4. Move the email to a subfolder of the shared mailbox's Inbox
  3. Write some Access code that can be triggered by a button which determines whether we already have imported the most 'recent' file, and if not, perform further import/processing.
This ETL process was accomplished with the following chained Flow operation steps:
(To make the screenshot information useful, I have left the official Microsoft default name of each Operation as the portion prior to the parenthesis; the parenthesis is the portion I've re-named for my own descriptive purposes)

flow1.jpg

flow2.jpg


flow3.jpg


flow4.jpg


flow5.jpg


After this, I can write Access code as follows:

  1. Code Dlookup to get the FileName in a column of my "already imported" table
  2. Select the top 1 record (by Created, desc) of the Sharepoint list that the Flow creates 1 record in each time it downloads, containing the filename
  3. If the two are the same, we've already imported the most recent one; if not, proceed
  4. Link to the downloaded XLSX in the Sharepoint doc library, i.e.:
    \\company.sharepoint.com@SSL\DavWWWRoot\sites\SPOSiteName\DocLibraryName\" & varFileName
  5. Perform further processing, append/update queries, etc
This resulted in a sound, fully automated, server-hosted ETL job with which I am happy!
 
Last edited:
Again, posting this to build up content/traffic in this forum.

We have a enterprise-level, I.T.-built System, from whence can be exported useful reports manually. I wanted to automate the ingestion of one of these XLSX reports for use in an Access database. (Since we have users on different networks, Sharepoint and Email are the main common denominators that provide interconnectedness).

I created a business process + Flow that does the following useful thing:

  1. Convinced the admins of the I.T. system to automatically email this report to a Shared (Office 365) mailbox, each Friday morning
  2. Built a Microsoft Flow that does the following:
    1. Be triggered by an incoming email to the shared mailbox, with certain text in the Subject line
    2. Extract the attachment of the email to a Sharepoint document library, using a variable that will contain a unique file name
    3. Create a new record in a Sharepoint list, telling me the unique filename that was granted to the file, so that at any time I can use MS Access to query that linked Sharepoint list for the most recent entry, and then freely link to the downloaded xlsx file in the Sharepoint doc library for further processing
    4. Move the email to a subfolder of the shared mailbox's Inbox
  3. Write some Access code that can be triggered by a button which determines whether we already have imported the most 'recent' file, and if not, perform further import/processing.
This ETL process was accomplished with the following chained Flow operation steps:
(To make the screenshot information useful, I have left the official Microsoft default name of each Operation as the portion prior to the parenthesis; the parenthesis is the portion I've re-named for my own descriptive purposes)

View attachment 91219
View attachment 91214

View attachment 91215

View attachment 91216

View attachment 91217

After this, I can write Access code as follows:

  1. Code Dlookup to get the FileName in a column of my "already imported" table
  2. Select the top 1 record (by Created, desc) of the Sharepoint list that the Flow creates 1 record in each time it downloads, containing the filename
  3. If the two are the same, we've already imported the most recent one; if not, proceed
  4. Link to the downloaded XLSX in the Sharepoint doc library, i.e.:
    \\company.sharepoint.com@SSL\DavWWWRoot\sites\SPOSiteName\DocLibraryName\" & varFileName
  5. Perform further processing, append/update queries, etc
This resulted in a sound, fully automated, server-hosted ETL job with which I am happy!
Sir, I commend you on this. The last bit for the access code is exactly what I was looking for in this project I'm running that even ChatGPT couldn't do. Thank you for this write up and all of the work you put into this.
 
Sir, I commend you on this. The last bit for the access code is exactly what I was looking for in this project I'm running that even ChatGPT couldn't do. Thank you for this write up and all of the work you put into this.
I'm very glad to see that it helped someone.
Microsoft flow AKA power automate is indeed a powerful tool especially if integration services is not available
 

Users who are viewing this thread

Back
Top Bottom