Data in Access, uses Excel charts for a Word report, how to automate? (1 Viewer)

HartJF

Registered User.
Local time
Yesterday, 22:44
Joined
Jul 23, 2010
Messages
34
I created a one-time (I thought) comprehensive performance report. I mined 1.5GB of data in a back-end database with seven principal tables. My front-end has more than 25 queries, some nested, that produced summary output that I copied and pasted into an Excel workbook. The workbook contains 13 charts (2-D and 3-D column charts and pie charts) that I embedded in a 3-page Word document with a Word table for other output and a little connective narrative. I saved the document as a .pdf (570KB) and sent it our Board of Trustees via e-mail.

Once and done, I thought.

My boss liked it so much that he wants it repeated on a regular basis.

(My first mistake was producing such a document in the first place :)

My fundamental question is what is the best container for the master report? Is Word the best?

The data is disparate. No single query would contain all the data. No overall detail/group/total structure exists, so I wonder if creating an Access report with all these charts would be awkward and slow. Anyway, it would exceed the 22 inch limit.

My secondary question is which application should drive this report? Since the data is in Access and I'm using an abundance of queries, should the code reside there? I anticipate scripting what I did manually. I produce a weekly scorecard that has a 3-D Excel chart embedded in the detail section of an Access report; I use Automation to push the detail to the particular cells in the workbook. I possess such skills.

How best do I attack this monster? My initial thought is to create a Word template that links to a series of Excel charts in a workbook that receives data pushed to it by code in Access. Does a better way exist?
 

spikepl

Eledittingent Beliped
Local time
Today, 05:44
Joined
Nov 3, 2010
Messages
6,142
I have a more or less similar set up in one solution.

  • The report is prepared in Word because it is easy to format and change text.
  • Tables and graphs are linked from Excel. The graphs are made in Excel because they are easiest to set up there. The worksheet gets its data by linking into some 20-odd tables and queries in the DB.
  • The content of each table in the Db is automatically wiped and recreated on each analysis run in the DB.
  • We have a tricky bit, where some standard text snippets (part of the narrative describing the results) are to include values from the db. Some of these texts are conditional depending on the value. Ie. if some value is, say, 0.75, then the text says "The value of parameter X is 0.75, and that is far above the average of 0.25" or if X =0.01 the "The value of parameter X is 0.01, and that is far below the average of 0.25". All such snippets are output into a table, and picked up in Excel by linking from Excel. Each table cell in Excel is then referred to from Word
So the difference from your setup seems to be that we pull data from the DB into Excel, instead of pushing it. Our idea was to design the cycle to minimise work in case of any changes, and to use each app for what it's best at.
 

HartJF

Registered User.
Local time
Yesterday, 22:44
Joined
Jul 23, 2010
Messages
34
Your scenario sounds wonderfully similar to mine! I agree that Word is the best application in which to house the report, and that Excel is the best tool to create the chart pieces. Not only are Excel charts "easiest to set up," but I feel like I have better control over the formatting.

My data source provides a challenge. Our IS department does not grant ODBC access to the central database, so I dump data as text files for import into Access tables, a "parallel universe" to the central database. Because of the size of the report database, collecting all the data in Excel worksheets seems an inefficient option. (One table has 400K records with 84 fields; another has more than a million records with 38 fields, including one memo field. The tab-separated text files exceed 1GB! I fear the bulk would overwhelm my system.)

I appreciate your conditional commentary situation. I had a database (now obsoleted) that did predictive modelling for which I wrote a narrative cover almost entirely comprised of standard phrases. Your approach would have been a great improvement!

Thank you for the confirmation that I am on a reasonable track. Since I have written code to import the text files into Access tables, I think I will keep all the code in that one place. Hopefully, that will simplify maintenance.

Your encouragement is greatly appreciated! To know that someone else has faced an elephant of a project and has successfully eaten it one bite at a time will keep me going!
 

DirMex

New member
Local time
Today, 13:44
Joined
Aug 23, 2011
Messages
3
Howdy

I've been in a similar situation to yourself, luckily ODBC access is allowed at my work on the data warehouse. This is probably how i'd go about getting it automated with minimal engagements of other parties.

I'd suggest trying to get an automated distribution of the table dumps. Whatever basis, daily, weekly, monthly - get it automated... it's a pain in the butt to be beholden to a particular person for data on a reporting schedule.

Setup a macro in access that will import all required data, and have your queries setup.

Setup an excel book with pivot tables linked to the access db, disable saving source data with pivot table (you'll likely refresh each time you open it anyway, why spend all that time loading existing data)... you can do linked tables but will produce a bloated spreadsheet.
In the same Workbook, Create the Charts (preferably from pivot tables) or setup datasets linking to pivot tables for their data (getpivotdata - on a seperate sheet to the pivot table to avoid reshaping pivot-table problems)
Gives logical names to objects for scripting into report, sequential (XXX01, YYY01) is nice for arrays/loops.

Now create your word document
Take note of in your current report chart sizes and whatnot, chances are you'll have to reshape the charts when putting them into word (perhaps you can setup dimensions in excel that will travel to word just fine)
Create Placeholders in the Documents for where the Charts/Reports should go like a mail merge.

I would probably do the script in excel, but there's I don't particularly see either word/excel/access couldn't hold the script.
Create a VBA script that will
Open Access - Run the ImportTextFiles macro
Compact and repair and close the database.
Contining the script - Will refresh the data for all pivot tables in excel
Recalculate the spreadsheet
Refresh Charts
Open the word document, save it in a new location
Replace placeholders with reports/charts
Save Document
--- Assuming you'll send it, open up outlook - send the word document to a distribution list.

This may not be the best way (in fact, it's not!), but it can all happen with boss impressing 'one button magic', and you could write a shell script (autoIT or something) to kick off the macro through a scheduled service - 8am the report arrives in the bosses email before you're even in!

You'll also need a some basic checks - did the text files come through? what an obvious way to check if there was an import error that will halt the script?

Lastly, if you don't know how to script and really don't think you'd be able to learn enough to do that in a short time, 'you' can be the script - still probably the least require amount of work.
 

HartJF

Registered User.
Local time
Yesterday, 22:44
Joined
Jul 23, 2010
Messages
34
Thank you for your reply, DirMex. I see more and more professionals having the need for cross-program integration but neither having the skill nor finding the resources to implement it... and, as a result, performing the interfaces manually.

I have created code in Access to link to the text files, to perform filtering queries, and to produce cross-tabulated output. (I find that TRANSFORM queries are more efficient for me to develop than PivotTables; this minimizes the bloat in both applications) I had tried producing "report parts" in Access, then sending the total data to Excel and the report part to Word, but that failed miserably. I think I'll try next to send the cross-tab results to Excel and generate both the report part and chart part there. Then I can write code to drop the parts into placeholders in the Word document, to archive the document, and to open Outlook to send notice that the document has been published.

Besides the concerns IS has about compromising data integrity through an ODBC connection (I know access can be restricted to read-only, but they remain unmoved), they find these text dumps to have the potential to be a serious security violation. As a result, they have not been disposed toward automating and scheduling such a dump. The database system does not expose its scripting facility except to system administrators. So, as you say, I "am" the script. That allows me to manually monitor the success of the data refresh.

I especially appreciate your step-by-step analysis of the problem. I've been so overwhelmed by the extent of this project that I never considered the "How do you eat an elephant?" question.

This routine would be executed eleven times a year -- after each start date at each campus of our educational institution. The pressure for this project ebbs and flows.

Now, I feel like I have a strategy to attack this elephant! Thanks!
 

Users who are viewing this thread

Top Bottom