Using/parsing JSON with vba

Hmmm. I think in general that is true. But for large online sales/fulfillment orgs and other webservices offering json or xml responses, there is documentation. The issue of the thread is to see if we can find some "code"/"program"/"approach" to request/receive and process the json response with Access and vba.
 
The issue of the thread is to see if we can find some "code"/"program"/"approach" to request/receive and process the json response with Access and vba.

...for a sole, specific source, with a known output format?
 
..for a sole, specific source, with a known output format?

If that's what you have to offer, it would be a good start. I was thinking of something more general --even an approach since there are not many examples.
 
Thanks CJ. Interesting I had a PM for that very link at 7:30 tonight also.

I did a little poking around the site--seems in most cases you can't PM an author/poster, and you can't comment if you don't have a 50pt reputation. Seems the powers to be decided in 2010 that PM's etc would only add junk info to the site???
There is little documentation for any code. I'm looking at the code and trying to work through some. I did find an email address for the author who is willing to answer questions for any of his posts/solutions. Also, I see he was on the site only a few hours ago. I'm going to try to work a few samples with his code and see if I can make sense of it ---at least enough to hopefully ask a relevant question.

There is another Regex sample (with 0 votes)

Title is Another Regex based JSON parser (decode only) who was online as recently as Jul 18 /17. Could be a useful resource if there was a way to contact him/her. (user : drgs )

If you (or anyone) get a chance to look at either of the Regex examples, could you provide any documentation or comments re the code and some sample usages.

Thanks.
 
Last edited:
Just came back to this after working on other things (& failing to get anywhere... sigh!)

Here's another thread that looks interesting
http://www.accessforums.net/showthread.php?t=62671
Haven't checked whether it's the same as the stackoverflow link quoted earlier
The OP got a long way but stopped. I haven't tried his approach

I also saw the link that CJ_London posted but got put off by the comment that the
approach makes the system vulnerable in some cases, since it allows the direct access to the drives (and other stuff) for the malicious JS code via ActiveX's.
Whilst the author later qualified that comment, I figured I wasn't knowledgeable enough to deal with that and left it alone...

Just for the record, I find both StackOverflow & ExpertsExchange really difficult to follow up posts or query accepted answers.
It sometimes feels like both are 'closed shops'

For now, I still wonder if the approach I outlined in post #13 using Excel as an intermediary is the best way forward.
I've used it on 20+ JSON files & its worked in every single case with almost no effort.
I would really recommend others try it.

My thinking is that it may be possible to use automation to run the JSON import into Excel from Access and then import into Access itself
 
Last edited:
Colin,

I'm not an Excel person, so the code you found may be the answer or at least an answer. Can you take 1 or 2 of your 20+ files and show us a database (or describe the scenario) that goes from a "json" file (simulated response) through to some Access table or process?

Your comment on the activeX solution in the link provided by CJ appears real. However, if you go passed that to his Regex based code, I think there is a good base (just need some comments to understand it and how to get it to show some of the intermediate data for processing with Access).

His code seems quite "professional/advanced" to me in that it is quite focused -no extra code for unnecessary loops,displays, debugging etc that we often see in the forum. Perhaps it is a more common result when a "real programmer knowledgeable in the language and the requirements" does the job.
 
I'm not an Excel person, so the code you found may be the answer or at least an answer. Can you take 1 or 2 of your 20+ files and show us a database (or describe the scenario) that goes from a "json" file (simulated response) through to some Access table or process?

That's what I did in post #13
I used your JSON file & imported into Excel using PowerQuery then did a standard import into Access. Both Excel & Access files were attached to that post

There's no code involved at all.
The PowerQuery addin for 2010/2013 does all the work.
In Excel 2016 its called Get&Transform.
If you have 2007 or earlier, sorry but you're stuffed

As for the 'how', I followed the method used in the video (see link in same post) but using Excel 2010.

I then repeated the process with some SMS messaging data using my Twilio account & various other JSON files emailed to me by @ashleedawg

If you post another example JSON file, I'm happy to repeat & post the results.
I could post another example of mine or ashleedawg's (with his agreement) if it helps, but the video explains how to do it better than I can put it in words.

Alternatively perhaps we could set up a remote session at an agreed time using Teamviewer so I could show you. As I'm in the UK, I'm several hours 'ahead' of you.
 
Lost my post somehow---I'll try to repeat it.

Thanks Colin. I don't have Excel 2016. I did add the Power Query add-in to Excel 2010. And I did watch the 2016 Excel video.
I don't know enough Excel to follow the post (2016) and translate the 2010 Power query to know if I can have multiple tables. The multiple tables derive from nested or multiple arrays in a json string.
It seems the best supported tool to get json to Excel and possibly Access. It would be nice to see a similar add-in for Access

Here is a sample json with some arrays. Can you derive the appropriate tables for use with database/Access?

The jsonToAccess I downloaded does single table, and you have to manipulate(write program/vba) the result to get the other tables.
 

Attachments

I've been using json to csv as mentioned in ridders post 17 for a six months and it works well. Started with the free version and recently switched to the desktop version so that could automate the process. Now all need to do is click a button and after the api call returns the json string, it is then fed to the json-csv desktop via shell command and then the data are imported as csv. I had looked at some of the examples mentioned before going with the json csv approach as the json string was too complicated for me to parse out via the examples provided and given my limited knowledge of api and json.
 
Last edited:
sxschech,

Do you then process the csv through individual/custom procedures/code to get normalized tables?
I just passed the json file from post 29 thru the online json -csv and got the attached.
I'm just curious at how you would process the csv within your Access.
Thanks

I changed the extension to txt
 

Attachments

@jdraw

I also managed to lose my reply post

2 zip files attached each with 3 files
a) ImportedCalls - json => xlsx using Excel PowerQuery => Access
b) MultiSquad - json +> csv using online convcerter => Access

Multisquad proved more tricky as it had lots of empty fields - perhaps you chose it knowing that! :rolleyes:
I had trouble using PowerQuery so I used the online converter

To import csv files into Access you just use 'Import From Text'

@sxszech
Out of interest why did you switch to the paid version of the converter?
 

Attachments

No. Not empty fields. The original json has some arrays. In database normalized tables these would be separate tables with PK fields. With csv you make a table with 1 large record and show empty fields.
That was my concern with processing more complex json strings.

You have Squads, each Squad has Members, and each member has Powers.

A Squad has 1 or more Members.
A Member has 1 or more Powers.

Squad-->Member-->Power

Also in your Mutlisquad table note that
members__powers____001 and
members__powers__001 are similar names for different fields.
And from a database perspective, when you have repeating groups, you would design a separate table to achieve normalized structure.
 

Attachments

  • jsonSqads.jpg
    jsonSqads.jpg
    53.9 KB · Views: 257
Last edited:
Hi Jack

First of all apologies.
After losing my first reply, I rushed my second reply & misread your post.
You were asking how the csv file would be processed to create normalised tables - I misread that as how to import them - doh!

Anyway, I roughly got the same structure as your jpg file shows when I tried importing into Excel using PowerQuery.
It was a tricky example!

attachment.php


I didn't think you'd be happy with that so used the online converter to make a 'standard CSV file' - un-normalised but 'accurate' in terms of the source data
If it had been an XML file, Access would still have failed to split it correctly

I've also redone the original json100 import into Excel but this time keeping it as 100 records
Basically Excel allows you to choose how to do each step as you can hopefully see in the screenshot below

attachment.php
 

Attachments

  • ExcelJSONImportPNG.PNG
    ExcelJSONImportPNG.PNG
    9.4 KB · Views: 915
  • ImportJSON100.jpg
    ImportJSON100.jpg
    99.2 KB · Views: 905
Last edited:
Colin,

No apologies needed. Most of the examples I've seen have a fairly straight forward object with no array(s). That's the reason for this latest json ---more realistic example (and it may not have some json peculiarities).

The JsonToAccess routine I found builds 1 table. It does keep array info intact in a single field for each array it finds. I wrote a small routine to review those fields (with multiple values) and create separate tables. However, you have to know the structure of the json (part of the schema) in order to create those related tables.

It looks to me that the multiple ip addresses (and emails) in your example have just been concatenated together??
In my view they should be separate fields in a new table that also included the ID field from the generated table.

I appreciate your efforts and comments.

As for JsonToAccess I did send an email to the author for more info. No reply yet on details.
 
late to the party, working away at the moment

@ridders to clarify this comment
approach makes the system vulnerable in some cases, since it allows the direct access to the drives (and other stuff) for the malicious JS code via ActiveX's
refers to using the json.parse function using scriptcontrol
 
I have lost another post??? retyping from memory...

I want to update the thread to say that the JsonToAccess utility mentioned earlier failed during the conversion when I tried to use the MultiSquad_Jed.json file. It didn't matter what sublevel I used, it failed each time.
I did send an email to the author/supplier questioning the routine and asked specifically if he could process the file. I suggested the routine may not be able to handle array(s) adequately. Similar for the DataFileConverter, as below.

I also tried his DataFileConverter to convert the MultiSquad_JED.json rom json to csv, and also jon to sql -- the conversion did not fail with error, but the output was not sufficient to import into Access.
It left [] as field name when it met the second array?? On the second element of the outer array it gave me over 20 fields???

UPDATE Aug1
I got a response from the author/supplier of JsonToAccess/DataFileConverter. They will work with the json file and let me know.

Further Update: The company returned my json file with a modification. I'm trying to attach it here, but for some reason can not get Manage Attachments to work?? However, both my original and the modified version passed the json validator/viewer on Notepad++. Neither my original not the revised version of the json (with embedded arrays) were converted successfully. See comments at the end of post #43.
 
Last edited:
Regarding the data, I do run the data through some loops, recordsets and temp tables to remove the excess underscores of field names as well as to "fill in the blanks" and get the data into a usable format. Still an effort, but the json-csv conversion is one less headache than trying to figure out the json translation on my own. Before I had the desktop version, I used vba and an excel macro to have one keystroke to paste json string into the browser. After it translated and browser asked to open or save excel, I choose open and ran a macro to fix headers, columns then select and copy all the data to paste into the access table (datasheet form) opened by vba so that I didn't have to spend time saving and importing. The data we are getting is from Survey Monkey and it took quite a while to understand the relationships in their data in order reassemble it in a layout that access and sql could handle. In a few cases where I only needed one or two elements, I used string functions to pull the data directly out of the json string rather than go through the conversion.
 
Reading sxszech's reply, that process is used for a data from a single source.

If the source is changed, a new set of rules would be required.
A bit like the import specifications but in multiple steps.

Essentially that's what PowerQuery does and the steps can be saved

@sxszech
Would you mind trying that excel addin on your JSON data and see if it works for you
 
ridders are you suggesting I try excel power query on the json? If so, I had tried installing power query for excel 2013 some time ago, for a different purpose and it caused problems locking up excel and had to use task manager to close excel, so I uninstalled it.
 

Users who are viewing this thread

Back
Top Bottom