Importing non-normalized data into normalized tables

cricketbird

Registered User.
Local time
Today, 18:15
Joined
Jun 17, 2013
Messages
117
I would like to import (using an append query) data in the following format:

Sample Name Carbon Nitrogen Oxygen Hydrogen
Sample 123 10 20 30 40
Sample 234 15 54 25 60

to a table in my database with the following columns:
RecordID (autonumber primary key)
SampleID (foreign key to table Samples)
AnalysisID (foreign key to table Elements)
AnalysisValue

I usually use query builder, but it doesn't seem able to "flatten" the data in this way.

Can SQL handle this? Or can anyone think of a different approach than using queries?

End result: With the above example data, I want generate the following records to append:
RecordID Sample ID AnalysisID AnalysisValue
1 123 1 10
2 123 2 20
3 123 3 30
4 123 4 40
5 234 1 15
6 234 2 54
7 234 3 25
8 234 4 60



Thanks,
CB
 
where is the data coming from? a CSV? Excel? How consistent is the data?
 
either import the data or create a linked table to it.

You will then need (based on your data) 4 append queries to append the data to your table
 
I'm with CJ - don't try this with one monolithic query. Julius Caesar was right on target... divide and conquer.

On the other hand, don't forget that you can treat the multiple queries, however many there are, as a single transaction via Begin Transaction and Commit, if you were worried about consistency.
 
We created a tool (Excel add-in) that exports and normalizes data from Excel to SQL Server (excel can hold up to 50 million records by the way)

I work for Synergy USA llc. Google "Transform and Normalize Excel to SQL Server".
 
We created a tool (Excel add-in) that exports and normalizes data from Excel to SQL Server (excel can hold up to 50 million records by the way)

I work for Synergy USA llc. Google "Transform and Normalize Excel to SQL Server".
I would be interested to know how many columns this can handle? Is it limited to 255? Or did you find a way to handle spreadsheets with more column's than that?

Sent from my SM-G925F using Tapatalk
 
I would be interested to know how many columns this can handle? Is it limited to 255? Or did you find a way to handle spreadsheets with more column's than that?

Sent from my SM-G925F using Tapatalk

Sure no problem. We tried a 500 columns Excel file. It created a SQL Server table with 500 fields. I believe 1024 is the first limit for a SQL table fields.

1.png

2.png

3.jpg

4.jpg
 
Thanks for letting me know.

I have written some code for importing large Comma Separated Value files in to MS Access. By this I mean, with many columns.

I always run into problems over a certain number of columns, which I think is something to do with the CSV file. I'm not being specific because this is from a project from getting on for a year ago, and I'm not well up on the detail... I'm wondering if I might have more luck importing from Excel.

As in - import CSV to excel then to Access.

Anyway, I now have another approach to explore!

Probably won't be able to get back to it for a year or so...

Sent from my SM-G925F using Tapatalk
 
We tried a 500 columns Excel file. It created a SQL Server table with 500 fields

That didn't normalize the data, it simply imported as is. The table should have only 3 columns and 500 * [Number Of Rows In Spreadsheet] rows.
 
My final solution was to just do this in R on the way to Access. In R, it's a breeze to flatten data. I just trigger an R script that takes a .csv, flattens it, exports as a new .csv. (3 lines of code). In Access I just trigger that script and then import the new .csv. So much easier than trying to convince SQL to flatten data.
 
I would be interested to know how many columns this can handle? Is it limited to 255? Or did you find a way to handle spreadsheets with more column's than that?

The 64 bit version of Excel handles 16,384 columns.
 
Actually all versions of Excel (32-bit & 64-bit) from 2007 onwards can handle 16384 columns and 1048576 rows. In fact so does the free Excel Mobile app.
The only caveat is that you must be using xlsx files. It is the xls file format that has the limitation

attachment.php
 

Attachments

  • Capture.PNG
    Capture.PNG
    5.8 KB · Views: 746
Last edited:
So, would you care to share your scripts?, as it might help others in a simiiar situation.?

My final solution was to just do this in R on the way to Access. In R, it's a breeze to flatten data. I just trigger an R script that takes a .csv, flattens it, exports as a new .csv. (3 lines of code). In Access I just trigger that script and then import the new .csv. So much easier than trying to convince SQL to flatten data.
 
Apologies for responding nearly 6 years too late, but for future reference here is my solution including the R Shiny script to import and convert data from "wide to long", an example datafile, and the Access VBA to import the "long" data to Access:
 

Users who are viewing this thread

Back
Top Bottom