Importing non-normalized data into normalized tables (1 Viewer)

cricketbird

Registered User.
Local time
Today, 08:48
Joined
Jun 17, 2013
Messages
106
I would like to import (using an append query) data in the following format:

Sample Name Carbon Nitrogen Oxygen Hydrogen
Sample 123 10 20 30 40
Sample 234 15 54 25 60

to a table in my database with the following columns:
RecordID (autonumber primary key)
SampleID (foreign key to table Samples)
AnalysisID (foreign key to table Elements)
AnalysisValue

I usually use query builder, but it doesn't seem able to "flatten" the data in this way.

Can SQL handle this? Or can anyone think of a different approach than using queries?

End result: With the above example data, I want generate the following records to append:
RecordID Sample ID AnalysisID AnalysisValue
1 123 1 10
2 123 2 20
3 123 3 30
4 123 4 40
5 234 1 15
6 234 2 54
7 234 3 25
8 234 4 60



Thanks,
CB
 

moke123

AWF VIP
Local time
Today, 08:48
Joined
Jan 11, 2013
Messages
3,852
where is the data coming from? a CSV? Excel? How consistent is the data?
 

CJ_London

Super Moderator
Staff member
Local time
Today, 12:48
Joined
Feb 19, 2013
Messages
16,555
either import the data or create a linked table to it.

You will then need (based on your data) 4 append queries to append the data to your table
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 07:48
Joined
Feb 28, 2001
Messages
27,001
I'm with CJ - don't try this with one monolithic query. Julius Caesar was right on target... divide and conquer.

On the other hand, don't forget that you can treat the multiple queries, however many there are, as a single transaction via Begin Transaction and Commit, if you were worried about consistency.
 

John Haword

New member
Local time
Today, 05:48
Joined
Nov 19, 2018
Messages
1
We created a tool (Excel add-in) that exports and normalizes data from Excel to SQL Server (excel can hold up to 50 million records by the way)

I work for Synergy USA llc. Google "Transform and Normalize Excel to SQL Server".
 

Uncle Gizmo

Nifty Access Guy
Staff member
Local time
Today, 12:48
Joined
Jul 9, 2003
Messages
16,245
We created a tool (Excel add-in) that exports and normalizes data from Excel to SQL Server (excel can hold up to 50 million records by the way)

I work for Synergy USA llc. Google "Transform and Normalize Excel to SQL Server".
I would be interested to know how many columns this can handle? Is it limited to 255? Or did you find a way to handle spreadsheets with more column's than that?

Sent from my SM-G925F using Tapatalk
 

David Fisher

New member
Local time
Today, 05:48
Joined
Nov 20, 2018
Messages
1
I would be interested to know how many columns this can handle? Is it limited to 255? Or did you find a way to handle spreadsheets with more column's than that?

Sent from my SM-G925F using Tapatalk

Sure no problem. We tried a 500 columns Excel file. It created a SQL Server table with 500 fields. I believe 1024 is the first limit for a SQL table fields.

1.png

2.png

3.jpg

4.jpg
 

Uncle Gizmo

Nifty Access Guy
Staff member
Local time
Today, 12:48
Joined
Jul 9, 2003
Messages
16,245
Thanks for letting me know.

I have written some code for importing large Comma Separated Value files in to MS Access. By this I mean, with many columns.

I always run into problems over a certain number of columns, which I think is something to do with the CSV file. I'm not being specific because this is from a project from getting on for a year ago, and I'm not well up on the detail... I'm wondering if I might have more luck importing from Excel.

As in - import CSV to excel then to Access.

Anyway, I now have another approach to explore!

Probably won't be able to get back to it for a year or so...

Sent from my SM-G925F using Tapatalk
 

plog

Banishment Pending
Local time
Today, 07:48
Joined
May 11, 2011
Messages
11,613
We tried a 500 columns Excel file. It created a SQL Server table with 500 fields

That didn't normalize the data, it simply imported as is. The table should have only 3 columns and 500 * [Number Of Rows In Spreadsheet] rows.
 

cricketbird

Registered User.
Local time
Today, 08:48
Joined
Jun 17, 2013
Messages
106
My final solution was to just do this in R on the way to Access. In R, it's a breeze to flatten data. I just trigger an R script that takes a .csv, flattens it, exports as a new .csv. (3 lines of code). In Access I just trigger that script and then import the new .csv. So much easier than trying to convince SQL to flatten data.
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 23:48
Joined
Jan 20, 2009
Messages
12,849
I would be interested to know how many columns this can handle? Is it limited to 255? Or did you find a way to handle spreadsheets with more column's than that?

The 64 bit version of Excel handles 16,384 columns.
 

isladogs

MVP / VIP
Local time
Today, 12:48
Joined
Jan 14, 2017
Messages
18,186
Actually all versions of Excel (32-bit & 64-bit) from 2007 onwards can handle 16384 columns and 1048576 rows. In fact so does the free Excel Mobile app.
The only caveat is that you must be using xlsx files. It is the xls file format that has the limitation

 

Attachments

  • Capture.PNG
    Capture.PNG
    5.8 KB · Views: 623
Last edited:

Gasman

Enthusiastic Amateur
Local time
Today, 12:48
Joined
Sep 21, 2011
Messages
14,052
So, would you care to share your scripts?, as it might help others in a simiiar situation.?

My final solution was to just do this in R on the way to Access. In R, it's a breeze to flatten data. I just trigger an R script that takes a .csv, flattens it, exports as a new .csv. (3 lines of code). In Access I just trigger that script and then import the new .csv. So much easier than trying to convince SQL to flatten data.
 

Users who are viewing this thread

Top Bottom