Import csv to excel to access table (1 Viewer)

Kingz

Member
Local time
Tomorrow, 00:09
Joined
Mar 19, 2024
Messages
49
Hi,

When I normally manually import a CSV file, I tend to convert it to xlsx first and then import, because I get no import errors this way. Now I'm trying to do this dynamically, and it works syntactically, but I want it to skip 12 lines and my delimiter is semicolon.. These two factors are not working properly. Maybe you can help me.. Here is my code:

Private Sub CSV_2_XLS_And_Import(strPath As String, strCase)

Dim strNewPath As String



strNewPath = Left(strPath, InStrRev(strPath, ".") - 1) & ".xlsx"


CreateExcel strPath, strNewPath


DoCmd.TransferSpreadsheet acImport, acSpreadsheetTypeExcel12, strCase, strNewPath

Kill strNewPath

End Sub

Private Sub CreateExcel(strCSVPath As String, strXLSXSpath As String)

Dim wb As Object


With CreateObject("Excel.Application") 'False visibility is default

.Workbooks.OpenText FileName:=strCSVPath, startRow:=13, dataType:=1, Semicolon:=True

Set wb = .ActiveWorkbook

wb.saveas FileName:=strXLSXSpath, FileFormat:=51 '(xlWorkbookDefault, xlOpenXMLWorkbook)

wb.Close False

.Quit

End With

End Sub
 

CJ_London

Super Moderator
Staff member
Local time
Today, 23:09
Joined
Feb 19, 2013
Messages
16,795
Most would do it the other way round as excel can modify the data

I appreciate you want to skip the first 12 rows but that can be achieved with sql

on my phone so can’t provide an example right now
 

Kingz

Member
Local time
Tomorrow, 00:09
Joined
Mar 19, 2024
Messages
49
Well, it's just that the application from which I am getting the CSV file, adds some unnecessary information, and I am just wanting to make it easier for the user to integrate the data into access.

Manually, I would first delete the 12 lines and save as xlsx, and when I then import, I would get no import errors. If I import the CSV file, I receive errors for half of the rows.
 

theDBguy

I’m here to help
Staff member
Local time
Today, 15:09
Joined
Oct 29, 2018
Messages
21,709
What were the errors? Are they not something you can just ignore?
 

ebs17

Well-known member
Local time
Tomorrow, 00:09
Joined
Feb 7, 2020
Messages
2,081
When I normally manually import a CSV file, I tend to convert it to xlsx first and then import, because I get no import errors this way.
This is cumbersome. A CSV is a text file, and a text file can be imported more easily, quickly and without problems than an Excel table. Using an import specification gives you a lot of control.

If, for example, the text file contains 12 leading lines that do not belong to the actual table, you can remove these from the text file before importing.
Code:
Dim sContent As String
sContent = ReadFile(FullPathTextFile)
WriteFile NewFullPathTextFile, Split(sContent, vbCrLf, 12)(12)
DoCmd.TransferText ...
ReadFile
WriteFile
 

Kingz

Member
Local time
Tomorrow, 00:09
Joined
Mar 19, 2024
Messages
49
What were the errors? Are they not something you can just ignore?
Unfortunately not.. It's just too much..I think access finds it easier to assign an appropriate data type to a column when it's an excel file.

What would help is to maybe remove the 12 lines before I convert it to excel.. does that seem feasible?
 
Last edited:

DickyP

Member
Local time
Today, 23:09
Joined
Apr 2, 2024
Messages
132
Unfortunately not.. It's just to much..I think access finds it easier to assign an appropriate data type to a column when it's an excel file.

As for assigning datatype just use a specification in your TransferText command.

What would help is to maybe remove the 12 lines before I convert it to excel.. does that seem feasible?
Easy enough to remove the 12 lines using Ebs17 solution above, or either the file system object.

Open two text streams, one for reading A and one for writing B

A.SkipLine(12 times)

then

Do while Not A.EndOfStream

B.WriteLine A.ReadLine
Loop

A.Close
NB.Close

then use file B.
 
Last edited:

sxschech

Registered User.
Local time
Today, 15:09
Joined
Mar 2, 2010
Messages
802
Here is an example of using SQL to append data from excel and specify the row number. If you are doing a make table or other type of query, change the sql statement as needed.

stfilename is a variable for the file name (can hard code if you prefer)
DOCS$A is the sheet/tab name along with the starting col
rn is a variable for the starting row number (can hard code if you prefer)
:I is the ending col (if your data starts and ends on Col C and Col M, for example, then you would change the A to C and the I to M

Code:
    Dim db As DAO.Database
    'Dim rs As DAO.Recordset
    Dim stFileName As String
    Dim stsql As String
    Dim rn As Integer

    Set db = CurrentDb
...
...
...

stsql = "INSERT INTO DocsTemp ( FileType, DocumentNo, DocumentTitle, Revision, DocumentStatus, DateModified, DocumentType, FileSize, Lock ) " & _
                "SELECT T.[File], T.[Document No], T.[Title], T.[Revision], T.[Status], T.[Date Modified], T.[Type], T.[Size], T.[Lock] " & _
                "FROM [excel 12.0 xml;hdr=yes;imex=1;DATABASE=" & stFileName & "].[DOCS$A" & rn & ":I] AS T"
        db.Execute stsql, dbFailOnError
 

DickyP

Member
Local time
Today, 23:09
Joined
Apr 2, 2024
Messages
132
It has occurred to me that this thread is symptomatic of a few others we have seen. For some reason people associate csv files with Excel, whereas Excel is merely set up to read this particular type of text file, if necessary. We really should treat text files as text files for efficiency.

The secret to the problems of data types on importing a CVS file is to use specifications, but it appears that this, what I consider a vital, part of Access is shunned, unpopular or more often not even known about. Silly really because despite Microsoft's best effort to to hide them away specifications are easy to use and a boon to simplified processing.

Is this a subject worthy of a separate thread?
 

Gasman

Enthusiastic Amateur
Local time
Today, 23:09
Joined
Sep 21, 2011
Messages
14,761
For some reason people associate csv files with Excel
Perhaps as it makes viewing them very easy?
Notepad is a little more difficult.
 

DickyP

Member
Local time
Today, 23:09
Joined
Apr 2, 2024
Messages
132
Perhaps as it makes viewing them very easy?
Notepad is a little more difficult.
I don't necessarily disagree but I seldom, if ever, use Nodepad as it is too simplistic. My everyday choice is Notepad ++ (FREE!), and for more complex text processing either UEDIT or PrimalScript according to what I'm trying to do, although the latter two as ridiculously expensive..

I might also add that often Excel doesn't cope well with slightly complex CSV files, especially where (despite the name) you use other characters instead of commas as the delimiter. The use of a specification solves that in Access.
 

ebs17

Well-known member
Local time
Tomorrow, 00:09
Joined
Feb 7, 2020
Messages
2,081
The dilemma is that the CSV file extension is assigned to Excel as the default program for opening it (this is a questionable measure by Microsoft). The second problem is that people do not trust the existing structure in the CSV and want to view the contents first by opening it.

Actually, a Comma-Separated-Values file is not a file to be viewed, but rather a very slim format for data exchange.

Regarding viewing: I have also seen and imported CSV files where real tables were "drawn" with hyphens and pipes. They worked with fixed lengths because of the table representation and from file to file there could be different lengths of content and thus different column widths.
The only thing that was correct about the CSV file extension was that it was a text file, and somehow there was also tabular data.

However, CSV in the standard is so simple that every reasonable program that processes and stores data in any form and can also import and export must be able to handle this file format. In Excel, one should also import a CSV rather than simply opening it. Excel has its own intelligence that changes data/data formats very quickly.
 

Users who are viewing this thread

Top Bottom