Data Change in CSV file after TransferText data export.

I don't think so.
If I add a lot of records as text and then a long number saved as text, when I export the table as csv(text file), and open it in excel. the number is shown as E-notation.


View attachment 113319

I tried to repeat my observation, but couldn't. I will have a proper look with the next import I do, and see what happens.
 
If I add a lot of records as text and then a long number saved as text, when I export the table as csv(text file), and open it in excel. the number is shown as E-notation.
The problem is that Excel defaults to "general" as the data type. If you explicitly change the column definition to text, the issue goes away.

I haven't had to do this for quite a while (several versions) so I tried it again. Excel is now completely broken and does not respect the conventional defaults for how you define a .csv file, even when you use the Data tab and specifically import a properly formatted .csv file. So, if you allow it to detect data types, it assumes numeric even for columns enclosed in quotes and therefore, all long numbers get rendered as scientific notation and the actual value is lost. Your only option is to tell Excel to not detect the data type. Then you open the workbook and mark each column yourself manually. Stupid.

Not sure who is running the show at MS but they certainly dropped the ball with this mistake. Earlier versions of Excel followed the old convention that any column enclosed in quotes was defined as text. That allowed my zip code - 06614 to be properly rendered with the leading zero because Zip is a CODE, not a number. Now, Excel can no longer distinguish based on a very simple convention that used to work.

Keep in mind that after you import the data using Data, and "do not recognize data type", you MUST manually set text to text. Excel STILL, even though you told it to import everything as text. still marks the column as General and you MUST convert it to Text or be bitten later.

When I open the .csv file with Excel, I do get this helpful message. if you don't blow by it, it will save your bacon. If you ever clicked the "don't notify me" option, you'll have to figure out how to make it prompt on every file again.
1711657164502.png
 
Just to clarify the scope of all of this, maybe there is another solution: actually I'm forced to prepare a CSV template file because it is required by the SAP functional team who upload data into the system trough a program who read the CSV File. So, as i explained, i export my data from my ms access database in to a CSV file. The team who receive my template (CSV file) , simple take the CSV file and import in their application but when they try to import it they get duplication error due to the "wrong" transformation of those long numbers.
First question: If a CSV is required for an import into the SAP system and a CSV is created for the export from the Access application - where and why does Excel come into play?

Second question: If the CSV is opened in Excel out of curiosity or playfulness, the CSV itself does not change unless explicit steps are taken to save it back. Do the actors involved not know what they are doing? Remove blindfold, tie hands?
An XLSX is not a CSV - do we need to discuss this further?

Third question: If a CSV is requested for an import, why can't the CSV be handled there? Do “experts” have to be replaced?

If you write text to a text file and then import text into a text field, there can be no reformatting unless interference is explicitly caused.
 
Do the actors involved not know what they are doing?
Do you ever work with users?
An XLSX is not a CSV - do we need to discuss this further?
You know that. I know that but a surprising number of others do not know that even if we think they should. MS hides this to some degree by assigning the .csv as an Excel data type so you might want to put the blame where the blame actually belongs.
If you write text to a text file and then import text into a text field, there can be no reformatting unless interference is explicitly caused.
All it takes is Excel being helpful and you saying OK. Maybe there's a lot of columns and you didn't notice that one had been converted to display as scientific notation. Once you say OK to the save, the actual value in the field is gonzo.
 
Not sure who is running the show at MS but they certainly dropped the ball with this mistake.
MS hides this to some degree by assigning the .csv as an Excel data type so you might want to put the blame where the blame actually belongs.
To be honest, it's very easy to blame MS, but you should also consider that Excel has a lot of great jobs to do, and it's designed to work on a spread sheet not a text file. If Windows defaults csv files to Excel, it's just for the convenience and having a better visual effect. Not that Excel must follow the text file's rules. You can change your defaults at any time if you want to (in control panel).

So I don't think it's fair to blame MS because they change Excel's internal code to work better as a spread sheet software rather than being faithful to a text file type.
They are professional programmers who are faced with a lot of problems to solve, which none of us are aware of.
Criticizing them on this problem with csv files, is just like I criticize your database and the way it behaves, without knowing all the problems you'd been faced and were forced to go that root.
 
Last edited:
If the CSV file extension is such a huge problem, which is incomprehensible because this issue has existed for at least 30 years and generations of developers and users have been able to deal with it - if the problem is the file extension, then TXT should be used. Importing and exporting in Access are the same, and this will also be the case with most other import interfaces. Text file is text file.

In order to ward off blind chickens (please excuse the harsh choice of words), you could also temporarily give the text file a file extension such as ASC, DAT, XYZ or similar.

Do you ever work with users?
Yes, there are different ones. The vast majority have understanding and are capable of learning.

A CSV has been a standard format for data exchange for decades, not an exotic format that “surprises” everyone. The real question is why does someone feel compelled to open the file?
A database application creates the CSV. It is therefore machine-generated and sufficiently standardized using not-so-difficult programming. If the file is given a meaningful name, sent to the correct address and stored in an expected folder, no one has to look at what it is. The importing program, if it already requests a CSV, should be able to handle it. This is called an automatic process, so no curious and unqualified users have to get involved.
 
Last edited:
Criticizing them on this problem with csv files, is just like I criticize your database and the way it behaves, without knowing all the problems you'd been faced and were forced to go that root.
The difference is that I would never presume to change the meaning of a standard file format. That's what standards are all about.
 
First question: If a CSV is required for an import into the SAP system and a CSV is created for the export from the Access application - where and why does Excel come into play?

Second question: If the CSV is opened in Excel out of curiosity or playfulness, the CSV itself does not change unless explicit steps are taken to save it back. Do the actors involved not know what they are doing? Remove blindfold, tie hands?
An XLSX is not a CSV - do we need to discuss this further?

Third question: If a CSV is requested for an import, why can't the CSV be handled there? Do “experts” have to be replaced?

If you write text to a text file and then import text into a text field, there can be no reformatting unless interference is explicitly caused.
I imagine excel comes into play because users double click a CSV, and it comes into excel, and then maybe they save it as a CSV (excel may even ask them if they want to save it), and that action might change the way the data looks in the csv. It might add a time to csv dates, and might mess with quote marks, or change number formats. I'm sure I had this issue with Sage transaction files, for instance.

I remember now. The file I was mentioning that wouldn't import correctly is an excel file (downloads from SAP as an excel file) and a column that's mainly numeric drops the values that are alphanumeric, even though the column in the destination table is defined as text. There are also 2 columns that are not found in the destination table because in the source spreadsheet the columns have a trailing space, so I have to manually edit the spreadsheet to remove the trailing spaces.

Another issue is that for some reason addresses that display correctly in the spreadsheet, don't handle the line breaks correctly when imported into access, and I have to change all the Chr(13) into Chr (10) & chr (13). I've had that last issue in another database as well. I'm sure it's not me, it's the interaction between access and the source file.
 
Good morning everyone.
I'm actually use a simple function to export some data trough a selection query to a CSV File. The problem is that when i export the data from my table (trough query) in the CSV output file i have some values (long numbers) who change. See please below :

This is part of my function I use to export the data in a CSV file :

Docmd.TransferText acExportDelim, , "008_EQUI_Temp", myfilename, True

These are part of my original data in the query :

LCM IT T22_DN150X100_CL300 2203000040001
LCM IT T22_DN50X40_CL300 2203000050001
LCM IT T22_DN150X100_CL300 2203000040002

And this is what I get in the CSV file (please focus on last column):

LCM IT T22_DN150X100_CL300 2.203E+12
LCM IT T22_DN50X40_CL300 2.203E+12
LCM IT T22_DN150X100_CL300 2.203E+12

This create me a problem of duplication other then of wrong data and not sure how to solve it, i tried to check in the net but didn't find appropriate solution.
Anyone face with the same problem and find a way to solve it?
Thank you for usual and kindly support.
Cheers.
A CSV File is technically a Text File. Once it is opened in Excel, the Long Numbers are destroyed and converted to exponentials.

I always stress to clients to NEVER open a CSV file in excel before it is imported. If it is converted, you can use VBA to first open it in Excel and TRIM() the Column to convert it back.
 
A CSV File is technically a Text File. Once it is opened in Excel, the Long Numbers are destroyed and converted to exponentials.
They are not converted though, just formatted as such?
 
If a little more collecting is done, it will ultimately turn out that dealing with a CSV is more difficult than flying to Mars.
 
They are not converted though, just formatted as such?

Technically, in Excel there is no such thing as a data type in the spreadsheet. That assertion is not claimed to be true in any VBA code residing under Excel as a macro.

All data in a spreadsheet is stored as text that is in part of a CELL object, which is a collection object defined as a component of the Rows collection or the Columns collection which is within a Sheet object - which is within a WorkBook object. That is according to the COM Object Model for Excel.

Each cell contains (potentially independent) formatting information. I.e. every cell is its own independent entity. When you format a Row or Column, you are actually just applying that format to the cells that are the innermost collection of the Row or Cell object. When you trigger a cell to contain a sum, the components of that sum are not affected. Instead, you make the contents of the cell contain a formula to sum a Range.
 
All the 'solutions' above ignore the obvious - use a specification , which is the empty ,, bit in the DoCmd.TransferText (the second parameter) example above. The facility is built in and designed to do exactly what you are attempting.
 
Ended up thinking about this problem and realised everyone is probably over complicating the problem. A number of things spring to mind:

1. The role of Excel in all this is the problem - it makes assumptions when opening a csv file about its format. Was the actual csv text file checked in a text editor before opening with Excel? If the file is to be used as a data source elsewhere as it appears to be and not in Excel then you need to manipulate it in a text editor and NOT Excel!

2. Actually the full data is almost certainly there even in Excel - you just need to expand the column. See the screen shots attached of an export of one of my tables
 

Attachments

  • Screenshot 2024-04-14 101934.jpg
    Screenshot 2024-04-14 101934.jpg
    13.5 KB · Views: 63
Last edited:

Users who are viewing this thread

Back
Top Bottom