Open and update Excel Workbook and saveas Question

The specification is Called "Volksbank Import Spezification"
I would generally avoid spaces and special characters because they can quickly become problematic.

First all tables are called VBImport, now Import => you didn't understand the principle of DIFFERENT naming.
Oh, your path is very rocky, and common sense should not be ignored at any point.
 
Yes it is called like that
I will change it to no spaces No problem!

Oh sorry that was just another test trying to work it out with a different table!

Oh, your path is very rocky, and common sense should not be ignored at any point.
Oh hmm ok :(
 
suggest upload a copy of your file. We are over 60 posts on something that is pretty straight foward. Most of the issues appear to be you do not understand what you are being told.

As previously suggested, start with the basics using the sql window, not VBA

best guess it would be

Code:
SELECT * FROM (SELECT *
FROM [Text;DSN=Volksbank Import Spezification;FMT=Delimited;HDR=yes;IMEX=2;CharacterSet=850;DATABASE=C:\Users\Albert\Documents\AJL\CSV_Daten\Test\MyTestFiles\].[TestDaten.csv ]) AS T


or my version

Code:
SELECT * FROM (SELECT * FROM  [TEXT;DATABASE=C:\Users\Albert\Documents\AJL\CSV_Daten\Test\MyTestFiles;HDR=Yes].TestDaten.csv)  AS T

execute it - do you see the required data?

if not, make a note of what is wrong - dates or numbers as text? extra rows?

now go to the query view (forget the sql view for now)

add in criteria to remove rows not required, use CDate of similar functions to convert text to numeric values etc. Every time you make a change, run the query to ensure you get the intended result.

Once you are happy you have the right data, convert the query to an append query and append the appropriate fields

execute the query - does it work as intended?

if not, delete the rows in the destination table, modify your query and run again.

keep doing this until you get the required result.

Now you can go back to the sql view to see what your query looks like, copy and paste to vba and do the required modifications to the string to replace path and table name with your constants/whatever.

use debug.print to compare the sql produced with the original query you created in the query builder. If it is exactly the same (subject to path and file name) you should be good to go the execute the query.

This process should only take you a few minutes to get right.
 
following on from my previous post, this is an example of one of the files I import (displayed in excel so I can hide the sensitive data)
1695722352170.png


this is the sql that imports the file (path removed)

1695723851531.png


In green is the basic select query having resolved datatypes

In Yellow - HDR=no - this defaults field headings to F1, F2, etc so I can eliminate the unwanted rows in the WHERE statement

I left join to the destination table to prevent duplicates
 
Hi CJ_London,

Sorry that this is going on for a long time!

But there are just a few things that just don't work the right way.

I found the issue or where the problems are:

Attached are pictures of what the issues are.
First of all I can not create a External Data Import if the filename is the original Filename Attachment : "ErrorMessage_Originalfile"
Basically it tells me that the file can not be found.. check the file name or location.....

However so I for testing purposes I did create somefiles with a different names just so I can create an import specification and import that Data.

Some of the files even they are exactly the same structure and are "original" Files! They show in the Specification a double Quote so I created two different Specifications.. one Called "SpcecificationWithoutTextqualifier" one "SpecificationWithTextQualifier" .

It imports but get an import error for some of the Date Values.. those values are then not imported.

For the code it does work it does get all files within a folder and import those even with the original filename such as
"umsaetze-girokont_
with the original name and import those into the Import Table.

So I can not understand why some files show a double quote for the texqualifier and some not.
And how I could overcome this issue.

For your guys Input I did understand it but was not able to find the issue es mentioned above.

Cheers
 

Attachments

  • ErrorMessage.JPG
    ErrorMessage.JPG
    51.1 KB · Views: 75
  • SavedImport_01.JPG
    SavedImport_01.JPG
    68.8 KB · Views: 78
  • Specification_01.JPG
    Specification_01.JPG
    66.5 KB · Views: 73
  • Specification_002.JPG
    Specification_002.JPG
    74.1 KB · Views: 87
  • Specification_003.JPG
    Specification_003.JPG
    77.2 KB · Views: 75
  • Specification_005.JPG
    Specification_005.JPG
    70.6 KB · Views: 84
  • ErrorMessage_OriginalFileName.JPG
    ErrorMessage_OriginalFileName.JPG
    34.9 KB · Views: 77
  • TextQualifier.JPG
    TextQualifier.JPG
    29.6 KB · Views: 81
Last edited:
Regarding the second image: A saved import is created. This is something SIGNIFICANTLY DIFFERENT than importing via TransferText or SQL, as shown above.
The name of a saved import is also different than the name of an import specification.

Otherwise, your picture book shows little of what you REALLY do.
Apparently you jump into every puddle you can reach. Please don't act like importing from a CSV is a big deal. Something like that is basic, every popular program should be able to do something like that.
 
looks to me like your error message does not include the .csv. if it does then chances are the length of path plus file name is too long. Pretty sure that has been mentioned before

your specification is using a semi colon as a field separator, not a comma (which is what a csv is - Comma Separated Values). If you actually do have semi colons and not comma's (we have yet to see an example of your file), then the usual way round that is to use a schema.ini file (at least for me).

it is unlikely the Umsatztest field will import as a datetime since it is not a valid date format for access. You need to do some transformation to exclude the part after the time element if you want it as a date

It would appear you don't try the things suggested, or if you do, you don't provide proper feedback, but go off in a different direction. So unless you can provide a sample of the .csv file (not one that has been opened using excel) and the details of your destination table, I'm going to drop out since all I am doing is repeating myself.

It's not difficult, open the file in notepad, leave the headers and 2 or 3 rows and delete the rest. You say some files have text qualifiers, others do not - prove it, provide an example of both files.

It imports but get an import error for some of the Date Values
go back into the file, (using notepad), find a row and compare the record with the ones around it, see what is different
 
Thanks guys,
@CJ_London
In germany and austria for some reason are csv files seperated as a semicolom(";")

@ebs17
I will provide a sample file but I am again out for work and will be back in few days
then I hope things are getting clearer.

Sorry for it to take some time but I am trying for what you guys suggesting.


Many thanks for your pacentice .

Wish you all a great day )

Cheers
 
which is what a csv is - Comma Separated Values
In German regional settings, rather in all European regional settings with the exception of the islands of Great Britain, the comma is the decimal separator for numbers. Therefore, it cannot be used in parallel as a column separator. Welcome to the non-American world.
 
OK, my last word on this.

create a schema.ini file to go into the same folder as the file(s) you are importing. A schema.ini file is a bit like the import specification you are trying to create.

the file would contain the following two lines

Code:
[ImportFile.csv]
Format=Delimited(;)

As mentioned way back when, I copy the original file to a standard import name (ImportFile.csv in this example) overwriting whatever was there before, before importing so the ini file does not need to be recreated for each file

if you want to do more - specifying things like field types, etc, you can. See this link

however in my experience it is generally easier to handle that as part of the import process. By specifying hdr=no, that guarantees everything is brought through as text which can easily be converted to numbers/dates etc as required.
 
In an international and creative world there are all of them, including "comma separated values" files without any separations via delimiter. You just have to see what you are given.

It is interesting to add another variant (schema.ini) with its own special features to the variants of normal import by specification and saved import, in addition to the detour via Excel or through row traversal. Anyone who hasn't understood the BEFORE and mixes elements from it will win a few more elements for the lottery drum.
 
I avoid the specification route because they can get 'lost' when distributing FE's or they need updating remotely. I know you can use sql to create them providing at least one specification has been created to create the tables and set some properties. I guess those properties can be created via vba but I've never investigated as I find the schema.ini route easier and more controllable.
 
You can handle it, I can handle both options. We can also manage to read values sequentially from the text file and save them in a table if necessary.
However, a discussion about this will not help Albert at this point.

First of all, the text files used must be analyzed to see which catalogs of atrocities are collected there, or not, because everything is neatly standardized.
I take it for granted that the developer knows and understands the table definition of the target table. So it should be clear to the developer how data must arrive.
 
Last edited:
Hi guys,
thanks for your replies!
@ebs17


yes it is.

The Bank Statements have for the first line of content always a certain structure.
IBAN; Auszugsnummer; Buchungsdatum; Valutadatum; Umsatzzeit; Zahlungsreferenz; Waehrung; Betrag; Buchungstext; Umsatztext

You are right that maybe if someone has modified that file in any way there could be an issue.
But it is is from the original Download so to speak than the file is in the same structure.

In the Column of Buchungsdatum I would like to find lowest Date and Highes Date, currently Code for workbookopen and looping through the Column,
As you guys mentioned that is not the savest way I was trying to do that with the InputStatement, but still working on it and not just yet able to do so.

My concept was open workbook and check for the first line of content. Check for the Account which is the last five digits of the IBAN,
Check for the lowestDate and highestDate in Buchungsdatum, and the Year of that file again from either the lowest or highest Date Value in Buchungsdatum.

Build a filename according to that information and save it to "MyFiles/2023/12345/12345_01To10_2023.csv" for example.

Once that has been done.
Import the files with transferText into my Database into Table IMPORT.

ImportSpecifications for that matter have been created and are able to work once the fileName is someting like 2023_01_09_12345.csv
or 12345_2023_01_09.csv or which ever sequence it would be.

If a file has been renamed and is saves already with a different name. I would like to just view the file first and if it is valid file then do the above and save it to the right location and the right file name.

@spaLOGICng

Ok that is great but again there still remains the issue of getting the fileName I like to get out of the File content of the IBAN " & "Dates

@CJ_London

I looked at the imput function
Currently just able to do the first line of content with code but not to sure how to get the dates out of it :(
Sorry for the delay.

Making sure the File Path exists before doing anything else is necessary,

You can add the FileSystemObject to you project. Tools > References > Microsoft Scripting Host Object ...


FolderExists method (Visual Basic for Applications) | Microsoft Learn
 

Users who are viewing this thread

Back
Top Bottom