Unbound multiline plain textbox 64k character limit (AC2013) (1 Viewer)

AOB

Registered User.
Local time
Today, 04:08
Joined
Sep 26, 2012
Messages
615
Hi experts!

I have an unbound multiline plain textbox in a form, into which users can enter text (or, paste text, usually from a column in an Excel spreadsheet, or from an email etc.) which is then taken as in input into a process

I've just discovered that users can't enter, or paste, more than 64,000 characters into the textbox, which I found surprising? :confused:

When free-typing, once the 64k character limit is reached, no additional characters appear (no error or warning). When pasting text that exceeds the 64k limit, an error dialog appears :

The text is too long to be edited
This is a major problem for me as the text input required is going to exceed 64k characters (including carriage returns etc.)

To clarify, the control is not bound to any field with a limit (i.e. Short Text = 255 or Long Text / Memo = 64k), it is just a handy way for me to allow users to enter their source data regardless of the actual source

Is there any way of extending the character limit on the textbox? Or, is there another similar control that could be used that wouldn't suffer this problem? I need it to be plain text, I don't want any HTML tags cluttering up the string.

(I've experimented changing the format of the textbox from Plain Text to Rich Text but then the format of text pasted from Excel is all over the place, line breaks are lost etc.) :banghead:

Any suggestions very gratefully received!

Thanks

AOB
 

Minty

AWF VIP
Local time
Today, 04:08
Joined
Jul 26, 2013
Messages
10,366
64k of plain text is a awful lot of text....

But yes, even though you aren't setting a data type for it, access can't parse an single string longer that a 64k memo (LongText)field, so that's your limit.
 

Frothingslosh

Premier Pale Stale Ale
Local time
Yesterday, 23:08
Joined
Oct 17, 2012
Messages
3,276
For reference, 64,000 characters is roughly 20 pages 12-point font single-spaced.

That is one HELL of a lot of information to be entering into a database on a per-record basis, and is going to lead to an extremely large database very, very quickly. Quickly enough that I really hope you are using another engine as your back end.

Even if you switch to Memo or Long Text (I forget which 2013 calls it), you'll still be limited to 64k characters.

The only way I can think of to store the amount of data you're talking about is to use another engine as the back end. If you use SQL Server and bind the field to an nVarChar(MAX) field, then you will have 2GB of space for that field for each record.

What is this field and is there any chance of just storing it in a document and instead saving a link to the document?
 

isladogs

MVP / VIP
Local time
Today, 04:08
Joined
Jan 14, 2017
Messages
18,209
Just for info, although there is a limit of 65535 characters in a memo/long text field when data is entered manually, you can have up to 1 gigabyte of storage space in a field if entered programmatically. In other words imported or several smaller fields concatenated

The attached screenshot doesn't go quite that far .
It shows a very large 76MB JSON file which has been imported into Access.



The total numbers of characters is 48,826,926.
Would that be enough for you??!!

However I wouldn't dream of storing these permanently as one field.
Doing so would make it totally unmanageable / unsearchable / unuseable.
So, once imported, I split the file into several different fields and many records
 

Attachments

  • Capture1.jpg
    Capture1.jpg
    104.7 KB · Views: 598

AOB

Registered User.
Local time
Today, 04:08
Joined
Sep 26, 2012
Messages
615
Yup, it is, but the process splits it down to its constituent parts - it's actually a giant delimited string of 7-character codes (it's not one single record of 64k+ characters, it's more like 7k sets of 7-character codes, which when you factor in line breaks, carriage returns and rogue whitespace, eats up the available 64k character space) So the database is never that big (plus, it gets cleaned out after every iteration)

So to clarify - it's not a single record in a single field. That would be madness and I wouldn't have even started down this road if that were the case! There is no "field" to consider in this case, this is all being done in VBA before anything touches the tables.

The string variable in VBA can easily handle well in excess of 64k characters; it's just the textbox in the interface, to take in that variable, that's causing the issue.

The problem is, the list of codes is usually (but not always) derived from multiple Excel spreadsheets and I don't want anything to do with that - how they come up with the list, or whether the list is wrong, is not my problem. Having a multiline plain textbox makes it a nice handy interface that they can just copy and paste information in as they go, insert further manual entries if needs be, easily incorporate some QA, review and edit etc, all super quickly (realtime) before kicking off the whole process.

Having them create yet another workbook, into which they will have to do all their reviews, and then have them import or link to the sheet, is far more cumbersome from a user perspective. Plus, if the QA suggests they need to make any changes (invalid codes etc.), they have to go back, edit the workbook, save it again, import it again... It's just messy. They really like the existing interface because it's nice and fluid and pretty much foolproof and extremely easy to use. No superfluous moving parts, as it were. The only issue with it is when they exceed 64k characters. That only happened for the first time today (so I suspect they are only edging over the limit)

I might play around with RichText a bit more in the morning - use a regular expression to derive a collection of valid text blocks and reassign the value in the textbox, stripped of the HTML formatting - but I have a suspicion it's going to be less than straightforward... :rolleyes:

I'll keep you posted...
 

jleach

Registered User.
Local time
Yesterday, 23:08
Joined
Jan 4, 2012
Messages
308
Part of me thinks if we need to parse all this data, why not stuff it into a file and then read the file contents (which can be done in chunks) and parse it that way?

Part of me thinks probably not that way because you lose the nice UI factor of having a window to paste in

You can somewhat catch a paste event in Access: use KeyDown and check if CTRL+V, and you can replace the standard Paste toolbar/ribbon with your own to intercept the event itself, but neither will get you the data. For that, you could go to the Windows API to get the current clipboard data, but it's not exactly a straightforward task to do so (I don't think it'd be terribly bad if you're familiar with Win32 API programming, but that's definitely a matter of relativity...).

Between finding out when something is ready to be pasted (keydown/custom ribbon/toolbar function) and being able to read data from the clipboard, I'd say it's feasible to get what you're looking for. Whether or not there's a better way or if it's worth it... don't know.

(Colin: is that entire piece of JSON loaded into that textbox? I didn't think that was possible: thought they topped out at 64k regardless of how it was loaded)

Cheers
 

Mark_

Longboard on the internet
Local time
Yesterday, 20:08
Joined
Sep 12, 2017
Messages
2,111
You MAY want to have them import work books one at a time. Dump what they put in into a temp table to allow QA, but remind them to ONLY put in one at a time.

From a design perspective, trying to get over 64k in one go is, well, asking for trouble. Means they can forget which one they've already put in and paste it again later on and other such. In some ways its far better to have a "Header" record that shows who put in what, when, and (optionally) from where. Bill it as "Well, trying to make your QA job easier by letting the reviewer quickly find who's putting in offal".
 

Frothingslosh

Premier Pale Stale Ale
Local time
Yesterday, 23:08
Joined
Oct 17, 2012
Messages
3,276
Same here, jleach. Hell, when I looked up the specs on Large Text fields, it straight up said that the field itself can only hold 64k characters. They never mentioned a difference between keyed in and added programmatically. Interesting.
 

Frothingslosh

Premier Pale Stale Ale
Local time
Yesterday, 23:08
Joined
Oct 17, 2012
Messages
3,276
Here's a thought.

Are these spreadsheets all using the same template, so the cells with your data are all in the same spot or column? If so, maybe you could use the file dialog system to just let them select the spreadsheets, then use VBA to go in and pull the data?
 

isladogs

MVP / VIP
Local time
Today, 04:08
Joined
Jan 14, 2017
Messages
18,209
Same here, jleach. Hell, when I looked up the specs on Large Text fields, it straight up said that the field itself can only hold 64k characters. They never mentioned a difference between keyed in and added programmatically. Interesting.

This is the quote from the MS website



(Colin: is that entire piece of JSON loaded into that textbox? I didn't think that was possible: thought they topped out at 64k regardless of how it was loaded)

Yes it is fully loaded! I've just re-checked by scrolling to the end (which took a very long time!) then comparing with the original file loaded into Notepad++.

Not only that, but the entire files are then analysed in Access, parsed and saved into one or more tables according to the file structure.

The screenshot showed the largest file I've managed to load successfully.

I do have another JSON file which is too much for Access to load. That one is a whopping 76MB and (according to Notepad++) is 78,231.204 characters.
My gut feeling is that fails as my work PC is 7 years old and underpowered with only 4GB RAM but perhaps that one does exceed the Access limits
 

Attachments

  • AccessSpecs.PNG
    AccessSpecs.PNG
    14 KB · Views: 583
Last edited:

Frothingslosh

Premier Pale Stale Ale
Local time
Yesterday, 23:08
Joined
Oct 17, 2012
Messages
3,276
Yeah, yeah. That wasn't the top link and I was in a hurry. :p I've never before had a need to figure out the max allowable size of a memo field!
 

isladogs

MVP / VIP
Local time
Today, 04:08
Joined
Jan 14, 2017
Messages
18,209
Yeah, yeah. That wasn't the top link and I was in a hurry. :p I've never before had a need to figure out the max allowable size of a memo field!

LOL. Nor did I until I starting working with large JSON files last year.
 

jleach

Registered User.
Local time
Yesterday, 23:08
Joined
Jan 4, 2012
Messages
308
Hi Colin - how is that data getting filled in the control though? Is it bound directly to a table/query?

I know we can work with much larger sizes than 64k strings in VBA and JET/ACE: the UI control itself is the only one limited to 64k, but I was just curious how you managed to get more than that in the control.

I wrote an XML parser/editor for Data Macros a while back and ran into the same UI limit issue. I don't remember what I did to work around it though (if anything).
 

isladogs

MVP / VIP
Local time
Today, 04:08
Joined
Jan 14, 2017
Messages
18,209
Hi

One way of doing this is shown in the attached example.
For an explanation see this post where I uploaded it last autumn SelLength overflow error. AWF member JHB provided excellent support at that time. Thanks again Jorn

In that example a long JSON file of approx 470,000 characters has been copied via the clipboard into a table then displayed on the form ...instananeously

In my JSON app, its just read into memory (clipboard) then displayed on the form. To reduce file bloat, I don't save it to a table.

Hope that helps

I found some large JSON files that were useful for testing at Github https://github.com/ozlerhakan/mongodb-json-files

IIRC the 'City' example is the large file I displayed in the earlier screenshot.
Its a big download....approx 23MB

The same datasource includes the 'Companies' JSON file (76MB) that defeated me!
 

Attachments

  • CopyALongText2.zip
    136.2 KB · Views: 115
Last edited:

AOB

Registered User.
Local time
Today, 04:08
Joined
Sep 26, 2012
Messages
615
That escalated quickly... :D

So I got around the problem by replacing my single plain textbox with a handful of other controls.

First I added three local tables; one which stores all the "valid" codes, one which holds any duplicates of the valid codes (and the count of the dupes) and a third which holds any "garbage" that may have been pasted accidentally.

I then replaced the textbox in the userform with continuous subforms linked to those tables. So the user can no longer edit the imported data directly (as they could previously) but they still have a nice visual representation of what's been imported which they can scroll through and review / remove individual items if necessary.

I then added two buttons below the subform. The first opens up a popup modal dialog with effectively the same setup as before; a plain textbox into which data can be free-keyed and/or pasted. There is a warning label at the top of the popup explaining that the textbox can only handle up to 64,000 characters at a time, and another label at the bottom with a counter of the number of remaining characters available (updated on the change event) So the users can tell if/when they are running out of space, import the data before they reach the limit, and then open the popup with a fresh textbox to continue if necessary.

When text is submitted, I have a regular expression which extracts all the valid codes and imports them into the table as individual records, thus making them neatly visible as a list in the first subform. Duplicate entries are added to the second table and displayed in the second subform. Anything which fails the regex is dumped into the third table and can be reviewed via the third subform. The process is very quick (<1 second) even for large chunks of text.

Thus if they need to submit more than 64,000 characters, they can simply split the data into batches, each of which obey this limit, and use the textbox multiple times (the popup opens clean each time) This is generally how it happens anyway, as the overall list is actually an accumulation from various different sources, added one paste at a time. Maintains the easy interface but bypasses the text limit.

The second button is for situations where more than 64,000 characters are required to be pasted in a single movement - it simply hooks into the clipboard via a late-bound MSForms.DataObject, reads in the clipboard text contents and parses the content using the same regular expression. The valid codes are inserted into the first table, duplicates are inserted/updated in the second; anything which remains is dumped into the third table so it can be easily reviewed.

Obviously this is rife for a few errors depending on what's being passed through the clipboard but there is sufficient handling that anything outside of the expected, is fairly obvious to the user and can be accounted for.

Anyway, just wanted to let y'all know, and also if anybody is having a similar problem, this is a fairly straightforward workaround that maintains the fluidity of the interface without introducing too many additional steps.

Thanks everybody for the assistance, and I hope this thread helps somebody else with a similar issue!
 

Users who are viewing this thread

Top Bottom